mirror of
https://github.com/planetmint/planetmint.git
synced 2025-07-04 03:32:29 +00:00
Fixing issues (#300)
* making multprocessing usage explicit and easily identifiable * fixed error messaging and made API not found reports debug information * removed acceptance tests * removed obsolete gh workflow file * fixed testcaes issue with patching * changed testcases to not check for error logs as we moved this to debug logs checks/asserts can be re-integrated asap we are able to set the debuglevel in for the single use cases Signed-off-by: Jürgen Eckel <juergen@riddleandcode.com>
This commit is contained in:
parent
84ae2ccf2b
commit
cfa3b6dcd4
22
.github/workflows/acceptance-test.yml
vendored
22
.github/workflows/acceptance-test.yml
vendored
@ -1,22 +0,0 @@
|
|||||||
# Copyright © 2020 Interplanetary Database Association e.V.,
|
|
||||||
# Planetmint and IPDB software contributors.
|
|
||||||
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
|
|
||||||
# Code is Apache-2.0 and docs are CC-BY-4.0
|
|
||||||
|
|
||||||
name: Acceptance tests
|
|
||||||
on: [push, pull_request]
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
test:
|
|
||||||
if: ${{ false }
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- name: Check out repository code
|
|
||||||
uses: actions/checkout@v3
|
|
||||||
|
|
||||||
- name: Start container
|
|
||||||
run: docker-compose up -d planetmint
|
|
||||||
|
|
||||||
- name: Run test
|
|
||||||
run: docker-compose -f docker-compose.yml run --rm python-acceptance pytest /src
|
|
11
Makefile
11
Makefile
@ -1,4 +1,4 @@
|
|||||||
.PHONY: help run start stop logs lint test test-unit test-unit-watch test-acceptance test-integration cov docs docs-acceptance clean reset release dist check-deps clean-build clean-pyc clean-test
|
.PHONY: help run start stop logs lint test test-unit test-unit-watch test-integration cov docs clean reset release dist check-deps clean-build clean-pyc clean-test
|
||||||
|
|
||||||
.DEFAULT_GOAL := help
|
.DEFAULT_GOAL := help
|
||||||
|
|
||||||
@ -77,7 +77,7 @@ lint: check-py-deps ## Lint the project
|
|||||||
format: check-py-deps ## Format the project
|
format: check-py-deps ## Format the project
|
||||||
black -l 119 .
|
black -l 119 .
|
||||||
|
|
||||||
test: check-deps test-unit #test-acceptance ## Run unit and acceptance tests
|
test: check-deps test-unit ## Run unit
|
||||||
|
|
||||||
test-unit: check-deps ## Run all tests once or specify a file/test with TEST=tests/file.py::Class::test
|
test-unit: check-deps ## Run all tests once or specify a file/test with TEST=tests/file.py::Class::test
|
||||||
@$(DC) up -d tarantool
|
@$(DC) up -d tarantool
|
||||||
@ -93,8 +93,7 @@ test-unit: check-deps ## Run all tests once or specify a file/test with TEST=tes
|
|||||||
test-unit-watch: check-deps ## Run all tests and wait. Every time you change code, tests will be run again
|
test-unit-watch: check-deps ## Run all tests and wait. Every time you change code, tests will be run again
|
||||||
@$(DC) run --rm --no-deps planetmint pytest -f
|
@$(DC) run --rm --no-deps planetmint pytest -f
|
||||||
|
|
||||||
test-acceptance: check-deps ## Run all acceptance tests
|
|
||||||
@./scripts/run-acceptance-test.sh
|
|
||||||
|
|
||||||
test-integration: check-deps ## Run all integration tests
|
test-integration: check-deps ## Run all integration tests
|
||||||
@./scripts/run-integration-test.sh
|
@./scripts/run-integration-test.sh
|
||||||
@ -107,10 +106,6 @@ docs: check-deps ## Generate HTML documentation and open it in the browser
|
|||||||
@$(DC) run --rm --no-deps bdocs make -C docs/root html
|
@$(DC) run --rm --no-deps bdocs make -C docs/root html
|
||||||
$(BROWSER) docs/root/build/html/index.html
|
$(BROWSER) docs/root/build/html/index.html
|
||||||
|
|
||||||
docs-acceptance: check-deps ## Create documentation for acceptance tests
|
|
||||||
@$(DC) run --rm python-acceptance pycco -i -s /src -d /docs
|
|
||||||
$(BROWSER) acceptance/python/docs/index.html
|
|
||||||
|
|
||||||
docs-integration: check-deps ## Create documentation for integration tests
|
docs-integration: check-deps ## Create documentation for integration tests
|
||||||
@$(DC) run --rm python-integration pycco -i -s /src -d /docs
|
@$(DC) run --rm python-integration pycco -i -s /src -d /docs
|
||||||
$(BROWSER) integration/python/docs/index.html
|
$(BROWSER) integration/python/docs/index.html
|
||||||
|
@ -1,27 +0,0 @@
|
|||||||
<!---
|
|
||||||
Copyright © 2020 Interplanetary Database Association e.V.,
|
|
||||||
Planetmint and IPDB software contributors.
|
|
||||||
SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
|
|
||||||
Code is Apache-2.0 and docs are CC-BY-4.0
|
|
||||||
--->
|
|
||||||
|
|
||||||
# Acceptance test suite
|
|
||||||
This directory contains the acceptance test suite for Planetmint.
|
|
||||||
|
|
||||||
The suite uses Docker Compose to set up a single Planetmint node, run all tests, and finally stop the node. In the future we will add support for a four node network setup.
|
|
||||||
|
|
||||||
## Running the tests
|
|
||||||
It should be as easy as `make test-acceptance`.
|
|
||||||
|
|
||||||
Note that `make test-acceptance` will take some time to start the node and shutting it down. If you are developing a test, or you wish to run a specific test in the acceptance test suite, first start the node with `make start`. After the node is running, you can run `pytest` inside the `python-acceptance` container with:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
docker-compose run --rm python-acceptance pytest <use whatever option you need>
|
|
||||||
```
|
|
||||||
|
|
||||||
## Writing and documenting the tests
|
|
||||||
Tests are sometimes difficult to read. For acceptance tests, we try to be really explicit on what the test is doing, so please write code that is *simple* and easy to understand. We decided to use literate-programming documentation. To generate the documentation run:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
make doc-acceptance
|
|
||||||
```
|
|
1
acceptance/python/.gitignore
vendored
1
acceptance/python/.gitignore
vendored
@ -1 +0,0 @@
|
|||||||
docs
|
|
@ -1,18 +0,0 @@
|
|||||||
FROM python:3.9
|
|
||||||
|
|
||||||
RUN apt-get update \
|
|
||||||
&& pip install -U pip \
|
|
||||||
&& apt-get autoremove \
|
|
||||||
&& apt-get clean
|
|
||||||
RUN apt-get install -y vim zsh build-essential cmake git
|
|
||||||
|
|
||||||
RUN mkdir -p /src
|
|
||||||
RUN /usr/local/bin/python -m pip install --upgrade pip
|
|
||||||
RUN pip install --upgrade meson ninja
|
|
||||||
RUN pip install --upgrade \
|
|
||||||
pycco \
|
|
||||||
websocket-client~=0.47.0 \
|
|
||||||
pytest~=3.0 \
|
|
||||||
planetmint-driver>=0.9.2 \
|
|
||||||
blns
|
|
||||||
RUN pip install planetmint-ipld>=0.0.3
|
|
@ -1,86 +0,0 @@
|
|||||||
# Copyright © 2020 Interplanetary Database Association e.V.,
|
|
||||||
# Planetmint and IPDB software contributors.
|
|
||||||
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
|
|
||||||
# Code is Apache-2.0 and docs are CC-BY-4.0
|
|
||||||
|
|
||||||
import pytest
|
|
||||||
|
|
||||||
CONDITION_SCRIPT = """Scenario 'ecdh': create the signature of an object
|
|
||||||
Given I have the 'keyring'
|
|
||||||
Given that I have a 'string dictionary' named 'houses'
|
|
||||||
When I create the signature of 'houses'
|
|
||||||
Then print the 'signature'"""
|
|
||||||
|
|
||||||
FULFILL_SCRIPT = """Scenario 'ecdh': Bob verifies the signature from Alice
|
|
||||||
Given I have a 'ecdh public key' from 'Alice'
|
|
||||||
Given that I have a 'string dictionary' named 'houses'
|
|
||||||
Given I have a 'signature' named 'signature'
|
|
||||||
When I verify the 'houses' has a signature in 'signature' by 'Alice'
|
|
||||||
Then print the string 'ok'"""
|
|
||||||
|
|
||||||
SK_TO_PK = """Scenario 'ecdh': Create the keypair
|
|
||||||
Given that I am known as '{}'
|
|
||||||
Given I have the 'keyring'
|
|
||||||
When I create the ecdh public key
|
|
||||||
When I create the bitcoin address
|
|
||||||
Then print my 'ecdh public key'
|
|
||||||
Then print my 'bitcoin address'"""
|
|
||||||
|
|
||||||
GENERATE_KEYPAIR = """Scenario 'ecdh': Create the keypair
|
|
||||||
Given that I am known as 'Pippo'
|
|
||||||
When I create the ecdh key
|
|
||||||
When I create the bitcoin key
|
|
||||||
Then print data"""
|
|
||||||
|
|
||||||
INITIAL_STATE = {"also": "more data"}
|
|
||||||
SCRIPT_INPUT = {
|
|
||||||
"houses": [
|
|
||||||
{
|
|
||||||
"name": "Harry",
|
|
||||||
"team": "Gryffindor",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"name": "Draco",
|
|
||||||
"team": "Slytherin",
|
|
||||||
},
|
|
||||||
],
|
|
||||||
}
|
|
||||||
|
|
||||||
metadata = {"units": 300, "type": "KG"}
|
|
||||||
|
|
||||||
ZENROOM_DATA = {"that": "is my data"}
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def gen_key_zencode():
|
|
||||||
return GENERATE_KEYPAIR
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def secret_key_to_private_key_zencode():
|
|
||||||
return SK_TO_PK
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def fulfill_script_zencode():
|
|
||||||
return FULFILL_SCRIPT
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def condition_script_zencode():
|
|
||||||
return CONDITION_SCRIPT
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def zenroom_house_assets():
|
|
||||||
return SCRIPT_INPUT
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def zenroom_script_input():
|
|
||||||
return SCRIPT_INPUT
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def zenroom_data():
|
|
||||||
return ZENROOM_DATA
|
|
@ -1,174 +0,0 @@
|
|||||||
# Copyright © 2020 Interplanetary Database Association e.V.,
|
|
||||||
# Planetmint and IPDB software contributors.
|
|
||||||
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
|
|
||||||
# Code is Apache-2.0 and docs are CC-BY-4.0
|
|
||||||
|
|
||||||
# # Basic Acceptance Test
|
|
||||||
# Here we check that the primitives of the system behave as expected.
|
|
||||||
# As you will see, this script tests basic stuff like:
|
|
||||||
#
|
|
||||||
# - create a transaction
|
|
||||||
# - check if the transaction is stored
|
|
||||||
# - check for the outputs of a given public key
|
|
||||||
# - transfer the transaction to another key
|
|
||||||
#
|
|
||||||
# We run a series of checks for each steps, that is retrieving the transaction from
|
|
||||||
# the remote system, and also checking the `outputs` of a given public key.
|
|
||||||
|
|
||||||
# ## Imports
|
|
||||||
# We need some utils from the `os` package, we will interact with
|
|
||||||
# env variables.
|
|
||||||
import os
|
|
||||||
|
|
||||||
# For this test case we import and use the Python Driver.
|
|
||||||
from planetmint_driver import Planetmint
|
|
||||||
from planetmint_driver.crypto import generate_keypair
|
|
||||||
from ipld import multihash, marshal
|
|
||||||
|
|
||||||
|
|
||||||
def test_get_tests():
|
|
||||||
# ## Set up a connection to Planetmint
|
|
||||||
# To use BighainDB we need a connection. Here we create one. By default we
|
|
||||||
# connect to localhost, but you can override this value using the env variable
|
|
||||||
# called `PLANETMINT_ENDPOINT`, a valid value must include the schema:
|
|
||||||
# `https://example.com:9984`
|
|
||||||
bdb = Planetmint(os.environ.get("PLANETMINT_ENDPOINT"))
|
|
||||||
|
|
||||||
# ## Create keypairs
|
|
||||||
# This test requires the interaction between two actors with their own keypair.
|
|
||||||
# The two keypairs will be called—drum roll—Alice and Bob.
|
|
||||||
alice, bob = generate_keypair(), generate_keypair()
|
|
||||||
|
|
||||||
# ## Alice registers her bike in Planetmint
|
|
||||||
# Alice has a nice bike, and here she creates the "digital twin"
|
|
||||||
# of her bike.
|
|
||||||
bike = {"data": multihash(marshal({"bicycle": {"serial_number": 420420}}))}
|
|
||||||
|
|
||||||
# She prepares a `CREATE` transaction...
|
|
||||||
prepared_creation_tx = bdb.transactions.prepare(operation="CREATE", signers=alice.public_key, asset=bike)
|
|
||||||
|
|
||||||
# ... and she fulfills it with her private key.
|
|
||||||
fulfilled_creation_tx = bdb.transactions.fulfill(prepared_creation_tx, private_keys=alice.private_key)
|
|
||||||
|
|
||||||
# We will use the `id` of this transaction several time, so we store it in
|
|
||||||
# a variable with a short and easy name
|
|
||||||
bike_id = fulfilled_creation_tx["id"]
|
|
||||||
|
|
||||||
# Now she is ready to send it to the Planetmint Network.
|
|
||||||
sent_transfer_tx = bdb.transactions.send_commit(fulfilled_creation_tx)
|
|
||||||
|
|
||||||
# And just to be 100% sure, she also checks if she can retrieve
|
|
||||||
# it from the Planetmint node.
|
|
||||||
assert bdb.transactions.retrieve(bike_id), "Cannot find transaction {}".format(bike_id)
|
|
||||||
|
|
||||||
# Alice is now the proud owner of one unspent asset.
|
|
||||||
assert len(bdb.outputs.get(alice.public_key, spent=False)) == 1
|
|
||||||
assert bdb.outputs.get(alice.public_key)[0]["transaction_id"] == bike_id
|
|
||||||
|
|
||||||
# ## Alice transfers her bike to Bob
|
|
||||||
# After registering her bike, Alice is ready to transfer it to Bob.
|
|
||||||
# She needs to create a new `TRANSFER` transaction.
|
|
||||||
|
|
||||||
# A `TRANSFER` transaction contains a pointer to the original asset. The original asset
|
|
||||||
# is identified by the `id` of the `CREATE` transaction that defined it.
|
|
||||||
transfer_asset = {"id": bike_id}
|
|
||||||
|
|
||||||
# Alice wants to spend the one and only output available, the one with index `0`.
|
|
||||||
output_index = 0
|
|
||||||
output = fulfilled_creation_tx["outputs"][output_index]
|
|
||||||
|
|
||||||
# Here, she defines the `input` of the `TRANSFER` transaction. The `input` contains
|
|
||||||
# several keys:
|
|
||||||
#
|
|
||||||
# - `fulfillment`, taken from the previous `CREATE` transaction.
|
|
||||||
# - `fulfills`, that specifies which condition she is fulfilling.
|
|
||||||
# - `owners_before`.
|
|
||||||
transfer_input = {
|
|
||||||
"fulfillment": output["condition"]["details"],
|
|
||||||
"fulfills": {"output_index": output_index, "transaction_id": fulfilled_creation_tx["id"]},
|
|
||||||
"owners_before": output["public_keys"],
|
|
||||||
}
|
|
||||||
|
|
||||||
# Now that all the elements are set, she creates the actual transaction...
|
|
||||||
prepared_transfer_tx = bdb.transactions.prepare(
|
|
||||||
operation="TRANSFER", asset=transfer_asset, inputs=transfer_input, recipients=bob.public_key
|
|
||||||
)
|
|
||||||
|
|
||||||
# ... and signs it with her private key.
|
|
||||||
fulfilled_transfer_tx = bdb.transactions.fulfill(prepared_transfer_tx, private_keys=alice.private_key)
|
|
||||||
|
|
||||||
# She finally sends the transaction to a Planetmint node.
|
|
||||||
sent_transfer_tx = bdb.transactions.send_commit(fulfilled_transfer_tx)
|
|
||||||
|
|
||||||
# And just to be 100% sure, she also checks if she can retrieve
|
|
||||||
# it from the Planetmint node.
|
|
||||||
assert bdb.transactions.retrieve(fulfilled_transfer_tx["id"]) == sent_transfer_tx
|
|
||||||
|
|
||||||
# Now Alice has zero unspent transactions.
|
|
||||||
assert len(bdb.outputs.get(alice.public_key, spent=False)) == 0
|
|
||||||
|
|
||||||
# While Bob has one.copy
|
|
||||||
assert len(bdb.outputs.get(bob.public_key, spent=False)) == 1
|
|
||||||
|
|
||||||
# Bob double checks what he got was the actual bike.
|
|
||||||
bob_tx_id = bdb.outputs.get(bob.public_key, spent=False)[0]["transaction_id"]
|
|
||||||
assert bdb.transactions.retrieve(bob_tx_id) == sent_transfer_tx
|
|
||||||
|
|
||||||
transfer_asset = {"id": bike_id}
|
|
||||||
|
|
||||||
# Alice wants to spend the one and only output available, the one with index `0`.
|
|
||||||
output_index = 0
|
|
||||||
output = fulfilled_transfer_tx["outputs"][output_index]
|
|
||||||
|
|
||||||
# Here, she defines the `input` of the `TRANSFER` transaction. The `input` contains
|
|
||||||
# several keys:
|
|
||||||
#
|
|
||||||
# - `fulfillment`, taken from the previous `CREATE` transaction.
|
|
||||||
# - `fulfills`, that specifies which condition she is fulfilling.
|
|
||||||
# - `owners_before`.
|
|
||||||
transfer_input = {
|
|
||||||
"fulfillment": output["condition"]["details"],
|
|
||||||
"fulfills": {"output_index": output_index, "transaction_id": fulfilled_transfer_tx["id"]},
|
|
||||||
"owners_before": output["public_keys"],
|
|
||||||
}
|
|
||||||
|
|
||||||
# Now that all the elements are set, she creates the actual transaction...
|
|
||||||
prepared_transfer_tx = bdb.transactions.prepare(
|
|
||||||
operation="TRANSFER", asset=transfer_asset, inputs=transfer_input, recipients=bob.public_key
|
|
||||||
)
|
|
||||||
|
|
||||||
# ... and signs it with her private key.
|
|
||||||
fulfilled_transfer_tx = bdb.transactions.fulfill(prepared_transfer_tx, private_keys=bob.private_key)
|
|
||||||
|
|
||||||
# She finally sends the transaction to a Planetmint node.
|
|
||||||
sent_transfer_tx = bdb.transactions.send_commit(fulfilled_transfer_tx)
|
|
||||||
|
|
||||||
assert bdb.transactions.retrieve(fulfilled_transfer_tx["id"]) == sent_transfer_tx
|
|
||||||
|
|
||||||
# from urllib3 import request
|
|
||||||
import urllib3
|
|
||||||
import json
|
|
||||||
|
|
||||||
http = urllib3.PoolManager()
|
|
||||||
|
|
||||||
# verify that 3 transactions contain the asset_id
|
|
||||||
asset_id = bike_id
|
|
||||||
url = "http://planetmint:9984/api/v1/transactions?asset_id=" + asset_id
|
|
||||||
r = http.request("GET", url)
|
|
||||||
tmp_json = http.request("GET", url)
|
|
||||||
tmp_json = json.loads(tmp_json.data.decode("utf-8"))
|
|
||||||
assert len(tmp_json) == 3
|
|
||||||
|
|
||||||
# verify that one transaction is the create TX
|
|
||||||
url = "http://planetmint:9984/api/v1/transactions?asset_id=" + asset_id + "&operation=CREATE"
|
|
||||||
r = http.request("GET", url)
|
|
||||||
tmp_json = http.request("GET", url)
|
|
||||||
tmp_json = json.loads(tmp_json.data.decode("utf-8"))
|
|
||||||
assert len(tmp_json) == 1
|
|
||||||
|
|
||||||
# verify that 2 transactoins are of type transfer
|
|
||||||
url = "http://planetmint:9984/api/v1/transactions?asset_id=" + asset_id + "&operation=transfer"
|
|
||||||
r = http.request("GET", url)
|
|
||||||
tmp_json = http.request("GET", url)
|
|
||||||
tmp_json = json.loads(tmp_json.data.decode("utf-8"))
|
|
||||||
assert len(tmp_json) == 2
|
|
@ -1,115 +0,0 @@
|
|||||||
# Copyright © 2020 Interplanetary Database Association e.V.,
|
|
||||||
# Planetmint and IPDB software contributors.
|
|
||||||
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
|
|
||||||
# Code is Apache-2.0 and docs are CC-BY-4.0
|
|
||||||
|
|
||||||
# # Basic Acceptance Test
|
|
||||||
# Here we check that the primitives of the system behave as expected.
|
|
||||||
# As you will see, this script tests basic stuff like:
|
|
||||||
#
|
|
||||||
# - create a transaction
|
|
||||||
# - check if the transaction is stored
|
|
||||||
# - check for the outputs of a given public key
|
|
||||||
# - transfer the transaction to another key
|
|
||||||
#
|
|
||||||
# We run a series of checks for each steps, that is retrieving the transaction from
|
|
||||||
# the remote system, and also checking the `outputs` of a given public key.
|
|
||||||
|
|
||||||
# ## Imports
|
|
||||||
# We need some utils from the `os` package, we will interact with
|
|
||||||
# env variables.
|
|
||||||
import os
|
|
||||||
|
|
||||||
# For this test case we import and use the Python Driver.
|
|
||||||
from planetmint_driver import Planetmint
|
|
||||||
from planetmint_driver.crypto import generate_keypair
|
|
||||||
from ipld import multihash, marshal
|
|
||||||
|
|
||||||
|
|
||||||
def test_basic():
|
|
||||||
# ## Set up a connection to Planetmint
|
|
||||||
# To use BighainDB we need a connection. Here we create one. By default we
|
|
||||||
# connect to localhost, but you can override this value using the env variable
|
|
||||||
# called `PLANETMINT_ENDPOINT`, a valid value must include the schema:
|
|
||||||
# `https://example.com:9984`
|
|
||||||
bdb = Planetmint(os.environ.get("PLANETMINT_ENDPOINT"))
|
|
||||||
|
|
||||||
# ## Create keypairs
|
|
||||||
# This test requires the interaction between two actors with their own keypair.
|
|
||||||
# The two keypairs will be called—drum roll—Alice and Bob.
|
|
||||||
alice, bob = generate_keypair(), generate_keypair()
|
|
||||||
|
|
||||||
# ## Alice registers her bike in Planetmint
|
|
||||||
# Alice has a nice bike, and here she creates the "digital twin"
|
|
||||||
# of her bike.
|
|
||||||
bike = [{"data": multihash(marshal({"bicycle": {"serial_number": 420420}}))}]
|
|
||||||
|
|
||||||
# She prepares a `CREATE` transaction...
|
|
||||||
prepared_creation_tx = bdb.transactions.prepare(operation="CREATE", signers=alice.public_key, assets=bike)
|
|
||||||
|
|
||||||
# ... and she fulfills it with her private key.
|
|
||||||
fulfilled_creation_tx = bdb.transactions.fulfill(prepared_creation_tx, private_keys=alice.private_key)
|
|
||||||
|
|
||||||
# We will use the `id` of this transaction several time, so we store it in
|
|
||||||
# a variable with a short and easy name
|
|
||||||
bike_id = fulfilled_creation_tx["id"]
|
|
||||||
|
|
||||||
# Now she is ready to send it to the Planetmint Network.
|
|
||||||
sent_transfer_tx = bdb.transactions.send_commit(fulfilled_creation_tx)
|
|
||||||
|
|
||||||
# And just to be 100% sure, she also checks if she can retrieve
|
|
||||||
# it from the Planetmint node.
|
|
||||||
assert bdb.transactions.retrieve(bike_id), "Cannot find transaction {}".format(bike_id)
|
|
||||||
|
|
||||||
# Alice is now the proud owner of one unspent asset.
|
|
||||||
assert len(bdb.outputs.get(alice.public_key, spent=False)) == 1
|
|
||||||
assert bdb.outputs.get(alice.public_key)[0]["transaction_id"] == bike_id
|
|
||||||
|
|
||||||
# ## Alice transfers her bike to Bob
|
|
||||||
# After registering her bike, Alice is ready to transfer it to Bob.
|
|
||||||
# She needs to create a new `TRANSFER` transaction.
|
|
||||||
|
|
||||||
# A `TRANSFER` transaction contains a pointer to the original asset. The original asset
|
|
||||||
# is identified by the `id` of the `CREATE` transaction that defined it.
|
|
||||||
transfer_assets = [{"id": bike_id}]
|
|
||||||
|
|
||||||
# Alice wants to spend the one and only output available, the one with index `0`.
|
|
||||||
output_index = 0
|
|
||||||
output = fulfilled_creation_tx["outputs"][output_index]
|
|
||||||
|
|
||||||
# Here, she defines the `input` of the `TRANSFER` transaction. The `input` contains
|
|
||||||
# several keys:
|
|
||||||
#
|
|
||||||
# - `fulfillment`, taken from the previous `CREATE` transaction.
|
|
||||||
# - `fulfills`, that specifies which condition she is fulfilling.
|
|
||||||
# - `owners_before`.
|
|
||||||
transfer_input = {
|
|
||||||
"fulfillment": output["condition"]["details"],
|
|
||||||
"fulfills": {"output_index": output_index, "transaction_id": fulfilled_creation_tx["id"]},
|
|
||||||
"owners_before": output["public_keys"],
|
|
||||||
}
|
|
||||||
|
|
||||||
# Now that all the elements are set, she creates the actual transaction...
|
|
||||||
prepared_transfer_tx = bdb.transactions.prepare(
|
|
||||||
operation="TRANSFER", assets=transfer_assets, inputs=transfer_input, recipients=bob.public_key
|
|
||||||
)
|
|
||||||
|
|
||||||
# ... and signs it with her private key.
|
|
||||||
fulfilled_transfer_tx = bdb.transactions.fulfill(prepared_transfer_tx, private_keys=alice.private_key)
|
|
||||||
|
|
||||||
# She finally sends the transaction to a Planetmint node.
|
|
||||||
sent_transfer_tx = bdb.transactions.send_commit(fulfilled_transfer_tx)
|
|
||||||
|
|
||||||
# And just to be 100% sure, she also checks if she can retrieve
|
|
||||||
# it from the Planetmint node.
|
|
||||||
assert bdb.transactions.retrieve(fulfilled_transfer_tx["id"]) == sent_transfer_tx
|
|
||||||
|
|
||||||
# Now Alice has zero unspent transactions.
|
|
||||||
assert len(bdb.outputs.get(alice.public_key, spent=False)) == 0
|
|
||||||
|
|
||||||
# While Bob has one.
|
|
||||||
assert len(bdb.outputs.get(bob.public_key, spent=False)) == 1
|
|
||||||
|
|
||||||
# Bob double checks what he got was the actual bike.
|
|
||||||
bob_tx_id = bdb.outputs.get(bob.public_key, spent=False)[0]["transaction_id"]
|
|
||||||
assert bdb.transactions.retrieve(bob_tx_id) == sent_transfer_tx
|
|
@ -1,170 +0,0 @@
|
|||||||
# Copyright © 2020 Interplanetary Database Association e.V.,
|
|
||||||
# Planetmint and IPDB software contributors.
|
|
||||||
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
|
|
||||||
# Code is Apache-2.0 and docs are CC-BY-4.0
|
|
||||||
|
|
||||||
# # Divisible assets integration testing
|
|
||||||
# This test checks if we can successfully divide assets.
|
|
||||||
# The script tests various things like:
|
|
||||||
#
|
|
||||||
# - create a transaction with a divisible asset and issue them to someone
|
|
||||||
# - check if the transaction is stored and has the right amount of tokens
|
|
||||||
# - spend some tokens
|
|
||||||
# - try to spend more tokens than available
|
|
||||||
#
|
|
||||||
# We run a series of checks for each step, that is retrieving
|
|
||||||
# the transaction from the remote system, and also checking the `amount`
|
|
||||||
# of a given transaction.
|
|
||||||
|
|
||||||
# ## Imports
|
|
||||||
# We need some utils from the `os` package, we will interact with
|
|
||||||
# env variables.
|
|
||||||
# We need the `pytest` package to catch the `BadRequest` exception properly.
|
|
||||||
# And of course, we also need the `BadRequest`.
|
|
||||||
import os
|
|
||||||
import pytest
|
|
||||||
from planetmint_driver.exceptions import BadRequest
|
|
||||||
|
|
||||||
# For this test case we import and use the Python Driver.
|
|
||||||
from planetmint_driver import Planetmint
|
|
||||||
from planetmint_driver.crypto import generate_keypair
|
|
||||||
from ipld import multihash, marshal
|
|
||||||
|
|
||||||
|
|
||||||
def test_divisible_assets():
|
|
||||||
# ## Set up a connection to Planetmint
|
|
||||||
# Check [test_basic.py](./test_basic.html) to get some more details
|
|
||||||
# about the endpoint.
|
|
||||||
bdb = Planetmint(os.environ.get("PLANETMINT_ENDPOINT"))
|
|
||||||
|
|
||||||
# Oh look, it is Alice again and she brought her friend Bob along.
|
|
||||||
alice, bob = generate_keypair(), generate_keypair()
|
|
||||||
|
|
||||||
# ## Alice creates a time sharing token
|
|
||||||
# Alice wants to go on vacation, while Bobs bike just broke down.
|
|
||||||
# Alice decides to rent her bike to Bob while she is gone.
|
|
||||||
# So she prepares a `CREATE` transaction to issues 10 tokens.
|
|
||||||
# First, she prepares an asset for a time sharing token. As you can see in
|
|
||||||
# the description, Bob and Alice agree that each token can be used to ride
|
|
||||||
# the bike for one hour.
|
|
||||||
|
|
||||||
bike_token = [
|
|
||||||
{
|
|
||||||
"data": multihash(
|
|
||||||
marshal(
|
|
||||||
{
|
|
||||||
"token_for": {"bike": {"serial_number": 420420}},
|
|
||||||
"description": "Time share token. Each token equals one hour of riding.",
|
|
||||||
}
|
|
||||||
)
|
|
||||||
),
|
|
||||||
}
|
|
||||||
]
|
|
||||||
|
|
||||||
# She prepares a `CREATE` transaction and issues 10 tokens.
|
|
||||||
# Here, Alice defines in a tuple that she wants to assign
|
|
||||||
# these 10 tokens to Bob.
|
|
||||||
prepared_token_tx = bdb.transactions.prepare(
|
|
||||||
operation="CREATE", signers=alice.public_key, recipients=[([bob.public_key], 10)], assets=bike_token
|
|
||||||
)
|
|
||||||
|
|
||||||
# She fulfills and sends the transaction.
|
|
||||||
fulfilled_token_tx = bdb.transactions.fulfill(prepared_token_tx, private_keys=alice.private_key)
|
|
||||||
|
|
||||||
bdb.transactions.send_commit(fulfilled_token_tx)
|
|
||||||
|
|
||||||
# We store the `id` of the transaction to use it later on.
|
|
||||||
bike_token_id = fulfilled_token_tx["id"]
|
|
||||||
|
|
||||||
# Let's check if the transaction was successful.
|
|
||||||
assert bdb.transactions.retrieve(bike_token_id), "Cannot find transaction {}".format(bike_token_id)
|
|
||||||
|
|
||||||
# Bob owns 10 tokens now.
|
|
||||||
assert bdb.transactions.retrieve(bike_token_id)["outputs"][0]["amount"] == "10"
|
|
||||||
|
|
||||||
# ## Bob wants to use the bike
|
|
||||||
# Now that Bob got the tokens and the sun is shining, he wants to get out
|
|
||||||
# with the bike for three hours.
|
|
||||||
# To use the bike he has to send the tokens back to Alice.
|
|
||||||
# To learn about the details of transferring a transaction check out
|
|
||||||
# [test_basic.py](./test_basic.html)
|
|
||||||
transfer_assets = [{"id": bike_token_id}]
|
|
||||||
|
|
||||||
output_index = 0
|
|
||||||
output = fulfilled_token_tx["outputs"][output_index]
|
|
||||||
transfer_input = {
|
|
||||||
"fulfillment": output["condition"]["details"],
|
|
||||||
"fulfills": {"output_index": output_index, "transaction_id": fulfilled_token_tx["id"]},
|
|
||||||
"owners_before": output["public_keys"],
|
|
||||||
}
|
|
||||||
|
|
||||||
# To use the tokens Bob has to reassign 7 tokens to himself and the
|
|
||||||
# amount he wants to use to Alice.
|
|
||||||
prepared_transfer_tx = bdb.transactions.prepare(
|
|
||||||
operation="TRANSFER",
|
|
||||||
asset=transfer_assets,
|
|
||||||
inputs=transfer_input,
|
|
||||||
recipients=[([alice.public_key], 3), ([bob.public_key], 7)],
|
|
||||||
)
|
|
||||||
|
|
||||||
# He signs and sends the transaction.
|
|
||||||
fulfilled_transfer_tx = bdb.transactions.fulfill(prepared_transfer_tx, private_keys=bob.private_key)
|
|
||||||
|
|
||||||
sent_transfer_tx = bdb.transactions.send_commit(fulfilled_transfer_tx)
|
|
||||||
|
|
||||||
# First, Bob checks if the transaction was successful.
|
|
||||||
assert bdb.transactions.retrieve(fulfilled_transfer_tx["id"]) == sent_transfer_tx
|
|
||||||
# There are two outputs in the transaction now.
|
|
||||||
# The first output shows that Alice got back 3 tokens...
|
|
||||||
assert bdb.transactions.retrieve(fulfilled_transfer_tx["id"])["outputs"][0]["amount"] == "3"
|
|
||||||
|
|
||||||
# ... while Bob still has 7 left.
|
|
||||||
assert bdb.transactions.retrieve(fulfilled_transfer_tx["id"])["outputs"][1]["amount"] == "7"
|
|
||||||
|
|
||||||
# ## Bob wants to ride the bike again
|
|
||||||
# It's been a week and Bob wants to right the bike again.
|
|
||||||
# Now he wants to ride for 8 hours, that's a lot Bob!
|
|
||||||
# He prepares the transaction again.
|
|
||||||
|
|
||||||
transfer_asset = [{"id": bike_token_id}]
|
|
||||||
# This time we need an `output_index` of 1, since we have two outputs
|
|
||||||
# in the `fulfilled_transfer_tx` we created before. The first output with
|
|
||||||
# index 0 is for Alice and the second output is for Bob.
|
|
||||||
# Since Bob wants to spend more of his tokens he has to provide the
|
|
||||||
# correct output with the correct amount of tokens.
|
|
||||||
output_index = 1
|
|
||||||
|
|
||||||
output = fulfilled_transfer_tx["outputs"][output_index]
|
|
||||||
|
|
||||||
transfer_input = {
|
|
||||||
"fulfillment": output["condition"]["details"],
|
|
||||||
"fulfills": {"output_index": output_index, "transaction_id": fulfilled_transfer_tx["id"]},
|
|
||||||
"owners_before": output["public_keys"],
|
|
||||||
}
|
|
||||||
|
|
||||||
# This time Bob only provides Alice in the `recipients` because he wants
|
|
||||||
# to spend all his tokens
|
|
||||||
prepared_transfer_tx = bdb.transactions.prepare(
|
|
||||||
operation="TRANSFER", assets=transfer_assets, inputs=transfer_input, recipients=[([alice.public_key], 8)]
|
|
||||||
)
|
|
||||||
|
|
||||||
fulfilled_transfer_tx = bdb.transactions.fulfill(prepared_transfer_tx, private_keys=bob.private_key)
|
|
||||||
|
|
||||||
# Oh Bob, what have you done?! You tried to spend more tokens than you had.
|
|
||||||
# Remember Bob, last time you spent 3 tokens already,
|
|
||||||
# so you only have 7 left.
|
|
||||||
with pytest.raises(BadRequest) as error:
|
|
||||||
bdb.transactions.send_commit(fulfilled_transfer_tx)
|
|
||||||
|
|
||||||
# Now Bob gets an error saying that the amount he wanted to spent is
|
|
||||||
# higher than the amount of tokens he has left.
|
|
||||||
assert error.value.args[0] == 400
|
|
||||||
message = (
|
|
||||||
"Invalid transaction (AmountError): The amount used in the "
|
|
||||||
"inputs `7` needs to be same as the amount used in the "
|
|
||||||
"outputs `8`"
|
|
||||||
)
|
|
||||||
assert error.value.args[2]["message"] == message
|
|
||||||
|
|
||||||
# We have to stop this test now, I am sorry, but Bob is pretty upset
|
|
||||||
# about his mistake. See you next time :)
|
|
@ -1,49 +0,0 @@
|
|||||||
# Copyright © 2020 Interplanetary Database Association e.V.,
|
|
||||||
# Planetmint and IPDB software contributors.
|
|
||||||
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
|
|
||||||
# Code is Apache-2.0 and docs are CC-BY-4.0
|
|
||||||
|
|
||||||
# # Double Spend testing
|
|
||||||
# This test challenge the system with double spends.
|
|
||||||
|
|
||||||
import os
|
|
||||||
from uuid import uuid4
|
|
||||||
from threading import Thread
|
|
||||||
import queue
|
|
||||||
|
|
||||||
import planetmint_driver.exceptions
|
|
||||||
from planetmint_driver import Planetmint
|
|
||||||
from planetmint_driver.crypto import generate_keypair
|
|
||||||
from ipld import multihash, marshal
|
|
||||||
|
|
||||||
|
|
||||||
def test_double_create():
|
|
||||||
bdb = Planetmint(os.environ.get("PLANETMINT_ENDPOINT"))
|
|
||||||
alice = generate_keypair()
|
|
||||||
|
|
||||||
results = queue.Queue()
|
|
||||||
|
|
||||||
tx = bdb.transactions.fulfill(
|
|
||||||
bdb.transactions.prepare(
|
|
||||||
operation="CREATE", signers=alice.public_key, assets=[{"data": multihash(marshal({"uuid": str(uuid4())}))}]
|
|
||||||
),
|
|
||||||
private_keys=alice.private_key,
|
|
||||||
)
|
|
||||||
|
|
||||||
def send_and_queue(tx):
|
|
||||||
try:
|
|
||||||
bdb.transactions.send_commit(tx)
|
|
||||||
results.put("OK")
|
|
||||||
except planetmint_driver.exceptions.TransportError as e:
|
|
||||||
results.put("FAIL")
|
|
||||||
|
|
||||||
t1 = Thread(target=send_and_queue, args=(tx,))
|
|
||||||
t2 = Thread(target=send_and_queue, args=(tx,))
|
|
||||||
|
|
||||||
t1.start()
|
|
||||||
t2.start()
|
|
||||||
|
|
||||||
results = [results.get(timeout=2), results.get(timeout=2)]
|
|
||||||
|
|
||||||
assert results.count("OK") == 1
|
|
||||||
assert results.count("FAIL") == 1
|
|
@ -1,107 +0,0 @@
|
|||||||
# Copyright © 2020 Interplanetary Database Association e.V.,
|
|
||||||
# Planetmint and IPDB software contributors.
|
|
||||||
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
|
|
||||||
# Code is Apache-2.0 and docs are CC-BY-4.0
|
|
||||||
|
|
||||||
# # Multiple owners integration testing
|
|
||||||
# This test checks if we can successfully create and transfer a transaction
|
|
||||||
# with multiple owners.
|
|
||||||
# The script tests various things like:
|
|
||||||
#
|
|
||||||
# - create a transaction with multiple owners
|
|
||||||
# - check if the transaction is stored and has the right amount of public keys
|
|
||||||
# - transfer the transaction to a third person
|
|
||||||
#
|
|
||||||
# We run a series of checks for each step, that is retrieving
|
|
||||||
# the transaction from the remote system, and also checking the public keys
|
|
||||||
# of a given transaction.
|
|
||||||
|
|
||||||
|
|
||||||
# ## Imports
|
|
||||||
# We need some utils from the `os` package, we will interact with
|
|
||||||
# env variables.
|
|
||||||
import os
|
|
||||||
|
|
||||||
# For this test case we import and use the Python Driver.
|
|
||||||
from planetmint_driver import Planetmint
|
|
||||||
from planetmint_driver.crypto import generate_keypair
|
|
||||||
from ipld import multihash, marshal
|
|
||||||
|
|
||||||
|
|
||||||
def test_multiple_owners():
|
|
||||||
# ## Set up a connection to Planetmint
|
|
||||||
# Check [test_basic.py](./test_basic.html) to get some more details
|
|
||||||
# about the endpoint.
|
|
||||||
bdb = Planetmint(os.environ.get("PLANETMINT_ENDPOINT"))
|
|
||||||
|
|
||||||
# Hey Alice and Bob, nice to see you again!
|
|
||||||
alice, bob = generate_keypair(), generate_keypair()
|
|
||||||
|
|
||||||
# ## Alice and Bob create a transaction
|
|
||||||
# Alice and Bob just moved into a shared flat, no one can afford these
|
|
||||||
# high rents anymore. Bob suggests to get a dish washer for the
|
|
||||||
# kitchen. Alice agrees and here they go, creating the asset for their
|
|
||||||
# dish washer.
|
|
||||||
dw_asset = {"data": multihash(marshal({"dish washer": {"serial_number": 1337}}))}
|
|
||||||
|
|
||||||
# They prepare a `CREATE` transaction. To have multiple owners, both
|
|
||||||
# Bob and Alice need to be the recipients.
|
|
||||||
prepared_dw_tx = bdb.transactions.prepare(
|
|
||||||
operation="CREATE", signers=alice.public_key, recipients=(alice.public_key, bob.public_key), assets=[dw_asset]
|
|
||||||
)
|
|
||||||
|
|
||||||
# Now they both sign the transaction by providing their private keys.
|
|
||||||
# And send it afterwards.
|
|
||||||
fulfilled_dw_tx = bdb.transactions.fulfill(prepared_dw_tx, private_keys=[alice.private_key, bob.private_key])
|
|
||||||
|
|
||||||
bdb.transactions.send_commit(fulfilled_dw_tx)
|
|
||||||
|
|
||||||
# We store the `id` of the transaction to use it later on.
|
|
||||||
dw_id = fulfilled_dw_tx["id"]
|
|
||||||
|
|
||||||
# Let's check if the transaction was successful.
|
|
||||||
assert bdb.transactions.retrieve(dw_id), "Cannot find transaction {}".format(dw_id)
|
|
||||||
|
|
||||||
# The transaction should have two public keys in the outputs.
|
|
||||||
assert len(bdb.transactions.retrieve(dw_id)["outputs"][0]["public_keys"]) == 2
|
|
||||||
|
|
||||||
# ## Alice and Bob transfer a transaction to Carol.
|
|
||||||
# Alice and Bob save a lot of money living together. They often go out
|
|
||||||
# for dinner and don't cook at home. But now they don't have any dishes to
|
|
||||||
# wash, so they decide to sell the dish washer to their friend Carol.
|
|
||||||
|
|
||||||
# Hey Carol, nice to meet you!
|
|
||||||
carol = generate_keypair()
|
|
||||||
|
|
||||||
# Alice and Bob prepare the transaction to transfer the dish washer to
|
|
||||||
# Carol.
|
|
||||||
transfer_assets = [{"id": dw_id}]
|
|
||||||
|
|
||||||
output_index = 0
|
|
||||||
output = fulfilled_dw_tx["outputs"][output_index]
|
|
||||||
transfer_input = {
|
|
||||||
"fulfillment": output["condition"]["details"],
|
|
||||||
"fulfills": {"output_index": output_index, "transaction_id": fulfilled_dw_tx["id"]},
|
|
||||||
"owners_before": output["public_keys"],
|
|
||||||
}
|
|
||||||
|
|
||||||
# Now they create the transaction...
|
|
||||||
prepared_transfer_tx = bdb.transactions.prepare(
|
|
||||||
operation="TRANSFER", assets=transfer_assets, inputs=transfer_input, recipients=carol.public_key
|
|
||||||
)
|
|
||||||
|
|
||||||
# ... and sign it with their private keys, then send it.
|
|
||||||
fulfilled_transfer_tx = bdb.transactions.fulfill(
|
|
||||||
prepared_transfer_tx, private_keys=[alice.private_key, bob.private_key]
|
|
||||||
)
|
|
||||||
|
|
||||||
sent_transfer_tx = bdb.transactions.send_commit(fulfilled_transfer_tx)
|
|
||||||
|
|
||||||
# They check if the transaction was successful.
|
|
||||||
assert bdb.transactions.retrieve(fulfilled_transfer_tx["id"]) == sent_transfer_tx
|
|
||||||
|
|
||||||
# The owners before should include both Alice and Bob.
|
|
||||||
assert len(bdb.transactions.retrieve(fulfilled_transfer_tx["id"])["inputs"][0]["owners_before"]) == 2
|
|
||||||
|
|
||||||
# While the new owner is Carol.
|
|
||||||
assert bdb.transactions.retrieve(fulfilled_transfer_tx["id"])["outputs"][0]["public_keys"][0] == carol.public_key
|
|
@ -1,134 +0,0 @@
|
|||||||
# Copyright © 2020 Interplanetary Database Association e.V.,
|
|
||||||
# Planetmint and IPDB software contributors.
|
|
||||||
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
|
|
||||||
# Code is Apache-2.0 and docs are CC-BY-4.0
|
|
||||||
|
|
||||||
# ## Testing potentially hazardous strings
|
|
||||||
# This test uses a library of `naughty` strings (code injections, weird unicode chars., etc.) as both keys and values.
|
|
||||||
# We look for either a successful tx, or in the case that we use a naughty string as a key, and it violates some key
|
|
||||||
# constraints, we expect to receive a well formatted error message.
|
|
||||||
|
|
||||||
# ## Imports
|
|
||||||
# We need some utils from the `os` package, we will interact with
|
|
||||||
# env variables.
|
|
||||||
import os
|
|
||||||
|
|
||||||
# Since the naughty strings get encoded and decoded in odd ways,
|
|
||||||
# we'll use a regex to sweep those details under the rug.
|
|
||||||
import re
|
|
||||||
from tkinter import N
|
|
||||||
from unittest import skip
|
|
||||||
|
|
||||||
# We'll use a nice library of naughty strings...
|
|
||||||
from blns import blns
|
|
||||||
|
|
||||||
# And parameterize our test so each one is treated as a separate test case
|
|
||||||
import pytest
|
|
||||||
|
|
||||||
# For this test case we import and use the Python Driver.
|
|
||||||
from planetmint_driver import Planetmint
|
|
||||||
from planetmint_driver.crypto import generate_keypair
|
|
||||||
from planetmint_driver.exceptions import BadRequest
|
|
||||||
from ipld import multihash, marshal
|
|
||||||
|
|
||||||
naughty_strings = blns.all()
|
|
||||||
skipped_naughty_strings = [
|
|
||||||
"1.00",
|
|
||||||
"$1.00",
|
|
||||||
"-1.00",
|
|
||||||
"-$1.00",
|
|
||||||
"0.00",
|
|
||||||
"0..0",
|
|
||||||
".",
|
|
||||||
"0.0.0",
|
|
||||||
"-.",
|
|
||||||
",./;'[]\\-=",
|
|
||||||
"ثم نفس سقطت وبالتحديد،, جزيرتي باستخدام أن دنو. إذ هنا؟ الستار وتنصيب كان. أهّل ايطاليا، بريطانيا-فرنسا قد أخذ. سليمان، إتفاقية بين ما, يذكر الحدود أي بعد, معاملة بولندا، الإطلاق عل إيو.",
|
|
||||||
"test\x00",
|
|
||||||
"Ṱ̺̺̕o͞ ̷i̲̬͇̪͙n̝̗͕v̟̜̘̦͟o̶̙̰̠kè͚̮̺̪̹̱̤ ̖t̝͕̳̣̻̪͞h̼͓̲̦̳̘̲e͇̣̰̦̬͎ ̢̼̻̱̘h͚͎͙̜̣̲ͅi̦̲̣̰̤v̻͍e̺̭̳̪̰-m̢iͅn̖̺̞̲̯̰d̵̼̟͙̩̼̘̳ ̞̥̱̳̭r̛̗̘e͙p͠r̼̞̻̭̗e̺̠̣͟s̘͇̳͍̝͉e͉̥̯̞̲͚̬͜ǹ̬͎͎̟̖͇̤t͍̬̤͓̼̭͘ͅi̪̱n͠g̴͉ ͏͉ͅc̬̟h͡a̫̻̯͘o̫̟̖͍̙̝͉s̗̦̲.̨̹͈̣",
|
|
||||||
"̡͓̞ͅI̗̘̦͝n͇͇͙v̮̫ok̲̫̙͈i̖͙̭̹̠̞n̡̻̮̣̺g̲͈͙̭͙̬͎ ̰t͔̦h̞̲e̢̤ ͍̬̲͖f̴̘͕̣è͖ẹ̥̩l͖͔͚i͓͚̦͠n͖͍̗͓̳̮g͍ ̨o͚̪͡f̘̣̬ ̖̘͖̟͙̮c҉͔̫͖͓͇͖ͅh̵̤̣͚͔á̗̼͕ͅo̼̣̥s̱͈̺̖̦̻͢.̛̖̞̠̫̰",
|
|
||||||
"̗̺͖̹̯͓Ṯ̤͍̥͇͈h̲́e͏͓̼̗̙̼̣͔ ͇̜̱̠͓͍ͅN͕͠e̗̱z̘̝̜̺͙p̤̺̹͍̯͚e̠̻̠͜r̨̤͍̺̖͔̖̖d̠̟̭̬̝͟i̦͖̩͓͔̤a̠̗̬͉̙n͚͜ ̻̞̰͚ͅh̵͉i̳̞v̢͇ḙ͎͟-҉̭̩̼͔m̤̭̫i͕͇̝̦n̗͙ḍ̟ ̯̲͕͞ǫ̟̯̰̲͙̻̝f ̪̰̰̗̖̭̘͘c̦͍̲̞͍̩̙ḥ͚a̮͎̟̙͜ơ̩̹͎s̤.̝̝ ҉Z̡̖̜͖̰̣͉̜a͖̰͙̬͡l̲̫̳͍̩g̡̟̼̱͚̞̬ͅo̗͜.̟",
|
|
||||||
"̦H̬̤̗̤͝e͜ ̜̥̝̻͍̟́w̕h̖̯͓o̝͙̖͎̱̮ ҉̺̙̞̟͈W̷̼̭a̺̪͍į͈͕̭͙̯̜t̶̼̮s̘͙͖̕ ̠̫̠B̻͍͙͉̳ͅe̵h̵̬͇̫͙i̹͓̳̳̮͎̫̕n͟d̴̪̜̖ ̰͉̩͇͙̲͞ͅT͖̼͓̪͢h͏͓̮̻e̬̝̟ͅ ̤̹̝W͙̞̝͔͇͝ͅa͏͓͔̹̼̣l̴͔̰̤̟͔ḽ̫.͕",
|
|
||||||
'"><script>alert(document.title)</script>',
|
|
||||||
"'><script>alert(document.title)</script>",
|
|
||||||
"><script>alert(document.title)</script>",
|
|
||||||
"</script><script>alert(document.title)</script>",
|
|
||||||
"< / script >< script >alert(document.title)< / script >",
|
|
||||||
" onfocus=alert(document.title) autofocus ",
|
|
||||||
'" onfocus=alert(document.title) autofocus ',
|
|
||||||
"' onfocus=alert(document.title) autofocus ",
|
|
||||||
"<script>alert(document.title)</script>",
|
|
||||||
"/dev/null; touch /tmp/blns.fail ; echo",
|
|
||||||
"../../../../../../../../../../../etc/passwd%00",
|
|
||||||
"../../../../../../../../../../../etc/hosts",
|
|
||||||
"() { 0; }; touch /tmp/blns.shellshock1.fail;",
|
|
||||||
"() { _; } >_[$($())] { touch /tmp/blns.shellshock2.fail; }",
|
|
||||||
]
|
|
||||||
|
|
||||||
naughty_strings = [naughty for naughty in naughty_strings if naughty not in skipped_naughty_strings]
|
|
||||||
|
|
||||||
# This is our base test case, but we'll reuse it to send naughty strings as both keys and values.
|
|
||||||
def send_naughty_tx(assets, metadata):
|
|
||||||
# ## Set up a connection to Planetmint
|
|
||||||
# Check [test_basic.py](./test_basic.html) to get some more details
|
|
||||||
# about the endpoint.
|
|
||||||
bdb = Planetmint(os.environ.get("PLANETMINT_ENDPOINT"))
|
|
||||||
|
|
||||||
# Here's Alice.
|
|
||||||
alice = generate_keypair()
|
|
||||||
|
|
||||||
# Alice is in a naughty mood today, so she creates a tx with some naughty strings
|
|
||||||
prepared_transaction = bdb.transactions.prepare(
|
|
||||||
operation="CREATE", signers=alice.public_key, assets=assets, metadata=metadata
|
|
||||||
)
|
|
||||||
|
|
||||||
# She fulfills the transaction
|
|
||||||
fulfilled_transaction = bdb.transactions.fulfill(prepared_transaction, private_keys=alice.private_key)
|
|
||||||
|
|
||||||
# The fulfilled tx gets sent to the BDB network
|
|
||||||
try:
|
|
||||||
sent_transaction = bdb.transactions.send_commit(fulfilled_transaction)
|
|
||||||
except BadRequest as e:
|
|
||||||
sent_transaction = e
|
|
||||||
|
|
||||||
# If her key contained a '.', began with a '$', or contained a NUL character
|
|
||||||
regex = ".*\..*|\$.*|.*\x00.*"
|
|
||||||
key = next(iter(metadata))
|
|
||||||
if re.match(regex, key):
|
|
||||||
# Then she expects a nicely formatted error code
|
|
||||||
status_code = sent_transaction.status_code
|
|
||||||
error = sent_transaction.error
|
|
||||||
regex = (
|
|
||||||
r"\{\s*\n*"
|
|
||||||
r'\s*"message":\s*"Invalid transaction \(ValidationError\):\s*'
|
|
||||||
r"Invalid key name.*The key name cannot contain characters.*\n*"
|
|
||||||
r'\s*"status":\s*400\n*'
|
|
||||||
r"\s*\}\n*"
|
|
||||||
)
|
|
||||||
assert status_code == 400
|
|
||||||
assert re.fullmatch(regex, error), sent_transaction
|
|
||||||
# Otherwise, she expects to see her transaction in the database
|
|
||||||
elif "id" in sent_transaction.keys():
|
|
||||||
tx_id = sent_transaction["id"]
|
|
||||||
assert bdb.transactions.retrieve(tx_id)
|
|
||||||
# If neither condition was true, then something weird happened...
|
|
||||||
else:
|
|
||||||
raise TypeError(sent_transaction)
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize("naughty_string", naughty_strings, ids=naughty_strings)
|
|
||||||
def test_naughty_keys(naughty_string):
|
|
||||||
|
|
||||||
assets = [{"data": multihash(marshal({naughty_string: "nice_value"}))}]
|
|
||||||
metadata = multihash(marshal({naughty_string: "nice_value"}))
|
|
||||||
|
|
||||||
send_naughty_tx(assets, metadata)
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize("naughty_string", naughty_strings, ids=naughty_strings)
|
|
||||||
def test_naughty_values(naughty_string):
|
|
||||||
|
|
||||||
assets = [{"data": multihash(marshal({"nice_key": naughty_string}))}]
|
|
||||||
metadata = multihash(marshal({"nice_key": naughty_string}))
|
|
||||||
|
|
||||||
send_naughty_tx(assets, metadata)
|
|
@ -1,135 +0,0 @@
|
|||||||
# Copyright © 2020 Interplanetary Database Association e.V.,
|
|
||||||
# Planetmint and IPDB software contributors.
|
|
||||||
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
|
|
||||||
# Code is Apache-2.0 and docs are CC-BY-4.0
|
|
||||||
|
|
||||||
# # Stream Acceptance Test
|
|
||||||
# This test checks if the event stream works correctly. The basic idea of this
|
|
||||||
# test is to generate some random **valid** transaction, send them to a
|
|
||||||
# Planetmint node, and expect those transactions to be returned by the valid
|
|
||||||
# transactions Stream API. During this test, two threads work together,
|
|
||||||
# sharing a queue to exchange events.
|
|
||||||
#
|
|
||||||
# - The *main thread* first creates and sends the transactions to Planetmint;
|
|
||||||
# then it run through all events in the shared queue to check if all
|
|
||||||
# transactions sent have been validated by Planetmint.
|
|
||||||
# - The *listen thread* listens to the events coming from Planetmint and puts
|
|
||||||
# them in a queue shared with the main thread.
|
|
||||||
|
|
||||||
import os
|
|
||||||
import queue
|
|
||||||
import json
|
|
||||||
from threading import Thread, Event
|
|
||||||
from uuid import uuid4
|
|
||||||
from ipld import multihash, marshal
|
|
||||||
|
|
||||||
# For this script, we need to set up a websocket connection, that's the reason
|
|
||||||
# we import the
|
|
||||||
# [websocket](https://github.com/websocket-client/websocket-client) module
|
|
||||||
from websocket import create_connection
|
|
||||||
|
|
||||||
from planetmint_driver import Planetmint
|
|
||||||
from planetmint_driver.crypto import generate_keypair
|
|
||||||
|
|
||||||
|
|
||||||
def test_stream():
|
|
||||||
# ## Set up the test
|
|
||||||
# We use the env variable `BICHAINDB_ENDPOINT` to know where to connect.
|
|
||||||
# Check [test_basic.py](./test_basic.html) for more information.
|
|
||||||
BDB_ENDPOINT = os.environ.get("PLANETMINT_ENDPOINT")
|
|
||||||
|
|
||||||
# *That's pretty bad, but let's do like this for now.*
|
|
||||||
WS_ENDPOINT = "ws://{}:9985/api/v1/streams/valid_transactions".format(BDB_ENDPOINT.rsplit(":")[0])
|
|
||||||
|
|
||||||
bdb = Planetmint(BDB_ENDPOINT)
|
|
||||||
|
|
||||||
# Hello to Alice again, she is pretty active in those tests, good job
|
|
||||||
# Alice!
|
|
||||||
alice = generate_keypair()
|
|
||||||
|
|
||||||
# We need few variables to keep the state, specifically we need `sent` to
|
|
||||||
# keep track of all transactions Alice sent to Planetmint, while `received`
|
|
||||||
# are the transactions Planetmint validated and sent back to her.
|
|
||||||
sent = []
|
|
||||||
received = queue.Queue()
|
|
||||||
|
|
||||||
# In this test we use a websocket. The websocket must be started **before**
|
|
||||||
# sending transactions to Planetmint, otherwise we might lose some
|
|
||||||
# transactions. The `ws_ready` event is used to synchronize the main thread
|
|
||||||
# with the listen thread.
|
|
||||||
ws_ready = Event()
|
|
||||||
|
|
||||||
# ## Listening to events
|
|
||||||
# This is the function run by the complementary thread.
|
|
||||||
def listen():
|
|
||||||
# First we connect to the remote endpoint using the WebSocket protocol.
|
|
||||||
ws = create_connection(WS_ENDPOINT)
|
|
||||||
|
|
||||||
# After the connection has been set up, we can signal the main thread
|
|
||||||
# to proceed (continue reading, it should make sense in a second.)
|
|
||||||
ws_ready.set()
|
|
||||||
|
|
||||||
# It's time to consume all events coming from the Planetmint stream API.
|
|
||||||
# Every time a new event is received, it is put in the queue shared
|
|
||||||
# with the main thread.
|
|
||||||
while True:
|
|
||||||
result = ws.recv()
|
|
||||||
received.put(result)
|
|
||||||
|
|
||||||
# Put `listen` in a thread, and start it. Note that `listen` is a local
|
|
||||||
# function and it can access all variables in the enclosing function.
|
|
||||||
t = Thread(target=listen, daemon=True)
|
|
||||||
t.start()
|
|
||||||
|
|
||||||
# ## Pushing the transactions to Planetmint
|
|
||||||
# After starting the listen thread, we wait for it to connect, and then we
|
|
||||||
# proceed.
|
|
||||||
ws_ready.wait()
|
|
||||||
|
|
||||||
# Here we prepare, sign, and send ten different `CREATE` transactions. To
|
|
||||||
# make sure each transaction is different from the other, we generate a
|
|
||||||
# random `uuid`.
|
|
||||||
for _ in range(10):
|
|
||||||
tx = bdb.transactions.fulfill(
|
|
||||||
bdb.transactions.prepare(
|
|
||||||
operation="CREATE",
|
|
||||||
signers=alice.public_key,
|
|
||||||
assets=[{"data": multihash(marshal({"uuid": str(uuid4())}))}],
|
|
||||||
),
|
|
||||||
private_keys=alice.private_key,
|
|
||||||
)
|
|
||||||
# We don't want to wait for each transaction to be in a block. By using
|
|
||||||
# `async` mode, we make sure that the driver returns as soon as the
|
|
||||||
# transaction is pushed to the Planetmint API. Remember: we expect all
|
|
||||||
# transactions to be in the shared queue: this is a two phase test,
|
|
||||||
# first we send a bunch of transactions, then we check if they are
|
|
||||||
# valid (and, in this case, they should).
|
|
||||||
bdb.transactions.send_async(tx)
|
|
||||||
|
|
||||||
# The `id` of every sent transaction is then stored in a list.
|
|
||||||
sent.append(tx["id"])
|
|
||||||
|
|
||||||
# ## Check the valid transactions coming from Planetmint
|
|
||||||
# Now we are ready to check if Planetmint did its job. A simple way to
|
|
||||||
# check if all sent transactions have been processed is to **remove** from
|
|
||||||
# `sent` the transactions we get from the *listen thread*. At one point in
|
|
||||||
# time, `sent` should be empty, and we exit the test.
|
|
||||||
while sent:
|
|
||||||
# To avoid waiting forever, we have an arbitrary timeout of 5
|
|
||||||
# seconds: it should be enough time for Planetmint to create
|
|
||||||
# blocks, in fact a new block is created every second. If we hit
|
|
||||||
# the timeout, then game over ¯\\\_(ツ)\_/¯
|
|
||||||
try:
|
|
||||||
event = received.get(timeout=5)
|
|
||||||
txid = json.loads(event)["transaction_id"]
|
|
||||||
except queue.Empty:
|
|
||||||
assert False, "Did not receive all expected transactions"
|
|
||||||
|
|
||||||
# Last thing is to try to remove the `txid` from the set of sent
|
|
||||||
# transactions. If this test is running in parallel with others, we
|
|
||||||
# might get a transaction id of another test, and `remove` can fail.
|
|
||||||
# It's OK if this happens.
|
|
||||||
try:
|
|
||||||
sent.remove(txid)
|
|
||||||
except ValueError:
|
|
||||||
pass
|
|
@ -1,162 +0,0 @@
|
|||||||
import os
|
|
||||||
import json
|
|
||||||
import base58
|
|
||||||
from hashlib import sha3_256
|
|
||||||
from planetmint_cryptoconditions.types.ed25519 import Ed25519Sha256
|
|
||||||
from planetmint_cryptoconditions.types.zenroom import ZenroomSha256
|
|
||||||
from zenroom import zencode_exec
|
|
||||||
from planetmint_driver import Planetmint
|
|
||||||
from planetmint_driver.crypto import generate_keypair
|
|
||||||
from ipld import multihash, marshal
|
|
||||||
|
|
||||||
|
|
||||||
def test_zenroom_signing(
|
|
||||||
gen_key_zencode,
|
|
||||||
secret_key_to_private_key_zencode,
|
|
||||||
fulfill_script_zencode,
|
|
||||||
zenroom_data,
|
|
||||||
zenroom_house_assets,
|
|
||||||
zenroom_script_input,
|
|
||||||
condition_script_zencode,
|
|
||||||
):
|
|
||||||
|
|
||||||
biolabs = generate_keypair()
|
|
||||||
version = "2.0"
|
|
||||||
|
|
||||||
alice = json.loads(zencode_exec(gen_key_zencode).output)["keyring"]
|
|
||||||
bob = json.loads(zencode_exec(gen_key_zencode).output)["keyring"]
|
|
||||||
|
|
||||||
zen_public_keys = json.loads(
|
|
||||||
zencode_exec(secret_key_to_private_key_zencode.format("Alice"), keys=json.dumps({"keyring": alice})).output
|
|
||||||
)
|
|
||||||
zen_public_keys.update(
|
|
||||||
json.loads(
|
|
||||||
zencode_exec(secret_key_to_private_key_zencode.format("Bob"), keys=json.dumps({"keyring": bob})).output
|
|
||||||
)
|
|
||||||
)
|
|
||||||
|
|
||||||
zenroomscpt = ZenroomSha256(script=fulfill_script_zencode, data=zenroom_data, keys=zen_public_keys)
|
|
||||||
print(f"zenroom is: {zenroomscpt.script}")
|
|
||||||
|
|
||||||
|
|
||||||
def test_zenroom_signing(
|
|
||||||
gen_key_zencode,
|
|
||||||
secret_key_to_private_key_zencode,
|
|
||||||
fulfill_script_zencode,
|
|
||||||
zenroom_data,
|
|
||||||
zenroom_house_assets,
|
|
||||||
condition_script_zencode,
|
|
||||||
):
|
|
||||||
|
|
||||||
biolabs = generate_keypair()
|
|
||||||
version = "2.0"
|
|
||||||
|
|
||||||
alice = json.loads(zencode_exec(gen_key_zencode).output)["keyring"]
|
|
||||||
bob = json.loads(zencode_exec(gen_key_zencode).output)["keyring"]
|
|
||||||
|
|
||||||
zen_public_keys = json.loads(
|
|
||||||
zencode_exec(secret_key_to_private_key_zencode.format("Alice"), keys=json.dumps({"keyring": alice})).output
|
|
||||||
)
|
|
||||||
zen_public_keys.update(
|
|
||||||
json.loads(
|
|
||||||
zencode_exec(secret_key_to_private_key_zencode.format("Bob"), keys=json.dumps({"keyring": bob})).output
|
|
||||||
)
|
|
||||||
)
|
|
||||||
|
|
||||||
zenroomscpt = ZenroomSha256(script=fulfill_script_zencode, data=zenroom_data, keys=zen_public_keys)
|
|
||||||
print(f"zenroom is: {zenroomscpt.script}")
|
|
||||||
|
|
||||||
# CRYPTO-CONDITIONS: generate the condition uri
|
|
||||||
condition_uri_zen = zenroomscpt.condition.serialize_uri()
|
|
||||||
print(f"\nzenroom condition URI: {condition_uri_zen}")
|
|
||||||
|
|
||||||
# CRYPTO-CONDITIONS: construct an unsigned fulfillment dictionary
|
|
||||||
unsigned_fulfillment_dict_zen = {
|
|
||||||
"type": zenroomscpt.TYPE_NAME,
|
|
||||||
"public_key": base58.b58encode(biolabs.public_key).decode(),
|
|
||||||
}
|
|
||||||
output = {
|
|
||||||
"amount": "10",
|
|
||||||
"condition": {
|
|
||||||
"details": unsigned_fulfillment_dict_zen,
|
|
||||||
"uri": condition_uri_zen,
|
|
||||||
},
|
|
||||||
"public_keys": [
|
|
||||||
biolabs.public_key,
|
|
||||||
],
|
|
||||||
}
|
|
||||||
input_ = {
|
|
||||||
"fulfillment": None,
|
|
||||||
"fulfills": None,
|
|
||||||
"owners_before": [
|
|
||||||
biolabs.public_key,
|
|
||||||
],
|
|
||||||
}
|
|
||||||
metadata = {"result": {"output": ["ok"]}}
|
|
||||||
|
|
||||||
script_ = {
|
|
||||||
"code": {"type": "zenroom", "raw": "test_string", "parameters": [{"obj": "1"}, {"obj": "2"}]}, # obsolete
|
|
||||||
"state": "dd8bbd234f9869cab4cc0b84aa660e9b5ef0664559b8375804ee8dce75b10576", #
|
|
||||||
"input": zenroom_script_input,
|
|
||||||
"output": ["ok"],
|
|
||||||
"policies": {},
|
|
||||||
}
|
|
||||||
metadata = {"result": {"output": ["ok"]}}
|
|
||||||
|
|
||||||
token_creation_tx = {
|
|
||||||
"operation": "CREATE",
|
|
||||||
"assets": [{"data": multihash(marshal({"test": "my asset"}))}],
|
|
||||||
"metadata": multihash(marshal(metadata)),
|
|
||||||
"script": script_,
|
|
||||||
"outputs": [
|
|
||||||
output,
|
|
||||||
],
|
|
||||||
"inputs": [
|
|
||||||
input_,
|
|
||||||
],
|
|
||||||
"version": version,
|
|
||||||
"id": None,
|
|
||||||
}
|
|
||||||
|
|
||||||
# JSON: serialize the transaction-without-id to a json formatted string
|
|
||||||
tx = json.dumps(
|
|
||||||
token_creation_tx,
|
|
||||||
sort_keys=True,
|
|
||||||
separators=(",", ":"),
|
|
||||||
ensure_ascii=False,
|
|
||||||
)
|
|
||||||
script_ = json.dumps(script_)
|
|
||||||
# major workflow:
|
|
||||||
# we store the fulfill script in the transaction/message (zenroom-sha)
|
|
||||||
# the condition script is used to fulfill the transaction and create the signature
|
|
||||||
#
|
|
||||||
# the server should ick the fulfill script and recreate the zenroom-sha and verify the signature
|
|
||||||
|
|
||||||
signed_input = zenroomscpt.sign(script_, condition_script_zencode, alice)
|
|
||||||
|
|
||||||
input_signed = json.loads(signed_input)
|
|
||||||
input_signed["input"]["signature"] = input_signed["output"]["signature"]
|
|
||||||
del input_signed["output"]["signature"]
|
|
||||||
del input_signed["output"]["logs"]
|
|
||||||
input_signed["output"] = ["ok"] # define expected output that is to be compared
|
|
||||||
input_msg = json.dumps(input_signed)
|
|
||||||
|
|
||||||
assert zenroomscpt.validate(message=input_msg)
|
|
||||||
|
|
||||||
tx = json.loads(tx)
|
|
||||||
fulfillment_uri_zen = zenroomscpt.serialize_uri()
|
|
||||||
|
|
||||||
tx["inputs"][0]["fulfillment"] = fulfillment_uri_zen
|
|
||||||
tx["script"] = input_signed
|
|
||||||
tx["id"] = None
|
|
||||||
json_str_tx = json.dumps(tx, sort_keys=True, skipkeys=False, separators=(",", ":"))
|
|
||||||
# SHA3: hash the serialized id-less transaction to generate the id
|
|
||||||
shared_creation_txid = sha3_256(json_str_tx.encode()).hexdigest()
|
|
||||||
tx["id"] = shared_creation_txid
|
|
||||||
# tx = json.dumps(tx)
|
|
||||||
# `https://example.com:9984`
|
|
||||||
print(f"TX \n{tx}")
|
|
||||||
plntmnt = Planetmint(os.environ.get("PLANETMINT_ENDPOINT"))
|
|
||||||
sent_transfer_tx = plntmnt.transactions.send_commit(tx)
|
|
||||||
|
|
||||||
print(f"\n\nstatus and result : + {sent_transfer_tx}")
|
|
@ -86,17 +86,6 @@ services:
|
|||||||
image: appropriate/curl
|
image: appropriate/curl
|
||||||
command: /bin/sh -c "curl -s http://planetmint:9984/ > /dev/null && curl -s http://tendermint:26657/ > /dev/null"
|
command: /bin/sh -c "curl -s http://planetmint:9984/ > /dev/null && curl -s http://tendermint:26657/ > /dev/null"
|
||||||
|
|
||||||
# Planetmint setup to do acceptance testing with Python
|
|
||||||
python-acceptance:
|
|
||||||
build:
|
|
||||||
context: .
|
|
||||||
dockerfile: ./acceptance/python/Dockerfile
|
|
||||||
volumes:
|
|
||||||
- ./acceptance/python/docs:/docs
|
|
||||||
- ./acceptance/python/src:/src
|
|
||||||
environment:
|
|
||||||
- PLANETMINT_ENDPOINT=planetmint
|
|
||||||
|
|
||||||
# Build docs only
|
# Build docs only
|
||||||
# docker-compose build bdocs
|
# docker-compose build bdocs
|
||||||
# docker-compose up -d bdocs
|
# docker-compose up -d bdocs
|
||||||
|
@ -10,7 +10,7 @@ for ``argparse.ArgumentParser``.
|
|||||||
import argparse
|
import argparse
|
||||||
import builtins
|
import builtins
|
||||||
import functools
|
import functools
|
||||||
import multiprocessing as mp
|
import multiprocessing
|
||||||
import sys
|
import sys
|
||||||
import planetmint
|
import planetmint
|
||||||
import planetmint.config_utils
|
import planetmint.config_utils
|
||||||
@ -132,7 +132,7 @@ def start(parser, argv, scope):
|
|||||||
if args.multiprocess is False:
|
if args.multiprocess is False:
|
||||||
args.multiprocess = 1
|
args.multiprocess = 1
|
||||||
elif args.multiprocess is None:
|
elif args.multiprocess is None:
|
||||||
args.multiprocess = mp.cpu_count()
|
args.multiprocess = multiprocessing.cpu_count()
|
||||||
|
|
||||||
return func(args)
|
return func(args)
|
||||||
|
|
||||||
|
@ -5,7 +5,7 @@
|
|||||||
|
|
||||||
from queue import Empty
|
from queue import Empty
|
||||||
from collections import defaultdict
|
from collections import defaultdict
|
||||||
from multiprocessing import Queue
|
import multiprocessing
|
||||||
|
|
||||||
|
|
||||||
POISON_PILL = "POISON_PILL"
|
POISON_PILL = "POISON_PILL"
|
||||||
@ -46,8 +46,8 @@ class Exchange:
|
|||||||
"""Dispatch events to subscribers."""
|
"""Dispatch events to subscribers."""
|
||||||
|
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
self.publisher_queue = Queue()
|
self.publisher_queue = multiprocessing.Queue()
|
||||||
self.started_queue = Queue()
|
self.started_queue = multiprocessing.Queue()
|
||||||
|
|
||||||
# Map <event_types -> queues>
|
# Map <event_types -> queues>
|
||||||
self.queues = defaultdict(list)
|
self.queues = defaultdict(list)
|
||||||
@ -80,7 +80,7 @@ class Exchange:
|
|||||||
if event_types is None:
|
if event_types is None:
|
||||||
event_types = EventTypes.ALL
|
event_types = EventTypes.ALL
|
||||||
|
|
||||||
queue = Queue()
|
queue = multiprocessing.Queue()
|
||||||
self.queues[event_types].append(queue)
|
self.queues[event_types].append(queue)
|
||||||
return queue
|
return queue
|
||||||
|
|
||||||
|
@ -3,7 +3,7 @@
|
|||||||
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
|
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
|
||||||
# Code is Apache-2.0 and docs are CC-BY-4.0
|
# Code is Apache-2.0 and docs are CC-BY-4.0
|
||||||
|
|
||||||
import multiprocessing as mp
|
import multiprocessing
|
||||||
|
|
||||||
from collections import defaultdict
|
from collections import defaultdict
|
||||||
from planetmint import App
|
from planetmint import App
|
||||||
@ -44,17 +44,17 @@ EXIT = "exit"
|
|||||||
|
|
||||||
|
|
||||||
class ParallelValidator:
|
class ParallelValidator:
|
||||||
def __init__(self, number_of_workers=mp.cpu_count()):
|
def __init__(self, number_of_workers=multiprocessing.cpu_count()):
|
||||||
self.number_of_workers = number_of_workers
|
self.number_of_workers = number_of_workers
|
||||||
self.transaction_index = 0
|
self.transaction_index = 0
|
||||||
self.routing_queues = [mp.Queue() for _ in range(self.number_of_workers)]
|
self.routing_queues = [multiprocessing.Queue() for _ in range(self.number_of_workers)]
|
||||||
self.workers = []
|
self.workers = []
|
||||||
self.results_queue = mp.Queue()
|
self.results_queue = multiprocessing.Queue()
|
||||||
|
|
||||||
def start(self):
|
def start(self):
|
||||||
for routing_queue in self.routing_queues:
|
for routing_queue in self.routing_queues:
|
||||||
worker = ValidationWorker(routing_queue, self.results_queue)
|
worker = ValidationWorker(routing_queue, self.results_queue)
|
||||||
process = mp.Process(target=worker.run)
|
process = multiprocessing.Process(target=worker.run)
|
||||||
process.start()
|
process.start()
|
||||||
self.workers.append(process)
|
self.workers.append(process)
|
||||||
|
|
||||||
|
@ -6,7 +6,7 @@
|
|||||||
import contextlib
|
import contextlib
|
||||||
import threading
|
import threading
|
||||||
import queue
|
import queue
|
||||||
import multiprocessing as mp
|
import multiprocessing
|
||||||
import json
|
import json
|
||||||
import setproctitle
|
import setproctitle
|
||||||
|
|
||||||
@ -19,7 +19,7 @@ from transactions.common.crypto import key_pair_from_ed25519_key
|
|||||||
|
|
||||||
class ProcessGroup(object):
|
class ProcessGroup(object):
|
||||||
def __init__(self, concurrency=None, group=None, target=None, name=None, args=None, kwargs=None, daemon=None):
|
def __init__(self, concurrency=None, group=None, target=None, name=None, args=None, kwargs=None, daemon=None):
|
||||||
self.concurrency = concurrency or mp.cpu_count()
|
self.concurrency = concurrency or multiprocessing.cpu_count()
|
||||||
self.group = group
|
self.group = group
|
||||||
self.target = target
|
self.target = target
|
||||||
self.name = name
|
self.name = name
|
||||||
@ -30,7 +30,7 @@ class ProcessGroup(object):
|
|||||||
|
|
||||||
def start(self):
|
def start(self):
|
||||||
for i in range(self.concurrency):
|
for i in range(self.concurrency):
|
||||||
proc = mp.Process(
|
proc = multiprocessing.Process(
|
||||||
group=self.group,
|
group=self.group,
|
||||||
target=self.target,
|
target=self.target,
|
||||||
name=self.name,
|
name=self.name,
|
||||||
@ -42,7 +42,7 @@ class ProcessGroup(object):
|
|||||||
self.processes.append(proc)
|
self.processes.append(proc)
|
||||||
|
|
||||||
|
|
||||||
class Process(mp.Process):
|
class Process(multiprocessing.Process):
|
||||||
"""Wrapper around multiprocessing.Process that uses
|
"""Wrapper around multiprocessing.Process that uses
|
||||||
setproctitle to set the name of the process when running
|
setproctitle to set the name of the process when running
|
||||||
the target task.
|
the target task.
|
||||||
|
@ -9,7 +9,7 @@ The application is implemented in Flask and runs using Gunicorn.
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
import copy
|
import copy
|
||||||
import multiprocessing
|
from multiprocessing import cpu_count
|
||||||
import gunicorn.app.base
|
import gunicorn.app.base
|
||||||
|
|
||||||
from flask import Flask
|
from flask import Flask
|
||||||
@ -102,7 +102,7 @@ def create_server(settings, log_config=None, planetmint_factory=None):
|
|||||||
settings = copy.deepcopy(settings)
|
settings = copy.deepcopy(settings)
|
||||||
|
|
||||||
if not settings.get("workers"):
|
if not settings.get("workers"):
|
||||||
settings["workers"] = (multiprocessing.cpu_count() * 2) + 1
|
settings["workers"] = (cpu_count() * 2) + 1
|
||||||
|
|
||||||
if not settings.get("threads"):
|
if not settings.get("threads"):
|
||||||
# Note: Threading is not recommended currently, as the frontend workload
|
# Note: Threading is not recommended currently, as the frontend workload
|
||||||
|
@ -22,7 +22,7 @@ def make_error(status_code, message=None):
|
|||||||
request_info = {"method": request.method, "path": request.path}
|
request_info = {"method": request.method, "path": request.path}
|
||||||
request_info.update(response_content)
|
request_info.update(response_content)
|
||||||
|
|
||||||
logger.error("HTTP API error: %(status)s - %(method)s:%(path)s - %(message)s", request_info)
|
logger.debug("HTTP API error: %(status)s - %(method)s:%(path)s - %(message)s", request_info)
|
||||||
|
|
||||||
response = jsonify(response_content)
|
response = jsonify(response_content)
|
||||||
response.status_code = status_code
|
response.status_code = status_code
|
||||||
|
@ -1,37 +0,0 @@
|
|||||||
#!/usr/bin/env bash
|
|
||||||
# Copyright © 2020 Interplanetary Database Association e.V.,
|
|
||||||
# Planetmint and IPDB software contributors.
|
|
||||||
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
|
|
||||||
# Code is Apache-2.0 and docs are CC-BY-4.0
|
|
||||||
|
|
||||||
|
|
||||||
# Set up a Planetmint node and return only when we are able to connect to both
|
|
||||||
# the Planetmint container *and* the Tendermint container.
|
|
||||||
setup () {
|
|
||||||
docker-compose up -d planetmint
|
|
||||||
|
|
||||||
# Try to connect to the containers for maximum three times, and wait
|
|
||||||
# one second between tries.
|
|
||||||
for i in $(seq 3); do
|
|
||||||
if $(docker-compose run --rm curl-client); then
|
|
||||||
break
|
|
||||||
else
|
|
||||||
sleep 1
|
|
||||||
fi
|
|
||||||
done
|
|
||||||
}
|
|
||||||
|
|
||||||
run_test () {
|
|
||||||
docker-compose run --rm python-acceptance pytest /src
|
|
||||||
}
|
|
||||||
|
|
||||||
teardown () {
|
|
||||||
docker-compose down
|
|
||||||
}
|
|
||||||
|
|
||||||
setup
|
|
||||||
run_test
|
|
||||||
exitcode=$?
|
|
||||||
teardown
|
|
||||||
|
|
||||||
exit $exitcode
|
|
@ -177,14 +177,15 @@ def test_post_create_transaction_with_invalid_id(mock_logger, b, client):
|
|||||||
).format(InvalidHash.__name__, tx["id"])
|
).format(InvalidHash.__name__, tx["id"])
|
||||||
assert res.status_code == expected_status_code
|
assert res.status_code == expected_status_code
|
||||||
assert res.json["message"] == expected_error_message
|
assert res.json["message"] == expected_error_message
|
||||||
assert mock_logger.error.called
|
# TODO change the loglevel to DEBUG for this test case to enable the following 3 asserts
|
||||||
assert "HTTP API error: %(status)s - %(method)s:%(path)s - %(message)s" in mock_logger.error.call_args[0]
|
# assert mock_logger.error.called
|
||||||
assert {
|
# assert "HTTP API error: %(status)s - %(method)s:%(path)s - %(message)s" in mock_logger.error.call_args[0]
|
||||||
"message": expected_error_message,
|
# assert {
|
||||||
"status": expected_status_code,
|
# "message": expected_error_message,
|
||||||
"method": "POST",
|
# "status": expected_status_code,
|
||||||
"path": TX_ENDPOINT,
|
# "method": "POST",
|
||||||
} in mock_logger.error.call_args[0]
|
# "path": TX_ENDPOINT,
|
||||||
|
# } in mock_logger.error.call_args[0]
|
||||||
# TODO put back caplog based asserts once possible
|
# TODO put back caplog based asserts once possible
|
||||||
# assert caplog.records[0].args['status'] == expected_status_code
|
# assert caplog.records[0].args['status'] == expected_status_code
|
||||||
# assert caplog.records[0].args['message'] == expected_error_message
|
# assert caplog.records[0].args['message'] == expected_error_message
|
||||||
@ -215,14 +216,15 @@ def test_post_create_transaction_with_invalid_signature(mock_logger, b, client):
|
|||||||
)
|
)
|
||||||
assert res.status_code == expected_status_code
|
assert res.status_code == expected_status_code
|
||||||
assert res.json["message"] == expected_error_message
|
assert res.json["message"] == expected_error_message
|
||||||
assert mock_logger.error.called
|
# TODO change the loglevel to DEBUG for this test case to enable the following 3 asserts
|
||||||
assert "HTTP API error: %(status)s - %(method)s:%(path)s - %(message)s" in mock_logger.error.call_args[0]
|
# assert mock_logger.error.called
|
||||||
assert {
|
# assert "HTTP API error: %(status)s - %(method)s:%(path)s - %(message)s" in mock_logger.error.call_args[0]
|
||||||
"message": expected_error_message,
|
# assert {
|
||||||
"status": expected_status_code,
|
# "message": expected_error_message,
|
||||||
"method": "POST",
|
# "status": expected_status_code,
|
||||||
"path": TX_ENDPOINT,
|
# "method": "POST",
|
||||||
} in mock_logger.error.call_args[0]
|
# "path": TX_ENDPOINT,
|
||||||
|
# } in mock_logger.error.call_args[0]
|
||||||
# TODO put back caplog based asserts once possible
|
# TODO put back caplog based asserts once possible
|
||||||
# assert caplog.records[0].args['status'] == expected_status_code
|
# assert caplog.records[0].args['status'] == expected_status_code
|
||||||
# assert caplog.records[0].args['message'] == expected_error_message
|
# assert caplog.records[0].args['message'] == expected_error_message
|
||||||
@ -265,14 +267,15 @@ def test_post_create_transaction_with_invalid_schema(mock_logger, client):
|
|||||||
)
|
)
|
||||||
assert res.status_code == expected_status_code
|
assert res.status_code == expected_status_code
|
||||||
assert res.json["message"] == expected_error_message
|
assert res.json["message"] == expected_error_message
|
||||||
assert mock_logger.error.called
|
# TODO change the loglevel to DEBUG for this test case to enable the following 3 asserts
|
||||||
assert "HTTP API error: %(status)s - %(method)s:%(path)s - %(message)s" in mock_logger.error.call_args[0]
|
# assert mock_logger.error.called
|
||||||
assert {
|
# assert "HTTP API error: %(status)s - %(method)s:%(path)s - %(message)s" in mock_logger.error.call_args[0]
|
||||||
"message": expected_error_message,
|
# assert {
|
||||||
"status": expected_status_code,
|
# "message": expected_error_message,
|
||||||
"method": "POST",
|
# "status": expected_status_code,
|
||||||
"path": TX_ENDPOINT,
|
# "method": "POST",
|
||||||
} in mock_logger.error.call_args[0]
|
# "path": TX_ENDPOINT,
|
||||||
|
# } in mock_logger.error.call_args[0]
|
||||||
# TODO put back caplog based asserts once possible
|
# TODO put back caplog based asserts once possible
|
||||||
# assert caplog.records[0].args['status'] == expected_status_code
|
# assert caplog.records[0].args['status'] == expected_status_code
|
||||||
# assert caplog.records[0].args['message'] == expected_error_message
|
# assert caplog.records[0].args['message'] == expected_error_message
|
||||||
@ -312,14 +315,15 @@ def test_post_invalid_transaction(
|
|||||||
expected_error_message = "Invalid transaction ({}): {}".format(exc, msg)
|
expected_error_message = "Invalid transaction ({}): {}".format(exc, msg)
|
||||||
assert res.status_code == expected_status_code
|
assert res.status_code == expected_status_code
|
||||||
assert res.json["message"] == "Invalid transaction ({}): {}".format(exc, msg)
|
assert res.json["message"] == "Invalid transaction ({}): {}".format(exc, msg)
|
||||||
assert mock_logger.error.called
|
# TODO change the loglevel to DEBUG for this test case to enable the following 3 asserts
|
||||||
assert "HTTP API error: %(status)s - %(method)s:%(path)s - %(message)s" in mock_logger.error.call_args[0]
|
# assert mock_logger.error.called
|
||||||
assert {
|
# assert "HTTP API error: %(status)s - %(method)s:%(path)s - %(message)s" in mock_logger.error.call_args[0]
|
||||||
"message": expected_error_message,
|
# assert {
|
||||||
"status": expected_status_code,
|
# "message": expected_error_message,
|
||||||
"method": "POST",
|
# "status": expected_status_code,
|
||||||
"path": TX_ENDPOINT,
|
# "method": "POST",
|
||||||
} in mock_logger.error.call_args[0]
|
# "path": TX_ENDPOINT,
|
||||||
|
# } in mock_logger.error.call_args[0]
|
||||||
# TODO put back caplog based asserts once possible
|
# TODO put back caplog based asserts once possible
|
||||||
# assert caplog.records[2].args['status'] == expected_status_code
|
# assert caplog.records[2].args['status'] == expected_status_code
|
||||||
# assert caplog.records[2].args['message'] == expected_error_message
|
# assert caplog.records[2].args['message'] == expected_error_message
|
||||||
|
Loading…
x
Reference in New Issue
Block a user