GitHub actions (#234)

* creating first github action

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* fix syntax error

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* renamed action, using black stable

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* updated checkout action on workflow black

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* formatted code with black

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* replaced lint with black service

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* removed black service added black check to makefile

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* replaced flake8 with black

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* added pull_request to black actions trigger

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* replaced flake8 with black style checker (#212)

* updated version number to 1.0.0

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* creating first github action

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* fix syntax error

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* renamed action, using black stable

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* updated checkout action on workflow black

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* formatted code with black

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* version bumpt

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* removed some comments and unsused import

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* replaced lint with black service

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* removed black service added black check to makefile

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* replaced flake8 with black

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* added pull_request to black actions trigger

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* started on unit test workflow

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* removed run step

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* fixed typo

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* testing docker-compose

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* check docker-compose

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* try running pytest

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* check out -f

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* changed path

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* increased health check retries, added job dependency

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* added path to docker-compose.yml to test action

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* moved container startup to test step

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* added checkout step to test job

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* different kind of execution

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* checking build step

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* fixed missing keyword

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* added checkout to build step

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* storing artifacts

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* added needs

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* changed Dockerfile-dev to python-slim

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* added job matrix back in

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* added abci to build job matrix

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* updated test job steps

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* fixed typo

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* replaced docker exec with docker-compose exec for abci test

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* added first version of acceptance and integration test action

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* added runs-on

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* fixed syntax error

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* reverted to docker exec

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* added copyright notice and env to start container step

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* separated abci from non abci test job

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* renamed pytest workflow to unit-test

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* added codecov workflow

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* added pytest install to codecov step

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* added pip install

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* moved codecov to unit-test

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* show files

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* changed paths

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* removed debug job steps

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* renamed black to lint, added audit workflow

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* checking if dc down is necessary

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* removed dc down step from acceptance and integration

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* fixed lint error

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* added tox documentation to github acitons (#226)

* added documentation job

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* added docs dependency install to docs workflow

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* add more dependencies

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* install rapidjson manually

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* added python-rapidjson to docs requirements text

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* changed gh config on tox.ini

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* added base58 to docs require

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* changed docs require to dev

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* reversed changes to docs require

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* changed gh to gh-actions

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* increased verbosity for debugging

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* added -e docsroot manually

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* removed verbosity

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* removed travis ci files

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

* changed audit step to trigger on schedule

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>

Signed-off-by: Lorenz Herzberger <lorenzherzberger@gmail.com>
Co-authored-by: enesturk <enes.m.turk@gmail.com>
This commit is contained in:
Lorenz Herzberger 2022-08-18 09:45:51 +02:00 committed by GitHub
parent e88bb41c70
commit 8abbef00fe
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
151 changed files with 4721 additions and 5201 deletions

View File

@ -1,12 +0,0 @@
#!/bin/bash
# Copyright © 2020 Interplanetary Database Association e.V.,
# Planetmint and IPDB software contributors.
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
# Code is Apache-2.0 and docs are CC-BY-4.0
set -e -x
if [[ -z ${TOXENV} ]] && [[ ${PLANETMINT_CI_ABCI} != 'enable' ]] && [[ ${PLANETMINT_ACCEPTANCE_TEST} != 'enable' ]]; then
codecov -v -f htmlcov/coverage.xml
fi

View File

@ -1,20 +0,0 @@
#!/bin/bash
# Copyright © 2020 Interplanetary Database Association e.V.,
# Planetmint and IPDB software contributors.
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
# Code is Apache-2.0 and docs are CC-BY-4.0
if [[ -n ${TOXENV} ]]; then
sudo apt-get update
sudo apt-get install zsh
fi
if [[ -z ${TOXENV} ]]; then
sudo apt-get update
sudo apt-get -y -o Dpkg::Options::="--force-confnew" install docker-ce
sudo rm /usr/local/bin/docker-compose
curl -L https://github.com/docker/compose/releases/download/${DOCKER_COMPOSE_VERSION}/docker-compose-`uname -s`-`uname -m` > docker-compose
chmod +x docker-compose
sudo mv docker-compose /usr/local/bin
fi

View File

@ -1,18 +0,0 @@
#!/bin/bash
# Copyright © 2020 Interplanetary Database Association e.V.,
# Planetmint and IPDB software contributors.
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
# Code is Apache-2.0 and docs are CC-BY-4.0
set -e -x
if [[ -z ${TOXENV} ]]; then
if [[ ${PLANETMINT_CI_ABCI} == 'enable' ]]; then
docker-compose up -d planetmint
else
docker-compose up -d bdb
fi
fi

View File

@ -1,19 +0,0 @@
#!/bin/bash
# Copyright © 2020 Interplanetary Database Association e.V.,
# Planetmint and IPDB software contributors.
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
# Code is Apache-2.0 and docs are CC-BY-4.0
set -e -x
pip install --upgrade pip
if [[ -n ${TOXENV} ]]; then
pip install --upgrade tox
elif [[ ${PLANETMINT_CI_ABCI} == 'enable' ]]; then
docker-compose build --no-cache --build-arg abci_status=enable planetmint
else
docker-compose build --no-cache planetmint
pip install --upgrade codecov
fi

View File

@ -1,21 +0,0 @@
#!/bin/bash
# Copyright © 2020 Interplanetary Database Association e.V.,
# Planetmint and IPDB software contributors.
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
# Code is Apache-2.0 and docs are CC-BY-4.0
set -e -x
if [[ -n ${TOXENV} ]]; then
tox -e ${TOXENV}
elif [[ ${PLANETMINT_CI_ABCI} == 'enable' ]]; then
docker-compose exec planetmint pytest -v -m abci
elif [[ ${PLANETMINT_ACCEPTANCE_TEST} == 'enable' ]]; then
./scripts/run-acceptance-test.sh
elif [[ ${PLANETMINT_INTEGRATION_TEST} == 'enable' ]]; then
docker-compose down # TODO: remove after ci optimization
./scripts/run-integration-test.sh
else
docker-compose exec planetmint pytest -v --cov=planetmint --cov-report xml:htmlcov/coverage.xml
fi

21
.github/workflows/acceptance-test.yml vendored Normal file
View File

@ -0,0 +1,21 @@
# Copyright © 2020 Interplanetary Database Association e.V.,
# Planetmint and IPDB software contributors.
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
# Code is Apache-2.0 and docs are CC-BY-4.0
name: Acceptance tests
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
steps:
- name: Check out repository code
uses: actions/checkout@v3
- name: Start container
run: docker-compose up -d planetmint
- name: Run test
run: docker-compose -f docker-compose.yml run --rm python-acceptance pytest /src

36
.github/workflows/audit.yml vendored Normal file
View File

@ -0,0 +1,36 @@
# Copyright © 2020 Interplanetary Database Association e.V.,
# Planetmint and IPDB software contributors.
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
# Code is Apache-2.0 and docs are CC-BY-4.0
name: Audit
on:
schedule:
- cron: '0 2 * * *'
jobs:
audit:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v3
- name: Setup python
uses: actions/setup-python@v4
with:
python-version: 3.9
- name: Install pip-audit
run: pip install --upgrade pip pip-audit
- name: Install dependencies
run: pip install .
- name: Create requirements.txt
run: pip freeze > requirements.txt
- name: Audit dependencies
run: pip-audit

35
.github/workflows/documenation.yml vendored Normal file
View File

@ -0,0 +1,35 @@
# Copyright © 2020 Interplanetary Database Association e.V.,
# Planetmint and IPDB software contributors.
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
# Code is Apache-2.0 and docs are CC-BY-4.0
name: Documentation
on: [push, pull_request]
jobs:
documentation:
runs-on: ubuntu-latest
steps:
- name: Check out repository code
uses: actions/checkout@v3
- name: Setup python
uses: actions/setup-python@v4
with:
python-version: 3.9
- name: Install tox
run: python -m pip install --upgrade tox tox-gh-actions
- name: Install dependencies
run: pip install .'[dev]'
- name: Run tox
run: tox -e docsroot

18
.github/workflows/integration-test.yml vendored Normal file
View File

@ -0,0 +1,18 @@
# Copyright © 2020 Interplanetary Database Association e.V.,
# Planetmint and IPDB software contributors.
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
# Code is Apache-2.0 and docs are CC-BY-4.0
name: Integration tests
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
steps:
- name: Check out repository code
uses: actions/checkout@v3
- name: Start test run
run: docker-compose -f docker-compose.integration.yml up test

17
.github/workflows/lint.yml vendored Normal file
View File

@ -0,0 +1,17 @@
# Copyright © 2020 Interplanetary Database Association e.V.,
# Planetmint and IPDB software contributors.
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
# Code is Apache-2.0 and docs are CC-BY-4.0
name: Lint
on: [push, pull_request]
jobs:
lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: psf/black@stable
with:
options: "--check -l 119"
src: "."

109
.github/workflows/unit-test.yml vendored Normal file
View File

@ -0,0 +1,109 @@
# Copyright © 2020 Interplanetary Database Association e.V.,
# Planetmint and IPDB software contributors.
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
# Code is Apache-2.0 and docs are CC-BY-4.0
name: Unit tests
on: [push, pull_request]
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
include:
- abci_enabled: "ABCI enabled"
abci: "enabled"
- abci_disabled: "ABCI disabled"
abci: "disabled"
steps:
- name: Check out repository code
uses: actions/checkout@v3
- name: Build container
run: |
if [[ "${{ matrix.abci }}" == "enabled" ]]; then
docker-compose -f docker-compose.yml build --no-cache --build-arg abci_status=enable planetmint
fi
if [[ ""${{ matrix.abci }}" == "disabled"" ]]; then
docker-compose -f docker-compose.yml build --no-cache planetmint
fi
- name: Save image
run: docker save -o planetmint.tar planetmint_planetmint
- name: Upload image
uses: actions/upload-artifact@v3
with:
name: planetmint-abci-${{matrix.abci}}
path: planetmint.tar
retention-days: 5
test-with-abci:
runs-on: ubuntu-latest
needs: build
strategy:
matrix:
include:
- db: "MongoDB with ABCI"
host: "mongodb"
port: 27017
abci: "enabled"
- db: "Tarantool with ABCI"
host: "tarantool"
port: 3303
abci: "enabled"
steps:
- name: Check out repository code
uses: actions/checkout@v3
- name: Download planetmint
uses: actions/download-artifact@v3
with:
name: planetmint-abci-enabled
- name: Load planetmint
run: docker load -i planetmint.tar
- name: Start containers
run: docker-compose -f docker-compose.yml up -d planetmint
- name: Run tests
run: docker exec planetmint_planetmint_1 pytest -v -m abci
test-without-abci:
runs-on: ubuntu-latest
needs: build
strategy:
matrix:
include:
- db: "MongoDB without ABCI"
host: "mongodb"
port: 27017
- db: "Tarantool without ABCI"
host: "tarantool"
port: 3303
steps:
- name: Check out repository code
uses: actions/checkout@v3
- name: Download planetmint
uses: actions/download-artifact@v3
with:
name: planetmint-abci-disabled
- name: Load planetmint
run: docker load -i planetmint.tar
- name: Start containers
run: docker-compose -f docker-compose.yml up -d bdb
- name: Run tests
run: docker exec planetmint_planetmint_1 pytest -v --cov=planetmint --cov-report xml:htmlcov/coverage.xml
- name: Upload Coverage to Codecov
uses: codecov/codecov-action@v3

View File

@ -1,64 +0,0 @@
# Copyright © 2020, 2021 Interplanetary Database Association e.V.,
# Planetmint and IPDB software contributors.
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
# Code is Apache-2.0 and docs are CC-BY-4.0
sudo: required
dist: focal
services:
- docker
language: python
cache: pip
python:
- 3.9
env:
global:
- DOCKER_COMPOSE_VERSION=1.29.2
matrix:
- TOXENV=flake8
- TOXENV=docsroot
matrix:
fast_finish: true
include:
- python: 3.9
env:
- PLANETMINT_DATABASE_BACKEND=tarantool_db
- PLANETMINT_DATABASE_SSL=
- python: 3.9
env:
- PLANETMINT_DATABASE_BACKEND=tarantool_db
- PLANETMINT_DATABASE_SSL=
- PLANETMINT_CI_ABCI=enable
- python: 3.9
env:
- PLANETMINT_DATABASE_BACKEND=localmongodb
- PLANETMINT_DATABASE_SSL=
- python: 3.9
env:
- PLANETMINT_DATABASE_BACKEND=localmongodb
- PLANETMINT_DATABASE_SSL=
- PLANETMINT_CI_ABCI=enable
- python: 3.9
env:
- PLANETMINT_ACCEPTANCE_TEST=enable
- python: 3.9
env:
- PLANETMINT_INTEGRATION_TEST=enable
before_install: sudo .ci/travis-before-install.sh
install: .ci/travis-install.sh
before_script: .ci/travis-before-script.sh
script: .ci/travis_script.sh
after_success: .ci/travis-after-success.sh

View File

@ -1,9 +1,9 @@
ARG python_version=3.9 ARG python_version=3.9
FROM python:${python_version} FROM python:${python_version}-slim
LABEL maintainer "contact@ipdb.global" LABEL maintainer "contact@ipdb.global"
RUN apt-get update \ RUN apt-get update \
&& apt-get install -y git zsh\ && apt-get install -y git zsh curl\
&& apt-get install -y tarantool-common\ && apt-get install -y tarantool-common\
&& apt-get install -y vim build-essential cmake\ && apt-get install -y vim build-essential cmake\
&& pip install -U pip \ && pip install -U pip \

View File

@ -47,6 +47,7 @@ HELP := python -c "$$PRINT_HELP_PYSCRIPT"
ECHO := /usr/bin/env echo ECHO := /usr/bin/env echo
IS_DOCKER_COMPOSE_INSTALLED := $(shell command -v docker-compose 2> /dev/null) IS_DOCKER_COMPOSE_INSTALLED := $(shell command -v docker-compose 2> /dev/null)
IS_BLACK_INSTALLED := $(shell command -v black 2> /dev/null)
################ ################
# Main targets # # Main targets #
@ -70,8 +71,11 @@ stop: check-deps ## Stop Planetmint
logs: check-deps ## Attach to the logs logs: check-deps ## Attach to the logs
@$(DC) logs -f planetmint @$(DC) logs -f planetmint
lint: check-deps ## Lint the project lint: check-py-deps ## Lint the project
@$(DC) up lint black --check -l 119 .
format: check-py-deps ## Format the project
black -l 119 .
test: check-deps test-unit test-acceptance ## Run unit and acceptance tests test: check-deps test-unit test-acceptance ## Run unit and acceptance tests
@ -132,3 +136,11 @@ ifndef IS_DOCKER_COMPOSE_INSTALLED
@$(ECHO) @$(ECHO)
@$(DC) # docker-compose is not installed, so we call it to generate an error and exit @$(DC) # docker-compose is not installed, so we call it to generate an error and exit
endif endif
check-py-deps:
ifndef IS_BLACK_INSTALLED
@$(ECHO) "Error: black is not installed"
@$(ECHO)
@$(ECHO) "You need to activate your virtual environment and install the test dependencies"
black # black is not installed, so we call it to generate an error and exit
endif

View File

@ -82,11 +82,11 @@ x = 'name: {}; score: {}'.format(name, n)
we use the `format()` version. The [official Python documentation says](https://docs.python.org/2/library/stdtypes.html#str.format), "This method of string formatting is the new standard in Python 3, and should be preferred to the % formatting described in String Formatting Operations in new code." we use the `format()` version. The [official Python documentation says](https://docs.python.org/2/library/stdtypes.html#str.format), "This method of string formatting is the new standard in Python 3, and should be preferred to the % formatting described in String Formatting Operations in new code."
## Running the Flake8 Style Checker ## Running the Black Style Checker
We use [Flake8](http://flake8.pycqa.org/en/latest/index.html) to check our Python code style. Once you have it installed, you can run it using: We use [Black](https://black.readthedocs.io/en/stable/) to check our Python code style. Once you have it installed, you can run it using:
```text ```text
flake8 --max-line-length 119 planetmint/ black --check -l 119 .
``` ```

View File

@ -31,7 +31,7 @@ def test_basic():
# connect to localhost, but you can override this value using the env variable # connect to localhost, but you can override this value using the env variable
# called `PLANETMINT_ENDPOINT`, a valid value must include the schema: # called `PLANETMINT_ENDPOINT`, a valid value must include the schema:
# `https://example.com:9984` # `https://example.com:9984`
bdb = Planetmint(os.environ.get('PLANETMINT_ENDPOINT')) bdb = Planetmint(os.environ.get("PLANETMINT_ENDPOINT"))
# ## Create keypairs # ## Create keypairs
# This test requires the interaction between two actors with their own keypair. # This test requires the interaction between two actors with their own keypair.
@ -41,33 +41,28 @@ def test_basic():
# ## Alice registers her bike in Planetmint # ## Alice registers her bike in Planetmint
# Alice has a nice bike, and here she creates the "digital twin" # Alice has a nice bike, and here she creates the "digital twin"
# of her bike. # of her bike.
bike = {'data': {'bicycle': {'serial_number': 420420}}} bike = {"data": {"bicycle": {"serial_number": 420420}}}
# She prepares a `CREATE` transaction... # She prepares a `CREATE` transaction...
prepared_creation_tx = bdb.transactions.prepare( prepared_creation_tx = bdb.transactions.prepare(operation="CREATE", signers=alice.public_key, asset=bike)
operation='CREATE',
signers=alice.public_key,
asset=bike)
# ... and she fulfills it with her private key. # ... and she fulfills it with her private key.
fulfilled_creation_tx = bdb.transactions.fulfill( fulfilled_creation_tx = bdb.transactions.fulfill(prepared_creation_tx, private_keys=alice.private_key)
prepared_creation_tx,
private_keys=alice.private_key)
# We will use the `id` of this transaction several time, so we store it in # We will use the `id` of this transaction several time, so we store it in
# a variable with a short and easy name # a variable with a short and easy name
bike_id = fulfilled_creation_tx['id'] bike_id = fulfilled_creation_tx["id"]
# Now she is ready to send it to the Planetmint Network. # Now she is ready to send it to the Planetmint Network.
sent_transfer_tx = bdb.transactions.send_commit(fulfilled_creation_tx) sent_transfer_tx = bdb.transactions.send_commit(fulfilled_creation_tx)
# And just to be 100% sure, she also checks if she can retrieve # And just to be 100% sure, she also checks if she can retrieve
# it from the Planetmint node. # it from the Planetmint node.
assert bdb.transactions.retrieve(bike_id), 'Cannot find transaction {}'.format(bike_id) assert bdb.transactions.retrieve(bike_id), "Cannot find transaction {}".format(bike_id)
# Alice is now the proud owner of one unspent asset. # Alice is now the proud owner of one unspent asset.
assert len(bdb.outputs.get(alice.public_key, spent=False)) == 1 assert len(bdb.outputs.get(alice.public_key, spent=False)) == 1
assert bdb.outputs.get(alice.public_key)[0]['transaction_id'] == bike_id assert bdb.outputs.get(alice.public_key)[0]["transaction_id"] == bike_id
# ## Alice transfers her bike to Bob # ## Alice transfers her bike to Bob
# After registering her bike, Alice is ready to transfer it to Bob. # After registering her bike, Alice is ready to transfer it to Bob.
@ -75,11 +70,11 @@ def test_basic():
# A `TRANSFER` transaction contains a pointer to the original asset. The original asset # A `TRANSFER` transaction contains a pointer to the original asset. The original asset
# is identified by the `id` of the `CREATE` transaction that defined it. # is identified by the `id` of the `CREATE` transaction that defined it.
transfer_asset = {'id': bike_id} transfer_asset = {"id": bike_id}
# Alice wants to spend the one and only output available, the one with index `0`. # Alice wants to spend the one and only output available, the one with index `0`.
output_index = 0 output_index = 0
output = fulfilled_creation_tx['outputs'][output_index] output = fulfilled_creation_tx["outputs"][output_index]
# Here, she defines the `input` of the `TRANSFER` transaction. The `input` contains # Here, she defines the `input` of the `TRANSFER` transaction. The `input` contains
# several keys: # several keys:
@ -87,29 +82,26 @@ def test_basic():
# - `fulfillment`, taken from the previous `CREATE` transaction. # - `fulfillment`, taken from the previous `CREATE` transaction.
# - `fulfills`, that specifies which condition she is fulfilling. # - `fulfills`, that specifies which condition she is fulfilling.
# - `owners_before`. # - `owners_before`.
transfer_input = {'fulfillment': output['condition']['details'], transfer_input = {
'fulfills': {'output_index': output_index, "fulfillment": output["condition"]["details"],
'transaction_id': fulfilled_creation_tx['id']}, "fulfills": {"output_index": output_index, "transaction_id": fulfilled_creation_tx["id"]},
'owners_before': output['public_keys']} "owners_before": output["public_keys"],
}
# Now that all the elements are set, she creates the actual transaction... # Now that all the elements are set, she creates the actual transaction...
prepared_transfer_tx = bdb.transactions.prepare( prepared_transfer_tx = bdb.transactions.prepare(
operation='TRANSFER', operation="TRANSFER", asset=transfer_asset, inputs=transfer_input, recipients=bob.public_key
asset=transfer_asset, )
inputs=transfer_input,
recipients=bob.public_key)
# ... and signs it with her private key. # ... and signs it with her private key.
fulfilled_transfer_tx = bdb.transactions.fulfill( fulfilled_transfer_tx = bdb.transactions.fulfill(prepared_transfer_tx, private_keys=alice.private_key)
prepared_transfer_tx,
private_keys=alice.private_key)
# She finally sends the transaction to a Planetmint node. # She finally sends the transaction to a Planetmint node.
sent_transfer_tx = bdb.transactions.send_commit(fulfilled_transfer_tx) sent_transfer_tx = bdb.transactions.send_commit(fulfilled_transfer_tx)
# And just to be 100% sure, she also checks if she can retrieve # And just to be 100% sure, she also checks if she can retrieve
# it from the Planetmint node. # it from the Planetmint node.
assert bdb.transactions.retrieve(fulfilled_transfer_tx['id']) == sent_transfer_tx assert bdb.transactions.retrieve(fulfilled_transfer_tx["id"]) == sent_transfer_tx
# Now Alice has zero unspent transactions. # Now Alice has zero unspent transactions.
assert len(bdb.outputs.get(alice.public_key, spent=False)) == 0 assert len(bdb.outputs.get(alice.public_key, spent=False)) == 0
@ -118,5 +110,5 @@ def test_basic():
assert len(bdb.outputs.get(bob.public_key, spent=False)) == 1 assert len(bdb.outputs.get(bob.public_key, spent=False)) == 1
# Bob double checks what he got was the actual bike. # Bob double checks what he got was the actual bike.
bob_tx_id = bdb.outputs.get(bob.public_key, spent=False)[0]['transaction_id'] bob_tx_id = bdb.outputs.get(bob.public_key, spent=False)[0]["transaction_id"]
assert bdb.transactions.retrieve(bob_tx_id) == sent_transfer_tx assert bdb.transactions.retrieve(bob_tx_id) == sent_transfer_tx

View File

@ -34,7 +34,7 @@ def test_divisible_assets():
# ## Set up a connection to Planetmint # ## Set up a connection to Planetmint
# Check [test_basic.py](./test_basic.html) to get some more details # Check [test_basic.py](./test_basic.html) to get some more details
# about the endpoint. # about the endpoint.
bdb = Planetmint(os.environ.get('PLANETMINT_ENDPOINT')) bdb = Planetmint(os.environ.get("PLANETMINT_ENDPOINT"))
# Oh look, it is Alice again and she brought her friend Bob along. # Oh look, it is Alice again and she brought her friend Bob along.
alice, bob = generate_keypair(), generate_keypair() alice, bob = generate_keypair(), generate_keypair()
@ -48,13 +48,9 @@ def test_divisible_assets():
# the bike for one hour. # the bike for one hour.
bike_token = { bike_token = {
'data': { "data": {
'token_for': { "token_for": {"bike": {"serial_number": 420420}},
'bike': { "description": "Time share token. Each token equals one hour of riding.",
'serial_number': 420420
}
},
'description': 'Time share token. Each token equals one hour of riding.',
}, },
} }
@ -62,28 +58,22 @@ def test_divisible_assets():
# Here, Alice defines in a tuple that she wants to assign # Here, Alice defines in a tuple that she wants to assign
# these 10 tokens to Bob. # these 10 tokens to Bob.
prepared_token_tx = bdb.transactions.prepare( prepared_token_tx = bdb.transactions.prepare(
operation='CREATE', operation="CREATE", signers=alice.public_key, recipients=[([bob.public_key], 10)], asset=bike_token
signers=alice.public_key, )
recipients=[([bob.public_key], 10)],
asset=bike_token)
# She fulfills and sends the transaction. # She fulfills and sends the transaction.
fulfilled_token_tx = bdb.transactions.fulfill( fulfilled_token_tx = bdb.transactions.fulfill(prepared_token_tx, private_keys=alice.private_key)
prepared_token_tx,
private_keys=alice.private_key)
bdb.transactions.send_commit(fulfilled_token_tx) bdb.transactions.send_commit(fulfilled_token_tx)
# We store the `id` of the transaction to use it later on. # We store the `id` of the transaction to use it later on.
bike_token_id = fulfilled_token_tx['id'] bike_token_id = fulfilled_token_tx["id"]
# Let's check if the transaction was successful. # Let's check if the transaction was successful.
assert bdb.transactions.retrieve(bike_token_id), \ assert bdb.transactions.retrieve(bike_token_id), "Cannot find transaction {}".format(bike_token_id)
'Cannot find transaction {}'.format(bike_token_id)
# Bob owns 10 tokens now. # Bob owns 10 tokens now.
assert bdb.transactions.retrieve(bike_token_id)['outputs'][0][ assert bdb.transactions.retrieve(bike_token_id)["outputs"][0]["amount"] == "10"
'amount'] == '10'
# ## Bob wants to use the bike # ## Bob wants to use the bike
# Now that Bob got the tokens and the sun is shining, he wants to get out # Now that Bob got the tokens and the sun is shining, he wants to get out
@ -91,49 +81,45 @@ def test_divisible_assets():
# To use the bike he has to send the tokens back to Alice. # To use the bike he has to send the tokens back to Alice.
# To learn about the details of transferring a transaction check out # To learn about the details of transferring a transaction check out
# [test_basic.py](./test_basic.html) # [test_basic.py](./test_basic.html)
transfer_asset = {'id': bike_token_id} transfer_asset = {"id": bike_token_id}
output_index = 0 output_index = 0
output = fulfilled_token_tx['outputs'][output_index] output = fulfilled_token_tx["outputs"][output_index]
transfer_input = {'fulfillment': output['condition']['details'], transfer_input = {
'fulfills': {'output_index': output_index, "fulfillment": output["condition"]["details"],
'transaction_id': fulfilled_token_tx[ "fulfills": {"output_index": output_index, "transaction_id": fulfilled_token_tx["id"]},
'id']}, "owners_before": output["public_keys"],
'owners_before': output['public_keys']} }
# To use the tokens Bob has to reassign 7 tokens to himself and the # To use the tokens Bob has to reassign 7 tokens to himself and the
# amount he wants to use to Alice. # amount he wants to use to Alice.
prepared_transfer_tx = bdb.transactions.prepare( prepared_transfer_tx = bdb.transactions.prepare(
operation='TRANSFER', operation="TRANSFER",
asset=transfer_asset, asset=transfer_asset,
inputs=transfer_input, inputs=transfer_input,
recipients=[([alice.public_key], 3), ([bob.public_key], 7)]) recipients=[([alice.public_key], 3), ([bob.public_key], 7)],
)
# He signs and sends the transaction. # He signs and sends the transaction.
fulfilled_transfer_tx = bdb.transactions.fulfill( fulfilled_transfer_tx = bdb.transactions.fulfill(prepared_transfer_tx, private_keys=bob.private_key)
prepared_transfer_tx,
private_keys=bob.private_key)
sent_transfer_tx = bdb.transactions.send_commit(fulfilled_transfer_tx) sent_transfer_tx = bdb.transactions.send_commit(fulfilled_transfer_tx)
# First, Bob checks if the transaction was successful. # First, Bob checks if the transaction was successful.
assert bdb.transactions.retrieve( assert bdb.transactions.retrieve(fulfilled_transfer_tx["id"]) == sent_transfer_tx
fulfilled_transfer_tx['id']) == sent_transfer_tx
# There are two outputs in the transaction now. # There are two outputs in the transaction now.
# The first output shows that Alice got back 3 tokens... # The first output shows that Alice got back 3 tokens...
assert bdb.transactions.retrieve( assert bdb.transactions.retrieve(fulfilled_transfer_tx["id"])["outputs"][0]["amount"] == "3"
fulfilled_transfer_tx['id'])['outputs'][0]['amount'] == '3'
# ... while Bob still has 7 left. # ... while Bob still has 7 left.
assert bdb.transactions.retrieve( assert bdb.transactions.retrieve(fulfilled_transfer_tx["id"])["outputs"][1]["amount"] == "7"
fulfilled_transfer_tx['id'])['outputs'][1]['amount'] == '7'
# ## Bob wants to ride the bike again # ## Bob wants to ride the bike again
# It's been a week and Bob wants to right the bike again. # It's been a week and Bob wants to right the bike again.
# Now he wants to ride for 8 hours, that's a lot Bob! # Now he wants to ride for 8 hours, that's a lot Bob!
# He prepares the transaction again. # He prepares the transaction again.
transfer_asset = {'id': bike_token_id} transfer_asset = {"id": bike_token_id}
# This time we need an `output_index` of 1, since we have two outputs # This time we need an `output_index` of 1, since we have two outputs
# in the `fulfilled_transfer_tx` we created before. The first output with # in the `fulfilled_transfer_tx` we created before. The first output with
# index 0 is for Alice and the second output is for Bob. # index 0 is for Alice and the second output is for Bob.
@ -141,24 +127,21 @@ def test_divisible_assets():
# correct output with the correct amount of tokens. # correct output with the correct amount of tokens.
output_index = 1 output_index = 1
output = fulfilled_transfer_tx['outputs'][output_index] output = fulfilled_transfer_tx["outputs"][output_index]
transfer_input = {'fulfillment': output['condition']['details'], transfer_input = {
'fulfills': {'output_index': output_index, "fulfillment": output["condition"]["details"],
'transaction_id': fulfilled_transfer_tx['id']}, "fulfills": {"output_index": output_index, "transaction_id": fulfilled_transfer_tx["id"]},
'owners_before': output['public_keys']} "owners_before": output["public_keys"],
}
# This time Bob only provides Alice in the `recipients` because he wants # This time Bob only provides Alice in the `recipients` because he wants
# to spend all his tokens # to spend all his tokens
prepared_transfer_tx = bdb.transactions.prepare( prepared_transfer_tx = bdb.transactions.prepare(
operation='TRANSFER', operation="TRANSFER", asset=transfer_asset, inputs=transfer_input, recipients=[([alice.public_key], 8)]
asset=transfer_asset, )
inputs=transfer_input,
recipients=[([alice.public_key], 8)])
fulfilled_transfer_tx = bdb.transactions.fulfill( fulfilled_transfer_tx = bdb.transactions.fulfill(prepared_transfer_tx, private_keys=bob.private_key)
prepared_transfer_tx,
private_keys=bob.private_key)
# Oh Bob, what have you done?! You tried to spend more tokens than you had. # Oh Bob, what have you done?! You tried to spend more tokens than you had.
# Remember Bob, last time you spent 3 tokens already, # Remember Bob, last time you spent 3 tokens already,
@ -169,10 +152,12 @@ def test_divisible_assets():
# Now Bob gets an error saying that the amount he wanted to spent is # Now Bob gets an error saying that the amount he wanted to spent is
# higher than the amount of tokens he has left. # higher than the amount of tokens he has left.
assert error.value.args[0] == 400 assert error.value.args[0] == 400
message = 'Invalid transaction (AmountError): The amount used in the ' \ message = (
'inputs `7` needs to be same as the amount used in the ' \ "Invalid transaction (AmountError): The amount used in the "
'outputs `8`' "inputs `7` needs to be same as the amount used in the "
assert error.value.args[2]['message'] == message "outputs `8`"
)
assert error.value.args[2]["message"] == message
# We have to stop this test now, I am sorry, but Bob is pretty upset # We have to stop this test now, I am sorry, but Bob is pretty upset
# about his mistake. See you next time :) # about his mistake. See you next time :)

View File

@ -17,24 +17,22 @@ from planetmint_driver.crypto import generate_keypair
def test_double_create(): def test_double_create():
bdb = Planetmint(os.environ.get('PLANETMINT_ENDPOINT')) bdb = Planetmint(os.environ.get("PLANETMINT_ENDPOINT"))
alice = generate_keypair() alice = generate_keypair()
results = queue.Queue() results = queue.Queue()
tx = bdb.transactions.fulfill( tx = bdb.transactions.fulfill(
bdb.transactions.prepare( bdb.transactions.prepare(operation="CREATE", signers=alice.public_key, asset={"data": {"uuid": str(uuid4())}}),
operation='CREATE', private_keys=alice.private_key,
signers=alice.public_key, )
asset={'data': {'uuid': str(uuid4())}}),
private_keys=alice.private_key)
def send_and_queue(tx): def send_and_queue(tx):
try: try:
bdb.transactions.send_commit(tx) bdb.transactions.send_commit(tx)
results.put('OK') results.put("OK")
except planetmint_driver.exceptions.TransportError as e: except planetmint_driver.exceptions.TransportError as e:
results.put('FAIL') results.put("FAIL")
t1 = Thread(target=send_and_queue, args=(tx,)) t1 = Thread(target=send_and_queue, args=(tx,))
t2 = Thread(target=send_and_queue, args=(tx,)) t2 = Thread(target=send_and_queue, args=(tx,))
@ -44,5 +42,5 @@ def test_double_create():
results = [results.get(timeout=2), results.get(timeout=2)] results = [results.get(timeout=2), results.get(timeout=2)]
assert results.count('OK') == 1 assert results.count("OK") == 1
assert results.count('FAIL') == 1 assert results.count("FAIL") == 1

View File

@ -31,7 +31,7 @@ def test_multiple_owners():
# ## Set up a connection to Planetmint # ## Set up a connection to Planetmint
# Check [test_basic.py](./test_basic.html) to get some more details # Check [test_basic.py](./test_basic.html) to get some more details
# about the endpoint. # about the endpoint.
bdb = Planetmint(os.environ.get('PLANETMINT_ENDPOINT')) bdb = Planetmint(os.environ.get("PLANETMINT_ENDPOINT"))
# Hey Alice and Bob, nice to see you again! # Hey Alice and Bob, nice to see you again!
alice, bob = generate_keypair(), generate_keypair() alice, bob = generate_keypair(), generate_keypair()
@ -41,40 +41,28 @@ def test_multiple_owners():
# high rents anymore. Bob suggests to get a dish washer for the # high rents anymore. Bob suggests to get a dish washer for the
# kitchen. Alice agrees and here they go, creating the asset for their # kitchen. Alice agrees and here they go, creating the asset for their
# dish washer. # dish washer.
dw_asset = { dw_asset = {"data": {"dish washer": {"serial_number": 1337}}}
'data': {
'dish washer': {
'serial_number': 1337
}
}
}
# They prepare a `CREATE` transaction. To have multiple owners, both # They prepare a `CREATE` transaction. To have multiple owners, both
# Bob and Alice need to be the recipients. # Bob and Alice need to be the recipients.
prepared_dw_tx = bdb.transactions.prepare( prepared_dw_tx = bdb.transactions.prepare(
operation='CREATE', operation="CREATE", signers=alice.public_key, recipients=(alice.public_key, bob.public_key), asset=dw_asset
signers=alice.public_key, )
recipients=(alice.public_key, bob.public_key),
asset=dw_asset)
# Now they both sign the transaction by providing their private keys. # Now they both sign the transaction by providing their private keys.
# And send it afterwards. # And send it afterwards.
fulfilled_dw_tx = bdb.transactions.fulfill( fulfilled_dw_tx = bdb.transactions.fulfill(prepared_dw_tx, private_keys=[alice.private_key, bob.private_key])
prepared_dw_tx,
private_keys=[alice.private_key, bob.private_key])
bdb.transactions.send_commit(fulfilled_dw_tx) bdb.transactions.send_commit(fulfilled_dw_tx)
# We store the `id` of the transaction to use it later on. # We store the `id` of the transaction to use it later on.
dw_id = fulfilled_dw_tx['id'] dw_id = fulfilled_dw_tx["id"]
# Let's check if the transaction was successful. # Let's check if the transaction was successful.
assert bdb.transactions.retrieve(dw_id), \ assert bdb.transactions.retrieve(dw_id), "Cannot find transaction {}".format(dw_id)
'Cannot find transaction {}'.format(dw_id)
# The transaction should have two public keys in the outputs. # The transaction should have two public keys in the outputs.
assert len( assert len(bdb.transactions.retrieve(dw_id)["outputs"][0]["public_keys"]) == 2
bdb.transactions.retrieve(dw_id)['outputs'][0]['public_keys']) == 2
# ## Alice and Bob transfer a transaction to Carol. # ## Alice and Bob transfer a transaction to Carol.
# Alice and Bob save a lot of money living together. They often go out # Alice and Bob save a lot of money living together. They often go out
@ -86,39 +74,33 @@ def test_multiple_owners():
# Alice and Bob prepare the transaction to transfer the dish washer to # Alice and Bob prepare the transaction to transfer the dish washer to
# Carol. # Carol.
transfer_asset = {'id': dw_id} transfer_asset = {"id": dw_id}
output_index = 0 output_index = 0
output = fulfilled_dw_tx['outputs'][output_index] output = fulfilled_dw_tx["outputs"][output_index]
transfer_input = {'fulfillment': output['condition']['details'], transfer_input = {
'fulfills': {'output_index': output_index, "fulfillment": output["condition"]["details"],
'transaction_id': fulfilled_dw_tx[ "fulfills": {"output_index": output_index, "transaction_id": fulfilled_dw_tx["id"]},
'id']}, "owners_before": output["public_keys"],
'owners_before': output['public_keys']} }
# Now they create the transaction... # Now they create the transaction...
prepared_transfer_tx = bdb.transactions.prepare( prepared_transfer_tx = bdb.transactions.prepare(
operation='TRANSFER', operation="TRANSFER", asset=transfer_asset, inputs=transfer_input, recipients=carol.public_key
asset=transfer_asset, )
inputs=transfer_input,
recipients=carol.public_key)
# ... and sign it with their private keys, then send it. # ... and sign it with their private keys, then send it.
fulfilled_transfer_tx = bdb.transactions.fulfill( fulfilled_transfer_tx = bdb.transactions.fulfill(
prepared_transfer_tx, prepared_transfer_tx, private_keys=[alice.private_key, bob.private_key]
private_keys=[alice.private_key, bob.private_key]) )
sent_transfer_tx = bdb.transactions.send_commit(fulfilled_transfer_tx) sent_transfer_tx = bdb.transactions.send_commit(fulfilled_transfer_tx)
# They check if the transaction was successful. # They check if the transaction was successful.
assert bdb.transactions.retrieve( assert bdb.transactions.retrieve(fulfilled_transfer_tx["id"]) == sent_transfer_tx
fulfilled_transfer_tx['id']) == sent_transfer_tx
# The owners before should include both Alice and Bob. # The owners before should include both Alice and Bob.
assert len( assert len(bdb.transactions.retrieve(fulfilled_transfer_tx["id"])["inputs"][0]["owners_before"]) == 2
bdb.transactions.retrieve(fulfilled_transfer_tx['id'])['inputs'][0][
'owners_before']) == 2
# While the new owner is Carol. # While the new owner is Carol.
assert bdb.transactions.retrieve(fulfilled_transfer_tx['id'])[ assert bdb.transactions.retrieve(fulfilled_transfer_tx["id"])["outputs"][0]["public_keys"][0] == carol.public_key
'outputs'][0]['public_keys'][0] == carol.public_key

View File

@ -32,15 +32,36 @@ from planetmint_driver.exceptions import BadRequest
naughty_strings = blns.all() naughty_strings = blns.all()
skipped_naughty_strings = [ skipped_naughty_strings = [
'1.00', '$1.00', '-1.00', '-$1.00', '0.00', '0..0', '.', '0.0.0', "1.00",
'-.', ",./;'[]\\-=", 'ثم نفس سقطت وبالتحديد،, جزيرتي باستخدام أن دنو. إذ هنا؟ الستار وتنصيب كان. أهّل ايطاليا، بريطانيا-فرنسا قد أخذ. سليمان، إتفاقية بين ما, يذكر الحدود أي بعد, معاملة بولندا، الإطلاق عل إيو.', "$1.00",
'test\x00', 'Ṱ̺̺̕o͞ ̷i̲̬͇̪͙n̝̗͕v̟̜̘̦͟o̶̙̰̠kè͚̮̺̪̹̱̤ ̖t̝͕̳̣̻̪͞h̼͓̲̦̳̘̲e͇̣̰̦̬͎ ̢̼̻̱̘h͚͎͙̜̣̲ͅi̦̲̣̰̤v̻͍e̺̭̳̪̰-m̢iͅn̖̺̞̲̯̰d̵̼̟͙̩̼̘̳ ̞̥̱̳̭r̛̗̘e͙p͠r̼̞̻̭̗e̺̠̣͟s̘͇̳͍̝͉e͉̥̯̞̲͚̬͜ǹ̬͎͎̟̖͇̤t͍̬̤͓̼̭͘ͅi̪̱n͠g̴͉ ͏͉ͅc̬̟h͡a̫̻̯͘o̫̟̖͍̙̝͉s̗̦̲.̨̹͈̣', '̡͓̞ͅI̗̘̦͝n͇͇͙v̮̫ok̲̫̙͈i̖͙̭̹̠̞n̡̻̮̣̺g̲͈͙̭͙̬͎ ̰t͔̦h̞̲e̢̤ ͍̬̲͖f̴̘͕̣è͖ẹ̥̩l͖͔͚i͓͚̦͠n͖͍̗͓̳̮g͍ ̨o͚̪͡f̘̣̬ ̖̘͖̟͙̮c҉͔̫͖͓͇͖ͅh̵̤̣͚͔á̗̼͕ͅo̼̣̥s̱͈̺̖̦̻͢.̛̖̞̠̫̰', '̗̺͖̹̯͓Ṯ̤͍̥͇͈h̲́e͏͓̼̗̙̼̣͔ ͇̜̱̠͓͍ͅN͕͠e̗̱z̘̝̜̺͙p̤̺̹͍̯͚e̠̻̠͜r̨̤͍̺̖͔̖̖d̠̟̭̬̝͟i̦͖̩͓͔̤a̠̗̬͉̙n͚͜ ̻̞̰͚ͅh̵͉i̳̞v̢͇ḙ͎͟-҉̭̩̼͔m̤̭̫i͕͇̝̦n̗͙ḍ̟ ̯̲͕͞ǫ̟̯̰̲͙̻̝f ̪̰̰̗̖̭̘͘c̦͍̲̞͍̩̙ḥ͚a̮͎̟̙͜ơ̩̹͎s̤.̝̝ ҉Z̡̖̜͖̰̣͉̜a͖̰͙̬͡l̲̫̳͍̩g̡̟̼̱͚̞̬ͅo̗͜.̟', "-1.00",
'̦H̬̤̗̤͝e͜ ̜̥̝̻͍̟́w̕h̖̯͓o̝͙̖͎̱̮ ҉̺̙̞̟͈W̷̼̭a̺̪͍į͈͕̭͙̯̜t̶̼̮s̘͙͖̕ ̠̫̠B̻͍͙͉̳ͅe̵h̵̬͇̫͙i̹͓̳̳̮͎̫̕n͟d̴̪̜̖ ̰͉̩͇͙̲͞ͅT͖̼͓̪͢h͏͓̮̻e̬̝̟ͅ ̤̹̝W͙̞̝͔͇͝ͅa͏͓͔̹̼̣l̴͔̰̤̟͔ḽ̫.͕', '"><script>alert(document.title)</script>', "'><script>alert(document.title)</script>", "-$1.00",
'><script>alert(document.title)</script>', '</script><script>alert(document.title)</script>', '< / script >< script >alert(document.title)< / script >', "0.00",
' onfocus=alert(document.title) autofocus ','" onfocus=alert(document.title) autofocus ', "' onfocus=alert(document.title) autofocus ", "0..0",
'scriptalert(document.title)/script', '/dev/null; touch /tmp/blns.fail ; echo', '../../../../../../../../../../../etc/passwd%00', ".",
'../../../../../../../../../../../etc/hosts', '() { 0; }; touch /tmp/blns.shellshock1.fail;', "0.0.0",
'() { _; } >_[$($())] { touch /tmp/blns.shellshock2.fail; }' "-.",
",./;'[]\\-=",
"ثم نفس سقطت وبالتحديد،, جزيرتي باستخدام أن دنو. إذ هنا؟ الستار وتنصيب كان. أهّل ايطاليا، بريطانيا-فرنسا قد أخذ. سليمان، إتفاقية بين ما, يذكر الحدود أي بعد, معاملة بولندا، الإطلاق عل إيو.",
"test\x00",
"Ṱ̺̺̕o͞ ̷i̲̬͇̪͙n̝̗͕v̟̜̘̦͟o̶̙̰̠kè͚̮̺̪̹̱̤ ̖t̝͕̳̣̻̪͞h̼͓̲̦̳̘̲e͇̣̰̦̬͎ ̢̼̻̱̘h͚͎͙̜̣̲ͅi̦̲̣̰̤v̻͍e̺̭̳̪̰-m̢iͅn̖̺̞̲̯̰d̵̼̟͙̩̼̘̳ ̞̥̱̳̭r̛̗̘e͙p͠r̼̞̻̭̗e̺̠̣͟s̘͇̳͍̝͉e͉̥̯̞̲͚̬͜ǹ̬͎͎̟̖͇̤t͍̬̤͓̼̭͘ͅi̪̱n͠g̴͉ ͏͉ͅc̬̟h͡a̫̻̯͘o̫̟̖͍̙̝͉s̗̦̲.̨̹͈̣",
"̡͓̞ͅI̗̘̦͝n͇͇͙v̮̫ok̲̫̙͈i̖͙̭̹̠̞n̡̻̮̣̺g̲͈͙̭͙̬͎ ̰t͔̦h̞̲e̢̤ ͍̬̲͖f̴̘͕̣è͖ẹ̥̩l͖͔͚i͓͚̦͠n͖͍̗͓̳̮g͍ ̨o͚̪͡f̘̣̬ ̖̘͖̟͙̮c҉͔̫͖͓͇͖ͅh̵̤̣͚͔á̗̼͕ͅo̼̣̥s̱͈̺̖̦̻͢.̛̖̞̠̫̰",
"̗̺͖̹̯͓Ṯ̤͍̥͇͈h̲́e͏͓̼̗̙̼̣͔ ͇̜̱̠͓͍ͅN͕͠e̗̱z̘̝̜̺͙p̤̺̹͍̯͚e̠̻̠͜r̨̤͍̺̖͔̖̖d̠̟̭̬̝͟i̦͖̩͓͔̤a̠̗̬͉̙n͚͜ ̻̞̰͚ͅh̵͉i̳̞v̢͇ḙ͎͟-҉̭̩̼͔m̤̭̫i͕͇̝̦n̗͙ḍ̟ ̯̲͕͞ǫ̟̯̰̲͙̻̝f ̪̰̰̗̖̭̘͘c̦͍̲̞͍̩̙ḥ͚a̮͎̟̙͜ơ̩̹͎s̤.̝̝ ҉Z̡̖̜͖̰̣͉̜a͖̰͙̬͡l̲̫̳͍̩g̡̟̼̱͚̞̬ͅo̗͜.̟",
"̦H̬̤̗̤͝e͜ ̜̥̝̻͍̟́w̕h̖̯͓o̝͙̖͎̱̮ ҉̺̙̞̟͈W̷̼̭a̺̪͍į͈͕̭͙̯̜t̶̼̮s̘͙͖̕ ̠̫̠B̻͍͙͉̳ͅe̵h̵̬͇̫͙i̹͓̳̳̮͎̫̕n͟d̴̪̜̖ ̰͉̩͇͙̲͞ͅT͖̼͓̪͢h͏͓̮̻e̬̝̟ͅ ̤̹̝W͙̞̝͔͇͝ͅa͏͓͔̹̼̣l̴͔̰̤̟͔ḽ̫.͕",
'"><script>alert(document.title)</script>',
"'><script>alert(document.title)</script>",
"><script>alert(document.title)</script>",
"</script><script>alert(document.title)</script>",
"< / script >< script >alert(document.title)< / script >",
" onfocus=alert(document.title) autofocus ",
'" onfocus=alert(document.title) autofocus ',
"' onfocus=alert(document.title) autofocus ",
"scriptalert(document.title)/script",
"/dev/null; touch /tmp/blns.fail ; echo",
"../../../../../../../../../../../etc/passwd%00",
"../../../../../../../../../../../etc/hosts",
"() { 0; }; touch /tmp/blns.shellshock1.fail;",
"() { _; } >_[$($())] { touch /tmp/blns.shellshock2.fail; }",
] ]
naughty_strings = [naughty for naughty in naughty_strings if naughty not in skipped_naughty_strings] naughty_strings = [naughty for naughty in naughty_strings if naughty not in skipped_naughty_strings]
@ -50,22 +71,18 @@ def send_naughty_tx(asset, metadata):
# ## Set up a connection to Planetmint # ## Set up a connection to Planetmint
# Check [test_basic.py](./test_basic.html) to get some more details # Check [test_basic.py](./test_basic.html) to get some more details
# about the endpoint. # about the endpoint.
bdb = Planetmint(os.environ.get('PLANETMINT_ENDPOINT')) bdb = Planetmint(os.environ.get("PLANETMINT_ENDPOINT"))
# Here's Alice. # Here's Alice.
alice = generate_keypair() alice = generate_keypair()
# Alice is in a naughty mood today, so she creates a tx with some naughty strings # Alice is in a naughty mood today, so she creates a tx with some naughty strings
prepared_transaction = bdb.transactions.prepare( prepared_transaction = bdb.transactions.prepare(
operation='CREATE', operation="CREATE", signers=alice.public_key, asset=asset, metadata=metadata
signers=alice.public_key, )
asset=asset,
metadata=metadata)
# She fulfills the transaction # She fulfills the transaction
fulfilled_transaction = bdb.transactions.fulfill( fulfilled_transaction = bdb.transactions.fulfill(prepared_transaction, private_keys=alice.private_key)
prepared_transaction,
private_keys=alice.private_key)
# The fulfilled tx gets sent to the BDB network # The fulfilled tx gets sent to the BDB network
try: try:
@ -74,23 +91,24 @@ def send_naughty_tx(asset, metadata):
sent_transaction = e sent_transaction = e
# If her key contained a '.', began with a '$', or contained a NUL character # If her key contained a '.', began with a '$', or contained a NUL character
regex = '.*\..*|\$.*|.*\x00.*' regex = ".*\..*|\$.*|.*\x00.*"
key = next(iter(metadata)) key = next(iter(metadata))
if re.match(regex, key): if re.match(regex, key):
# Then she expects a nicely formatted error code # Then she expects a nicely formatted error code
status_code = sent_transaction.status_code status_code = sent_transaction.status_code
error = sent_transaction.error error = sent_transaction.error
regex = ( regex = (
r'\{\s*\n*' r"\{\s*\n*"
r'\s*"message":\s*"Invalid transaction \(ValidationError\):\s*' r'\s*"message":\s*"Invalid transaction \(ValidationError\):\s*'
r'Invalid key name.*The key name cannot contain characters.*\n*' r"Invalid key name.*The key name cannot contain characters.*\n*"
r'\s*"status":\s*400\n*' r'\s*"status":\s*400\n*'
r'\s*\}\n*') r"\s*\}\n*"
)
assert status_code == 400 assert status_code == 400
assert re.fullmatch(regex, error), sent_transaction assert re.fullmatch(regex, error), sent_transaction
# Otherwise, she expects to see her transaction in the database # Otherwise, she expects to see her transaction in the database
elif 'id' in sent_transaction.keys(): elif "id" in sent_transaction.keys():
tx_id = sent_transaction['id'] tx_id = sent_transaction["id"]
assert bdb.transactions.retrieve(tx_id) assert bdb.transactions.retrieve(tx_id)
# If neither condition was true, then something weird happened... # If neither condition was true, then something weird happened...
else: else:
@ -100,8 +118,8 @@ def send_naughty_tx(asset, metadata):
@pytest.mark.parametrize("naughty_string", naughty_strings, ids=naughty_strings) @pytest.mark.parametrize("naughty_string", naughty_strings, ids=naughty_strings)
def test_naughty_keys(naughty_string): def test_naughty_keys(naughty_string):
asset = {'data': {naughty_string: 'nice_value'}} asset = {"data": {naughty_string: "nice_value"}}
metadata = {naughty_string: 'nice_value'} metadata = {naughty_string: "nice_value"}
send_naughty_tx(asset, metadata) send_naughty_tx(asset, metadata)
@ -109,7 +127,7 @@ def test_naughty_keys(naughty_string):
@pytest.mark.parametrize("naughty_string", naughty_strings, ids=naughty_strings) @pytest.mark.parametrize("naughty_string", naughty_strings, ids=naughty_strings)
def test_naughty_values(naughty_string): def test_naughty_values(naughty_string):
asset = {'data': {'nice_key': naughty_string}} asset = {"data": {"nice_key": naughty_string}}
metadata = {'nice_key': naughty_string} metadata = {"nice_key": naughty_string}
send_naughty_tx(asset, metadata) send_naughty_tx(asset, metadata)

View File

@ -35,10 +35,10 @@ def test_stream():
# ## Set up the test # ## Set up the test
# We use the env variable `BICHAINDB_ENDPOINT` to know where to connect. # We use the env variable `BICHAINDB_ENDPOINT` to know where to connect.
# Check [test_basic.py](./test_basic.html) for more information. # Check [test_basic.py](./test_basic.html) for more information.
BDB_ENDPOINT = os.environ.get('PLANETMINT_ENDPOINT') BDB_ENDPOINT = os.environ.get("PLANETMINT_ENDPOINT")
# *That's pretty bad, but let's do like this for now.* # *That's pretty bad, but let's do like this for now.*
WS_ENDPOINT = 'ws://{}:9985/api/v1/streams/valid_transactions'.format(BDB_ENDPOINT.rsplit(':')[0]) WS_ENDPOINT = "ws://{}:9985/api/v1/streams/valid_transactions".format(BDB_ENDPOINT.rsplit(":")[0])
bdb = Planetmint(BDB_ENDPOINT) bdb = Planetmint(BDB_ENDPOINT)
@ -91,10 +91,10 @@ def test_stream():
for _ in range(10): for _ in range(10):
tx = bdb.transactions.fulfill( tx = bdb.transactions.fulfill(
bdb.transactions.prepare( bdb.transactions.prepare(
operation='CREATE', operation="CREATE", signers=alice.public_key, asset={"data": {"uuid": str(uuid4())}}
signers=alice.public_key, ),
asset={'data': {'uuid': str(uuid4())}}), private_keys=alice.private_key,
private_keys=alice.private_key) )
# We don't want to wait for each transaction to be in a block. By using # We don't want to wait for each transaction to be in a block. By using
# `async` mode, we make sure that the driver returns as soon as the # `async` mode, we make sure that the driver returns as soon as the
# transaction is pushed to the Planetmint API. Remember: we expect all # transaction is pushed to the Planetmint API. Remember: we expect all
@ -104,7 +104,7 @@ def test_stream():
bdb.transactions.send_async(tx) bdb.transactions.send_async(tx)
# The `id` of every sent transaction is then stored in a list. # The `id` of every sent transaction is then stored in a list.
sent.append(tx['id']) sent.append(tx["id"])
# ## Check the valid transactions coming from Planetmint # ## Check the valid transactions coming from Planetmint
# Now we are ready to check if Planetmint did its job. A simple way to # Now we are ready to check if Planetmint did its job. A simple way to
@ -118,9 +118,9 @@ def test_stream():
# the timeout, then game over ¯\\\_(ツ)\_/¯ # the timeout, then game over ¯\\\_(ツ)\_/¯
try: try:
event = received.get(timeout=5) event = received.get(timeout=5)
txid = json.loads(event)['transaction_id'] txid = json.loads(event)["transaction_id"]
except queue.Empty: except queue.Empty:
assert False, 'Did not receive all expected transactions' assert False, "Did not receive all expected transactions"
# Last thing is to try to remove the `txid` from the set of sent # Last thing is to try to remove the `txid` from the set of sent
# transactions. If this test is running in parallel with others, we # transactions. If this test is running in parallel with others, we

View File

@ -9,73 +9,80 @@ from planetmint_driver import Planetmint
from planetmint_driver.crypto import generate_keypair from planetmint_driver.crypto import generate_keypair
def test_zenroom_signing(
gen_key_zencode,
secret_key_to_private_key_zencode,
def test_zenroom_signing(gen_key_zencode, secret_key_to_private_key_zencode, fulfill_script_zencode,
fulfill_script_zencode, zenroom_data, zenroom_house_assets, zenroom_data,
condition_script_zencode): zenroom_house_assets,
condition_script_zencode,
):
biolabs = generate_keypair() biolabs = generate_keypair()
version = '2.0' version = "2.0"
alice = json.loads(zencode_exec(gen_key_zencode).output)['keyring']
bob = json.loads(zencode_exec(gen_key_zencode).output)['keyring']
zen_public_keys = json.loads(zencode_exec(secret_key_to_private_key_zencode.format('Alice'),
keys=json.dumps({'keyring': alice})).output)
zen_public_keys.update(json.loads(zencode_exec(secret_key_to_private_key_zencode.format('Bob'),
keys=json.dumps({'keyring': bob})).output))
alice = json.loads(zencode_exec(gen_key_zencode).output)["keyring"]
bob = json.loads(zencode_exec(gen_key_zencode).output)["keyring"]
zen_public_keys = json.loads(
zencode_exec(secret_key_to_private_key_zencode.format("Alice"), keys=json.dumps({"keyring": alice})).output
)
zen_public_keys.update(
json.loads(
zencode_exec(secret_key_to_private_key_zencode.format("Bob"), keys=json.dumps({"keyring": bob})).output
)
)
zenroomscpt = ZenroomSha256(script=fulfill_script_zencode, data=zenroom_data, keys=zen_public_keys) zenroomscpt = ZenroomSha256(script=fulfill_script_zencode, data=zenroom_data, keys=zen_public_keys)
print(F'zenroom is: {zenroomscpt.script}') print(f"zenroom is: {zenroomscpt.script}")
# CRYPTO-CONDITIONS: generate the condition uri # CRYPTO-CONDITIONS: generate the condition uri
condition_uri_zen = zenroomscpt.condition.serialize_uri() condition_uri_zen = zenroomscpt.condition.serialize_uri()
print(F'\nzenroom condition URI: {condition_uri_zen}') print(f"\nzenroom condition URI: {condition_uri_zen}")
# CRYPTO-CONDITIONS: construct an unsigned fulfillment dictionary # CRYPTO-CONDITIONS: construct an unsigned fulfillment dictionary
unsigned_fulfillment_dict_zen = { unsigned_fulfillment_dict_zen = {
'type': zenroomscpt.TYPE_NAME, "type": zenroomscpt.TYPE_NAME,
'public_key': base58.b58encode(biolabs.public_key).decode(), "public_key": base58.b58encode(biolabs.public_key).decode(),
} }
output = { output = {
'amount': '10', "amount": "10",
'condition': { "condition": {
'details': unsigned_fulfillment_dict_zen, "details": unsigned_fulfillment_dict_zen,
'uri': condition_uri_zen, "uri": condition_uri_zen,
}, },
'public_keys': [biolabs.public_key,], "public_keys": [
biolabs.public_key,
],
} }
input_ = { input_ = {
'fulfillment': None, "fulfillment": None,
'fulfills': None, "fulfills": None,
'owners_before': [biolabs.public_key,] "owners_before": [
} biolabs.public_key,
metadata = { ],
"result": {
"output": ["ok"]
}
} }
metadata = {"result": {"output": ["ok"]}}
token_creation_tx = { token_creation_tx = {
'operation': 'CREATE', "operation": "CREATE",
'asset': zenroom_house_assets, "asset": zenroom_house_assets,
'metadata': metadata, "metadata": metadata,
'outputs': [output,], "outputs": [
'inputs': [input_,], output,
'version': version, ],
'id': None, "inputs": [
input_,
],
"version": version,
"id": None,
} }
# JSON: serialize the transaction-without-id to a json formatted string # JSON: serialize the transaction-without-id to a json formatted string
message = json.dumps( message = json.dumps(
token_creation_tx, token_creation_tx,
sort_keys=True, sort_keys=True,
separators=(',', ':'), separators=(",", ":"),
ensure_ascii=False, ensure_ascii=False,
) )
@ -85,30 +92,22 @@ def test_zenroom_signing(gen_key_zencode, secret_key_to_private_key_zencode,
# #
# the server should ick the fulfill script and recreate the zenroom-sha and verify the signature # the server should ick the fulfill script and recreate the zenroom-sha and verify the signature
message = zenroomscpt.sign(message, condition_script_zencode, alice) message = zenroomscpt.sign(message, condition_script_zencode, alice)
assert(zenroomscpt.validate(message=message)) assert zenroomscpt.validate(message=message)
message = json.loads(message) message = json.loads(message)
fulfillment_uri_zen = zenroomscpt.serialize_uri() fulfillment_uri_zen = zenroomscpt.serialize_uri()
message['inputs'][0]['fulfillment'] = fulfillment_uri_zen message["inputs"][0]["fulfillment"] = fulfillment_uri_zen
tx = message tx = message
tx['id'] = None tx["id"] = None
json_str_tx = json.dumps( json_str_tx = json.dumps(tx, sort_keys=True, skipkeys=False, separators=(",", ":"))
tx,
sort_keys=True,
skipkeys=False,
separators=(',', ':')
)
# SHA3: hash the serialized id-less transaction to generate the id # SHA3: hash the serialized id-less transaction to generate the id
shared_creation_txid = sha3_256(json_str_tx.encode()).hexdigest() shared_creation_txid = sha3_256(json_str_tx.encode()).hexdigest()
message['id'] = shared_creation_txid message["id"] = shared_creation_txid
# `https://example.com:9984` # `https://example.com:9984`
plntmnt = Planetmint(os.environ.get('PLANETMINT_ENDPOINT')) plntmnt = Planetmint(os.environ.get("PLANETMINT_ENDPOINT"))
sent_transfer_tx = plntmnt.transactions.send_commit(message) sent_transfer_tx = plntmnt.transactions.send_commit(message)
print(f"\n\nstatus and result : + {sent_transfer_tx}") print(f"\n\nstatus and result : + {sent_transfer_tx}")

View File

@ -60,8 +60,8 @@ services:
test: ["CMD", "bash", "-c", "curl http://planetmint:9984 && curl http://tendermint:26657/abci_query"] test: ["CMD", "bash", "-c", "curl http://planetmint:9984 && curl http://tendermint:26657/abci_query"]
interval: 3s interval: 3s
timeout: 5s timeout: 5s
retries: 3 retries: 5
command: '.ci/entrypoint.sh' command: 'scripts/entrypoint.sh'
restart: always restart: always
tendermint: tendermint:
@ -119,16 +119,6 @@ services:
volumes: volumes:
- ./docs/root/build/html:/usr/share/nginx/html - ./docs/root/build/html:/usr/share/nginx/html
# Lints project according to PEP8
lint:
image: alpine/flake8
command: --max-line-length 119 /planetmint /acceptance /integration /tests
volumes:
- ./planetmint:/planetmint
- ./acceptance:/acceptance
- ./integration:/integration
- ./tests:/tests
# Remove all build, test, coverage and Python artifacts # Remove all build, test, coverage and Python artifacts
clean: clean:
image: alpine image: alpine

View File

@ -20,28 +20,36 @@ from planetmint.web import server
TPLS = {} TPLS = {}
TPLS['index-response'] = """\ TPLS[
"index-response"
] = """\
HTTP/1.1 200 OK HTTP/1.1 200 OK
Content-Type: application/json Content-Type: application/json
%(index)s %(index)s
""" """
TPLS['api-index-response'] = """\ TPLS[
"api-index-response"
] = """\
HTTP/1.1 200 OK HTTP/1.1 200 OK
Content-Type: application/json Content-Type: application/json
%(api_index)s %(api_index)s
""" """
TPLS['get-tx-id-request'] = """\ TPLS[
"get-tx-id-request"
] = """\
GET /api/v1/transactions/%(txid)s HTTP/1.1 GET /api/v1/transactions/%(txid)s HTTP/1.1
Host: example.com Host: example.com
""" """
TPLS['get-tx-id-response'] = """\ TPLS[
"get-tx-id-response"
] = """\
HTTP/1.1 200 OK HTTP/1.1 200 OK
Content-Type: application/json Content-Type: application/json
@ -49,14 +57,18 @@ Content-Type: application/json
""" """
TPLS['get-tx-by-asset-request'] = """\ TPLS[
"get-tx-by-asset-request"
] = """\
GET /api/v1/transactions?operation=TRANSFER&asset_id=%(txid)s HTTP/1.1 GET /api/v1/transactions?operation=TRANSFER&asset_id=%(txid)s HTTP/1.1
Host: example.com Host: example.com
""" """
TPLS['get-tx-by-asset-response'] = """\ TPLS[
"get-tx-by-asset-response"
] = """\
HTTP/1.1 200 OK HTTP/1.1 200 OK
Content-Type: application/json Content-Type: application/json
@ -64,7 +76,9 @@ Content-Type: application/json
%(tx_transfer_last)s] %(tx_transfer_last)s]
""" """
TPLS['post-tx-request'] = """\ TPLS[
"post-tx-request"
] = """\
POST /api/v1/transactions?mode=async HTTP/1.1 POST /api/v1/transactions?mode=async HTTP/1.1
Host: example.com Host: example.com
Content-Type: application/json Content-Type: application/json
@ -73,7 +87,9 @@ Content-Type: application/json
""" """
TPLS['post-tx-response'] = """\ TPLS[
"post-tx-response"
] = """\
HTTP/1.1 202 Accepted HTTP/1.1 202 Accepted
Content-Type: application/json Content-Type: application/json
@ -81,14 +97,18 @@ Content-Type: application/json
""" """
TPLS['get-block-request'] = """\ TPLS[
"get-block-request"
] = """\
GET /api/v1/blocks/%(blockid)s HTTP/1.1 GET /api/v1/blocks/%(blockid)s HTTP/1.1
Host: example.com Host: example.com
""" """
TPLS['get-block-response'] = """\ TPLS[
"get-block-response"
] = """\
HTTP/1.1 200 OK HTTP/1.1 200 OK
Content-Type: application/json Content-Type: application/json
@ -96,14 +116,18 @@ Content-Type: application/json
""" """
TPLS['get-block-txid-request'] = """\ TPLS[
"get-block-txid-request"
] = """\
GET /api/v1/blocks?transaction_id=%(txid)s HTTP/1.1 GET /api/v1/blocks?transaction_id=%(txid)s HTTP/1.1
Host: example.com Host: example.com
""" """
TPLS['get-block-txid-response'] = """\ TPLS[
"get-block-txid-response"
] = """\
HTTP/1.1 200 OK HTTP/1.1 200 OK
Content-Type: application/json Content-Type: application/json
@ -121,83 +145,84 @@ def main():
client = server.create_app().test_client() client = server.create_app().test_client()
host = 'example.com:9984' host = "example.com:9984"
# HTTP Index # HTTP Index
res = client.get('/', environ_overrides={'HTTP_HOST': host}) res = client.get("/", environ_overrides={"HTTP_HOST": host})
res_data = json.loads(res.data.decode()) res_data = json.loads(res.data.decode())
ctx['index'] = pretty_json(res_data) ctx["index"] = pretty_json(res_data)
# API index # API index
res = client.get('/api/v1/', environ_overrides={'HTTP_HOST': host}) res = client.get("/api/v1/", environ_overrides={"HTTP_HOST": host})
ctx['api_index'] = pretty_json(json.loads(res.data.decode())) ctx["api_index"] = pretty_json(json.loads(res.data.decode()))
# tx create # tx create
privkey = 'CfdqtD7sS7FgkMoGPXw55MVGGFwQLAoHYTcBhZDtF99Z' privkey = "CfdqtD7sS7FgkMoGPXw55MVGGFwQLAoHYTcBhZDtF99Z"
pubkey = '4K9sWUMFwTgaDGPfdynrbxWqWS6sWmKbZoTjxLtVUibD' pubkey = "4K9sWUMFwTgaDGPfdynrbxWqWS6sWmKbZoTjxLtVUibD"
asset = {'msg': 'Hello Planetmint!'} asset = {"msg": "Hello Planetmint!"}
tx = Create.generate([pubkey], [([pubkey], 1)], asset=asset, metadata={'sequence': 0}) tx = Create.generate([pubkey], [([pubkey], 1)], asset=asset, metadata={"sequence": 0})
tx = tx.sign([privkey]) tx = tx.sign([privkey])
ctx['tx'] = pretty_json(tx.to_dict()) ctx["tx"] = pretty_json(tx.to_dict())
ctx['public_keys'] = tx.outputs[0].public_keys[0] ctx["public_keys"] = tx.outputs[0].public_keys[0]
ctx['txid'] = tx.id ctx["txid"] = tx.id
# tx transfer # tx transfer
privkey_transfer = '3AeWpPdhEZzWLYfkfYHBfMFC2r1f8HEaGS9NtbbKssya' privkey_transfer = "3AeWpPdhEZzWLYfkfYHBfMFC2r1f8HEaGS9NtbbKssya"
pubkey_transfer = '3yfQPHeWAa1MxTX9Zf9176QqcpcnWcanVZZbaHb8B3h9' pubkey_transfer = "3yfQPHeWAa1MxTX9Zf9176QqcpcnWcanVZZbaHb8B3h9"
cid = 0 cid = 0
input_ = Input(fulfillment=tx.outputs[cid].fulfillment, input_ = Input(
fulfillment=tx.outputs[cid].fulfillment,
fulfills=TransactionLink(txid=tx.id, output=cid), fulfills=TransactionLink(txid=tx.id, output=cid),
owners_before=tx.outputs[cid].public_keys) owners_before=tx.outputs[cid].public_keys,
tx_transfer = Transfer.generate([input_], [([pubkey_transfer], 1)], asset_id=tx.id, metadata={'sequence': 1}) )
tx_transfer = Transfer.generate([input_], [([pubkey_transfer], 1)], asset_id=tx.id, metadata={"sequence": 1})
tx_transfer = tx_transfer.sign([privkey]) tx_transfer = tx_transfer.sign([privkey])
ctx['tx_transfer'] = pretty_json(tx_transfer.to_dict()) ctx["tx_transfer"] = pretty_json(tx_transfer.to_dict())
ctx['public_keys_transfer'] = tx_transfer.outputs[0].public_keys[0] ctx["public_keys_transfer"] = tx_transfer.outputs[0].public_keys[0]
ctx['tx_transfer_id'] = tx_transfer.id ctx["tx_transfer_id"] = tx_transfer.id
# privkey_transfer_last = 'sG3jWDtdTXUidBJK53ucSTrosktG616U3tQHBk81eQe' # privkey_transfer_last = 'sG3jWDtdTXUidBJK53ucSTrosktG616U3tQHBk81eQe'
pubkey_transfer_last = '3Af3fhhjU6d9WecEM9Uw5hfom9kNEwE7YuDWdqAUssqm' pubkey_transfer_last = "3Af3fhhjU6d9WecEM9Uw5hfom9kNEwE7YuDWdqAUssqm"
cid = 0 cid = 0
input_ = Input(fulfillment=tx_transfer.outputs[cid].fulfillment, input_ = Input(
fulfillment=tx_transfer.outputs[cid].fulfillment,
fulfills=TransactionLink(txid=tx_transfer.id, output=cid), fulfills=TransactionLink(txid=tx_transfer.id, output=cid),
owners_before=tx_transfer.outputs[cid].public_keys) owners_before=tx_transfer.outputs[cid].public_keys,
tx_transfer_last = Transfer.generate([input_], [([pubkey_transfer_last], 1)], )
asset_id=tx.id, metadata={'sequence': 2}) tx_transfer_last = Transfer.generate(
[input_], [([pubkey_transfer_last], 1)], asset_id=tx.id, metadata={"sequence": 2}
)
tx_transfer_last = tx_transfer_last.sign([privkey_transfer]) tx_transfer_last = tx_transfer_last.sign([privkey_transfer])
ctx['tx_transfer_last'] = pretty_json(tx_transfer_last.to_dict()) ctx["tx_transfer_last"] = pretty_json(tx_transfer_last.to_dict())
ctx['tx_transfer_last_id'] = tx_transfer_last.id ctx["tx_transfer_last_id"] = tx_transfer_last.id
ctx['public_keys_transfer_last'] = tx_transfer_last.outputs[0].public_keys[0] ctx["public_keys_transfer_last"] = tx_transfer_last.outputs[0].public_keys[0]
# block # block
node_private = "5G2kE1zJAgTajkVSbPAQWo4c2izvtwqaNHYsaNpbbvxX" node_private = "5G2kE1zJAgTajkVSbPAQWo4c2izvtwqaNHYsaNpbbvxX"
node_public = "DngBurxfeNVKZWCEcDnLj1eMPAS7focUZTE5FndFGuHT" node_public = "DngBurxfeNVKZWCEcDnLj1eMPAS7focUZTE5FndFGuHT"
signature = "53wxrEQDYk1dXzmvNSytbCfmNVnPqPkDQaTnAe8Jf43s6ssejPxezkCvUnGTnduNUmaLjhaan1iRLi3peu6s5DzA" signature = "53wxrEQDYk1dXzmvNSytbCfmNVnPqPkDQaTnAe8Jf43s6ssejPxezkCvUnGTnduNUmaLjhaan1iRLi3peu6s5DzA"
app_hash = 'f6e0c49c6d94d6924351f25bb334cf2a99af4206339bf784e741d1a5ab599056' app_hash = "f6e0c49c6d94d6924351f25bb334cf2a99af4206339bf784e741d1a5ab599056"
block = lib.Block(height=1, transactions=[tx.to_dict()], app_hash=app_hash) block = lib.Block(height=1, transactions=[tx.to_dict()], app_hash=app_hash)
block_dict = block._asdict() block_dict = block._asdict()
block_dict.pop('app_hash') block_dict.pop("app_hash")
ctx['block'] = pretty_json(block_dict) ctx["block"] = pretty_json(block_dict)
ctx['blockid'] = block.height ctx["blockid"] = block.height
# block status # block status
block_list = [ block_list = [block.height]
block.height ctx["block_list"] = pretty_json(block_list)
]
ctx['block_list'] = pretty_json(block_list)
base_path = os.path.join(os.path.dirname(__file__), "source/connecting/http-samples")
base_path = os.path.join(os.path.dirname(__file__),
'source/connecting/http-samples')
if not os.path.exists(base_path): if not os.path.exists(base_path):
os.makedirs(base_path) os.makedirs(base_path)
for name, tpl in TPLS.items(): for name, tpl in TPLS.items():
path = os.path.join(base_path, name + '.http') path = os.path.join(base_path, name + ".http")
code = tpl % ctx code = tpl % ctx
with open(path, 'w') as handle: with open(path, "w") as handle:
handle.write(code) handle.write(code)
@ -206,5 +231,5 @@ def setup(*_):
main() main()
if __name__ == '__main__': if __name__ == "__main__":
main() main()

View File

@ -82,11 +82,11 @@ x = 'name: {}; score: {}'.format(name, n)
we use the `format()` version. The [official Python documentation says](https://docs.python.org/2/library/stdtypes.html#str.format), "This method of string formatting is the new standard in Python 3, and should be preferred to the % formatting described in String Formatting Operations in new code." we use the `format()` version. The [official Python documentation says](https://docs.python.org/2/library/stdtypes.html#str.format), "This method of string formatting is the new standard in Python 3, and should be preferred to the % formatting described in String Formatting Operations in new code."
## Running the Flake8 Style Checker ## Running the Black Style Checker
We use [Flake8](http://flake8.pycqa.org/en/latest/index.html) to check our Python code style. Once you have it installed, you can run it using: We use [Black](https://black.readthedocs.io/en/stable/) to check our Python code style. Once you have it installed, you can run it using:
```text ```text
flake8 --max-line-length 119 planetmint/ black --check -l 119 .
``` ```

View File

@ -32,5 +32,4 @@ class Hosts:
def assert_transaction(self, tx_id) -> None: def assert_transaction(self, tx_id) -> None:
txs = self.get_transactions(tx_id) txs = self.get_transactions(tx_id)
for tx in txs: for tx in txs:
assert txs[0] == tx, \ assert txs[0] == tx, "Cannot find transaction {}".format(tx_id)
'Cannot find transaction {}'.format(tx_id)

View File

@ -14,7 +14,7 @@ import time
def test_basic(): def test_basic():
# Setup up connection to Planetmint integration test nodes # Setup up connection to Planetmint integration test nodes
hosts = Hosts('/shared/hostnames') hosts = Hosts("/shared/hostnames")
pm_alpha = hosts.get_connection() pm_alpha = hosts.get_connection()
# genarate a keypair # genarate a keypair
@ -22,62 +22,64 @@ def test_basic():
# create a digital asset for Alice # create a digital asset for Alice
game_boy_token = { game_boy_token = {
'data': { "data": {
'hash': '0xFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF', "hash": "0xFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF",
'storageID': '0xFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF', "storageID": "0xFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF",
}, },
} }
# prepare the transaction with the digital asset and issue 10 tokens to bob # prepare the transaction with the digital asset and issue 10 tokens to bob
prepared_creation_tx = pm_alpha.transactions.prepare( prepared_creation_tx = pm_alpha.transactions.prepare(
operation='CREATE', operation="CREATE",
metadata={ metadata={
'hash': '0xFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF', "hash": "0xFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF",
'storageID': '0xFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF', }, "storageID": "0xFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF",
},
signers=alice.public_key, signers=alice.public_key,
recipients=[([alice.public_key], 10)], recipients=[([alice.public_key], 10)],
asset=game_boy_token) asset=game_boy_token,
)
# fulfill and send the transaction # fulfill and send the transaction
fulfilled_creation_tx = pm_alpha.transactions.fulfill( fulfilled_creation_tx = pm_alpha.transactions.fulfill(prepared_creation_tx, private_keys=alice.private_key)
prepared_creation_tx,
private_keys=alice.private_key)
pm_alpha.transactions.send_commit(fulfilled_creation_tx) pm_alpha.transactions.send_commit(fulfilled_creation_tx)
time.sleep(1) time.sleep(1)
creation_tx_id = fulfilled_creation_tx['id'] creation_tx_id = fulfilled_creation_tx["id"]
# Assert that transaction is stored on all planetmint nodes # Assert that transaction is stored on all planetmint nodes
hosts.assert_transaction(creation_tx_id) hosts.assert_transaction(creation_tx_id)
# Transfer # Transfer
# create the output and inout for the transaction # create the output and inout for the transaction
transfer_asset = {'id': creation_tx_id} transfer_asset = {"id": creation_tx_id}
output_index = 0 output_index = 0
output = fulfilled_creation_tx['outputs'][output_index] output = fulfilled_creation_tx["outputs"][output_index]
transfer_input = {'fulfillment': output['condition']['details'], transfer_input = {
'fulfills': {'output_index': output_index, "fulfillment": output["condition"]["details"],
'transaction_id': transfer_asset['id']}, "fulfills": {"output_index": output_index, "transaction_id": transfer_asset["id"]},
'owners_before': output['public_keys']} "owners_before": output["public_keys"],
}
# prepare the transaction and use 3 tokens # prepare the transaction and use 3 tokens
prepared_transfer_tx = pm_alpha.transactions.prepare( prepared_transfer_tx = pm_alpha.transactions.prepare(
operation='TRANSFER', operation="TRANSFER",
asset=transfer_asset, asset=transfer_asset,
inputs=transfer_input, inputs=transfer_input,
metadata={'hash': '0xFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF', metadata={
'storageID': '0xFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF', }, "hash": "0xFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF",
recipients=[([alice.public_key], 10)]) "storageID": "0xFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF",
},
recipients=[([alice.public_key], 10)],
)
# fulfill and send the transaction # fulfill and send the transaction
fulfilled_transfer_tx = pm_alpha.transactions.fulfill( fulfilled_transfer_tx = pm_alpha.transactions.fulfill(prepared_transfer_tx, private_keys=alice.private_key)
prepared_transfer_tx,
private_keys=alice.private_key)
sent_transfer_tx = pm_alpha.transactions.send_commit(fulfilled_transfer_tx) sent_transfer_tx = pm_alpha.transactions.send_commit(fulfilled_transfer_tx)
time.sleep(1) time.sleep(1)
transfer_tx_id = sent_transfer_tx['id'] transfer_tx_id = sent_transfer_tx["id"]
# Assert that transaction is stored on both planetmint nodes # Assert that transaction is stored on both planetmint nodes
hosts.assert_transaction(transfer_tx_id) hosts.assert_transaction(transfer_tx_id)

View File

@ -33,7 +33,7 @@ def test_divisible_assets():
# ## Set up a connection to Planetmint # ## Set up a connection to Planetmint
# Check [test_basic.py](./test_basic.html) to get some more details # Check [test_basic.py](./test_basic.html) to get some more details
# about the endpoint. # about the endpoint.
hosts = Hosts('/shared/hostnames') hosts = Hosts("/shared/hostnames")
pm = hosts.get_connection() pm = hosts.get_connection()
# Oh look, it is Alice again and she brought her friend Bob along. # Oh look, it is Alice again and she brought her friend Bob along.
@ -48,13 +48,9 @@ def test_divisible_assets():
# the bike for one hour. # the bike for one hour.
bike_token = { bike_token = {
'data': { "data": {
'token_for': { "token_for": {"bike": {"serial_number": 420420}},
'bike': { "description": "Time share token. Each token equals one hour of riding.",
'serial_number': 420420
}
},
'description': 'Time share token. Each token equals one hour of riding.',
}, },
} }
@ -62,28 +58,22 @@ def test_divisible_assets():
# Here, Alice defines in a tuple that she wants to assign # Here, Alice defines in a tuple that she wants to assign
# these 10 tokens to Bob. # these 10 tokens to Bob.
prepared_token_tx = pm.transactions.prepare( prepared_token_tx = pm.transactions.prepare(
operation='CREATE', operation="CREATE", signers=alice.public_key, recipients=[([bob.public_key], 10)], asset=bike_token
signers=alice.public_key, )
recipients=[([bob.public_key], 10)],
asset=bike_token)
# She fulfills and sends the transaction. # She fulfills and sends the transaction.
fulfilled_token_tx = pm.transactions.fulfill( fulfilled_token_tx = pm.transactions.fulfill(prepared_token_tx, private_keys=alice.private_key)
prepared_token_tx,
private_keys=alice.private_key)
pm.transactions.send_commit(fulfilled_token_tx) pm.transactions.send_commit(fulfilled_token_tx)
# We store the `id` of the transaction to use it later on. # We store the `id` of the transaction to use it later on.
bike_token_id = fulfilled_token_tx['id'] bike_token_id = fulfilled_token_tx["id"]
# Let's check if the transaction was successful. # Let's check if the transaction was successful.
assert pm.transactions.retrieve(bike_token_id), \ assert pm.transactions.retrieve(bike_token_id), "Cannot find transaction {}".format(bike_token_id)
'Cannot find transaction {}'.format(bike_token_id)
# Bob owns 10 tokens now. # Bob owns 10 tokens now.
assert pm.transactions.retrieve(bike_token_id)['outputs'][0][ assert pm.transactions.retrieve(bike_token_id)["outputs"][0]["amount"] == "10"
'amount'] == '10'
# ## Bob wants to use the bike # ## Bob wants to use the bike
# Now that Bob got the tokens and the sun is shining, he wants to get out # Now that Bob got the tokens and the sun is shining, he wants to get out
@ -91,51 +81,47 @@ def test_divisible_assets():
# To use the bike he has to send the tokens back to Alice. # To use the bike he has to send the tokens back to Alice.
# To learn about the details of transferring a transaction check out # To learn about the details of transferring a transaction check out
# [test_basic.py](./test_basic.html) # [test_basic.py](./test_basic.html)
transfer_asset = {'id': bike_token_id} transfer_asset = {"id": bike_token_id}
output_index = 0 output_index = 0
output = fulfilled_token_tx['outputs'][output_index] output = fulfilled_token_tx["outputs"][output_index]
transfer_input = {'fulfillment': output['condition']['details'], transfer_input = {
'fulfills': {'output_index': output_index, "fulfillment": output["condition"]["details"],
'transaction_id': fulfilled_token_tx[ "fulfills": {"output_index": output_index, "transaction_id": fulfilled_token_tx["id"]},
'id']}, "owners_before": output["public_keys"],
'owners_before': output['public_keys']} }
# To use the tokens Bob has to reassign 7 tokens to himself and the # To use the tokens Bob has to reassign 7 tokens to himself and the
# amount he wants to use to Alice. # amount he wants to use to Alice.
prepared_transfer_tx = pm.transactions.prepare( prepared_transfer_tx = pm.transactions.prepare(
operation='TRANSFER', operation="TRANSFER",
asset=transfer_asset, asset=transfer_asset,
inputs=transfer_input, inputs=transfer_input,
recipients=[([alice.public_key], 3), ([bob.public_key], 7)]) recipients=[([alice.public_key], 3), ([bob.public_key], 7)],
)
# He signs and sends the transaction. # He signs and sends the transaction.
fulfilled_transfer_tx = pm.transactions.fulfill( fulfilled_transfer_tx = pm.transactions.fulfill(prepared_transfer_tx, private_keys=bob.private_key)
prepared_transfer_tx,
private_keys=bob.private_key)
sent_transfer_tx = pm.transactions.send_commit(fulfilled_transfer_tx) sent_transfer_tx = pm.transactions.send_commit(fulfilled_transfer_tx)
# First, Bob checks if the transaction was successful. # First, Bob checks if the transaction was successful.
assert pm.transactions.retrieve( assert pm.transactions.retrieve(fulfilled_transfer_tx["id"]) == sent_transfer_tx
fulfilled_transfer_tx['id']) == sent_transfer_tx
hosts.assert_transaction(fulfilled_transfer_tx['id']) hosts.assert_transaction(fulfilled_transfer_tx["id"])
# There are two outputs in the transaction now. # There are two outputs in the transaction now.
# The first output shows that Alice got back 3 tokens... # The first output shows that Alice got back 3 tokens...
assert pm.transactions.retrieve( assert pm.transactions.retrieve(fulfilled_transfer_tx["id"])["outputs"][0]["amount"] == "3"
fulfilled_transfer_tx['id'])['outputs'][0]['amount'] == '3'
# ... while Bob still has 7 left. # ... while Bob still has 7 left.
assert pm.transactions.retrieve( assert pm.transactions.retrieve(fulfilled_transfer_tx["id"])["outputs"][1]["amount"] == "7"
fulfilled_transfer_tx['id'])['outputs'][1]['amount'] == '7'
# ## Bob wants to ride the bike again # ## Bob wants to ride the bike again
# It's been a week and Bob wants to right the bike again. # It's been a week and Bob wants to right the bike again.
# Now he wants to ride for 8 hours, that's a lot Bob! # Now he wants to ride for 8 hours, that's a lot Bob!
# He prepares the transaction again. # He prepares the transaction again.
transfer_asset = {'id': bike_token_id} transfer_asset = {"id": bike_token_id}
# This time we need an `output_index` of 1, since we have two outputs # This time we need an `output_index` of 1, since we have two outputs
# in the `fulfilled_transfer_tx` we created before. The first output with # in the `fulfilled_transfer_tx` we created before. The first output with
# index 0 is for Alice and the second output is for Bob. # index 0 is for Alice and the second output is for Bob.
@ -143,24 +129,21 @@ def test_divisible_assets():
# correct output with the correct amount of tokens. # correct output with the correct amount of tokens.
output_index = 1 output_index = 1
output = fulfilled_transfer_tx['outputs'][output_index] output = fulfilled_transfer_tx["outputs"][output_index]
transfer_input = {'fulfillment': output['condition']['details'], transfer_input = {
'fulfills': {'output_index': output_index, "fulfillment": output["condition"]["details"],
'transaction_id': fulfilled_transfer_tx['id']}, "fulfills": {"output_index": output_index, "transaction_id": fulfilled_transfer_tx["id"]},
'owners_before': output['public_keys']} "owners_before": output["public_keys"],
}
# This time Bob only provides Alice in the `recipients` because he wants # This time Bob only provides Alice in the `recipients` because he wants
# to spend all his tokens # to spend all his tokens
prepared_transfer_tx = pm.transactions.prepare( prepared_transfer_tx = pm.transactions.prepare(
operation='TRANSFER', operation="TRANSFER", asset=transfer_asset, inputs=transfer_input, recipients=[([alice.public_key], 8)]
asset=transfer_asset, )
inputs=transfer_input,
recipients=[([alice.public_key], 8)])
fulfilled_transfer_tx = pm.transactions.fulfill( fulfilled_transfer_tx = pm.transactions.fulfill(prepared_transfer_tx, private_keys=bob.private_key)
prepared_transfer_tx,
private_keys=bob.private_key)
# Oh Bob, what have you done?! You tried to spend more tokens than you had. # Oh Bob, what have you done?! You tried to spend more tokens than you had.
# Remember Bob, last time you spent 3 tokens already, # Remember Bob, last time you spent 3 tokens already,
@ -171,10 +154,12 @@ def test_divisible_assets():
# Now Bob gets an error saying that the amount he wanted to spent is # Now Bob gets an error saying that the amount he wanted to spent is
# higher than the amount of tokens he has left. # higher than the amount of tokens he has left.
assert error.value.args[0] == 400 assert error.value.args[0] == 400
message = 'Invalid transaction (AmountError): The amount used in the ' \ message = (
'inputs `7` needs to be same as the amount used in the ' \ "Invalid transaction (AmountError): The amount used in the "
'outputs `8`' "inputs `7` needs to be same as the amount used in the "
assert error.value.args[2]['message'] == message "outputs `8`"
)
assert error.value.args[2]["message"] == message
# We have to stop this test now, I am sorry, but Bob is pretty upset # We have to stop this test now, I am sorry, but Bob is pretty upset
# about his mistake. See you next time :) # about his mistake. See you next time :)

View File

@ -16,25 +16,23 @@ from .helper.hosts import Hosts
def test_double_create(): def test_double_create():
hosts = Hosts('/shared/hostnames') hosts = Hosts("/shared/hostnames")
pm = hosts.get_connection() pm = hosts.get_connection()
alice = generate_keypair() alice = generate_keypair()
results = queue.Queue() results = queue.Queue()
tx = pm.transactions.fulfill( tx = pm.transactions.fulfill(
pm.transactions.prepare( pm.transactions.prepare(operation="CREATE", signers=alice.public_key, asset={"data": {"uuid": str(uuid4())}}),
operation='CREATE', private_keys=alice.private_key,
signers=alice.public_key, )
asset={'data': {'uuid': str(uuid4())}}),
private_keys=alice.private_key)
def send_and_queue(tx): def send_and_queue(tx):
try: try:
pm.transactions.send_commit(tx) pm.transactions.send_commit(tx)
results.put('OK') results.put("OK")
except planetmint_driver.exceptions.TransportError: except planetmint_driver.exceptions.TransportError:
results.put('FAIL') results.put("FAIL")
t1 = Thread(target=send_and_queue, args=(tx,)) t1 = Thread(target=send_and_queue, args=(tx,))
t2 = Thread(target=send_and_queue, args=(tx,)) t2 = Thread(target=send_and_queue, args=(tx,))
@ -44,5 +42,5 @@ def test_double_create():
results = [results.get(timeout=2), results.get(timeout=2)] results = [results.get(timeout=2), results.get(timeout=2)]
assert results.count('OK') == 1 assert results.count("OK") == 1
assert results.count('FAIL') == 1 assert results.count("FAIL") == 1

View File

@ -28,7 +28,7 @@ from .helper.hosts import Hosts
def test_multiple_owners(): def test_multiple_owners():
# Setup up connection to Planetmint integration test nodes # Setup up connection to Planetmint integration test nodes
hosts = Hosts('/shared/hostnames') hosts = Hosts("/shared/hostnames")
pm_alpha = hosts.get_connection() pm_alpha = hosts.get_connection()
# Generate Keypairs for Alice and Bob! # Generate Keypairs for Alice and Bob!
@ -39,32 +39,22 @@ def test_multiple_owners():
# high rents anymore. Bob suggests to get a dish washer for the # high rents anymore. Bob suggests to get a dish washer for the
# kitchen. Alice agrees and here they go, creating the asset for their # kitchen. Alice agrees and here they go, creating the asset for their
# dish washer. # dish washer.
dw_asset = { dw_asset = {"data": {"dish washer": {"serial_number": 1337}}}
'data': {
'dish washer': {
'serial_number': 1337
}
}
}
# They prepare a `CREATE` transaction. To have multiple owners, both # They prepare a `CREATE` transaction. To have multiple owners, both
# Bob and Alice need to be the recipients. # Bob and Alice need to be the recipients.
prepared_dw_tx = pm_alpha.transactions.prepare( prepared_dw_tx = pm_alpha.transactions.prepare(
operation='CREATE', operation="CREATE", signers=alice.public_key, recipients=(alice.public_key, bob.public_key), asset=dw_asset
signers=alice.public_key, )
recipients=(alice.public_key, bob.public_key),
asset=dw_asset)
# Now they both sign the transaction by providing their private keys. # Now they both sign the transaction by providing their private keys.
# And send it afterwards. # And send it afterwards.
fulfilled_dw_tx = pm_alpha.transactions.fulfill( fulfilled_dw_tx = pm_alpha.transactions.fulfill(prepared_dw_tx, private_keys=[alice.private_key, bob.private_key])
prepared_dw_tx,
private_keys=[alice.private_key, bob.private_key])
pm_alpha.transactions.send_commit(fulfilled_dw_tx) pm_alpha.transactions.send_commit(fulfilled_dw_tx)
# We store the `id` of the transaction to use it later on. # We store the `id` of the transaction to use it later on.
dw_id = fulfilled_dw_tx['id'] dw_id = fulfilled_dw_tx["id"]
time.sleep(1) time.sleep(1)
@ -72,12 +62,10 @@ def test_multiple_owners():
hosts.assert_transaction(dw_id) hosts.assert_transaction(dw_id)
# Let's check if the transaction was successful. # Let's check if the transaction was successful.
assert pm_alpha.transactions.retrieve(dw_id), \ assert pm_alpha.transactions.retrieve(dw_id), "Cannot find transaction {}".format(dw_id)
'Cannot find transaction {}'.format(dw_id)
# The transaction should have two public keys in the outputs. # The transaction should have two public keys in the outputs.
assert len( assert len(pm_alpha.transactions.retrieve(dw_id)["outputs"][0]["public_keys"]) == 2
pm_alpha.transactions.retrieve(dw_id)['outputs'][0]['public_keys']) == 2
# ## Alice and Bob transfer a transaction to Carol. # ## Alice and Bob transfer a transaction to Carol.
# Alice and Bob save a lot of money living together. They often go out # Alice and Bob save a lot of money living together. They often go out
@ -89,43 +77,39 @@ def test_multiple_owners():
# Alice and Bob prepare the transaction to transfer the dish washer to # Alice and Bob prepare the transaction to transfer the dish washer to
# Carol. # Carol.
transfer_asset = {'id': dw_id} transfer_asset = {"id": dw_id}
output_index = 0 output_index = 0
output = fulfilled_dw_tx['outputs'][output_index] output = fulfilled_dw_tx["outputs"][output_index]
transfer_input = {'fulfillment': output['condition']['details'], transfer_input = {
'fulfills': {'output_index': output_index, "fulfillment": output["condition"]["details"],
'transaction_id': fulfilled_dw_tx[ "fulfills": {"output_index": output_index, "transaction_id": fulfilled_dw_tx["id"]},
'id']}, "owners_before": output["public_keys"],
'owners_before': output['public_keys']} }
# Now they create the transaction... # Now they create the transaction...
prepared_transfer_tx = pm_alpha.transactions.prepare( prepared_transfer_tx = pm_alpha.transactions.prepare(
operation='TRANSFER', operation="TRANSFER", asset=transfer_asset, inputs=transfer_input, recipients=carol.public_key
asset=transfer_asset, )
inputs=transfer_input,
recipients=carol.public_key)
# ... and sign it with their private keys, then send it. # ... and sign it with their private keys, then send it.
fulfilled_transfer_tx = pm_alpha.transactions.fulfill( fulfilled_transfer_tx = pm_alpha.transactions.fulfill(
prepared_transfer_tx, prepared_transfer_tx, private_keys=[alice.private_key, bob.private_key]
private_keys=[alice.private_key, bob.private_key]) )
sent_transfer_tx = pm_alpha.transactions.send_commit(fulfilled_transfer_tx) sent_transfer_tx = pm_alpha.transactions.send_commit(fulfilled_transfer_tx)
time.sleep(1) time.sleep(1)
# Now compare if both nodes returned the same transaction # Now compare if both nodes returned the same transaction
hosts.assert_transaction(fulfilled_transfer_tx['id']) hosts.assert_transaction(fulfilled_transfer_tx["id"])
# They check if the transaction was successful. # They check if the transaction was successful.
assert pm_alpha.transactions.retrieve( assert pm_alpha.transactions.retrieve(fulfilled_transfer_tx["id"]) == sent_transfer_tx
fulfilled_transfer_tx['id']) == sent_transfer_tx
# The owners before should include both Alice and Bob. # The owners before should include both Alice and Bob.
assert len( assert len(pm_alpha.transactions.retrieve(fulfilled_transfer_tx["id"])["inputs"][0]["owners_before"]) == 2
pm_alpha.transactions.retrieve(fulfilled_transfer_tx['id'])['inputs'][0][
'owners_before']) == 2
# While the new owner is Carol. # While the new owner is Carol.
assert pm_alpha.transactions.retrieve(fulfilled_transfer_tx['id'])[ assert (
'outputs'][0]['public_keys'][0] == carol.public_key pm_alpha.transactions.retrieve(fulfilled_transfer_tx["id"])["outputs"][0]["public_keys"][0] == carol.public_key
)

View File

@ -27,6 +27,40 @@ from planetmint_driver.exceptions import BadRequest
from .helper.hosts import Hosts from .helper.hosts import Hosts
naughty_strings = blns.all() naughty_strings = blns.all()
skipped_naughty_strings = [
"1.00",
"$1.00",
"-1.00",
"-$1.00",
"0.00",
"0..0",
".",
"0.0.0",
"-.",
",./;'[]\\-=",
"ثم نفس سقطت وبالتحديد،, جزيرتي باستخدام أن دنو. إذ هنا؟ الستار وتنصيب كان. أهّل ايطاليا، بريطانيا-فرنسا قد أخذ. سليمان، إتفاقية بين ما, يذكر الحدود أي بعد, معاملة بولندا، الإطلاق عل إيو.",
"test\x00",
"Ṱ̺̺̕o͞ ̷i̲̬͇̪͙n̝̗͕v̟̜̘̦͟o̶̙̰̠kè͚̮̺̪̹̱̤ ̖t̝͕̳̣̻̪͞h̼͓̲̦̳̘̲e͇̣̰̦̬͎ ̢̼̻̱̘h͚͎͙̜̣̲ͅi̦̲̣̰̤v̻͍e̺̭̳̪̰-m̢iͅn̖̺̞̲̯̰d̵̼̟͙̩̼̘̳ ̞̥̱̳̭r̛̗̘e͙p͠r̼̞̻̭̗e̺̠̣͟s̘͇̳͍̝͉e͉̥̯̞̲͚̬͜ǹ̬͎͎̟̖͇̤t͍̬̤͓̼̭͘ͅi̪̱n͠g̴͉ ͏͉ͅc̬̟h͡a̫̻̯͘o̫̟̖͍̙̝͉s̗̦̲.̨̹͈̣",
"̡͓̞ͅI̗̘̦͝n͇͇͙v̮̫ok̲̫̙͈i̖͙̭̹̠̞n̡̻̮̣̺g̲͈͙̭͙̬͎ ̰t͔̦h̞̲e̢̤ ͍̬̲͖f̴̘͕̣è͖ẹ̥̩l͖͔͚i͓͚̦͠n͖͍̗͓̳̮g͍ ̨o͚̪͡f̘̣̬ ̖̘͖̟͙̮c҉͔̫͖͓͇͖ͅh̵̤̣͚͔á̗̼͕ͅo̼̣̥s̱͈̺̖̦̻͢.̛̖̞̠̫̰",
"̗̺͖̹̯͓Ṯ̤͍̥͇͈h̲́e͏͓̼̗̙̼̣͔ ͇̜̱̠͓͍ͅN͕͠e̗̱z̘̝̜̺͙p̤̺̹͍̯͚e̠̻̠͜r̨̤͍̺̖͔̖̖d̠̟̭̬̝͟i̦͖̩͓͔̤a̠̗̬͉̙n͚͜ ̻̞̰͚ͅh̵͉i̳̞v̢͇ḙ͎͟-҉̭̩̼͔m̤̭̫i͕͇̝̦n̗͙ḍ̟ ̯̲͕͞ǫ̟̯̰̲͙̻̝f ̪̰̰̗̖̭̘͘c̦͍̲̞͍̩̙ḥ͚a̮͎̟̙͜ơ̩̹͎s̤.̝̝ ҉Z̡̖̜͖̰̣͉̜a͖̰͙̬͡l̲̫̳͍̩g̡̟̼̱͚̞̬ͅo̗͜.̟",
"̦H̬̤̗̤͝e͜ ̜̥̝̻͍̟́w̕h̖̯͓o̝͙̖͎̱̮ ҉̺̙̞̟͈W̷̼̭a̺̪͍į͈͕̭͙̯̜t̶̼̮s̘͙͖̕ ̠̫̠B̻͍͙͉̳ͅe̵h̵̬͇̫͙i̹͓̳̳̮͎̫̕n͟d̴̪̜̖ ̰͉̩͇͙̲͞ͅT͖̼͓̪͢h͏͓̮̻e̬̝̟ͅ ̤̹̝W͙̞̝͔͇͝ͅa͏͓͔̹̼̣l̴͔̰̤̟͔ḽ̫.͕",
'"><script>alert(document.title)</script>',
"'><script>alert(document.title)</script>",
"><script>alert(document.title)</script>",
"</script><script>alert(document.title)</script>",
"< / script >< script >alert(document.title)< / script >",
" onfocus=alert(document.title) autofocus ",
'" onfocus=alert(document.title) autofocus ',
"' onfocus=alert(document.title) autofocus ",
"scriptalert(document.title)/script",
"/dev/null; touch /tmp/blns.fail ; echo",
"../../../../../../../../../../../etc/passwd%00",
"../../../../../../../../../../../etc/hosts",
"() { 0; }; touch /tmp/blns.shellshock1.fail;",
"() { _; } >_[$($())] { touch /tmp/blns.shellshock2.fail; }",
]
naughty_strings = [naughty for naughty in naughty_strings if naughty not in skipped_naughty_strings]
# This is our base test case, but we'll reuse it to send naughty strings as both keys and values. # This is our base test case, but we'll reuse it to send naughty strings as both keys and values.
@ -34,7 +68,7 @@ def send_naughty_tx(asset, metadata):
# ## Set up a connection to Planetmint # ## Set up a connection to Planetmint
# Check [test_basic.py](./test_basic.html) to get some more details # Check [test_basic.py](./test_basic.html) to get some more details
# about the endpoint. # about the endpoint.
hosts = Hosts('/shared/hostnames') hosts = Hosts("/shared/hostnames")
pm = hosts.get_connection() pm = hosts.get_connection()
# Here's Alice. # Here's Alice.
@ -42,15 +76,11 @@ def send_naughty_tx(asset, metadata):
# Alice is in a naughty mood today, so she creates a tx with some naughty strings # Alice is in a naughty mood today, so she creates a tx with some naughty strings
prepared_transaction = pm.transactions.prepare( prepared_transaction = pm.transactions.prepare(
operation='CREATE', operation="CREATE", signers=alice.public_key, asset=asset, metadata=metadata
signers=alice.public_key, )
asset=asset,
metadata=metadata)
# She fulfills the transaction # She fulfills the transaction
fulfilled_transaction = pm.transactions.fulfill( fulfilled_transaction = pm.transactions.fulfill(prepared_transaction, private_keys=alice.private_key)
prepared_transaction,
private_keys=alice.private_key)
# The fulfilled tx gets sent to the pm network # The fulfilled tx gets sent to the pm network
try: try:
@ -59,23 +89,24 @@ def send_naughty_tx(asset, metadata):
sent_transaction = e sent_transaction = e
# If her key contained a '.', began with a '$', or contained a NUL character # If her key contained a '.', began with a '$', or contained a NUL character
regex = r'.*\..*|\$.*|.*\x00.*' regex = r".*\..*|\$.*|.*\x00.*"
key = next(iter(metadata)) key = next(iter(metadata))
if re.match(regex, key): if re.match(regex, key):
# Then she expects a nicely formatted error code # Then she expects a nicely formatted error code
status_code = sent_transaction.status_code status_code = sent_transaction.status_code
error = sent_transaction.error error = sent_transaction.error
regex = ( regex = (
r'\{\s*\n*' r"\{\s*\n*"
r'\s*"message":\s*"Invalid transaction \(ValidationError\):\s*' r'\s*"message":\s*"Invalid transaction \(ValidationError\):\s*'
r'Invalid key name.*The key name cannot contain characters.*\n*' r"Invalid key name.*The key name cannot contain characters.*\n*"
r'\s*"status":\s*400\n*' r'\s*"status":\s*400\n*'
r'\s*\}\n*') r"\s*\}\n*"
)
assert status_code == 400 assert status_code == 400
assert re.fullmatch(regex, error), sent_transaction assert re.fullmatch(regex, error), sent_transaction
# Otherwise, she expects to see her transaction in the database # Otherwise, she expects to see her transaction in the database
elif 'id' in sent_transaction.keys(): elif "id" in sent_transaction.keys():
tx_id = sent_transaction['id'] tx_id = sent_transaction["id"]
assert pm.transactions.retrieve(tx_id) assert pm.transactions.retrieve(tx_id)
# If neither condition was true, then something weird happened... # If neither condition was true, then something weird happened...
else: else:
@ -85,8 +116,8 @@ def send_naughty_tx(asset, metadata):
@pytest.mark.parametrize("naughty_string", naughty_strings, ids=naughty_strings) @pytest.mark.parametrize("naughty_string", naughty_strings, ids=naughty_strings)
def test_naughty_keys(naughty_string): def test_naughty_keys(naughty_string):
asset = {'data': {naughty_string: 'nice_value'}} asset = {"data": {naughty_string: "nice_value"}}
metadata = {naughty_string: 'nice_value'} metadata = {naughty_string: "nice_value"}
send_naughty_tx(asset, metadata) send_naughty_tx(asset, metadata)
@ -94,7 +125,7 @@ def test_naughty_keys(naughty_string):
@pytest.mark.parametrize("naughty_string", naughty_strings, ids=naughty_strings) @pytest.mark.parametrize("naughty_string", naughty_strings, ids=naughty_strings)
def test_naughty_values(naughty_string): def test_naughty_values(naughty_string):
asset = {'data': {'nice_key': naughty_string}} asset = {"data": {"nice_key": naughty_string}}
metadata = {'nice_key': naughty_string} metadata = {"nice_key": naughty_string}
send_naughty_tx(asset, metadata) send_naughty_tx(asset, metadata)

View File

@ -35,11 +35,11 @@ def test_stream():
# ## Set up the test # ## Set up the test
# We use the env variable `BICHAINDB_ENDPOINT` to know where to connect. # We use the env variable `BICHAINDB_ENDPOINT` to know where to connect.
# Check [test_basic.py](./test_basic.html) for more information. # Check [test_basic.py](./test_basic.html) for more information.
hosts = Hosts('/shared/hostnames') hosts = Hosts("/shared/hostnames")
pm = hosts.get_connection() pm = hosts.get_connection()
# *That's pretty bad, but let's do like this for now.* # *That's pretty bad, but let's do like this for now.*
WS_ENDPOINT = 'ws://{}:9985/api/v1/streams/valid_transactions'.format(hosts.hostnames[0]) WS_ENDPOINT = "ws://{}:9985/api/v1/streams/valid_transactions".format(hosts.hostnames[0])
# Hello to Alice again, she is pretty active in those tests, good job # Hello to Alice again, she is pretty active in those tests, good job
# Alice! # Alice!
@ -90,10 +90,10 @@ def test_stream():
for _ in range(10): for _ in range(10):
tx = pm.transactions.fulfill( tx = pm.transactions.fulfill(
pm.transactions.prepare( pm.transactions.prepare(
operation='CREATE', operation="CREATE", signers=alice.public_key, asset={"data": {"uuid": str(uuid4())}}
signers=alice.public_key, ),
asset={'data': {'uuid': str(uuid4())}}), private_keys=alice.private_key,
private_keys=alice.private_key) )
# We don't want to wait for each transaction to be in a block. By using # We don't want to wait for each transaction to be in a block. By using
# `async` mode, we make sure that the driver returns as soon as the # `async` mode, we make sure that the driver returns as soon as the
# transaction is pushed to the Planetmint API. Remember: we expect all # transaction is pushed to the Planetmint API. Remember: we expect all
@ -103,7 +103,7 @@ def test_stream():
pm.transactions.send_async(tx) pm.transactions.send_async(tx)
# The `id` of every sent transaction is then stored in a list. # The `id` of every sent transaction is then stored in a list.
sent.append(tx['id']) sent.append(tx["id"])
# ## Check the valid transactions coming from Planetmint # ## Check the valid transactions coming from Planetmint
# Now we are ready to check if Planetmint did its job. A simple way to # Now we are ready to check if Planetmint did its job. A simple way to
@ -117,9 +117,9 @@ def test_stream():
# the timeout, then game over ¯\\\_(ツ)\_/¯ # the timeout, then game over ¯\\\_(ツ)\_/¯
try: try:
event = received.get(timeout=5) event = received.get(timeout=5)
txid = json.loads(event)['transaction_id'] txid = json.loads(event)["transaction_id"]
except queue.Empty: except queue.Empty:
assert False, 'Did not receive all expected transactions' assert False, "Did not receive all expected transactions"
# Last thing is to try to remove the `txid` from the set of sent # Last thing is to try to remove the `txid` from the set of sent
# transactions. If this test is running in parallel with others, we # transactions. If this test is running in parallel with others, we

View File

@ -18,27 +18,22 @@ from .helper.hosts import Hosts
def prepare_condition_details(condition: ThresholdSha256): def prepare_condition_details(condition: ThresholdSha256):
condition_details = { condition_details = {"subconditions": [], "threshold": condition.threshold, "type": condition.TYPE_NAME}
'subconditions': [],
'threshold': condition.threshold,
'type': condition.TYPE_NAME
}
for s in condition.subconditions: for s in condition.subconditions:
if (s['type'] == 'fulfillment' and s['body'].TYPE_NAME == 'ed25519-sha-256'): if s["type"] == "fulfillment" and s["body"].TYPE_NAME == "ed25519-sha-256":
condition_details['subconditions'].append({ condition_details["subconditions"].append(
'type': s['body'].TYPE_NAME, {"type": s["body"].TYPE_NAME, "public_key": base58.b58encode(s["body"].public_key).decode()}
'public_key': base58.b58encode(s['body'].public_key).decode() )
})
else: else:
condition_details['subconditions'].append(prepare_condition_details(s['body'])) condition_details["subconditions"].append(prepare_condition_details(s["body"]))
return condition_details return condition_details
def test_threshold(): def test_threshold():
# Setup connection to test nodes # Setup connection to test nodes
hosts = Hosts('/shared/hostnames') hosts = Hosts("/shared/hostnames")
pm = hosts.get_connection() pm = hosts.get_connection()
# Generate Keypars for Alice, Bob an Carol! # Generate Keypars for Alice, Bob an Carol!
@ -49,13 +44,7 @@ def test_threshold():
# high rents anymore. Bob suggests to get a dish washer for the # high rents anymore. Bob suggests to get a dish washer for the
# kitchen. Alice agrees and here they go, creating the asset for their # kitchen. Alice agrees and here they go, creating the asset for their
# dish washer. # dish washer.
dw_asset = { dw_asset = {"data": {"dish washer": {"serial_number": 1337}}}
'data': {
'dish washer': {
'serial_number': 1337
}
}
}
# Create subfulfillments # Create subfulfillments
alice_ed25519 = Ed25519Sha256(public_key=base58.b58decode(alice.public_key)) alice_ed25519 = Ed25519Sha256(public_key=base58.b58decode(alice.public_key))
@ -74,37 +63,37 @@ def test_threshold():
# Assemble output and input for the handcrafted tx # Assemble output and input for the handcrafted tx
output = { output = {
'amount': '1', "amount": "1",
'condition': { "condition": {
'details': condition_details, "details": condition_details,
'uri': condition_uri, "uri": condition_uri,
}, },
'public_keys': (alice.public_key, bob.public_key, carol.public_key), "public_keys": (alice.public_key, bob.public_key, carol.public_key),
} }
# The yet to be fulfilled input: # The yet to be fulfilled input:
input_ = { input_ = {
'fulfillment': None, "fulfillment": None,
'fulfills': None, "fulfills": None,
'owners_before': (alice.public_key, bob.public_key), "owners_before": (alice.public_key, bob.public_key),
} }
# Assemble the handcrafted transaction # Assemble the handcrafted transaction
handcrafted_dw_tx = { handcrafted_dw_tx = {
'operation': 'CREATE', "operation": "CREATE",
'asset': dw_asset, "asset": dw_asset,
'metadata': None, "metadata": None,
'outputs': (output,), "outputs": (output,),
'inputs': (input_,), "inputs": (input_,),
'version': '2.0', "version": "2.0",
'id': None, "id": None,
} }
# Create sha3-256 of message to sign # Create sha3-256 of message to sign
message = json.dumps( message = json.dumps(
handcrafted_dw_tx, handcrafted_dw_tx,
sort_keys=True, sort_keys=True,
separators=(',', ':'), separators=(",", ":"),
ensure_ascii=False, ensure_ascii=False,
) )
message = sha3.sha3_256(message.encode()) message = sha3.sha3_256(message.encode())
@ -121,19 +110,19 @@ def test_threshold():
fulfillment_uri = fulfillment_threshold.serialize_uri() fulfillment_uri = fulfillment_threshold.serialize_uri()
handcrafted_dw_tx['inputs'][0]['fulfillment'] = fulfillment_uri handcrafted_dw_tx["inputs"][0]["fulfillment"] = fulfillment_uri
# Create tx_id for handcrafted_dw_tx and send tx commit # Create tx_id for handcrafted_dw_tx and send tx commit
json_str_tx = json.dumps( json_str_tx = json.dumps(
handcrafted_dw_tx, handcrafted_dw_tx,
sort_keys=True, sort_keys=True,
separators=(',', ':'), separators=(",", ":"),
ensure_ascii=False, ensure_ascii=False,
) )
dw_creation_txid = sha3.sha3_256(json_str_tx.encode()).hexdigest() dw_creation_txid = sha3.sha3_256(json_str_tx.encode()).hexdigest()
handcrafted_dw_tx['id'] = dw_creation_txid handcrafted_dw_tx["id"] = dw_creation_txid
pm.transactions.send_commit(handcrafted_dw_tx) pm.transactions.send_commit(handcrafted_dw_tx)
@ -144,18 +133,12 @@ def test_threshold():
def test_weighted_threshold(): def test_weighted_threshold():
hosts = Hosts('/shared/hostnames') hosts = Hosts("/shared/hostnames")
pm = hosts.get_connection() pm = hosts.get_connection()
alice, bob, carol = generate_keypair(), generate_keypair(), generate_keypair() alice, bob, carol = generate_keypair(), generate_keypair(), generate_keypair()
asset = { asset = {"data": {"trashcan": {"animals": ["racoon_1", "racoon_2"]}}}
'data': {
'trashcan': {
'animals': ['racoon_1', 'racoon_2']
}
}
}
alice_ed25519 = Ed25519Sha256(public_key=base58.b58decode(alice.public_key)) alice_ed25519 = Ed25519Sha256(public_key=base58.b58decode(alice.public_key))
bob_ed25519 = Ed25519Sha256(public_key=base58.b58decode(bob.public_key)) bob_ed25519 = Ed25519Sha256(public_key=base58.b58decode(bob.public_key))
@ -175,37 +158,37 @@ def test_weighted_threshold():
# Assemble output and input for the handcrafted tx # Assemble output and input for the handcrafted tx
output = { output = {
'amount': '1', "amount": "1",
'condition': { "condition": {
'details': condition_details, "details": condition_details,
'uri': condition_uri, "uri": condition_uri,
}, },
'public_keys': (alice.public_key, bob.public_key, carol.public_key), "public_keys": (alice.public_key, bob.public_key, carol.public_key),
} }
# The yet to be fulfilled input: # The yet to be fulfilled input:
input_ = { input_ = {
'fulfillment': None, "fulfillment": None,
'fulfills': None, "fulfills": None,
'owners_before': (alice.public_key, bob.public_key), "owners_before": (alice.public_key, bob.public_key),
} }
# Assemble the handcrafted transaction # Assemble the handcrafted transaction
handcrafted_tx = { handcrafted_tx = {
'operation': 'CREATE', "operation": "CREATE",
'asset': asset, "asset": asset,
'metadata': None, "metadata": None,
'outputs': (output,), "outputs": (output,),
'inputs': (input_,), "inputs": (input_,),
'version': '2.0', "version": "2.0",
'id': None, "id": None,
} }
# Create sha3-256 of message to sign # Create sha3-256 of message to sign
message = json.dumps( message = json.dumps(
handcrafted_tx, handcrafted_tx,
sort_keys=True, sort_keys=True,
separators=(',', ':'), separators=(",", ":"),
ensure_ascii=False, ensure_ascii=False,
) )
message = sha3.sha3_256(message.encode()) message = sha3.sha3_256(message.encode())
@ -224,19 +207,19 @@ def test_weighted_threshold():
fulfillment_uri = fulfillment_threshold.serialize_uri() fulfillment_uri = fulfillment_threshold.serialize_uri()
handcrafted_tx['inputs'][0]['fulfillment'] = fulfillment_uri handcrafted_tx["inputs"][0]["fulfillment"] = fulfillment_uri
# Create tx_id for handcrafted_dw_tx and send tx commit # Create tx_id for handcrafted_dw_tx and send tx commit
json_str_tx = json.dumps( json_str_tx = json.dumps(
handcrafted_tx, handcrafted_tx,
sort_keys=True, sort_keys=True,
separators=(',', ':'), separators=(",", ":"),
ensure_ascii=False, ensure_ascii=False,
) )
creation_tx_id = sha3.sha3_256(json_str_tx.encode()).hexdigest() creation_tx_id = sha3.sha3_256(json_str_tx.encode()).hexdigest()
handcrafted_tx['id'] = creation_tx_id handcrafted_tx["id"] = creation_tx_id
pm.transactions.send_commit(handcrafted_tx) pm.transactions.send_commit(handcrafted_tx)
@ -254,50 +237,50 @@ def test_weighted_threshold():
# Assemble output and input for the handcrafted tx # Assemble output and input for the handcrafted tx
transfer_output = { transfer_output = {
'amount': '1', "amount": "1",
'condition': { "condition": {
'details': { "details": {
'type': alice_transfer_ed25519.TYPE_NAME, "type": alice_transfer_ed25519.TYPE_NAME,
'public_key': base58.b58encode(alice_transfer_ed25519.public_key).decode() "public_key": base58.b58encode(alice_transfer_ed25519.public_key).decode(),
}, },
'uri': transfer_condition_uri, "uri": transfer_condition_uri,
}, },
'public_keys': (alice.public_key,), "public_keys": (alice.public_key,),
} }
# The yet to be fulfilled input: # The yet to be fulfilled input:
transfer_input_ = { transfer_input_ = {
'fulfillment': None, "fulfillment": None,
'fulfills': { "fulfills": {"transaction_id": creation_tx_id, "output_index": 0},
'transaction_id': creation_tx_id, "owners_before": (alice.public_key, bob.public_key, carol.public_key),
'output_index': 0
},
'owners_before': (alice.public_key, bob.public_key, carol.public_key),
} }
# Assemble the handcrafted transaction # Assemble the handcrafted transaction
handcrafted_transfer_tx = { handcrafted_transfer_tx = {
'operation': 'TRANSFER', "operation": "TRANSFER",
'asset': {'id': creation_tx_id}, "asset": {"id": creation_tx_id},
'metadata': None, "metadata": None,
'outputs': (transfer_output,), "outputs": (transfer_output,),
'inputs': (transfer_input_,), "inputs": (transfer_input_,),
'version': '2.0', "version": "2.0",
'id': None, "id": None,
} }
# Create sha3-256 of message to sign # Create sha3-256 of message to sign
message = json.dumps( message = json.dumps(
handcrafted_transfer_tx, handcrafted_transfer_tx,
sort_keys=True, sort_keys=True,
separators=(',', ':'), separators=(",", ":"),
ensure_ascii=False, ensure_ascii=False,
) )
message = sha3.sha3_256(message.encode()) message = sha3.sha3_256(message.encode())
message.update('{}{}'.format( message.update(
handcrafted_transfer_tx['inputs'][0]['fulfills']['transaction_id'], "{}{}".format(
handcrafted_transfer_tx['inputs'][0]['fulfills']['output_index']).encode()) handcrafted_transfer_tx["inputs"][0]["fulfills"]["transaction_id"],
handcrafted_transfer_tx["inputs"][0]["fulfills"]["output_index"],
).encode()
)
# Sign message with Alice's und Bob's private key # Sign message with Alice's und Bob's private key
bob_transfer_ed25519.sign(message.digest(), base58.b58decode(bob.private_key)) bob_transfer_ed25519.sign(message.digest(), base58.b58decode(bob.private_key))
@ -314,19 +297,19 @@ def test_weighted_threshold():
fulfillment_uri = fulfillment_threshold.serialize_uri() fulfillment_uri = fulfillment_threshold.serialize_uri()
handcrafted_transfer_tx['inputs'][0]['fulfillment'] = fulfillment_uri handcrafted_transfer_tx["inputs"][0]["fulfillment"] = fulfillment_uri
# Create tx_id for handcrafted_dw_tx and send tx commit # Create tx_id for handcrafted_dw_tx and send tx commit
json_str_tx = json.dumps( json_str_tx = json.dumps(
handcrafted_transfer_tx, handcrafted_transfer_tx,
sort_keys=True, sort_keys=True,
separators=(',', ':'), separators=(",", ":"),
ensure_ascii=False, ensure_ascii=False,
) )
transfer_tx_id = sha3.sha3_256(json_str_tx.encode()).hexdigest() transfer_tx_id = sha3.sha3_256(json_str_tx.encode()).hexdigest()
handcrafted_transfer_tx['id'] = transfer_tx_id handcrafted_transfer_tx["id"] = transfer_tx_id
pm.transactions.send_commit(handcrafted_transfer_tx) pm.transactions.send_commit(handcrafted_transfer_tx)

View File

@ -38,9 +38,7 @@ def test_zenroom_signing(
) )
) )
zenroomscpt = ZenroomSha256( zenroomscpt = ZenroomSha256(script=fulfill_script_zencode, data=zenroom_data, keys=zen_public_keys)
script=fulfill_script_zencode, data=zenroom_data, keys=zen_public_keys
)
print(f"zenroom is: {zenroomscpt.script}") print(f"zenroom is: {zenroomscpt.script}")
# CRYPTO-CONDITIONS: generate the condition uri # CRYPTO-CONDITIONS: generate the condition uri

View File

@ -15,19 +15,19 @@ def edit_genesis() -> None:
for file_name in file_names: for file_name in file_names:
file = open(file_name) file = open(file_name)
genesis = json.load(file) genesis = json.load(file)
validators.extend(genesis['validators']) validators.extend(genesis["validators"])
file.close() file.close()
genesis_file = open(file_names[0]) genesis_file = open(file_names[0])
genesis_json = json.load(genesis_file) genesis_json = json.load(genesis_file)
genesis_json['validators'] = validators genesis_json["validators"] = validators
genesis_file.close() genesis_file.close()
with open('/shared/genesis.json', 'w') as f: with open("/shared/genesis.json", "w") as f:
json.dump(genesis_json, f, indent=True) json.dump(genesis_json, f, indent=True)
return None return None
if __name__ == '__main__': if __name__ == "__main__":
edit_genesis() edit_genesis()

View File

@ -31,25 +31,27 @@ import re
from dateutil.parser import parse from dateutil.parser import parse
lineformat = re.compile(r'(?P<ipaddress>\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}) - - ' lineformat = re.compile(
r'\[(?P<dateandtime>\d{2}\/[a-z]{3}\/\d{4}:\d{2}:\d{2}:\d{2} ' r"(?P<ipaddress>\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}) - - "
r"\[(?P<dateandtime>\d{2}\/[a-z]{3}\/\d{4}:\d{2}:\d{2}:\d{2} "
r'(\+|\-)\d{4})\] ((\"(GET|POST) )(?P<url>.+)(http\/1\.1")) ' r'(\+|\-)\d{4})\] ((\"(GET|POST) )(?P<url>.+)(http\/1\.1")) '
r'(?P<statuscode>\d{3}) ' r"(?P<statuscode>\d{3}) "
r'(?P<bytessent>\d+) ' r"(?P<bytessent>\d+) "
r'(["](?P<refferer>(\-)|(.+))["]) ' r'(["](?P<refferer>(\-)|(.+))["]) '
r'(["](?P<useragent>.+)["])', r'(["](?P<useragent>.+)["])',
re.IGNORECASE) re.IGNORECASE,
)
filepath = sys.argv[1] filepath = sys.argv[1]
logline_list = [] logline_list = []
with open(filepath) as csvfile: with open(filepath) as csvfile:
csvreader = csv.reader(csvfile, delimiter=',') csvreader = csv.reader(csvfile, delimiter=",")
for row in csvreader: for row in csvreader:
if row and (row[8] != 'LogEntry'): if row and (row[8] != "LogEntry"):
# because the first line is just the column headers, such as 'LogEntry' # because the first line is just the column headers, such as 'LogEntry'
logline = row[8] logline = row[8]
print(logline + '\n') print(logline + "\n")
logline_data = re.search(lineformat, logline) logline_data = re.search(lineformat, logline)
if logline_data: if logline_data:
logline_dict = logline_data.groupdict() logline_dict = logline_data.groupdict()
@ -63,20 +65,19 @@ total_bytes_sent = 0
tstamp_list = [] tstamp_list = []
for lldict in logline_list: for lldict in logline_list:
total_bytes_sent += int(lldict['bytessent']) total_bytes_sent += int(lldict["bytessent"])
dt = lldict['dateandtime'] dt = lldict["dateandtime"]
# https://tinyurl.com/lqjnhot # https://tinyurl.com/lqjnhot
dtime = parse(dt[:11] + " " + dt[12:]) dtime = parse(dt[:11] + " " + dt[12:])
tstamp_list.append(dtime.timestamp()) tstamp_list.append(dtime.timestamp())
print('Number of log lines seen: {}'.format(len(logline_list))) print("Number of log lines seen: {}".format(len(logline_list)))
# Time range # Time range
trange_sec = max(tstamp_list) - min(tstamp_list) trange_sec = max(tstamp_list) - min(tstamp_list)
trange_days = trange_sec / 60.0 / 60.0 / 24.0 trange_days = trange_sec / 60.0 / 60.0 / 24.0
print('Time range seen (days): {}'.format(trange_days)) print("Time range seen (days): {}".format(trange_days))
print('Total bytes sent: {}'.format(total_bytes_sent)) print("Total bytes sent: {}".format(total_bytes_sent))
print('Average bytes sent per day (out via GET): {}'. print("Average bytes sent per day (out via GET): {}".format(total_bytes_sent / trange_days))
format(total_bytes_sent / trange_days))

View File

@ -14,15 +14,16 @@ from planetmint.backend.exceptions import ConnectionError
from planetmint.transactions.common.exceptions import ConfigurationError from planetmint.transactions.common.exceptions import ConfigurationError
BACKENDS = { BACKENDS = {
'tarantool_db': 'planetmint.backend.tarantool.connection.TarantoolDBConnection', "tarantool_db": "planetmint.backend.tarantool.connection.TarantoolDBConnection",
'localmongodb': 'planetmint.backend.localmongodb.connection.LocalMongoDBConnection' "localmongodb": "planetmint.backend.localmongodb.connection.LocalMongoDBConnection",
} }
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
def connect(host: str = None, port: int = None, login: str = None, password: str = None, backend: str = None, def connect(
**kwargs): host: str = None, port: int = None, login: str = None, password: str = None, backend: str = None, **kwargs
):
try: try:
backend = backend backend = backend
if not backend and kwargs and kwargs.get("backend"): if not backend and kwargs and kwargs.get("backend"):
@ -37,40 +38,57 @@ def connect(host: str = None, port: int = None, login: str = None, password: str
raise ConfigurationError raise ConfigurationError
host = host or Config().get()["database"]["host"] if not kwargs.get("host") else kwargs["host"] host = host or Config().get()["database"]["host"] if not kwargs.get("host") else kwargs["host"]
port = port or Config().get()['database']['port'] if not kwargs.get("port") else kwargs["port"] port = port or Config().get()["database"]["port"] if not kwargs.get("port") else kwargs["port"]
login = login or Config().get()["database"]["login"] if not kwargs.get("login") else kwargs["login"] login = login or Config().get()["database"]["login"] if not kwargs.get("login") else kwargs["login"]
password = password or Config().get()["database"]["password"] password = password or Config().get()["database"]["password"]
try: try:
if backend == "tarantool_db": if backend == "tarantool_db":
modulepath, _, class_name = BACKENDS[backend].rpartition('.') modulepath, _, class_name = BACKENDS[backend].rpartition(".")
Class = getattr(import_module(modulepath), class_name) Class = getattr(import_module(modulepath), class_name)
return Class(host=host, port=port, user=login, password=password, kwargs=kwargs) return Class(host=host, port=port, user=login, password=password, kwargs=kwargs)
elif backend == "localmongodb": elif backend == "localmongodb":
modulepath, _, class_name = BACKENDS[backend].rpartition('.') modulepath, _, class_name = BACKENDS[backend].rpartition(".")
Class = getattr(import_module(modulepath), class_name) Class = getattr(import_module(modulepath), class_name)
dbname = _kwargs_parser(key="name", kwargs=kwargs) or Config().get()['database']['name'] dbname = _kwargs_parser(key="name", kwargs=kwargs) or Config().get()["database"]["name"]
replicaset = _kwargs_parser(key="replicaset", kwargs=kwargs) or Config().get()['database']['replicaset'] replicaset = _kwargs_parser(key="replicaset", kwargs=kwargs) or Config().get()["database"]["replicaset"]
ssl = _kwargs_parser(key="ssl", kwargs=kwargs) or Config().get()['database']['ssl'] ssl = _kwargs_parser(key="ssl", kwargs=kwargs) or Config().get()["database"]["ssl"]
login = login or Config().get()['database']['login'] if _kwargs_parser(key="login", login = (
kwargs=kwargs) is None else _kwargs_parser( # noqa: E501 login or Config().get()["database"]["login"]
key="login", kwargs=kwargs) if _kwargs_parser(key="login", kwargs=kwargs) is None
password = password or Config().get()['database']['password'] if _kwargs_parser(key="password", else _kwargs_parser(key="login", kwargs=kwargs) # noqa: E501
kwargs=kwargs) is None else _kwargs_parser( # noqa: E501 )
key="password", kwargs=kwargs) password = (
ca_cert = _kwargs_parser(key="ca_cert", kwargs=kwargs) or Config().get()['database']['ca_cert'] password or Config().get()["database"]["password"]
certfile = _kwargs_parser(key="certfile", kwargs=kwargs) or Config().get()['database']['certfile'] if _kwargs_parser(key="password", kwargs=kwargs) is None
keyfile = _kwargs_parser(key="keyfile", kwargs=kwargs) or Config().get()['database']['keyfile'] else _kwargs_parser(key="password", kwargs=kwargs) # noqa: E501
keyfile_passphrase = _kwargs_parser(key="keyfile_passphrase", kwargs=kwargs) or Config().get()['database'][ )
'keyfile_passphrase'] ca_cert = _kwargs_parser(key="ca_cert", kwargs=kwargs) or Config().get()["database"]["ca_cert"]
crlfile = _kwargs_parser(key="crlfile", kwargs=kwargs) or Config().get()['database']['crlfile'] certfile = _kwargs_parser(key="certfile", kwargs=kwargs) or Config().get()["database"]["certfile"]
keyfile = _kwargs_parser(key="keyfile", kwargs=kwargs) or Config().get()["database"]["keyfile"]
keyfile_passphrase = (
_kwargs_parser(key="keyfile_passphrase", kwargs=kwargs)
or Config().get()["database"]["keyfile_passphrase"]
)
crlfile = _kwargs_parser(key="crlfile", kwargs=kwargs) or Config().get()["database"]["crlfile"]
max_tries = _kwargs_parser(key="max_tries", kwargs=kwargs) max_tries = _kwargs_parser(key="max_tries", kwargs=kwargs)
connection_timeout = _kwargs_parser(key="connection_timeout", kwargs=kwargs) connection_timeout = _kwargs_parser(key="connection_timeout", kwargs=kwargs)
return Class(host=host, port=port, dbname=dbname, return Class(
max_tries=max_tries, connection_timeout=connection_timeout, host=host,
replicaset=replicaset, ssl=ssl, login=login, password=password, port=port,
ca_cert=ca_cert, certfile=certfile, keyfile=keyfile, dbname=dbname,
keyfile_passphrase=keyfile_passphrase, crlfile=crlfile) max_tries=max_tries,
connection_timeout=connection_timeout,
replicaset=replicaset,
ssl=ssl,
login=login,
password=password,
ca_cert=ca_cert,
certfile=certfile,
keyfile=keyfile,
keyfile_passphrase=keyfile_passphrase,
crlfile=crlfile,
)
except tarantool.error.NetworkError as network_err: except tarantool.error.NetworkError as network_err:
print(f"Host {host}:{port} can't be reached.\n{network_err}") print(f"Host {host}:{port} can't be reached.\n{network_err}")
raise network_err raise network_err
@ -81,15 +99,14 @@ def _kwargs_parser(key, kwargs):
return kwargs[key] return kwargs[key]
return None return None
class Connection: class Connection:
"""Connection class interface. """Connection class interface.
All backend implementations should provide a connection class that inherits All backend implementations should provide a connection class that inherits
from and implements this class. from and implements this class.
""" """
def __init__(self, host=None, port=None, dbname=None, def __init__(self, host=None, port=None, dbname=None, connection_timeout=None, max_tries=None, **kwargs):
connection_timeout=None, max_tries=None,
**kwargs):
"""Create a new :class:`~.Connection` instance. """Create a new :class:`~.Connection` instance.
Args: Args:
host (str): the host to connect to. host (str): the host to connect to.
@ -104,14 +121,15 @@ class Connection:
configuration's ``database`` settings configuration's ``database`` settings
""" """
dbconf = Config().get()['database'] dbconf = Config().get()["database"]
self.host = host or dbconf['host'] self.host = host or dbconf["host"]
self.port = port or dbconf['port'] self.port = port or dbconf["port"]
self.dbname = dbname or dbconf['name'] self.dbname = dbname or dbconf["name"]
self.connection_timeout = connection_timeout if connection_timeout is not None \ self.connection_timeout = (
else dbconf['connection_timeout'] connection_timeout if connection_timeout is not None else dbconf["connection_timeout"]
self.max_tries = max_tries if max_tries is not None else dbconf['max_tries'] )
self.max_tries = max_tries if max_tries is not None else dbconf["max_tries"]
self.max_tries_counter = range(self.max_tries) if self.max_tries != 0 else repeat(0) self.max_tries_counter = range(self.max_tries) if self.max_tries != 0 else repeat(0)
self._conn = None self._conn = None
@ -149,11 +167,16 @@ class Connection:
try: try:
self._conn = self._connect() self._conn = self._connect()
except ConnectionError as exc: except ConnectionError as exc:
logger.warning('Attempt %s/%s. Connection to %s:%s failed after %sms.', logger.warning(
attempt, self.max_tries if self.max_tries != 0 else '', "Attempt %s/%s. Connection to %s:%s failed after %sms.",
self.host, self.port, self.connection_timeout) attempt,
self.max_tries if self.max_tries != 0 else "",
self.host,
self.port,
self.connection_timeout,
)
if attempt == self.max_tries: if attempt == self.max_tries:
logger.critical('Cannot connect to the Database. Giving up.') logger.critical("Cannot connect to the Database. Giving up.")
raise ConnectionError() from exc raise ConnectionError() from exc
else: else:
break break

View File

@ -8,20 +8,28 @@ from ssl import CERT_REQUIRED
import pymongo import pymongo
from planetmint.config import Config from planetmint.config import Config
from planetmint.backend.exceptions import (DuplicateKeyError, from planetmint.backend.exceptions import DuplicateKeyError, OperationError, ConnectionError
OperationError,
ConnectionError)
from planetmint.transactions.common.exceptions import ConfigurationError from planetmint.transactions.common.exceptions import ConfigurationError
from planetmint.utils import Lazy from planetmint.utils import Lazy
from planetmint.backend.connection import Connection from planetmint.backend.connection import Connection
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
class LocalMongoDBConnection(Connection):
def __init__(self, replicaset=None, ssl=None, login=None, password=None, class LocalMongoDBConnection(Connection):
ca_cert=None, certfile=None, keyfile=None, def __init__(
keyfile_passphrase=None, crlfile=None, **kwargs): self,
replicaset=None,
ssl=None,
login=None,
password=None,
ca_cert=None,
certfile=None,
keyfile=None,
keyfile_passphrase=None,
crlfile=None,
**kwargs,
):
"""Create a new Connection instance. """Create a new Connection instance.
Args: Args:
@ -32,15 +40,15 @@ class LocalMongoDBConnection(Connection):
""" """
super().__init__(**kwargs) super().__init__(**kwargs)
self.replicaset = replicaset or Config().get()['database']['replicaset'] self.replicaset = replicaset or Config().get()["database"]["replicaset"]
self.ssl = ssl if ssl is not None else Config().get()['database']['ssl'] self.ssl = ssl if ssl is not None else Config().get()["database"]["ssl"]
self.login = login or Config().get()['database']['login'] self.login = login or Config().get()["database"]["login"]
self.password = password or Config().get()['database']['password'] self.password = password or Config().get()["database"]["password"]
self.ca_cert = ca_cert or Config().get()['database']['ca_cert'] self.ca_cert = ca_cert or Config().get()["database"]["ca_cert"]
self.certfile = certfile or Config().get()['database']['certfile'] self.certfile = certfile or Config().get()["database"]["certfile"]
self.keyfile = keyfile or Config().get()['database']['keyfile'] self.keyfile = keyfile or Config().get()["database"]["keyfile"]
self.keyfile_passphrase = keyfile_passphrase or Config().get()['database']['keyfile_passphrase'] self.keyfile_passphrase = keyfile_passphrase or Config().get()["database"]["keyfile_passphrase"]
self.crlfile = crlfile or Config().get()['database']['crlfile'] self.crlfile = crlfile or Config().get()["database"]["crlfile"]
if not self.ssl: if not self.ssl:
self.ssl = False self.ssl = False
if not self.keyfile_passphrase: if not self.keyfile_passphrase:
@ -66,15 +74,14 @@ class LocalMongoDBConnection(Connection):
try: try:
return query.run(self.conn) return query.run(self.conn)
except pymongo.errors.AutoReconnect: except pymongo.errors.AutoReconnect:
logger.warning('Lost connection to the database, ' logger.warning("Lost connection to the database, " "retrying query.")
'retrying query.')
return query.run(self.conn) return query.run(self.conn)
except pymongo.errors.AutoReconnect as exc: except pymongo.errors.AutoReconnect as exc:
raise ConnectionError from exc raise ConnectionError from exc
except pymongo.errors.DuplicateKeyError as exc: except pymongo.errors.DuplicateKeyError as exc:
raise DuplicateKeyError from exc raise DuplicateKeyError from exc
except pymongo.errors.OperationFailure as exc: except pymongo.errors.OperationFailure as exc:
print(f'DETAILS: {exc.details}') print(f"DETAILS: {exc.details}")
raise OperationError from exc raise OperationError from exc
def _connect(self): def _connect(self):
@ -95,19 +102,21 @@ class LocalMongoDBConnection(Connection):
# `ConnectionFailure`. # `ConnectionFailure`.
# The presence of ca_cert, certfile, keyfile, crlfile implies the # The presence of ca_cert, certfile, keyfile, crlfile implies the
# use of certificates for TLS connectivity. # use of certificates for TLS connectivity.
if self.ca_cert is None or self.certfile is None or \ if self.ca_cert is None or self.certfile is None or self.keyfile is None or self.crlfile is None:
self.keyfile is None or self.crlfile is None: client = pymongo.MongoClient(
client = pymongo.MongoClient(self.host, self.host,
self.port, self.port,
replicaset=self.replicaset, replicaset=self.replicaset,
serverselectiontimeoutms=self.connection_timeout, serverselectiontimeoutms=self.connection_timeout,
ssl=self.ssl, ssl=self.ssl,
**MONGO_OPTS) **MONGO_OPTS,
)
if self.login is not None and self.password is not None: if self.login is not None and self.password is not None:
client[self.dbname].authenticate(self.login, self.password) client[self.dbname].authenticate(self.login, self.password)
else: else:
logger.info('Connecting to MongoDB over TLS/SSL...') logger.info("Connecting to MongoDB over TLS/SSL...")
client = pymongo.MongoClient(self.host, client = pymongo.MongoClient(
self.host,
self.port, self.port,
replicaset=self.replicaset, replicaset=self.replicaset,
serverselectiontimeoutms=self.connection_timeout, serverselectiontimeoutms=self.connection_timeout,
@ -118,21 +127,20 @@ class LocalMongoDBConnection(Connection):
ssl_pem_passphrase=self.keyfile_passphrase, ssl_pem_passphrase=self.keyfile_passphrase,
ssl_crlfile=self.crlfile, ssl_crlfile=self.crlfile,
ssl_cert_reqs=CERT_REQUIRED, ssl_cert_reqs=CERT_REQUIRED,
**MONGO_OPTS) **MONGO_OPTS,
)
if self.login is not None: if self.login is not None:
client[self.dbname].authenticate(self.login, client[self.dbname].authenticate(self.login, mechanism="MONGODB-X509")
mechanism='MONGODB-X509')
return client return client
except (pymongo.errors.ConnectionFailure, except (pymongo.errors.ConnectionFailure, pymongo.errors.OperationFailure) as exc:
pymongo.errors.OperationFailure) as exc: logger.info("Exception in _connect(): {}".format(exc))
logger.info('Exception in _connect(): {}'.format(exc))
raise ConnectionError(str(exc)) from exc raise ConnectionError(str(exc)) from exc
except pymongo.errors.ConfigurationError as exc: except pymongo.errors.ConfigurationError as exc:
raise ConfigurationError from exc raise ConfigurationError from exc
MONGO_OPTS = { MONGO_OPTS = {
'socketTimeoutMS': 20000, "socketTimeoutMS": 20000,
} }

View File

@ -15,11 +15,10 @@ register_query = module_dispatch_registrar(convert)
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def prepare_asset(connection, transaction_type, transaction_id, filter_operation, asset): def prepare_asset(connection, transaction_type, transaction_id, filter_operation, asset):
if transaction_type == filter_operation: if transaction_type == filter_operation:
asset['id'] = transaction_id asset["id"] = transaction_id
return asset return asset
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def prepare_metadata(connection, transaction_id, metadata): def prepare_metadata(connection, transaction_id, metadata):
return {'id': transaction_id, return {"id": transaction_id, "metadata": metadata}
'metadata': metadata}

View File

@ -1,4 +1,5 @@
from functools import singledispatch from functools import singledispatch
# Copyright © 2020 Interplanetary Database Association e.V., # Copyright © 2020 Interplanetary Database Association e.V.,
# Planetmint and IPDB software contributors. # Planetmint and IPDB software contributors.
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0) # SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
@ -19,104 +20,80 @@ register_query = module_dispatch_registrar(backend.query)
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def store_transactions(conn, signed_transactions): def store_transactions(conn, signed_transactions):
return conn.run(conn.collection('transactions') return conn.run(conn.collection("transactions").insert_many(signed_transactions))
.insert_many(signed_transactions))
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def get_transaction(conn, transaction_id): def get_transaction(conn, transaction_id):
return conn.run( return conn.run(conn.collection("transactions").find_one({"id": transaction_id}, {"_id": 0}))
conn.collection('transactions')
.find_one({'id': transaction_id}, {'_id': 0}))
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def get_transactions(conn, transaction_ids): def get_transactions(conn, transaction_ids):
try: try:
return conn.run( return conn.run(
conn.collection('transactions') conn.collection("transactions").find({"id": {"$in": transaction_ids}}, projection={"_id": False})
.find({'id': {'$in': transaction_ids}}, )
projection={'_id': False}))
except IndexError: except IndexError:
pass pass
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def store_metadatas(conn, metadata): def store_metadatas(conn, metadata):
return conn.run( return conn.run(conn.collection("metadata").insert_many(metadata, ordered=False))
conn.collection('metadata')
.insert_many(metadata, ordered=False))
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def get_metadata(conn, transaction_ids): def get_metadata(conn, transaction_ids):
return conn.run( return conn.run(conn.collection("metadata").find({"id": {"$in": transaction_ids}}, projection={"_id": False}))
conn.collection('metadata')
.find({'id': {'$in': transaction_ids}},
projection={'_id': False}))
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def store_asset(conn, asset): def store_asset(conn, asset):
try: try:
return conn.run( return conn.run(conn.collection("assets").insert_one(asset))
conn.collection('assets')
.insert_one(asset))
except DuplicateKeyError: except DuplicateKeyError:
pass pass
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def store_assets(conn, assets): def store_assets(conn, assets):
return conn.run( return conn.run(conn.collection("assets").insert_many(assets, ordered=False))
conn.collection('assets')
.insert_many(assets, ordered=False))
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def get_asset(conn, asset_id): def get_asset(conn, asset_id):
try: try:
return conn.run( return conn.run(conn.collection("assets").find_one({"id": asset_id}, {"_id": 0, "id": 0}))
conn.collection('assets')
.find_one({'id': asset_id}, {'_id': 0, 'id': 0}))
except IndexError: except IndexError:
pass pass
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def get_assets(conn, asset_ids): def get_assets(conn, asset_ids):
return conn.run( return conn.run(conn.collection("assets").find({"id": {"$in": asset_ids}}, projection={"_id": False}))
conn.collection('assets')
.find({'id': {'$in': asset_ids}},
projection={'_id': False}))
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def get_spent(conn, transaction_id, output): def get_spent(conn, transaction_id, output):
query = {'inputs': query = {
{'$elemMatch': "inputs": {
{'$and': [{'fulfills.transaction_id': transaction_id}, "$elemMatch": {"$and": [{"fulfills.transaction_id": transaction_id}, {"fulfills.output_index": output}]}
{'fulfills.output_index': output}]}}} }
}
return conn.run( return conn.run(conn.collection("transactions").find(query, {"_id": 0}))
conn.collection('transactions')
.find(query, {'_id': 0}))
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def get_latest_block(conn): def get_latest_block(conn):
return conn.run( return conn.run(conn.collection("blocks").find_one(projection={"_id": False}, sort=[("height", DESCENDING)]))
conn.collection('blocks')
.find_one(projection={'_id': False},
sort=[('height', DESCENDING)]))
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def store_block(conn, block): def store_block(conn, block):
try: try:
return conn.run( return conn.run(conn.collection("blocks").insert_one(block))
conn.collection('blocks')
.insert_one(block))
except DuplicateKeyError: except DuplicateKeyError:
pass pass
@ -125,32 +102,47 @@ def store_block(conn, block):
def get_txids_filtered(conn, asset_id, operation=None, last_tx=None): def get_txids_filtered(conn, asset_id, operation=None, last_tx=None):
match = { match = {
Transaction.CREATE: {'operation': 'CREATE', 'id': asset_id}, Transaction.CREATE: {"operation": "CREATE", "id": asset_id},
Transaction.TRANSFER: {'operation': 'TRANSFER', 'asset.id': asset_id}, Transaction.TRANSFER: {"operation": "TRANSFER", "asset.id": asset_id},
None: {'$or': [{'asset.id': asset_id}, {'id': asset_id}]}, None: {"$or": [{"asset.id": asset_id}, {"id": asset_id}]},
}[operation] }[operation]
cursor = conn.run(conn.collection('transactions').find(match)) cursor = conn.run(conn.collection("transactions").find(match))
if last_tx: if last_tx:
cursor = cursor.sort([('$natural', DESCENDING)]).limit(1) cursor = cursor.sort([("$natural", DESCENDING)]).limit(1)
return (elem['id'] for elem in cursor) return (elem["id"] for elem in cursor)
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def text_search(conn, search, *, language='english', case_sensitive=False, def text_search(
diacritic_sensitive=False, text_score=False, limit=0, table='assets'): conn,
search,
*,
language="english",
case_sensitive=False,
diacritic_sensitive=False,
text_score=False,
limit=0,
table="assets"
):
cursor = conn.run( cursor = conn.run(
conn.collection(table) conn.collection(table)
.find({'$text': { .find(
'$search': search, {
'$language': language, "$text": {
'$caseSensitive': case_sensitive, "$search": search,
'$diacriticSensitive': diacritic_sensitive}}, "$language": language,
{'score': {'$meta': 'textScore'}, '_id': False}) "$caseSensitive": case_sensitive,
.sort([('score', {'$meta': 'textScore'})]) "$diacriticSensitive": diacritic_sensitive,
.limit(limit)) }
},
{"score": {"$meta": "textScore"}, "_id": False},
)
.sort([("score", {"$meta": "textScore"})])
.limit(limit)
)
if text_score: if text_score:
return cursor return cursor
@ -159,58 +151,54 @@ def text_search(conn, search, *, language='english', case_sensitive=False,
def _remove_text_score(asset): def _remove_text_score(asset):
asset.pop('score', None) asset.pop("score", None)
return asset return asset
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def get_owned_ids(conn, owner): def get_owned_ids(conn, owner):
cursor = conn.run( cursor = conn.run(
conn.collection('transactions').aggregate([ conn.collection("transactions").aggregate(
{'$match': {'outputs.public_keys': owner}}, [{"$match": {"outputs.public_keys": owner}}, {"$project": {"_id": False}}]
{'$project': {'_id': False}} )
])) )
return cursor return cursor
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def get_spending_transactions(conn, inputs): def get_spending_transactions(conn, inputs):
transaction_ids = [i['transaction_id'] for i in inputs] transaction_ids = [i["transaction_id"] for i in inputs]
output_indexes = [i['output_index'] for i in inputs] output_indexes = [i["output_index"] for i in inputs]
query = {'inputs': query = {
{'$elemMatch': "inputs": {
{'$and': "$elemMatch": {
[ "$and": [
{'fulfills.transaction_id': {'$in': transaction_ids}}, {"fulfills.transaction_id": {"$in": transaction_ids}},
{'fulfills.output_index': {'$in': output_indexes}} {"fulfills.output_index": {"$in": output_indexes}},
]}}} ]
}
}
}
cursor = conn.run( cursor = conn.run(conn.collection("transactions").find(query, {"_id": False}))
conn.collection('transactions').find(query, {'_id': False}))
return cursor return cursor
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def get_block(conn, block_id): def get_block(conn, block_id):
return conn.run( return conn.run(conn.collection("blocks").find_one({"height": block_id}, projection={"_id": False}))
conn.collection('blocks')
.find_one({'height': block_id},
projection={'_id': False}))
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def get_block_with_transaction(conn, txid): def get_block_with_transaction(conn, txid):
return conn.run( return conn.run(conn.collection("blocks").find({"transactions": txid}, projection={"_id": False, "height": True}))
conn.collection('blocks')
.find({'transactions': txid},
projection={'_id': False, 'height': True}))
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def delete_transactions(conn, txn_ids): def delete_transactions(conn, txn_ids):
conn.run(conn.collection('assets').delete_many({'id': {'$in': txn_ids}})) conn.run(conn.collection("assets").delete_many({"id": {"$in": txn_ids}}))
conn.run(conn.collection('metadata').delete_many({'id': {'$in': txn_ids}})) conn.run(conn.collection("metadata").delete_many({"id": {"$in": txn_ids}}))
conn.run(conn.collection('transactions').delete_many({'id': {'$in': txn_ids}})) conn.run(conn.collection("transactions").delete_many({"id": {"$in": txn_ids}}))
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
@ -218,7 +206,7 @@ def store_unspent_outputs(conn, *unspent_outputs):
if unspent_outputs: if unspent_outputs:
try: try:
return conn.run( return conn.run(
conn.collection('utxos').insert_many( conn.collection("utxos").insert_many(
unspent_outputs, unspent_outputs,
ordered=False, ordered=False,
) )
@ -232,14 +220,19 @@ def store_unspent_outputs(conn, *unspent_outputs):
def delete_unspent_outputs(conn, *unspent_outputs): def delete_unspent_outputs(conn, *unspent_outputs):
if unspent_outputs: if unspent_outputs:
return conn.run( return conn.run(
conn.collection('utxos').delete_many({ conn.collection("utxos").delete_many(
'$or': [{ {
'$and': [ "$or": [
{'transaction_id': unspent_output['transaction_id']}, {
{'output_index': unspent_output['output_index']}, "$and": [
{"transaction_id": unspent_output["transaction_id"]},
{"output_index": unspent_output["output_index"]},
], ],
} for unspent_output in unspent_outputs] }
}) for unspent_output in unspent_outputs
]
}
)
) )
@ -247,51 +240,36 @@ def delete_unspent_outputs(conn, *unspent_outputs):
def get_unspent_outputs(conn, *, query=None): def get_unspent_outputs(conn, *, query=None):
if query is None: if query is None:
query = {} query = {}
return conn.run(conn.collection('utxos').find(query, return conn.run(conn.collection("utxos").find(query, projection={"_id": False}))
projection={'_id': False}))
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def store_pre_commit_state(conn, state): def store_pre_commit_state(conn, state):
return conn.run( return conn.run(conn.collection("pre_commit").replace_one({}, state, upsert=True))
conn.collection('pre_commit')
.replace_one({}, state, upsert=True)
)
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def get_pre_commit_state(connection): def get_pre_commit_state(connection):
return connection.run(connection.collection('pre_commit').find_one()) return connection.run(connection.collection("pre_commit").find_one())
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def store_validator_set(conn, validators_update): def store_validator_set(conn, validators_update):
height = validators_update['height'] height = validators_update["height"]
return conn.run( return conn.run(conn.collection("validators").replace_one({"height": height}, validators_update, upsert=True))
conn.collection('validators').replace_one(
{'height': height},
validators_update,
upsert=True
)
)
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def delete_validator_set(conn, height): def delete_validator_set(conn, height):
return conn.run( return conn.run(conn.collection("validators").delete_many({"height": height}))
conn.collection('validators').delete_many({'height': height})
)
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def store_election(conn, election_id, height, is_concluded): def store_election(conn, election_id, height, is_concluded):
return conn.run( return conn.run(
conn.collection('elections').replace_one( conn.collection("elections").replace_one(
{'election_id': election_id, {"election_id": election_id, "height": height},
'height': height}, {"election_id": election_id, "height": height, "is_concluded": is_concluded},
{'election_id': election_id,
'height': height,
'is_concluded': is_concluded},
upsert=True, upsert=True,
) )
) )
@ -299,29 +277,22 @@ def store_election(conn, election_id, height, is_concluded):
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def store_elections(conn, elections): def store_elections(conn, elections):
return conn.run( return conn.run(conn.collection("elections").insert_many(elections))
conn.collection('elections').insert_many(elections)
)
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def delete_elections(conn, height): def delete_elections(conn, height):
return conn.run( return conn.run(conn.collection("elections").delete_many({"height": height}))
conn.collection('elections').delete_many({'height': height})
)
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def get_validator_set(conn, height=None): def get_validator_set(conn, height=None):
query = {} query = {}
if height is not None: if height is not None:
query = {'height': {'$lte': height}} query = {"height": {"$lte": height}}
cursor = conn.run( cursor = conn.run(
conn.collection('validators') conn.collection("validators").find(query, projection={"_id": False}).sort([("height", DESCENDING)]).limit(1)
.find(query, projection={'_id': False})
.sort([('height', DESCENDING)])
.limit(1)
) )
return next(cursor, None) return next(cursor, None)
@ -329,35 +300,27 @@ def get_validator_set(conn, height=None):
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def get_election(conn, election_id): def get_election(conn, election_id):
query = {'election_id': election_id} query = {"election_id": election_id}
return conn.run( return conn.run(
conn.collection('elections') conn.collection("elections").find_one(query, projection={"_id": False}, sort=[("height", DESCENDING)])
.find_one(query, projection={'_id': False},
sort=[('height', DESCENDING)])
) )
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def get_asset_tokens_for_public_key(conn, asset_id, public_key): def get_asset_tokens_for_public_key(conn, asset_id, public_key):
query = {'outputs.public_keys': [public_key], query = {"outputs.public_keys": [public_key], "asset.id": asset_id}
'asset.id': asset_id}
cursor = conn.run( cursor = conn.run(conn.collection("transactions").aggregate([{"$match": query}, {"$project": {"_id": False}}]))
conn.collection('transactions').aggregate([
{'$match': query},
{'$project': {'_id': False}}
]))
return cursor return cursor
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def store_abci_chain(conn, height, chain_id, is_synced=True): def store_abci_chain(conn, height, chain_id, is_synced=True):
return conn.run( return conn.run(
conn.collection('abci_chains').replace_one( conn.collection("abci_chains").replace_one(
{'height': height}, {"height": height},
{'height': height, 'chain_id': chain_id, {"height": height, "chain_id": chain_id, "is_synced": is_synced},
'is_synced': is_synced},
upsert=True, upsert=True,
) )
) )
@ -365,14 +328,9 @@ def store_abci_chain(conn, height, chain_id, is_synced=True):
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def delete_abci_chain(conn, height): def delete_abci_chain(conn, height):
return conn.run( return conn.run(conn.collection("abci_chains").delete_many({"height": height}))
conn.collection('abci_chains').delete_many({'height': height})
)
@register_query(LocalMongoDBConnection) @register_query(LocalMongoDBConnection)
def get_latest_abci_chain(conn): def get_latest_abci_chain(conn):
return conn.run( return conn.run(conn.collection("abci_chains").find_one(projection={"_id": False}, sort=[("height", DESCENDING)]))
conn.collection('abci_chains')
.find_one(projection={'_id': False}, sort=[('height', DESCENDING)])
)

View File

@ -20,48 +20,48 @@ register_schema = module_dispatch_registrar(backend.schema)
INDEXES = { INDEXES = {
'transactions': [ "transactions": [
('id', dict(unique=True, name='transaction_id')), ("id", dict(unique=True, name="transaction_id")),
('asset.id', dict(name='asset_id')), ("asset.id", dict(name="asset_id")),
('outputs.public_keys', dict(name='outputs')), ("outputs.public_keys", dict(name="outputs")),
([('inputs.fulfills.transaction_id', ASCENDING), (
('inputs.fulfills.output_index', ASCENDING)], dict(name='inputs')), [("inputs.fulfills.transaction_id", ASCENDING), ("inputs.fulfills.output_index", ASCENDING)],
dict(name="inputs"),
),
], ],
'assets': [ "assets": [
('id', dict(name='asset_id', unique=True)), ("id", dict(name="asset_id", unique=True)),
([('$**', TEXT)], dict(name='text')), ([("$**", TEXT)], dict(name="text")),
], ],
'blocks': [ "blocks": [
([('height', DESCENDING)], dict(name='height', unique=True)), ([("height", DESCENDING)], dict(name="height", unique=True)),
], ],
'metadata': [ "metadata": [
('id', dict(name='transaction_id', unique=True)), ("id", dict(name="transaction_id", unique=True)),
([('$**', TEXT)], dict(name='text')), ([("$**", TEXT)], dict(name="text")),
], ],
'utxos': [ "utxos": [
([('transaction_id', ASCENDING), ([("transaction_id", ASCENDING), ("output_index", ASCENDING)], dict(name="utxo", unique=True)),
('output_index', ASCENDING)], dict(name='utxo', unique=True)),
], ],
'pre_commit': [ "pre_commit": [
('height', dict(name='height', unique=True)), ("height", dict(name="height", unique=True)),
], ],
'elections': [ "elections": [
([('height', DESCENDING), ('election_id', ASCENDING)], ([("height", DESCENDING), ("election_id", ASCENDING)], dict(name="election_id_height", unique=True)),
dict(name='election_id_height', unique=True)),
], ],
'validators': [ "validators": [
('height', dict(name='height', unique=True)), ("height", dict(name="height", unique=True)),
], ],
'abci_chains': [ "abci_chains": [
('height', dict(name='height', unique=True)), ("height", dict(name="height", unique=True)),
('chain_id', dict(name='chain_id', unique=True)), ("chain_id", dict(name="chain_id", unique=True)),
], ],
} }
@register_schema(LocalMongoDBConnection) @register_schema(LocalMongoDBConnection)
def create_database(conn, dbname): def create_database(conn, dbname):
logger.info('Create database `%s`.', dbname) logger.info("Create database `%s`.", dbname)
# TODO: read and write concerns can be declared here # TODO: read and write concerns can be declared here
conn.conn.get_database(dbname) conn.conn.get_database(dbname)
@ -72,15 +72,15 @@ def create_tables(conn, dbname):
# create the table # create the table
# TODO: read and write concerns can be declared here # TODO: read and write concerns can be declared here
try: try:
logger.info(f'Create `{table_name}` table.') logger.info(f"Create `{table_name}` table.")
conn.conn[dbname].create_collection(table_name) conn.conn[dbname].create_collection(table_name)
except CollectionInvalid: except CollectionInvalid:
logger.info(f'Collection {table_name} already exists.') logger.info(f"Collection {table_name} already exists.")
create_indexes(conn, dbname, table_name, INDEXES[table_name]) create_indexes(conn, dbname, table_name, INDEXES[table_name])
def create_indexes(conn, dbname, collection, indexes): def create_indexes(conn, dbname, collection, indexes):
logger.info(f'Ensure secondary indexes for `{collection}`.') logger.info(f"Ensure secondary indexes for `{collection}`.")
for fields, kwargs in indexes: for fields, kwargs in indexes:
conn.conn[dbname][collection].create_index(fields, **kwargs) conn.conn[dbname][collection].create_index(fields, **kwargs)

View File

@ -215,8 +215,17 @@ def get_txids_filtered(connection, asset_id, operation=None):
@singledispatch @singledispatch
def text_search(conn, search, *, language='english', case_sensitive=False, def text_search(
diacritic_sensitive=False, text_score=False, limit=0, table=None): conn,
search,
*,
language="english",
case_sensitive=False,
diacritic_sensitive=False,
text_score=False,
limit=0,
table=None
):
"""Return all the assets that match the text search. """Return all the assets that match the text search.
The results are sorted by text score. The results are sorted by text score.
@ -243,8 +252,7 @@ def text_search(conn, search, *, language='english', case_sensitive=False,
OperationError: If the backend does not support text search OperationError: If the backend does not support text search
""" """
raise OperationError('This query is only supported when running ' raise OperationError("This query is only supported when running " "Planetmint with MongoDB as the backend.")
'Planetmint with MongoDB as the backend.')
@singledispatch @singledispatch
@ -384,8 +392,7 @@ def get_validator_set(conn, height):
@singledispatch @singledispatch
def get_election(conn, election_id): def get_election(conn, election_id):
"""Return the election record """Return the election record"""
"""
raise NotImplementedError raise NotImplementedError
@ -432,6 +439,5 @@ def get_latest_abci_chain(conn):
@singledispatch @singledispatch
def _group_transaction_by_ids(txids: list, connection): def _group_transaction_by_ids(txids: list, connection):
"""Returns the transactions object (JSON TYPE), from list of ids. """Returns the transactions object (JSON TYPE), from list of ids."""
"""
raise NotImplementedError raise NotImplementedError

View File

@ -12,23 +12,74 @@ from planetmint.config import Config
from planetmint.backend.connection import connect from planetmint.backend.connection import connect
from planetmint.transactions.common.exceptions import ValidationError from planetmint.transactions.common.exceptions import ValidationError
from planetmint.transactions.common.utils import ( from planetmint.transactions.common.utils import (
validate_all_values_for_key_in_obj, validate_all_values_for_key_in_list) validate_all_values_for_key_in_obj,
validate_all_values_for_key_in_list,
)
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
# Tables/collections that every backend database must create # Tables/collections that every backend database must create
TABLES = ('transactions', 'blocks', 'assets', 'metadata', TABLES = (
'validators', 'elections', 'pre_commit', 'utxos', 'abci_chains') "transactions",
"blocks",
"assets",
"metadata",
"validators",
"elections",
"pre_commit",
"utxos",
"abci_chains",
)
SPACE_NAMES = ("abci_chains", "assets", "blocks", "blocks_tx", SPACE_NAMES = (
"elections", "meta_data", "pre_commits", "validators", "abci_chains",
"transactions", "inputs", "outputs", "keys", "utxos") "assets",
"blocks",
"blocks_tx",
"elections",
"meta_data",
"pre_commits",
"validators",
"transactions",
"inputs",
"outputs",
"keys",
"utxos",
)
VALID_LANGUAGES = ('danish', 'dutch', 'english', 'finnish', 'french', 'german', VALID_LANGUAGES = (
'hungarian', 'italian', 'norwegian', 'portuguese', 'romanian', "danish",
'russian', 'spanish', 'swedish', 'turkish', 'none', "dutch",
'da', 'nl', 'en', 'fi', 'fr', 'de', 'hu', 'it', 'nb', 'pt', "english",
'ro', 'ru', 'es', 'sv', 'tr') "finnish",
"french",
"german",
"hungarian",
"italian",
"norwegian",
"portuguese",
"romanian",
"russian",
"spanish",
"swedish",
"turkish",
"none",
"da",
"nl",
"en",
"fi",
"fr",
"de",
"hu",
"it",
"nb",
"pt",
"ro",
"ru",
"es",
"sv",
"tr",
)
@singledispatch @singledispatch
@ -84,7 +135,7 @@ def init_database(connection=None, dbname=None):
""" """
connection = connection or connect() connection = connection or connect()
dbname = dbname or Config().get()['database']['name'] dbname = dbname or Config().get()["database"]["name"]
create_database(connection, dbname) create_database(connection, dbname)
create_tables(connection, dbname) create_tables(connection, dbname)
@ -102,14 +153,14 @@ def validate_language_key(obj, key):
Raises: Raises:
ValidationError: will raise exception in case language is not valid. ValidationError: will raise exception in case language is not valid.
""" """
backend = Config().get()['database']['backend'] backend = Config().get()["database"]["backend"]
if backend == 'localmongodb': if backend == "localmongodb":
data = obj.get(key, {}) data = obj.get(key, {})
if isinstance(data, dict): if isinstance(data, dict):
validate_all_values_for_key_in_obj(data, 'language', validate_language) validate_all_values_for_key_in_obj(data, "language", validate_language)
elif isinstance(data, list): elif isinstance(data, list):
validate_all_values_for_key_in_list(data, 'language', validate_language) validate_all_values_for_key_in_list(data, "language", validate_language)
def validate_language(value): def validate_language(value):
@ -126,8 +177,10 @@ def validate_language(value):
ValidationError: will raise exception in case language is not valid. ValidationError: will raise exception in case language is not valid.
""" """
if value not in VALID_LANGUAGES: if value not in VALID_LANGUAGES:
error_str = ('MongoDB does not support text search for the ' error_str = (
"MongoDB does not support text search for the "
'language "{}". If you do not understand this error ' 'language "{}". If you do not understand this error '
'message then please rename key/field "language" to ' 'message then please rename key/field "language" to '
'something else like "lang".').format(value) 'something else like "lang".'
).format(value)
raise ValidationError(error_str) raise ValidationError(error_str)

View File

@ -16,11 +16,10 @@ register_query = module_dispatch_registrar(convert)
def prepare_asset(connection, transaction_type, transaction_id, filter_operation, asset): def prepare_asset(connection, transaction_type, transaction_id, filter_operation, asset):
asset_id = transaction_id asset_id = transaction_id
if transaction_type != filter_operation: if transaction_type != filter_operation:
asset_id = asset['id'] asset_id = asset["id"]
return tuple([asset, transaction_id, asset_id]) return tuple([asset, transaction_id, asset_id])
@register_query(TarantoolDBConnection) @register_query(TarantoolDBConnection)
def prepare_metadata(connection, transaction_id, metadata): def prepare_metadata(connection, transaction_id, metadata):
return {'id': transaction_id, return {"id": transaction_id, "metadata": metadata}
'metadata': metadata}

View File

@ -57,40 +57,22 @@ def store_transactions(connection, signed_transactions: list):
txprepare = TransactionDecompose(transaction) txprepare = TransactionDecompose(transaction)
txtuples = txprepare.convert_to_tuple() txtuples = txprepare.convert_to_tuple()
try: try:
connection.run( connection.run(connection.space("transactions").insert(txtuples["transactions"]), only_data=False)
connection.space("transactions").insert(txtuples["transactions"]),
only_data=False
)
except: # This is used for omitting duplicate error in database for test -> test_bigchain_api::test_double_inclusion # noqa: E501, E722 except: # This is used for omitting duplicate error in database for test -> test_bigchain_api::test_double_inclusion # noqa: E501, E722
continue continue
for _in in txtuples["inputs"]: for _in in txtuples["inputs"]:
connection.run( connection.run(connection.space("inputs").insert(_in), only_data=False)
connection.space("inputs").insert(_in),
only_data=False
)
for _out in txtuples["outputs"]: for _out in txtuples["outputs"]:
connection.run( connection.run(connection.space("outputs").insert(_out), only_data=False)
connection.space("outputs").insert(_out),
only_data=False
)
for _key in txtuples["keys"]: for _key in txtuples["keys"]:
connection.run( connection.run(connection.space("keys").insert(_key), only_data=False)
connection.space("keys").insert(_key),
only_data=False
)
if txtuples["metadata"] is not None: if txtuples["metadata"] is not None:
connection.run( connection.run(connection.space("meta_data").insert(txtuples["metadata"]), only_data=False)
connection.space("meta_data").insert(txtuples["metadata"]),
only_data=False
)
if txtuples["asset"] is not None: if txtuples["asset"] is not None:
connection.run( connection.run(connection.space("assets").insert(txtuples["asset"]), only_data=False)
connection.space("assets").insert(txtuples["asset"]),
only_data=False
)
@register_query(TarantoolDBConnection) @register_query(TarantoolDBConnection)
@ -110,7 +92,8 @@ def store_metadatas(connection, metadata: list):
for meta in metadata: for meta in metadata:
connection.run( connection.run(
connection.space("meta_data").insert( connection.space("meta_data").insert(
(meta["id"], json.dumps(meta["data"] if not "metadata" in meta else meta["metadata"]))) # noqa: E713 (meta["id"], json.dumps(meta["data"] if not "metadata" in meta else meta["metadata"]))
) # noqa: E713
) )
@ -118,9 +101,7 @@ def store_metadatas(connection, metadata: list):
def get_metadata(connection, transaction_ids: list): def get_metadata(connection, transaction_ids: list):
_returned_data = [] _returned_data = []
for _id in transaction_ids: for _id in transaction_ids:
metadata = connection.run( metadata = connection.run(connection.space("meta_data").select(_id, index="id_search"))
connection.space("meta_data").select(_id, index="id_search")
)
if metadata is not None: if metadata is not None:
if len(metadata) > 0: if len(metadata) > 0:
metadata[0] = list(metadata[0]) metadata[0] = list(metadata[0])
@ -139,14 +120,13 @@ def store_asset(connection, asset):
return tuple(obj) return tuple(obj)
else: else:
return (json.dumps(obj), obj["id"], obj["id"]) return (json.dumps(obj), obj["id"], obj["id"])
try: try:
return connection.run( return connection.run(connection.space("assets").insert(convert(asset)), only_data=False)
connection.space("assets").insert(convert(asset)),
only_data=False
)
except DatabaseError: except DatabaseError:
pass pass
@register_query(TarantoolDBConnection) @register_query(TarantoolDBConnection)
def store_assets(connection, assets: list): def store_assets(connection, assets: list):
for asset in assets: for asset in assets:
@ -155,9 +135,7 @@ def store_assets(connection, assets: list):
@register_query(TarantoolDBConnection) @register_query(TarantoolDBConnection)
def get_asset(connection, asset_id: str): def get_asset(connection, asset_id: str):
_data = connection.run( _data = connection.run(connection.space("assets").select(asset_id, index="txid_search"))
connection.space("assets").select(asset_id, index="txid_search")
)
return json.loads(_data[0][0]) if len(_data) > 0 else [] return json.loads(_data[0][0]) if len(_data) > 0 else []
@ -166,9 +144,7 @@ def get_asset(connection, asset_id: str):
def get_assets(connection, assets_ids: list) -> list: def get_assets(connection, assets_ids: list) -> list:
_returned_data = [] _returned_data = []
for _id in list(set(assets_ids)): for _id in list(set(assets_ids)):
res = connection.run( res = connection.run(connection.space("assets").select(_id, index="txid_search"))
connection.space("assets").select(_id, index="txid_search")
)
_returned_data.append(res[0]) _returned_data.append(res[0])
sorted_assets = sorted(_returned_data, key=lambda k: k[1], reverse=False) sorted_assets = sorted(_returned_data, key=lambda k: k[1], reverse=False)
@ -186,17 +162,13 @@ def get_spent(connection, fullfil_transaction_id: str, fullfil_output_index: str
@register_query(TarantoolDBConnection) @register_query(TarantoolDBConnection)
def get_latest_block(connection): # TODO Here is used DESCENDING OPERATOR def get_latest_block(connection): # TODO Here is used DESCENDING OPERATOR
_all_blocks = connection.run( _all_blocks = connection.run(connection.space("blocks").select())
connection.space("blocks").select() block = {"app_hash": "", "height": 0, "transactions": []}
)
block = {"app_hash": '', "height": 0, "transactions": []}
if _all_blocks is not None: if _all_blocks is not None:
if len(_all_blocks) > 0: if len(_all_blocks) > 0:
_block = sorted(_all_blocks, key=itemgetter(1), reverse=True)[0] _block = sorted(_all_blocks, key=itemgetter(1), reverse=True)[0]
_txids = connection.run( _txids = connection.run(connection.space("blocks_tx").select(_block[2], index="block_search"))
connection.space("blocks_tx").select(_block[2], index="block_search")
)
block["app_hash"] = _block[0] block["app_hash"] = _block[0]
block["height"] = _block[1] block["height"] = _block[1]
block["transactions"] = [tx[0] for tx in _txids] block["transactions"] = [tx[0] for tx in _txids]
@ -209,27 +181,22 @@ def get_latest_block(connection): # TODO Here is used DESCENDING OPERATOR
def store_block(connection, block: dict): def store_block(connection, block: dict):
block_unique_id = token_hex(8) block_unique_id = token_hex(8)
connection.run( connection.run(
connection.space("blocks").insert((block["app_hash"], connection.space("blocks").insert((block["app_hash"], block["height"], block_unique_id)), only_data=False
block["height"],
block_unique_id)),
only_data=False
) )
for txid in block["transactions"]: for txid in block["transactions"]:
connection.run( connection.run(connection.space("blocks_tx").insert((txid, block_unique_id)), only_data=False)
connection.space("blocks_tx").insert((txid, block_unique_id)),
only_data=False
)
@register_query(TarantoolDBConnection) @register_query(TarantoolDBConnection)
def get_txids_filtered(connection, asset_id: str, operation: str = None, def get_txids_filtered(
last_tx: any = None): # TODO here is used 'OR' operator connection, asset_id: str, operation: str = None, last_tx: any = None
): # TODO here is used 'OR' operator
actions = { actions = {
"CREATE": {"sets": ["CREATE", asset_id], "index": "transaction_search"}, "CREATE": {"sets": ["CREATE", asset_id], "index": "transaction_search"},
# 1 - operation, 2 - id (only in transactions) + # 1 - operation, 2 - id (only in transactions) +
"TRANSFER": {"sets": ["TRANSFER", asset_id], "index": "transaction_search"}, "TRANSFER": {"sets": ["TRANSFER", asset_id], "index": "transaction_search"},
# 1 - operation, 2 - asset.id (linked mode) + OPERATOR OR # 1 - operation, 2 - asset.id (linked mode) + OPERATOR OR
None: {"sets": [asset_id, asset_id]} None: {"sets": [asset_id, asset_id]},
}[operation] }[operation]
_transactions = [] _transactions = []
if actions["sets"][0] == "CREATE": # + if actions["sets"][0] == "CREATE": # +
@ -237,9 +204,7 @@ def get_txids_filtered(connection, asset_id: str, operation: str = None,
connection.space("transactions").select([operation, asset_id], index=actions["index"]) connection.space("transactions").select([operation, asset_id], index=actions["index"])
) )
elif actions["sets"][0] == "TRANSFER": # + elif actions["sets"][0] == "TRANSFER": # +
_assets = connection.run( _assets = connection.run(connection.space("assets").select([asset_id], index="only_asset_search"))
connection.space("assets").select([asset_id], index="only_asset_search")
)
for asset in _assets: for asset in _assets:
_txid = asset[1] _txid = asset[1]
_transactions = connection.run( _transactions = connection.run(
@ -248,12 +213,8 @@ def get_txids_filtered(connection, asset_id: str, operation: str = None,
if len(_transactions) != 0: if len(_transactions) != 0:
break break
else: else:
_tx_ids = connection.run( _tx_ids = connection.run(connection.space("transactions").select([asset_id], index="id_search"))
connection.space("transactions").select([asset_id], index="id_search") _assets_ids = connection.run(connection.space("assets").select([asset_id], index="only_asset_search"))
)
_assets_ids = connection.run(
connection.space("assets").select([asset_id], index="only_asset_search")
)
return tuple(set([sublist[1] for sublist in _assets_ids] + [sublist[0] for sublist in _tx_ids])) return tuple(set([sublist[1] for sublist in _assets_ids] + [sublist[0] for sublist in _tx_ids]))
if last_tx: if last_tx:
@ -261,43 +222,34 @@ def get_txids_filtered(connection, asset_id: str, operation: str = None,
return tuple([elem[0] for elem in _transactions]) return tuple([elem[0] for elem in _transactions])
@register_query(TarantoolDBConnection) @register_query(TarantoolDBConnection)
def text_search(conn, search, table='assets', limit=0): def text_search(conn, search, table="assets", limit=0):
pattern = ".{}.".format(search) pattern = ".{}.".format(search)
field_no = 1 if table == 'assets' else 2 # 2 for meta_data field_no = 1 if table == "assets" else 2 # 2 for meta_data
res = conn.run( res = conn.run(conn.space(table).call("indexed_pattern_search", (table, field_no, pattern)))
conn.space(table).call('indexed_pattern_search', (table, field_no, pattern))
)
to_return = [] to_return = []
if len(res[0]): # NEEDS BEAUTIFICATION if len(res[0]): # NEEDS BEAUTIFICATION
if table == 'assets': if table == "assets":
for result in res[0]: for result in res[0]:
to_return.append({ to_return.append({"data": json.loads(result[0])["data"], "id": result[1]})
'data': json.loads(result[0])['data'],
'id': result[1]
})
else: else:
for result in res[0]: for result in res[0]:
to_return.append({ to_return.append({"metadata": json.loads(result[1]), "id": result[0]})
'metadata': json.loads(result[1]),
'id': result[0]
})
return to_return if limit == 0 else to_return[:limit] return to_return if limit == 0 else to_return[:limit]
def _remove_text_score(asset): def _remove_text_score(asset):
asset.pop('score', None) asset.pop("score", None)
return asset return asset
@register_query(TarantoolDBConnection) @register_query(TarantoolDBConnection)
def get_owned_ids(connection, owner: str): def get_owned_ids(connection, owner: str):
_keys = connection.run( _keys = connection.run(connection.space("keys").select(owner, index="keys_search"))
connection.space("keys").select(owner, index="keys_search")
)
if _keys is None or len(_keys) == 0: if _keys is None or len(_keys) == 0:
return [] return []
_transactionids = list(set([key[1] for key in _keys])) _transactionids = list(set([key[1] for key in _keys]))
@ -310,9 +262,11 @@ def get_spending_transactions(connection, inputs):
_transactions = [] _transactions = []
for inp in inputs: for inp in inputs:
_trans_list = get_spent(fullfil_transaction_id=inp["transaction_id"], _trans_list = get_spent(
fullfil_transaction_id=inp["transaction_id"],
fullfil_output_index=inp["output_index"], fullfil_output_index=inp["output_index"],
connection=connection) connection=connection,
)
_transactions.extend(_trans_list) _transactions.extend(_trans_list)
return _transactions return _transactions
@ -320,28 +274,20 @@ def get_spending_transactions(connection, inputs):
@register_query(TarantoolDBConnection) @register_query(TarantoolDBConnection)
def get_block(connection, block_id=[]): def get_block(connection, block_id=[]):
_block = connection.run( _block = connection.run(connection.space("blocks").select(block_id, index="block_search", limit=1))
connection.space("blocks").select(block_id, index="block_search", limit=1)
)
if _block is None or len(_block) == 0: if _block is None or len(_block) == 0:
return [] return []
_block = _block[0] _block = _block[0]
_txblock = connection.run( _txblock = connection.run(connection.space("blocks_tx").select(_block[2], index="block_search"))
connection.space("blocks_tx").select(_block[2], index="block_search")
)
return {"app_hash": _block[0], "height": _block[1], "transactions": [_tx[0] for _tx in _txblock]} return {"app_hash": _block[0], "height": _block[1], "transactions": [_tx[0] for _tx in _txblock]}
@register_query(TarantoolDBConnection) @register_query(TarantoolDBConnection)
def get_block_with_transaction(connection, txid: str): def get_block_with_transaction(connection, txid: str):
_all_blocks_tx = connection.run( _all_blocks_tx = connection.run(connection.space("blocks_tx").select(txid, index="id_search"))
connection.space("blocks_tx").select(txid, index="id_search")
)
if _all_blocks_tx is None or len(_all_blocks_tx) == 0: if _all_blocks_tx is None or len(_all_blocks_tx) == 0:
return [] return []
_block = connection.run( _block = connection.run(connection.space("blocks").select(_all_blocks_tx[0][1], index="block_id_search"))
connection.space("blocks").select(_all_blocks_tx[0][1], index="block_id_search")
)
return [{"height": _height[1]} for _height in _block] return [{"height": _height[1]} for _height in _block]
@ -373,7 +319,7 @@ def store_unspent_outputs(connection, *unspent_outputs: list):
if unspent_outputs: if unspent_outputs:
for utxo in unspent_outputs: for utxo in unspent_outputs:
output = connection.run( output = connection.run(
connection.space("utxos").insert((utxo['transaction_id'], utxo['output_index'], dumps(utxo))) connection.space("utxos").insert((utxo["transaction_id"], utxo["output_index"], dumps(utxo)))
) )
result.append(output) result.append(output)
return result return result
@ -384,42 +330,36 @@ def delete_unspent_outputs(connection, *unspent_outputs: list):
result = [] result = []
if unspent_outputs: if unspent_outputs:
for utxo in unspent_outputs: for utxo in unspent_outputs:
output = connection.run( output = connection.run(connection.space("utxos").delete((utxo["transaction_id"], utxo["output_index"])))
connection.space("utxos").delete((utxo['transaction_id'], utxo['output_index']))
)
result.append(output) result.append(output)
return result return result
@register_query(TarantoolDBConnection) @register_query(TarantoolDBConnection)
def get_unspent_outputs(connection, query=None): # for now we don't have implementation for 'query'. def get_unspent_outputs(connection, query=None): # for now we don't have implementation for 'query'.
_utxos = connection.run( _utxos = connection.run(connection.space("utxos").select([]))
connection.space("utxos").select([])
)
return [loads(utx[2]) for utx in _utxos] return [loads(utx[2]) for utx in _utxos]
@register_query(TarantoolDBConnection) @register_query(TarantoolDBConnection)
def store_pre_commit_state(connection, state: dict): def store_pre_commit_state(connection, state: dict):
_precommit = connection.run( _precommit = connection.run(connection.space("pre_commits").select([], limit=1))
connection.space("pre_commits").select([], limit=1) _precommitTuple = (
(token_hex(8), state["height"], state["transactions"])
if _precommit is None or len(_precommit) == 0
else _precommit[0]
) )
_precommitTuple = (token_hex(8), state["height"], state["transactions"]) if _precommit is None or len(
_precommit) == 0 else _precommit[0]
connection.run( connection.run(
connection.space("pre_commits").upsert(_precommitTuple, connection.space("pre_commits").upsert(
op_list=[('=', 1, state["height"]), _precommitTuple, op_list=[("=", 1, state["height"]), ("=", 2, state["transactions"])], limit=1
('=', 2, state["transactions"])], ),
limit=1), only_data=False,
only_data=False
) )
@register_query(TarantoolDBConnection) @register_query(TarantoolDBConnection)
def get_pre_commit_state(connection): def get_pre_commit_state(connection):
_commit = connection.run( _commit = connection.run(connection.space("pre_commits").select([], index="id_search"))
connection.space("pre_commits").select([], index="id_search")
)
if _commit is None or len(_commit) == 0: if _commit is None or len(_commit) == 0:
return None return None
_commit = sorted(_commit, key=itemgetter(1), reverse=False)[0] _commit = sorted(_commit, key=itemgetter(1), reverse=False)[0]
@ -428,39 +368,32 @@ def get_pre_commit_state(connection):
@register_query(TarantoolDBConnection) @register_query(TarantoolDBConnection)
def store_validator_set(conn, validators_update: dict): def store_validator_set(conn, validators_update: dict):
_validator = conn.run( _validator = conn.run(conn.space("validators").select(validators_update["height"], index="height_search", limit=1))
conn.space("validators").select(validators_update["height"], index="height_search", limit=1)
)
unique_id = token_hex(8) if _validator is None or len(_validator) == 0 else _validator[0][0] unique_id = token_hex(8) if _validator is None or len(_validator) == 0 else _validator[0][0]
conn.run( conn.run(
conn.space("validators").upsert((unique_id, validators_update["height"], validators_update["validators"]), conn.space("validators").upsert(
op_list=[('=', 1, validators_update["height"]), (unique_id, validators_update["height"], validators_update["validators"]),
('=', 2, validators_update["validators"])], op_list=[("=", 1, validators_update["height"]), ("=", 2, validators_update["validators"])],
limit=1), limit=1,
only_data=False ),
only_data=False,
) )
@register_query(TarantoolDBConnection) @register_query(TarantoolDBConnection)
def delete_validator_set(connection, height: int): def delete_validator_set(connection, height: int):
_validators = connection.run( _validators = connection.run(connection.space("validators").select(height, index="height_search"))
connection.space("validators").select(height, index="height_search")
)
for _valid in _validators: for _valid in _validators:
connection.run( connection.run(connection.space("validators").delete(_valid[0]), only_data=False)
connection.space("validators").delete(_valid[0]),
only_data=False
)
@register_query(TarantoolDBConnection) @register_query(TarantoolDBConnection)
def store_election(connection, election_id: str, height: int, is_concluded: bool): def store_election(connection, election_id: str, height: int, is_concluded: bool):
connection.run( connection.run(
connection.space("elections").upsert((election_id, height, is_concluded), connection.space("elections").upsert(
op_list=[('=', 1, height), (election_id, height, is_concluded), op_list=[("=", 1, height), ("=", 2, is_concluded)], limit=1
('=', 2, is_concluded)], ),
limit=1), only_data=False,
only_data=False
) )
@ -468,33 +401,27 @@ def store_election(connection, election_id: str, height: int, is_concluded: bool
def store_elections(connection, elections: list): def store_elections(connection, elections: list):
for election in elections: for election in elections:
_election = connection.run( # noqa: F841 _election = connection.run( # noqa: F841
connection.space("elections").insert((election["election_id"], connection.space("elections").insert(
election["height"], (election["election_id"], election["height"], election["is_concluded"])
election["is_concluded"])), ),
only_data=False only_data=False,
) )
@register_query(TarantoolDBConnection) @register_query(TarantoolDBConnection)
def delete_elections(connection, height: int): def delete_elections(connection, height: int):
_elections = connection.run( _elections = connection.run(connection.space("elections").select(height, index="height_search"))
connection.space("elections").select(height, index="height_search")
)
for _elec in _elections: for _elec in _elections:
connection.run( connection.run(connection.space("elections").delete(_elec[0]), only_data=False)
connection.space("elections").delete(_elec[0]),
only_data=False
)
@register_query(TarantoolDBConnection) @register_query(TarantoolDBConnection)
def get_validator_set(connection, height: int = None): def get_validator_set(connection, height: int = None):
_validators = connection.run( _validators = connection.run(connection.space("validators").select())
connection.space("validators").select()
)
if height is not None and _validators is not None: if height is not None and _validators is not None:
_validators = [{"height": validator[1], "validators": validator[2]} for validator in _validators if _validators = [
validator[1] <= height] {"height": validator[1], "validators": validator[2]} for validator in _validators if validator[1] <= height
]
return next(iter(sorted(_validators, key=lambda k: k["height"], reverse=True)), None) return next(iter(sorted(_validators, key=lambda k: k["height"], reverse=True)), None)
elif _validators is not None: elif _validators is not None:
_validators = [{"height": validator[1], "validators": validator[2]} for validator in _validators] _validators = [{"height": validator[1], "validators": validator[2]} for validator in _validators]
@ -504,9 +431,7 @@ def get_validator_set(connection, height: int = None):
@register_query(TarantoolDBConnection) @register_query(TarantoolDBConnection)
def get_election(connection, election_id: str): def get_election(connection, election_id: str):
_elections = connection.run( _elections = connection.run(connection.space("elections").select(election_id, index="id_search"))
connection.space("elections").select(election_id, index="id_search")
)
if _elections is None or len(_elections) == 0: if _elections is None or len(_elections) == 0:
return None return None
_election = sorted(_elections, key=itemgetter(0), reverse=True)[0] _election = sorted(_elections, key=itemgetter(0), reverse=True)[0]
@ -514,13 +439,12 @@ def get_election(connection, election_id: str):
@register_query(TarantoolDBConnection) @register_query(TarantoolDBConnection)
def get_asset_tokens_for_public_key(connection, asset_id: str, def get_asset_tokens_for_public_key(
public_key: str): # FIXME Something can be wrong with this function ! (public_key) is not used # noqa: E501 connection, asset_id: str, public_key: str
): # FIXME Something can be wrong with this function ! (public_key) is not used # noqa: E501
# space = connection.space("keys") # space = connection.space("keys")
# _keys = space.select([public_key], index="keys_search") # _keys = space.select([public_key], index="keys_search")
_transactions = connection.run( _transactions = connection.run(connection.space("assets").select([asset_id], index="assetid_search"))
connection.space("assets").select([asset_id], index="assetid_search")
)
# _transactions = _transactions # _transactions = _transactions
# _keys = _keys.data # _keys = _keys.data
_grouped_transactions = _group_transaction_by_ids(connection=connection, txids=[_tx[1] for _tx in _transactions]) _grouped_transactions = _group_transaction_by_ids(connection=connection, txids=[_tx[1] for _tx in _transactions])
@ -531,30 +455,23 @@ def get_asset_tokens_for_public_key(connection, asset_id: str,
def store_abci_chain(connection, height: int, chain_id: str, is_synced: bool = True): def store_abci_chain(connection, height: int, chain_id: str, is_synced: bool = True):
hash_id_primarykey = sha256(dumps(obj={"height": height}).encode()).hexdigest() hash_id_primarykey = sha256(dumps(obj={"height": height}).encode()).hexdigest()
connection.run( connection.run(
connection.space("abci_chains").upsert((height, is_synced, chain_id, hash_id_primarykey), connection.space("abci_chains").upsert(
op_list=[ (height, is_synced, chain_id, hash_id_primarykey),
('=', 0, height), op_list=[("=", 0, height), ("=", 1, is_synced), ("=", 2, chain_id)],
('=', 1, is_synced), ),
('=', 2, chain_id) only_data=False,
]),
only_data=False
) )
@register_query(TarantoolDBConnection) @register_query(TarantoolDBConnection)
def delete_abci_chain(connection, height: int): def delete_abci_chain(connection, height: int):
hash_id_primarykey = sha256(dumps(obj={"height": height}).encode()).hexdigest() hash_id_primarykey = sha256(dumps(obj={"height": height}).encode()).hexdigest()
connection.run( connection.run(connection.space("abci_chains").delete(hash_id_primarykey), only_data=False)
connection.space("abci_chains").delete(hash_id_primarykey),
only_data=False
)
@register_query(TarantoolDBConnection) @register_query(TarantoolDBConnection)
def get_latest_abci_chain(connection): def get_latest_abci_chain(connection):
_all_chains = connection.run( _all_chains = connection.run(connection.space("abci_chains").select())
connection.space("abci_chains").select()
)
if _all_chains is None or len(_all_chains) == 0: if _all_chains is None or len(_all_chains) == 0:
return None return None
_chain = sorted(_all_chains, key=itemgetter(0), reverse=True)[0] _chain = sorted(_all_chains, key=itemgetter(0), reverse=True)[0]

View File

@ -9,9 +9,21 @@ from planetmint.backend.tarantool.connection import TarantoolDBConnection
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
register_schema = module_dispatch_registrar(backend.schema) register_schema = module_dispatch_registrar(backend.schema)
SPACE_NAMES = ("abci_chains", "assets", "blocks", "blocks_tx", SPACE_NAMES = (
"elections", "meta_data", "pre_commits", "validators", "abci_chains",
"transactions", "inputs", "outputs", "keys", "utxos") "assets",
"blocks",
"blocks_tx",
"elections",
"meta_data",
"pre_commits",
"validators",
"transactions",
"inputs",
"outputs",
"keys",
"utxos",
)
SPACE_COMMANDS = { SPACE_COMMANDS = {
"abci_chains": "abci_chains = box.schema.space.create('abci_chains', {engine='memtx', is_sync = false})", "abci_chains": "abci_chains = box.schema.space.create('abci_chains', {engine='memtx', is_sync = false})",
@ -26,110 +38,86 @@ SPACE_COMMANDS = {
"inputs": "inputs = box.schema.space.create('inputs')", "inputs": "inputs = box.schema.space.create('inputs')",
"outputs": "outputs = box.schema.space.create('outputs')", "outputs": "outputs = box.schema.space.create('outputs')",
"keys": "keys = box.schema.space.create('keys')", "keys": "keys = box.schema.space.create('keys')",
"utxos": "utxos = box.schema.space.create('utxos', {engine = 'memtx' , is_sync = false})" "utxos": "utxos = box.schema.space.create('utxos', {engine = 'memtx' , is_sync = false})",
} }
INDEX_COMMANDS = { INDEX_COMMANDS = {
"abci_chains": "abci_chains": {
{
"id_search": "abci_chains:create_index('id_search' ,{type='hash', parts={'id'}})", "id_search": "abci_chains:create_index('id_search' ,{type='hash', parts={'id'}})",
"height_search": "abci_chains:create_index('height_search' ,{type='tree', unique=false, parts={'height'}})" "height_search": "abci_chains:create_index('height_search' ,{type='tree', unique=false, parts={'height'}})",
}, },
"assets": "assets": {
{
"txid_search": "assets:create_index('txid_search', {type='hash', parts={'tx_id'}})", "txid_search": "assets:create_index('txid_search', {type='hash', parts={'tx_id'}})",
"assetid_search": "assets:create_index('assetid_search', {type='tree',unique=false, parts={'asset_id', 'tx_id'}})", # noqa: E501 "assetid_search": "assets:create_index('assetid_search', {type='tree',unique=false, parts={'asset_id', 'tx_id'}})", # noqa: E501
"only_asset_search": "assets:create_index('only_asset_search', {type='tree', unique=false, parts={'asset_id'}})", # noqa: E501 "only_asset_search": "assets:create_index('only_asset_search', {type='tree', unique=false, parts={'asset_id'}})", # noqa: E501
"text_search": "assets:create_index('secondary', {unique=false,parts={1,'string'}})" "text_search": "assets:create_index('secondary', {unique=false,parts={1,'string'}})",
}, },
"blocks": "blocks": {
{
"id_search": "blocks:create_index('id_search' , {type='hash' , parts={'block_id'}})", "id_search": "blocks:create_index('id_search' , {type='hash' , parts={'block_id'}})",
"block_search": "blocks:create_index('block_search' , {type='tree', unique = false, parts={'height'}})", "block_search": "blocks:create_index('block_search' , {type='tree', unique = false, parts={'height'}})",
"block_id_search": "blocks:create_index('block_id_search', {type = 'hash', parts ={'block_id'}})" "block_id_search": "blocks:create_index('block_id_search', {type = 'hash', parts ={'block_id'}})",
}, },
"blocks_tx": "blocks_tx": {
{
"id_search": "blocks_tx:create_index('id_search',{ type = 'hash', parts={'transaction_id'}})", "id_search": "blocks_tx:create_index('id_search',{ type = 'hash', parts={'transaction_id'}})",
"block_search": "blocks_tx:create_index('block_search', {type = 'tree',unique=false, parts={'block_id'}})" "block_search": "blocks_tx:create_index('block_search', {type = 'tree',unique=false, parts={'block_id'}})",
}, },
"elections": "elections": {
{
"id_search": "elections:create_index('id_search' , {type='hash', parts={'election_id'}})", "id_search": "elections:create_index('id_search' , {type='hash', parts={'election_id'}})",
"height_search": "elections:create_index('height_search' , {type='tree',unique=false, parts={'height'}})", "height_search": "elections:create_index('height_search' , {type='tree',unique=false, parts={'height'}})",
"update_search": "elections:create_index('update_search', {type='tree', unique=false, parts={'election_id', 'height'}})" # noqa: E501 "update_search": "elections:create_index('update_search', {type='tree', unique=false, parts={'election_id', 'height'}})", # noqa: E501
}, },
"meta_data": "meta_data": {
{
"id_search": "meta_datas:create_index('id_search', { type='hash' , parts={'transaction_id'}})", "id_search": "meta_datas:create_index('id_search', { type='hash' , parts={'transaction_id'}})",
"text_search": "meta_datas:create_index('secondary', {unique=false,parts={2,'string'}})" "text_search": "meta_datas:create_index('secondary', {unique=false,parts={2,'string'}})",
}, },
"pre_commits": "pre_commits": {
{
"id_search": "pre_commits:create_index('id_search', {type ='hash' , parts={'commit_id'}})", "id_search": "pre_commits:create_index('id_search', {type ='hash' , parts={'commit_id'}})",
"height_search": "pre_commits:create_index('height_search', {type ='tree',unique=true, parts={'height'}})" "height_search": "pre_commits:create_index('height_search', {type ='tree',unique=true, parts={'height'}})",
}, },
"validators": "validators": {
{
"id_search": "validators:create_index('id_search' , {type='hash' , parts={'validator_id'}})", "id_search": "validators:create_index('id_search' , {type='hash' , parts={'validator_id'}})",
"height_search": "validators:create_index('height_search' , {type='tree', unique=true, parts={'height'}})" "height_search": "validators:create_index('height_search' , {type='tree', unique=true, parts={'height'}})",
}, },
"transactions": "transactions": {
{
"id_search": "transactions:create_index('id_search' , {type = 'hash' , parts={'transaction_id'}})", "id_search": "transactions:create_index('id_search' , {type = 'hash' , parts={'transaction_id'}})",
"transaction_search": "transactions:create_index('transaction_search' , {type = 'tree',unique=false, parts={'operation', 'transaction_id'}})" # noqa: E501 "transaction_search": "transactions:create_index('transaction_search' , {type = 'tree',unique=false, parts={'operation', 'transaction_id'}})", # noqa: E501
}, },
"inputs": "inputs": {
{
"delete_search": "inputs:create_index('delete_search' , {type = 'hash', parts={'input_id'}})", "delete_search": "inputs:create_index('delete_search' , {type = 'hash', parts={'input_id'}})",
"spent_search": "inputs:create_index('spent_search' , {type = 'tree', unique=false, parts={'fulfills_transaction_id', 'fulfills_output_index'}})", # noqa: E501 "spent_search": "inputs:create_index('spent_search' , {type = 'tree', unique=false, parts={'fulfills_transaction_id', 'fulfills_output_index'}})", # noqa: E501
"id_search": "inputs:create_index('id_search', {type = 'tree', unique=false, parts = {'transaction_id'}})" "id_search": "inputs:create_index('id_search', {type = 'tree', unique=false, parts = {'transaction_id'}})",
}, },
"outputs": "outputs": {
{
"unique_search": "outputs:create_index('unique_search' ,{type='hash', parts={'output_id'}})", "unique_search": "outputs:create_index('unique_search' ,{type='hash', parts={'output_id'}})",
"id_search": "outputs:create_index('id_search' ,{type='tree', unique=false, parts={'transaction_id'}})" "id_search": "outputs:create_index('id_search' ,{type='tree', unique=false, parts={'transaction_id'}})",
}, },
"keys": "keys": {
{
"id_search": "keys:create_index('id_search', {type = 'hash', parts={'id'}})", "id_search": "keys:create_index('id_search', {type = 'hash', parts={'id'}})",
"keys_search": "keys:create_index('keys_search', {type = 'tree', unique=false, parts={'public_key'}})", "keys_search": "keys:create_index('keys_search', {type = 'tree', unique=false, parts={'public_key'}})",
"txid_search": "keys:create_index('txid_search', {type = 'tree', unique=false, parts={'transaction_id'}})", "txid_search": "keys:create_index('txid_search', {type = 'tree', unique=false, parts={'transaction_id'}})",
"output_search": "keys:create_index('output_search', {type = 'tree', unique=false, parts={'output_id'}})" "output_search": "keys:create_index('output_search', {type = 'tree', unique=false, parts={'output_id'}})",
}, },
"utxos": "utxos": {
{
"id_search": "utxos:create_index('id_search', {type='hash' , parts={'transaction_id', 'output_index'}})", "id_search": "utxos:create_index('id_search', {type='hash' , parts={'transaction_id', 'output_index'}})",
"transaction_search": "utxos:create_index('transaction_search', {type='tree', unique=false, parts={'transaction_id'}})", # noqa: E501 "transaction_search": "utxos:create_index('transaction_search', {type='tree', unique=false, parts={'transaction_id'}})", # noqa: E501
"index_Search": "utxos:create_index('index_search', {type='tree', unique=false, parts={'output_index'}})" "index_Search": "utxos:create_index('index_search', {type='tree', unique=false, parts={'output_index'}})",
} },
} }
SCHEMA_COMMANDS = { SCHEMA_COMMANDS = {
"abci_chains": "abci_chains": "abci_chains:format({{name='height' , type='integer'},{name='is_synched' , type='boolean'},{name='chain_id',type='string'}, {name='id', type='string'}})", # noqa: E501
"abci_chains:format({{name='height' , type='integer'},{name='is_synched' , type='boolean'},{name='chain_id',type='string'}, {name='id', type='string'}})", # noqa: E501 "assets": "assets:format({{name='data' , type='string'}, {name='tx_id', type='string'}, {name='asset_id', type='string'}})", # noqa: E501
"assets": "blocks": "blocks:format{{name='app_hash',type='string'},{name='height' , type='integer'},{name='block_id' , type='string'}}", # noqa: E501
"assets:format({{name='data' , type='string'}, {name='tx_id', type='string'}, {name='asset_id', type='string'}})", # noqa: E501
"blocks":
"blocks:format{{name='app_hash',type='string'},{name='height' , type='integer'},{name='block_id' , type='string'}}", # noqa: E501
"blocks_tx": "blocks_tx:format{{name='transaction_id', type = 'string'}, {name = 'block_id', type = 'string'}}", "blocks_tx": "blocks_tx:format{{name='transaction_id', type = 'string'}, {name = 'block_id', type = 'string'}}",
"elections": "elections": "elections:format({{name='election_id' , type='string'},{name='height' , type='integer'}, {name='is_concluded' , type='boolean'}})", # noqa: E501
"elections:format({{name='election_id' , type='string'},{name='height' , type='integer'}, {name='is_concluded' , type='boolean'}})", # noqa: E501
"meta_data": "meta_datas:format({{name='transaction_id' , type='string'}, {name='meta_data' , type='string'}})", # noqa: E501 "meta_data": "meta_datas:format({{name='transaction_id' , type='string'}, {name='meta_data' , type='string'}})", # noqa: E501
"pre_commits": "pre_commits": "pre_commits:format({{name='commit_id', type='string'}, {name='height',type='integer'}, {name='transactions',type=any}})", # noqa: E501
"pre_commits:format({{name='commit_id', type='string'}, {name='height',type='integer'}, {name='transactions',type=any}})", # noqa: E501 "validators": "validators:format({{name='validator_id' , type='string'},{name='height',type='integer'},{name='validators' , type='any'}})", # noqa: E501
"validators": "transactions": "transactions:format({{name='transaction_id' , type='string'}, {name='operation' , type='string'}, {name='version' ,type='string'}, {name='dict_map', type='any'}})", # noqa: E501
"validators:format({{name='validator_id' , type='string'},{name='height',type='integer'},{name='validators' , type='any'}})", # noqa: E501 "inputs": "inputs:format({{name='transaction_id' , type='string'}, {name='fulfillment' , type='any'}, {name='owners_before' , type='array'}, {name='fulfills_transaction_id', type = 'string'}, {name='fulfills_output_index', type = 'string'}, {name='input_id', type='string'}, {name='input_index', type='number'}})", # noqa: E501
"transactions": "outputs": "outputs:format({{name='transaction_id' , type='string'}, {name='amount' , type='string'}, {name='uri', type='string'}, {name='details_type', type='string'}, {name='details_public_key', type='any'}, {name = 'output_id', type = 'string'}, {name='treshold', type='any'}, {name='subconditions', type='any'}, {name='output_index', type='number'}})", # noqa: E501
"transactions:format({{name='transaction_id' , type='string'}, {name='operation' , type='string'}, {name='version' ,type='string'}, {name='dict_map', type='any'}})", # noqa: E501 "keys": "keys:format({{name = 'id', type='string'}, {name = 'transaction_id', type = 'string'} ,{name = 'output_id', type = 'string'}, {name = 'public_key', type = 'string'}, {name = 'key_index', type = 'integer'}})", # noqa: E501
"inputs": "utxos": "utxos:format({{name='transaction_id' , type='string'}, {name='output_index' , type='integer'}, {name='utxo_dict', type='string'}})", # noqa: E501
"inputs:format({{name='transaction_id' , type='string'}, {name='fulfillment' , type='any'}, {name='owners_before' , type='array'}, {name='fulfills_transaction_id', type = 'string'}, {name='fulfills_output_index', type = 'string'}, {name='input_id', type='string'}, {name='input_index', type='number'}})", # noqa: E501
"outputs":
"outputs:format({{name='transaction_id' , type='string'}, {name='amount' , type='string'}, {name='uri', type='string'}, {name='details_type', type='string'}, {name='details_public_key', type='any'}, {name = 'output_id', type = 'string'}, {name='treshold', type='any'}, {name='subconditions', type='any'}, {name='output_index', type='number'}})", # noqa: E501
"keys":
"keys:format({{name = 'id', type='string'}, {name = 'transaction_id', type = 'string'} ,{name = 'output_id', type = 'string'}, {name = 'public_key', type = 'string'}, {name = 'key_index', type = 'integer'}})", # noqa: E501
"utxos":
"utxos:format({{name='transaction_id' , type='string'}, {name='output_index' , type='integer'}, {name='utxo_dict', type='string'}})" # noqa: E501
} }
SCHEMA_DROP_COMMANDS = { SCHEMA_DROP_COMMANDS = {
@ -145,7 +133,7 @@ SCHEMA_DROP_COMMANDS = {
"inputs": "box.space.inputs:drop()", "inputs": "box.space.inputs:drop()",
"outputs": "box.space.outputs:drop()", "outputs": "box.space.outputs:drop()",
"keys": "box.space.keys:drop()", "keys": "box.space.keys:drop()",
"utxos": "box.space.utxos:drop()" "utxos": "box.space.utxos:drop()",
} }
@ -159,24 +147,24 @@ def drop_database(connection, not_used=None):
except Exception: except Exception:
print(f"Unexpected error while trying to drop space '{_space}'") print(f"Unexpected error while trying to drop space '{_space}'")
@register_schema(TarantoolDBConnection) @register_schema(TarantoolDBConnection)
def create_database(connection, dbname): def create_database(connection, dbname):
''' """
For tarantool implementation, this function runs For tarantool implementation, this function runs
create_tables, to initiate spaces, schema and indexes. create_tables, to initiate spaces, schema and indexes.
''' """
logger.info('Create database `%s`.', dbname) logger.info("Create database `%s`.", dbname)
create_tables(connection, dbname) create_tables(connection, dbname)
def run_command_with_output(command): def run_command_with_output(command):
from subprocess import run from subprocess import run
host_port = "%s:%s" % (Config().get()["database"]["host"], Config().get()["database"]["port"]) host_port = "%s:%s" % (Config().get()["database"]["host"], Config().get()["database"]["port"])
output = run(["tarantoolctl", "connect", host_port], output = run(["tarantoolctl", "connect", host_port], input=command, capture_output=True).stderr
input=command,
capture_output=True).stderr
output = output.decode() output = output.decode()
return output return output

View File

@ -41,13 +41,16 @@ class TransactionDecompose:
"outputs": [], "outputs": [],
"keys": [], "keys": [],
"metadata": None, "metadata": None,
"asset": None "asset": None,
} }
def get_map(self, dictionary: dict = None): def get_map(self, dictionary: dict = None):
return _save_keys_order(dictionary=dictionary) if dictionary is not None else _save_keys_order( return (
dictionary=self._transaction) _save_keys_order(dictionary=dictionary)
if dictionary is not None
else _save_keys_order(dictionary=self._transaction)
)
def __create_hash(self, n: int): def __create_hash(self, n: int):
return token_hex(n) return token_hex(n)
@ -71,13 +74,17 @@ class TransactionDecompose:
input_index = 0 input_index = 0
for _input in self._transaction["inputs"]: for _input in self._transaction["inputs"]:
_inputs.append((self._transaction["id"], _inputs.append(
(
self._transaction["id"],
_input["fulfillment"], _input["fulfillment"],
_input["owners_before"], _input["owners_before"],
_input["fulfills"]["transaction_id"] if _input["fulfills"] is not None else "", _input["fulfills"]["transaction_id"] if _input["fulfills"] is not None else "",
str(_input["fulfills"]["output_index"]) if _input["fulfills"] is not None else "", str(_input["fulfills"]["output_index"]) if _input["fulfills"] is not None else "",
self.__create_hash(7), self.__create_hash(7),
input_index)) input_index,
)
)
input_index = input_index + 1 input_index = input_index + 1
return _inputs return _inputs
@ -88,7 +95,8 @@ class TransactionDecompose:
for _output in self._transaction["outputs"]: for _output in self._transaction["outputs"]:
output_id = self.__create_hash(7) output_id = self.__create_hash(7)
if _output["condition"]["details"].get("subconditions") is None: if _output["condition"]["details"].get("subconditions") is None:
tmp_output = (self._transaction["id"], tmp_output = (
self._transaction["id"],
_output["amount"], _output["amount"],
_output["condition"]["uri"], _output["condition"]["uri"],
_output["condition"]["details"]["type"], _output["condition"]["details"]["type"],
@ -96,10 +104,11 @@ class TransactionDecompose:
output_id, output_id,
None, None,
None, None,
output_index output_index,
) )
else: else:
tmp_output = (self._transaction["id"], tmp_output = (
self._transaction["id"],
_output["amount"], _output["amount"],
_output["condition"]["uri"], _output["condition"]["uri"],
_output["condition"]["details"]["type"], _output["condition"]["details"]["type"],
@ -107,7 +116,7 @@ class TransactionDecompose:
output_id, output_id,
_output["condition"]["details"]["threshold"], _output["condition"]["details"]["threshold"],
_output["condition"]["details"]["subconditions"], _output["condition"]["details"]["subconditions"],
output_index output_index,
) )
_outputs.append(tmp_output) _outputs.append(tmp_output)
@ -121,10 +130,7 @@ class TransactionDecompose:
def __prepare_transaction(self): def __prepare_transaction(self):
_map = self.get_map() _map = self.get_map()
return (self._transaction["id"], return (self._transaction["id"], self._transaction["operation"], self._transaction["version"], _map)
self._transaction["operation"],
self._transaction["version"],
_map)
def convert_to_tuple(self): def convert_to_tuple(self):
self._metadata_check() self._metadata_check()
@ -138,7 +144,6 @@ class TransactionDecompose:
class TransactionCompose: class TransactionCompose:
def __init__(self, db_results): def __init__(self, db_results):
self.db_results = db_results self.db_results = db_results
self._map = self.db_results["transaction"][3] self._map = self.db_results["transaction"][3]

View File

@ -1,11 +1,13 @@
import subprocess import subprocess
def run_cmd(commands: list, config: dict): def run_cmd(commands: list, config: dict):
ret = subprocess.Popen( ret = subprocess.Popen(
['%s %s:%s < %s' % ("tarantoolctl connect", "localhost", "3303", "planetmint/backend/tarantool/init.lua")], ["%s %s:%s < %s" % ("tarantoolctl connect", "localhost", "3303", "planetmint/backend/tarantool/init.lua")],
stdin=subprocess.PIPE, stdin=subprocess.PIPE,
stdout=subprocess.PIPE, stdout=subprocess.PIPE,
universal_newlines=True, universal_newlines=True,
bufsize=0, bufsize=0,
shell=True) shell=True,
)
return True if ret >= 0 else False return True if ret >= 0 else False

View File

@ -19,10 +19,12 @@ def module_dispatch_registrar(module):
return dispatch_registrar.register(obj_type)(func) return dispatch_registrar.register(obj_type)(func)
except AttributeError as ex: except AttributeError as ex:
raise ModuleDispatchRegistrationError( raise ModuleDispatchRegistrationError(
('`{module}` does not contain a single-dispatchable ' (
'function named `{func}`. The module being registered ' "`{module}` does not contain a single-dispatchable "
'was not implemented correctly!').format( "function named `{func}`. The module being registered "
func=func_name, module=module.__name__)) from ex "was not implemented correctly!"
).format(func=func_name, module=module.__name__)
) from ex
return wrapper return wrapper

View File

@ -1,31 +1,28 @@
elections = { elections = {
'upsert-validator': { "upsert-validator": {
'help': 'Propose a change to the validator set', "help": "Propose a change to the validator set",
'args': { "args": {
'public_key': { "public_key": {"help": "Public key of the validator to be added/updated/removed."},
'help': 'Public key of the validator to be added/updated/removed.' "power": {
"type": int,
"help": "The proposed power for the validator. Setting to 0 will remove the validator.",
}, },
'power': { "node_id": {"help": "The node_id of the validator."},
'type': int, "--private-key": {
'help': 'The proposed power for the validator. Setting to 0 will remove the validator.'}, "dest": "sk",
'node_id': { "required": True,
'help': 'The node_id of the validator.' "help": "Path to the private key of the election initiator.",
}, },
'--private-key': { },
'dest': 'sk', },
'required': True, "chain-migration": {
'help': 'Path to the private key of the election initiator.' "help": "Call for a halt to block production to allow for a version change across breaking changes.",
} "args": {
"--private-key": {
"dest": "sk",
"required": True,
"help": "Path to the private key of the election initiator.",
} }
}, },
'chain-migration': { },
'help': 'Call for a halt to block production to allow for a version change across breaking changes.',
'args': {
'--private-key': {
'dest': 'sk',
'required': True,
'help': 'Path to the private key of the election initiator.'
}
}
}
} }

View File

@ -18,18 +18,15 @@ from planetmint.backend.tarantool.connection import TarantoolDBConnection
from planetmint.core import rollback from planetmint.core import rollback
from planetmint.utils import load_node_key from planetmint.utils import load_node_key
from planetmint.transactions.common.transaction_mode_types import BROADCAST_TX_COMMIT from planetmint.transactions.common.transaction_mode_types import BROADCAST_TX_COMMIT
from planetmint.transactions.common.exceptions import ( from planetmint.transactions.common.exceptions import DatabaseDoesNotExist, ValidationError
DatabaseDoesNotExist, ValidationError)
from planetmint.transactions.types.elections.vote import Vote from planetmint.transactions.types.elections.vote import Vote
from planetmint.transactions.types.elections.chain_migration_election import ChainMigrationElection from planetmint.transactions.types.elections.chain_migration_election import ChainMigrationElection
import planetmint import planetmint
from planetmint import (backend, ValidatorElection, from planetmint import backend, ValidatorElection, Planetmint
Planetmint)
from planetmint.backend import schema from planetmint.backend import schema
from planetmint.backend import tarantool from planetmint.backend import tarantool
from planetmint.commands import utils from planetmint.commands import utils
from planetmint.commands.utils import (configure_planetmint, from planetmint.commands.utils import configure_planetmint, input_on_stderr
input_on_stderr)
from planetmint.log import setup_logging from planetmint.log import setup_logging
from planetmint.tendermint_utils import public_key_from_base64 from planetmint.tendermint_utils import public_key_from_base64
from planetmint.commands.election_types import elections from planetmint.commands.election_types import elections
@ -53,7 +50,7 @@ def run_show_config(args):
# the system needs to be configured, then display information on how to # the system needs to be configured, then display information on how to
# configure the system. # configure the system.
_config = Config().get() _config = Config().get()
del _config['CONFIGURED'] del _config["CONFIGURED"]
print(json.dumps(_config, indent=4, sort_keys=True)) print(json.dumps(_config, indent=4, sort_keys=True))
@ -64,47 +61,47 @@ def run_configure(args):
config_file_exists = False config_file_exists = False
# if the config path is `-` then it's stdout # if the config path is `-` then it's stdout
if config_path != '-': if config_path != "-":
config_file_exists = os.path.exists(config_path) config_file_exists = os.path.exists(config_path)
if config_file_exists and not args.yes: if config_file_exists and not args.yes:
want = input_on_stderr('Config file `{}` exists, do you want to ' want = input_on_stderr(
'override it? (cannot be undone) [y/N]: '.format(config_path)) "Config file `{}` exists, do you want to " "override it? (cannot be undone) [y/N]: ".format(config_path)
if want != 'y': )
if want != "y":
return return
Config().init_config(args.backend) Config().init_config(args.backend)
conf = Config().get() conf = Config().get()
# select the correct config defaults based on the backend # select the correct config defaults based on the backend
print('Generating default configuration for backend {}' print("Generating default configuration for backend {}".format(args.backend), file=sys.stderr)
.format(args.backend), file=sys.stderr)
database_keys = Config().get_db_key_map(args.backend) database_keys = Config().get_db_key_map(args.backend)
if not args.yes: if not args.yes:
for key in ('bind',): for key in ("bind",):
val = conf['server'][key] val = conf["server"][key]
conf['server'][key] = input_on_stderr('API Server {}? (default `{}`): '.format(key, val), val) conf["server"][key] = input_on_stderr("API Server {}? (default `{}`): ".format(key, val), val)
for key in ('scheme', 'host', 'port'): for key in ("scheme", "host", "port"):
val = conf['wsserver'][key] val = conf["wsserver"][key]
conf['wsserver'][key] = input_on_stderr('WebSocket Server {}? (default `{}`): '.format(key, val), val) conf["wsserver"][key] = input_on_stderr("WebSocket Server {}? (default `{}`): ".format(key, val), val)
for key in database_keys: for key in database_keys:
val = conf['database'][key] val = conf["database"][key]
conf['database'][key] = input_on_stderr('Database {}? (default `{}`): '.format(key, val), val) conf["database"][key] = input_on_stderr("Database {}? (default `{}`): ".format(key, val), val)
for key in ('host', 'port'): for key in ("host", "port"):
val = conf['tendermint'][key] val = conf["tendermint"][key]
conf['tendermint'][key] = input_on_stderr('Tendermint {}? (default `{}`)'.format(key, val), val) conf["tendermint"][key] = input_on_stderr("Tendermint {}? (default `{}`)".format(key, val), val)
if config_path != '-': if config_path != "-":
planetmint.config_utils.write_config(conf, config_path) planetmint.config_utils.write_config(conf, config_path)
else: else:
print(json.dumps(conf, indent=4, sort_keys=True)) print(json.dumps(conf, indent=4, sort_keys=True))
Config().set(conf) Config().set(conf)
print('Configuration written to {}'.format(config_path), file=sys.stderr) print("Configuration written to {}".format(config_path), file=sys.stderr)
print('Ready to go!', file=sys.stderr) print("Ready to go!", file=sys.stderr)
@configure_planetmint @configure_planetmint
@ -114,21 +111,19 @@ def run_election(args):
b = Planetmint() b = Planetmint()
# Call the function specified by args.action, as defined above # Call the function specified by args.action, as defined above
globals()[f'run_election_{args.action}'](args, b) globals()[f"run_election_{args.action}"](args, b)
def run_election_new(args, planet): def run_election_new(args, planet):
election_type = args.election_type.replace('-', '_') election_type = args.election_type.replace("-", "_")
globals()[f'run_election_new_{election_type}'](args, planet) globals()[f"run_election_new_{election_type}"](args, planet)
def create_new_election(sk, planet, election_class, data): def create_new_election(sk, planet, election_class, data):
try: try:
key = load_node_key(sk) key = load_node_key(sk)
voters = election_class.recipients(planet) voters = election_class.recipients(planet)
election = election_class.generate([key.public_key], election = election_class.generate([key.public_key], voters, data, None).sign([key.private_key])
voters,
data, None).sign([key.private_key])
election.validate(planet) election.validate(planet)
except ValidationError as e: except ValidationError as e:
logger.error(e) logger.error(e)
@ -138,11 +133,11 @@ def create_new_election(sk, planet, election_class, data):
return False return False
resp = planet.write_transaction(election, BROADCAST_TX_COMMIT) resp = planet.write_transaction(election, BROADCAST_TX_COMMIT)
if resp == (202, ''): if resp == (202, ""):
logger.info('[SUCCESS] Submitted proposal with id: {}'.format(election.id)) logger.info("[SUCCESS] Submitted proposal with id: {}".format(election.id))
return election.id return election.id
else: else:
logger.error('Failed to commit election proposal') logger.error("Failed to commit election proposal")
return False return False
@ -161,10 +156,9 @@ def run_election_new_upsert_validator(args, planet):
""" """
new_validator = { new_validator = {
'public_key': {'value': public_key_from_base64(args.public_key), "public_key": {"value": public_key_from_base64(args.public_key), "type": "ed25519-base16"},
'type': 'ed25519-base16'}, "power": args.power,
'power': args.power, "node_id": args.node_id,
'node_id': args.node_id
} }
return create_new_election(args.sk, planet, ValidatorElection, new_validator) return create_new_election(args.sk, planet, ValidatorElection, new_validator)
@ -202,23 +196,21 @@ def run_election_approve(args, planet):
if len(voting_powers) > 0: if len(voting_powers) > 0:
voting_power = voting_powers[0] voting_power = voting_powers[0]
else: else:
logger.error('The key you provided does not match any of the eligible voters in this election.') logger.error("The key you provided does not match any of the eligible voters in this election.")
return False return False
inputs = [i for i in tx.to_inputs() if key.public_key in i.owners_before] inputs = [i for i in tx.to_inputs() if key.public_key in i.owners_before]
election_pub_key = ValidatorElection.to_public_key(tx.id) election_pub_key = ValidatorElection.to_public_key(tx.id)
approval = Vote.generate(inputs, approval = Vote.generate(inputs, [([election_pub_key], voting_power)], tx.id).sign([key.private_key])
[([election_pub_key], voting_power)],
tx.id).sign([key.private_key])
approval.validate(planet) approval.validate(planet)
resp = planet.write_transaction(approval, BROADCAST_TX_COMMIT) resp = planet.write_transaction(approval, BROADCAST_TX_COMMIT)
if resp == (202, ''): if resp == (202, ""):
logger.info('[SUCCESS] Your vote has been submitted') logger.info("[SUCCESS] Your vote has been submitted")
return approval.id return approval.id
else: else:
logger.error('Failed to commit vote') logger.error("Failed to commit vote")
return False return False
@ -234,7 +226,7 @@ def run_election_show(args, planet):
election = planet.get_transaction(args.election_id) election = planet.get_transaction(args.election_id)
if not election: if not election:
logger.error(f'No election found with election_id {args.election_id}') logger.error(f"No election found with election_id {args.election_id}")
return return
response = election.show_election(planet) response = election.show_election(planet)
@ -260,11 +252,12 @@ def run_drop(args):
"""Drop the database""" """Drop the database"""
if not args.yes: if not args.yes:
response = input_on_stderr('Do you want to drop `{}` database? [y/n]: ') response = input_on_stderr("Do you want to drop `{}` database? [y/n]: ")
if response != 'y': if response != "y":
return return
from planetmint.backend.connection import connect from planetmint.backend.connection import connect
conn = connect() conn = connect()
try: try:
schema.drop_database(conn) schema.drop_database(conn)
@ -284,115 +277,103 @@ def run_start(args):
setup_logging() setup_logging()
if not args.skip_initialize_database: if not args.skip_initialize_database:
logger.info('Initializing database') logger.info("Initializing database")
_run_init() _run_init()
logger.info('Planetmint Version %s', planetmint.version.__version__) logger.info("Planetmint Version %s", planetmint.version.__version__)
run_recover(planetmint.lib.Planetmint()) run_recover(planetmint.lib.Planetmint())
logger.info('Starting Planetmint main process.') logger.info("Starting Planetmint main process.")
from planetmint.start import start from planetmint.start import start
start(args) start(args)
def run_tendermint_version(args): def run_tendermint_version(args):
"""Show the supported Tendermint version(s)""" """Show the supported Tendermint version(s)"""
supported_tm_ver = { supported_tm_ver = {
'description': 'Planetmint supports the following Tendermint version(s)', "description": "Planetmint supports the following Tendermint version(s)",
'tendermint': __tm_supported_versions__, "tendermint": __tm_supported_versions__,
} }
print(json.dumps(supported_tm_ver, indent=4, sort_keys=True)) print(json.dumps(supported_tm_ver, indent=4, sort_keys=True))
def create_parser(): def create_parser():
parser = argparse.ArgumentParser( parser = argparse.ArgumentParser(description="Control your Planetmint node.", parents=[utils.base_parser])
description='Control your Planetmint node.',
parents=[utils.base_parser])
# all the commands are contained in the subparsers object, # all the commands are contained in the subparsers object,
# the command selected by the user will be stored in `args.command` # the command selected by the user will be stored in `args.command`
# that is used by the `main` function to select which other # that is used by the `main` function to select which other
# function to call. # function to call.
subparsers = parser.add_subparsers(title='Commands', subparsers = parser.add_subparsers(title="Commands", dest="command")
dest='command')
# parser for writing a config file # parser for writing a config file
config_parser = subparsers.add_parser('configure', config_parser = subparsers.add_parser("configure", help="Prepare the config file.")
help='Prepare the config file.')
config_parser.add_argument('backend', config_parser.add_argument(
choices=['tarantool_db', 'localmongodb'], "backend",
default='tarantool_db', choices=["tarantool_db", "localmongodb"],
const='tarantool_db', default="tarantool_db",
nargs='?', const="tarantool_db",
help='The backend to use. It can only be ' nargs="?",
'"tarantool_db", currently.') help="The backend to use. It can only be " '"tarantool_db", currently.',
)
# parser for managing elections # parser for managing elections
election_parser = subparsers.add_parser('election', election_parser = subparsers.add_parser("election", help="Manage elections.")
help='Manage elections.')
election_subparser = election_parser.add_subparsers(title='Action', election_subparser = election_parser.add_subparsers(title="Action", dest="action")
dest='action')
new_election_parser = election_subparser.add_parser('new', new_election_parser = election_subparser.add_parser("new", help="Calls a new election.")
help='Calls a new election.')
new_election_subparser = new_election_parser.add_subparsers(title='Election_Type', new_election_subparser = new_election_parser.add_subparsers(title="Election_Type", dest="election_type")
dest='election_type')
# Parser factory for each type of new election, so we get a bunch of commands that look like this: # Parser factory for each type of new election, so we get a bunch of commands that look like this:
# election new <some_election_type> <args>... # election new <some_election_type> <args>...
for name, data in elections.items(): for name, data in elections.items():
args = data['args'] args = data["args"]
generic_parser = new_election_subparser.add_parser(name, help=data['help']) generic_parser = new_election_subparser.add_parser(name, help=data["help"])
for arg, kwargs in args.items(): for arg, kwargs in args.items():
generic_parser.add_argument(arg, **kwargs) generic_parser.add_argument(arg, **kwargs)
approve_election_parser = election_subparser.add_parser('approve', approve_election_parser = election_subparser.add_parser("approve", help="Approve the election.")
help='Approve the election.') approve_election_parser.add_argument("election_id", help="The election_id of the election.")
approve_election_parser.add_argument('election_id', approve_election_parser.add_argument(
help='The election_id of the election.') "--private-key", dest="sk", required=True, help="Path to the private key of the election initiator."
approve_election_parser.add_argument('--private-key', )
dest='sk',
required=True,
help='Path to the private key of the election initiator.')
show_election_parser = election_subparser.add_parser('show', show_election_parser = election_subparser.add_parser("show", help="Provides information about an election.")
help='Provides information about an election.')
show_election_parser.add_argument('election_id', show_election_parser.add_argument("election_id", help="The transaction id of the election you wish to query.")
help='The transaction id of the election you wish to query.')
# parsers for showing/exporting config values # parsers for showing/exporting config values
subparsers.add_parser('show-config', subparsers.add_parser("show-config", help="Show the current configuration")
help='Show the current configuration')
# parser for database-level commands # parser for database-level commands
subparsers.add_parser('init', subparsers.add_parser("init", help="Init the database")
help='Init the database')
subparsers.add_parser('drop', subparsers.add_parser("drop", help="Drop the database")
help='Drop the database')
# parser for starting Planetmint # parser for starting Planetmint
start_parser = subparsers.add_parser('start', start_parser = subparsers.add_parser("start", help="Start Planetmint")
help='Start Planetmint')
start_parser.add_argument('--no-init', start_parser.add_argument(
dest='skip_initialize_database', "--no-init",
dest="skip_initialize_database",
default=False, default=False,
action='store_true', action="store_true",
help='Skip database initialization') help="Skip database initialization",
)
subparsers.add_parser('tendermint-version', subparsers.add_parser("tendermint-version", help="Show the Tendermint supported versions")
help='Show the Tendermint supported versions')
start_parser.add_argument('--experimental-parallel-validation', start_parser.add_argument(
dest='experimental_parallel_validation', "--experimental-parallel-validation",
dest="experimental_parallel_validation",
default=False, default=False,
action='store_true', action="store_true",
help='💀 EXPERIMENTAL: parallelize validation for better throughput 💀') help="💀 EXPERIMENTAL: parallelize validation for better throughput 💀",
)
return parser return parser

View File

@ -30,22 +30,22 @@ def configure_planetmint(command):
The command wrapper function. The command wrapper function.
""" """
@functools.wraps(command) @functools.wraps(command)
def configure(args): def configure(args):
config_from_cmdline = None config_from_cmdline = None
try: try:
if args.log_level is not None: if args.log_level is not None:
config_from_cmdline = { config_from_cmdline = {
'log': { "log": {
'level_console': args.log_level, "level_console": args.log_level,
'level_logfile': args.log_level, "level_logfile": args.log_level,
}, },
'server': {'loglevel': args.log_level}, "server": {"loglevel": args.log_level},
} }
except AttributeError: except AttributeError:
pass pass
planetmint.config_utils.autoconfigure( planetmint.config_utils.autoconfigure(filename=args.config, config=config_from_cmdline, force=True)
filename=args.config, config=config_from_cmdline, force=True)
command(args) command(args)
return configure return configure
@ -53,13 +53,13 @@ def configure_planetmint(command):
def _convert(value, default=None, convert=None): def _convert(value, default=None, convert=None):
def convert_bool(value): def convert_bool(value):
if value.lower() in ('true', 't', 'yes', 'y'): if value.lower() in ("true", "t", "yes", "y"):
return True return True
if value.lower() in ('false', 'f', 'no', 'n'): if value.lower() in ("false", "f", "no", "n"):
return False return False
raise ValueError('{} cannot be converted to bool'.format(value)) raise ValueError("{} cannot be converted to bool".format(value))
if value == '': if value == "":
value = None value = None
if convert is None: if convert is None:
@ -80,7 +80,7 @@ def _convert(value, default=None, convert=None):
# We need this because `input` always prints on stdout, while it should print # We need this because `input` always prints on stdout, while it should print
# to stderr. It's a very old bug, check it out here: # to stderr. It's a very old bug, check it out here:
# - https://bugs.python.org/issue1927 # - https://bugs.python.org/issue1927
def input_on_stderr(prompt='', default=None, convert=None): def input_on_stderr(prompt="", default=None, convert=None):
"""Output a string to stderr and wait for input. """Output a string to stderr and wait for input.
Args: Args:
@ -92,7 +92,7 @@ def input_on_stderr(prompt='', default=None, convert=None):
``default`` will be used. ``default`` will be used.
""" """
print(prompt, end='', file=sys.stderr) print(prompt, end="", file=sys.stderr)
value = builtins.input() value = builtins.input()
return _convert(value, default, convert) return _convert(value, default, convert)
@ -121,14 +121,13 @@ def start(parser, argv, scope):
# look up in the current scope for a function called 'run_<command>' # look up in the current scope for a function called 'run_<command>'
# replacing all the dashes '-' with the lowercase character '_' # replacing all the dashes '-' with the lowercase character '_'
func = scope.get('run_' + args.command.replace('-', '_')) func = scope.get("run_" + args.command.replace("-", "_"))
# if no command has been found, raise a `NotImplementedError` # if no command has been found, raise a `NotImplementedError`
if not func: if not func:
raise NotImplementedError('Command `{}` not yet implemented'. raise NotImplementedError("Command `{}` not yet implemented".format(args.command))
format(args.command))
args.multiprocess = getattr(args, 'multiprocess', False) args.multiprocess = getattr(args, "multiprocess", False)
if args.multiprocess is False: if args.multiprocess is False:
args.multiprocess = 1 args.multiprocess = 1
@ -138,24 +137,28 @@ def start(parser, argv, scope):
return func(args) return func(args)
base_parser = argparse.ArgumentParser(add_help=False, prog='planetmint') base_parser = argparse.ArgumentParser(add_help=False, prog="planetmint")
base_parser.add_argument('-c', '--config', base_parser.add_argument(
help='Specify the location of the configuration file ' "-c", "--config", help="Specify the location of the configuration file " '(use "-" for stdout)'
'(use "-" for stdout)') )
# NOTE: this flag should not have any default value because that will override # NOTE: this flag should not have any default value because that will override
# the environment variables provided to configure the logger. # the environment variables provided to configure the logger.
base_parser.add_argument('-l', '--log-level', base_parser.add_argument(
"-l",
"--log-level",
type=str.upper, # convert to uppercase for comparison to choices type=str.upper, # convert to uppercase for comparison to choices
choices=['DEBUG', 'BENCHMARK', 'INFO', 'WARNING', 'ERROR', 'CRITICAL'], choices=["DEBUG", "BENCHMARK", "INFO", "WARNING", "ERROR", "CRITICAL"],
help='Log level') help="Log level",
)
base_parser.add_argument('-y', '--yes', '--yes-please', base_parser.add_argument(
action='store_true', "-y",
help='Assume "yes" as answer to all prompts and run ' "--yes",
'non-interactively') "--yes-please",
action="store_true",
help='Assume "yes" as answer to all prompts and run ' "non-interactively",
)
base_parser.add_argument('-v', '--version', base_parser.add_argument("-v", "--version", action="version", version="%(prog)s {}".format(__version__))
action='version',
version='%(prog)s {}'.format(__version__))

View File

@ -1,6 +1,7 @@
import copy import copy
import logging import logging
import os import os
# from planetmint.log import DEFAULT_LOGGING_CONFIG as log_config # from planetmint.log import DEFAULT_LOGGING_CONFIG as log_config
from planetmint.version import __version__ # noqa from planetmint.version import __version__ # noqa
@ -15,7 +16,6 @@ class Singleton(type):
class Config(metaclass=Singleton): class Config(metaclass=Singleton):
def __init__(self): def __init__(self):
# from functools import reduce # from functools import reduce
# PORT_NUMBER = reduce(lambda x, y: x * y, map(ord, 'Planetmint')) % 2**16 # PORT_NUMBER = reduce(lambda x, y: x * y, map(ord, 'Planetmint')) % 2**16
@ -26,27 +26,27 @@ class Config(metaclass=Singleton):
# _base_database_localmongodb.keys() because dicts are unordered. # _base_database_localmongodb.keys() because dicts are unordered.
# I tried to configure # I tried to configure
self.log_config = DEFAULT_LOGGING_CONFIG self.log_config = DEFAULT_LOGGING_CONFIG
db = 'tarantool_db' db = "tarantool_db"
self.__private_database_keys_map = { # TODO Check if it is working after removing 'name' field self.__private_database_keys_map = { # TODO Check if it is working after removing 'name' field
'tarantool_db': ('host', 'port'), "tarantool_db": ("host", "port"),
'localmongodb': ('host', 'port', 'name') "localmongodb": ("host", "port", "name"),
} }
self.__private_database_localmongodb = { self.__private_database_localmongodb = {
'backend': 'localmongodb', "backend": "localmongodb",
'host': 'localhost', "host": "localhost",
'port': 27017, "port": 27017,
'name': 'bigchain', "name": "bigchain",
'replicaset': None, "replicaset": None,
'login': None, "login": None,
'password': None, "password": None,
'connection_timeout': 5000, "connection_timeout": 5000,
'max_tries': 3, "max_tries": 3,
'ssl': False, "ssl": False,
'ca_cert': None, "ca_cert": None,
'certfile': None, "certfile": None,
'keyfile': None, "keyfile": None,
'keyfile_passphrase': None, "keyfile_passphrase": None,
'crlfile': None "crlfile": None,
} }
self.__private_init_config = { self.__private_init_config = {
"absolute_path": os.path.dirname(os.path.abspath(__file__)) + "/backend/tarantool/init.lua" "absolute_path": os.path.dirname(os.path.abspath(__file__)) + "/backend/tarantool/init.lua"
@ -56,71 +56,68 @@ class Config(metaclass=Singleton):
"absolute_path": os.path.dirname(os.path.abspath(__file__)) + "/backend/tarantool/drop.lua" "absolute_path": os.path.dirname(os.path.abspath(__file__)) + "/backend/tarantool/drop.lua"
} }
self.__private_database_tarantool = { self.__private_database_tarantool = {
'backend': 'tarantool_db', "backend": "tarantool_db",
'connection_timeout': 5000, "connection_timeout": 5000,
'max_tries': 3, "max_tries": 3,
'name': 'universe', "name": "universe",
"reconnect_delay": 0.5, "reconnect_delay": 0.5,
'host': 'localhost', "host": "localhost",
'port': 3303, "port": 3303,
"connect_now": True, "connect_now": True,
"encoding": "utf-8", "encoding": "utf-8",
"login": "guest", "login": "guest",
'password': "", "password": "",
"service": "tarantoolctl connect", "service": "tarantoolctl connect",
"init_config": self.__private_init_config, "init_config": self.__private_init_config,
"drop_config": self.__private_drop_config, "drop_config": self.__private_drop_config,
} }
self.__private_database_map = { self.__private_database_map = {
'tarantool_db': self.__private_database_tarantool, "tarantool_db": self.__private_database_tarantool,
'localmongodb': self.__private_database_localmongodb "localmongodb": self.__private_database_localmongodb,
} }
self.__private_config = { self.__private_config = {
'server': { "server": {
# Note: this section supports all the Gunicorn settings: # Note: this section supports all the Gunicorn settings:
# - http://docs.gunicorn.org/en/stable/settings.html # - http://docs.gunicorn.org/en/stable/settings.html
'bind': 'localhost:9984', "bind": "localhost:9984",
'loglevel': logging.getLevelName( "loglevel": logging.getLevelName(self.log_config["handlers"]["console"]["level"]).lower(),
self.log_config['handlers']['console']['level']).lower(), "workers": None, # if None, the value will be cpu_count * 2 + 1
'workers': None, # if None, the value will be cpu_count * 2 + 1
}, },
'wsserver': { "wsserver": {
'scheme': 'ws', "scheme": "ws",
'host': 'localhost', "host": "localhost",
'port': 9985, "port": 9985,
'advertised_scheme': 'ws', "advertised_scheme": "ws",
'advertised_host': 'localhost', "advertised_host": "localhost",
'advertised_port': 9985, "advertised_port": 9985,
}, },
'tendermint': { "tendermint": {
'host': 'localhost', "host": "localhost",
'port': 26657, "port": 26657,
'version': 'v0.31.5', # look for __tm_supported_versions__ "version": "v0.31.5", # look for __tm_supported_versions__
}, },
'database': self.__private_database_map, "database": self.__private_database_map,
'log': { "log": {
'file': self.log_config['handlers']['file']['filename'], "file": self.log_config["handlers"]["file"]["filename"],
'error_file': self.log_config['handlers']['errors']['filename'], "error_file": self.log_config["handlers"]["errors"]["filename"],
'level_console': logging.getLevelName( "level_console": logging.getLevelName(self.log_config["handlers"]["console"]["level"]).lower(),
self.log_config['handlers']['console']['level']).lower(), "level_logfile": logging.getLevelName(self.log_config["handlers"]["file"]["level"]).lower(),
'level_logfile': logging.getLevelName( "datefmt_console": self.log_config["formatters"]["console"]["datefmt"],
self.log_config['handlers']['file']['level']).lower(), "datefmt_logfile": self.log_config["formatters"]["file"]["datefmt"],
'datefmt_console': self.log_config['formatters']['console']['datefmt'], "fmt_console": self.log_config["formatters"]["console"]["format"],
'datefmt_logfile': self.log_config['formatters']['file']['datefmt'], "fmt_logfile": self.log_config["formatters"]["file"]["format"],
'fmt_console': self.log_config['formatters']['console']['format'], "granular_levels": {},
'fmt_logfile': self.log_config['formatters']['file']['format'],
'granular_levels': {},
}, },
} }
self._private_real_config = copy.deepcopy(self.__private_config) self._private_real_config = copy.deepcopy(self.__private_config)
# select the correct config defaults based on the backend # select the correct config defaults based on the backend
self._private_real_config['database'] = self.__private_database_map[db] self._private_real_config["database"] = self.__private_database_map[db]
def init_config(self, db): def init_config(self, db):
self._private_real_config = copy.deepcopy(self.__private_config) self._private_real_config = copy.deepcopy(self.__private_config)
# select the correct config defaults based on the backend # select the correct config defaults based on the backend
self._private_real_config['database'] = self.__private_database_map[db] self._private_real_config["database"] = self.__private_database_map[db]
return self._private_real_config return self._private_real_config
def get(self): def get(self):
@ -135,52 +132,55 @@ class Config(metaclass=Singleton):
def get_db_map(sefl, db): def get_db_map(sefl, db):
return sefl.__private_database_map[db] return sefl.__private_database_map[db]
DEFAULT_LOG_DIR = os.getcwd() DEFAULT_LOG_DIR = os.getcwd()
DEFAULT_LOGGING_CONFIG = { DEFAULT_LOGGING_CONFIG = {
'version': 1, "version": 1,
'disable_existing_loggers': False, "disable_existing_loggers": False,
'formatters': { "formatters": {
'console': { "console": {
'class': 'logging.Formatter', "class": "logging.Formatter",
'format': ('[%(asctime)s] [%(levelname)s] (%(name)s) ' "format": (
'%(message)s (%(processName)-10s - pid: %(process)d)'), "[%(asctime)s] [%(levelname)s] (%(name)s) " "%(message)s (%(processName)-10s - pid: %(process)d)"
'datefmt': '%Y-%m-%d %H:%M:%S', ),
"datefmt": "%Y-%m-%d %H:%M:%S",
}, },
'file': { "file": {
'class': 'logging.Formatter', "class": "logging.Formatter",
'format': ('[%(asctime)s] [%(levelname)s] (%(name)s) ' "format": (
'%(message)s (%(processName)-10s - pid: %(process)d)'), "[%(asctime)s] [%(levelname)s] (%(name)s) " "%(message)s (%(processName)-10s - pid: %(process)d)"
'datefmt': '%Y-%m-%d %H:%M:%S', ),
} "datefmt": "%Y-%m-%d %H:%M:%S",
}, },
'handlers': {
'console': {
'class': 'logging.StreamHandler',
'formatter': 'console',
'level': logging.INFO,
}, },
'file': { "handlers": {
'class': 'logging.handlers.RotatingFileHandler', "console": {
'filename': os.path.join(DEFAULT_LOG_DIR, 'planetmint.log'), "class": "logging.StreamHandler",
'mode': 'w', "formatter": "console",
'maxBytes': 209715200, "level": logging.INFO,
'backupCount': 5,
'formatter': 'file',
'level': logging.INFO,
}, },
'errors': { "file": {
'class': 'logging.handlers.RotatingFileHandler', "class": "logging.handlers.RotatingFileHandler",
'filename': os.path.join(DEFAULT_LOG_DIR, 'planetmint-errors.log'), "filename": os.path.join(DEFAULT_LOG_DIR, "planetmint.log"),
'mode': 'w', "mode": "w",
'maxBytes': 209715200, "maxBytes": 209715200,
'backupCount': 5, "backupCount": 5,
'formatter': 'file', "formatter": "file",
'level': logging.ERROR, "level": logging.INFO,
}
}, },
'loggers': {}, "errors": {
'root': { "class": "logging.handlers.RotatingFileHandler",
'level': logging.DEBUG, "filename": os.path.join(DEFAULT_LOG_DIR, "planetmint-errors.log"),
'handlers': ['console', 'file', 'errors'], "mode": "w",
"maxBytes": 209715200,
"backupCount": 5,
"formatter": "file",
"level": logging.ERROR,
},
},
"loggers": {},
"root": {
"level": logging.DEBUG,
"handlers": ["console", "file", "errors"],
}, },
} }

View File

@ -29,16 +29,16 @@ from planetmint.transactions.common import exceptions
from planetmint.validation import BaseValidationRules from planetmint.validation import BaseValidationRules
# TODO: move this to a proper configuration file for logging # TODO: move this to a proper configuration file for logging
logging.getLogger('requests').setLevel(logging.WARNING) logging.getLogger("requests").setLevel(logging.WARNING)
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
CONFIG_DEFAULT_PATH = os.environ.setdefault( CONFIG_DEFAULT_PATH = os.environ.setdefault(
'PLANETMINT_CONFIG_PATH', "PLANETMINT_CONFIG_PATH",
os.path.join(os.path.expanduser('~'), '.planetmint'), os.path.join(os.path.expanduser("~"), ".planetmint"),
) )
CONFIG_PREFIX = 'PLANETMINT' CONFIG_PREFIX = "PLANETMINT"
CONFIG_SEP = '_' CONFIG_SEP = "_"
def map_leafs(func, mapping): def map_leafs(func, mapping):
@ -96,21 +96,21 @@ def file_config(filename=None):
dict: The config values in the specified config file (or the dict: The config values in the specified config file (or the
file at CONFIG_DEFAULT_PATH, if filename == None) file at CONFIG_DEFAULT_PATH, if filename == None)
""" """
logger.debug('On entry into file_config(), filename = {}'.format(filename)) logger.debug("On entry into file_config(), filename = {}".format(filename))
if filename is None: if filename is None:
filename = CONFIG_DEFAULT_PATH filename = CONFIG_DEFAULT_PATH
logger.debug('file_config() will try to open `{}`'.format(filename)) logger.debug("file_config() will try to open `{}`".format(filename))
with open(filename) as f: with open(filename) as f:
try: try:
config = json.load(f) config = json.load(f)
except ValueError as err: except ValueError as err:
raise exceptions.ConfigurationError( raise exceptions.ConfigurationError(
'Failed to parse the JSON configuration from `{}`, {}'.format(filename, err) "Failed to parse the JSON configuration from `{}`, {}".format(filename, err)
) )
logger.info('Configuration loaded from `{}`'.format(filename)) logger.info("Configuration loaded from `{}`".format(filename))
return config return config
@ -136,7 +136,7 @@ def env_config(config):
return map_leafs(load_from_env, config) return map_leafs(load_from_env, config)
def update_types(config, reference, list_sep=':'): def update_types(config, reference, list_sep=":"):
"""Return a new configuration where all the values types """Return a new configuration where all the values types
are aligned with the ones in the default configuration are aligned with the ones in the default configuration
""" """
@ -192,7 +192,7 @@ def set_config(config):
_config = Config().get() _config = Config().get()
# Update the default config with whatever is in the passed config # Update the default config with whatever is in the passed config
update(_config, update_types(config, _config)) update(_config, update_types(config, _config))
_config['CONFIGURED'] = True _config["CONFIGURED"] = True
Config().set(_config) Config().set(_config)
@ -208,7 +208,7 @@ def update_config(config):
_config = Config().get() _config = Config().get()
# Update the default config with whatever is in the passed config # Update the default config with whatever is in the passed config
update(_config, update_types(config, _config)) update(_config, update_types(config, _config))
_config['CONFIGURED'] = True _config["CONFIGURED"] = True
Config().set(_config) Config().set(_config)
@ -223,12 +223,12 @@ def write_config(config, filename=None):
if not filename: if not filename:
filename = CONFIG_DEFAULT_PATH filename = CONFIG_DEFAULT_PATH
with open(filename, 'w') as f: with open(filename, "w") as f:
json.dump(config, f, indent=4) json.dump(config, f, indent=4)
def is_configured(): def is_configured():
return bool(Config().get().get('CONFIGURED')) return bool(Config().get().get("CONFIGURED"))
def autoconfigure(filename=None, config=None, force=False): def autoconfigure(filename=None, config=None, force=False):
@ -236,7 +236,7 @@ def autoconfigure(filename=None, config=None, force=False):
been initialized. been initialized.
""" """
if not force and is_configured(): if not force and is_configured():
logger.debug('System already configured, skipping autoconfiguration') logger.debug("System already configured, skipping autoconfiguration")
return return
# start with the current configuration # start with the current configuration
@ -249,7 +249,7 @@ def autoconfigure(filename=None, config=None, force=False):
if filename: if filename:
raise raise
else: else:
logger.info('Cannot find config file `%s`.' % e.filename) logger.info("Cannot find config file `%s`." % e.filename)
# override configuration with env variables # override configuration with env variables
newconfig = env_config(newconfig) newconfig = env_config(newconfig)
@ -277,20 +277,20 @@ def load_validation_plugin(name=None):
# We should probably support Requirements specs in the config, e.g. # We should probably support Requirements specs in the config, e.g.
# validation_plugin: 'my-plugin-package==0.0.1;default' # validation_plugin: 'my-plugin-package==0.0.1;default'
plugin = None plugin = None
for entry_point in iter_entry_points('planetmint.validation', name): for entry_point in iter_entry_points("planetmint.validation", name):
plugin = entry_point.load() plugin = entry_point.load()
# No matching entry_point found # No matching entry_point found
if not plugin: if not plugin:
raise ResolutionError( raise ResolutionError("No plugin found in group `planetmint.validation` with name `{}`".format(name))
'No plugin found in group `planetmint.validation` with name `{}`'.
format(name))
# Is this strictness desireable? # Is this strictness desireable?
# It will probably reduce developer headaches in the wild. # It will probably reduce developer headaches in the wild.
if not issubclass(plugin, (BaseValidationRules,)): if not issubclass(plugin, (BaseValidationRules,)):
raise TypeError('object of type "{}" does not implement `planetmint.' raise TypeError(
'validation.BaseValidationRules`'.format(type(plugin))) 'object of type "{}" does not implement `planetmint.'
"validation.BaseValidationRules`".format(type(plugin))
)
return plugin return plugin
@ -302,7 +302,7 @@ def load_events_plugins(names=None):
return plugins return plugins
for name in names: for name in names:
for entry_point in iter_entry_points('planetmint.events', name): for entry_point in iter_entry_points("planetmint.events", name):
plugins.append((name, entry_point.load())) plugins.append((name, entry_point.load()))
return plugins return plugins

View File

@ -18,12 +18,11 @@ from tendermint.abci.types_pb2 import (
ResponseDeliverTx, ResponseDeliverTx,
ResponseBeginBlock, ResponseBeginBlock,
ResponseEndBlock, ResponseEndBlock,
ResponseCommit ResponseCommit,
) )
from planetmint import Planetmint from planetmint import Planetmint
from planetmint.transactions.types.elections.election import Election from planetmint.transactions.types.elections.election import Election
from planetmint.tendermint_utils import (decode_transaction, from planetmint.tendermint_utils import decode_transaction, calculate_hash
calculate_hash)
from planetmint.lib import Block from planetmint.lib import Block
import planetmint.upsert_validator.validator_utils as vutils import planetmint.upsert_validator.validator_utils as vutils
from planetmint.events import EventTypes, Event from planetmint.events import EventTypes, Event
@ -42,40 +41,41 @@ class App(BaseApplication):
def __init__(self, planetmint_node=None, events_queue=None): def __init__(self, planetmint_node=None, events_queue=None):
# super().__init__(abci) # super().__init__(abci)
logger.debug('Checking values of types') logger.debug("Checking values of types")
logger.debug(dir(types_pb2)) logger.debug(dir(types_pb2))
self.events_queue = events_queue self.events_queue = events_queue
self.planetmint_node = planetmint_node or Planetmint() self.planetmint_node = planetmint_node or Planetmint()
self.block_txn_ids = [] self.block_txn_ids = []
self.block_txn_hash = '' self.block_txn_hash = ""
self.block_transactions = [] self.block_transactions = []
self.validators = None self.validators = None
self.new_height = None self.new_height = None
self.chain = self.planetmint_node.get_latest_abci_chain() self.chain = self.planetmint_node.get_latest_abci_chain()
def log_abci_migration_error(self, chain_id, validators): def log_abci_migration_error(self, chain_id, validators):
logger.error('An ABCI chain migration is in process. ' logger.error(
'Download theself.planetmint_node.get_latest_abci_chain new ABCI client and configure it with ' "An ABCI chain migration is in process. "
f'chain_id={chain_id} and validators={validators}.') "Download theself.planetmint_node.get_latest_abci_chain new ABCI client and configure it with "
f"chain_id={chain_id} and validators={validators}."
)
def abort_if_abci_chain_is_not_synced(self): def abort_if_abci_chain_is_not_synced(self):
if self.chain is None or self.chain['is_synced']: if self.chain is None or self.chain["is_synced"]:
return return
validators = self.planetmint_node.get_validators() validators = self.planetmint_node.get_validators()
self.log_abci_migration_error(self.chain['chain_id'], validators) self.log_abci_migration_error(self.chain["chain_id"], validators)
sys.exit(1) sys.exit(1)
def init_chain(self, genesis): def init_chain(self, genesis):
"""Initialize chain upon genesis or a migration""" """Initialize chain upon genesis or a migration"""
app_hash = '' app_hash = ""
height = 0 height = 0
known_chain = self.planetmint_node.get_latest_abci_chain() known_chain = self.planetmint_node.get_latest_abci_chain()
if known_chain is not None: if known_chain is not None:
chain_id = known_chain['chain_id'] chain_id = known_chain["chain_id"]
if known_chain['is_synced']: if known_chain["is_synced"]:
msg = (f'Got invalid InitChain ABCI request ({genesis}) - ' msg = f"Got invalid InitChain ABCI request ({genesis}) - " f"the chain {chain_id} is already synced."
f'the chain {chain_id} is already synced.')
logger.error(msg) logger.error(msg)
sys.exit(1) sys.exit(1)
if chain_id != genesis.chain_id: if chain_id != genesis.chain_id:
@ -84,22 +84,19 @@ class App(BaseApplication):
sys.exit(1) sys.exit(1)
# set migration values for app hash and height # set migration values for app hash and height
block = self.planetmint_node.get_latest_block() block = self.planetmint_node.get_latest_block()
app_hash = '' if block is None else block['app_hash'] app_hash = "" if block is None else block["app_hash"]
height = 0 if block is None else block['height'] + 1 height = 0 if block is None else block["height"] + 1
known_validators = self.planetmint_node.get_validators() known_validators = self.planetmint_node.get_validators()
validator_set = [vutils.decode_validator(v) validator_set = [vutils.decode_validator(v) for v in genesis.validators]
for v in genesis.validators]
if known_validators and known_validators != validator_set: if known_validators and known_validators != validator_set:
self.log_abci_migration_error(known_chain['chain_id'], self.log_abci_migration_error(known_chain["chain_id"], known_validators)
known_validators)
sys.exit(1) sys.exit(1)
block = Block(app_hash=app_hash, height=height, transactions=[]) block = Block(app_hash=app_hash, height=height, transactions=[])
self.planetmint_node.store_block(block._asdict()) self.planetmint_node.store_block(block._asdict())
self.planetmint_node.store_validator_set(height + 1, validator_set) self.planetmint_node.store_validator_set(height + 1, validator_set)
abci_chain_height = 0 if known_chain is None else known_chain['height'] abci_chain_height = 0 if known_chain is None else known_chain["height"]
self.planetmint_node.store_abci_chain(abci_chain_height, genesis.chain_id, True) self.planetmint_node.store_abci_chain(abci_chain_height, genesis.chain_id, True)
self.chain = {'height': abci_chain_height, 'is_synced': True, self.chain = {"height": abci_chain_height, "is_synced": True, "chain_id": genesis.chain_id}
'chain_id': genesis.chain_id}
return ResponseInitChain() return ResponseInitChain()
def info(self, request): def info(self, request):
@ -118,12 +115,12 @@ class App(BaseApplication):
r = ResponseInfo() r = ResponseInfo()
block = self.planetmint_node.get_latest_block() block = self.planetmint_node.get_latest_block()
if block: if block:
chain_shift = 0 if self.chain is None else self.chain['height'] chain_shift = 0 if self.chain is None else self.chain["height"]
r.last_block_height = block['height'] - chain_shift r.last_block_height = block["height"] - chain_shift
r.last_block_app_hash = block['app_hash'].encode('utf-8') r.last_block_app_hash = block["app_hash"].encode("utf-8")
else: else:
r.last_block_height = 0 r.last_block_height = 0
r.last_block_app_hash = b'' r.last_block_app_hash = b""
return r return r
def check_tx(self, raw_transaction): def check_tx(self, raw_transaction):
@ -136,13 +133,13 @@ class App(BaseApplication):
self.abort_if_abci_chain_is_not_synced() self.abort_if_abci_chain_is_not_synced()
logger.debug('check_tx: %s', raw_transaction) logger.debug("check_tx: %s", raw_transaction)
transaction = decode_transaction(raw_transaction) transaction = decode_transaction(raw_transaction)
if self.planetmint_node.is_valid_transaction(transaction): if self.planetmint_node.is_valid_transaction(transaction):
logger.debug('check_tx: VALID') logger.debug("check_tx: VALID")
return ResponseCheckTx(code=OkCode) return ResponseCheckTx(code=OkCode)
else: else:
logger.debug('check_tx: INVALID') logger.debug("check_tx: INVALID")
return ResponseCheckTx(code=CodeTypeError) return ResponseCheckTx(code=CodeTypeError)
def begin_block(self, req_begin_block): def begin_block(self, req_begin_block):
@ -153,10 +150,9 @@ class App(BaseApplication):
""" """
self.abort_if_abci_chain_is_not_synced() self.abort_if_abci_chain_is_not_synced()
chain_shift = 0 if self.chain is None else self.chain['height'] chain_shift = 0 if self.chain is None else self.chain["height"]
# req_begin_block.header.num_txs not found, so removing it. # req_begin_block.header.num_txs not found, so removing it.
logger.debug('BEGIN BLOCK, height:%s', logger.debug("BEGIN BLOCK, height:%s", req_begin_block.header.height + chain_shift)
req_begin_block.header.height + chain_shift)
self.block_txn_ids = [] self.block_txn_ids = []
self.block_transactions = [] self.block_transactions = []
@ -171,15 +167,16 @@ class App(BaseApplication):
self.abort_if_abci_chain_is_not_synced() self.abort_if_abci_chain_is_not_synced()
logger.debug('deliver_tx: %s', raw_transaction) logger.debug("deliver_tx: %s", raw_transaction)
transaction = self.planetmint_node.is_valid_transaction( transaction = self.planetmint_node.is_valid_transaction(
decode_transaction(raw_transaction), self.block_transactions) decode_transaction(raw_transaction), self.block_transactions
)
if not transaction: if not transaction:
logger.debug('deliver_tx: INVALID') logger.debug("deliver_tx: INVALID")
return ResponseDeliverTx(code=CodeTypeError) return ResponseDeliverTx(code=CodeTypeError)
else: else:
logger.debug('storing tx') logger.debug("storing tx")
self.block_txn_ids.append(transaction.id) self.block_txn_ids.append(transaction.id)
self.block_transactions.append(transaction) self.block_transactions.append(transaction)
return ResponseDeliverTx(code=OkCode) return ResponseDeliverTx(code=OkCode)
@ -194,28 +191,25 @@ class App(BaseApplication):
self.abort_if_abci_chain_is_not_synced() self.abort_if_abci_chain_is_not_synced()
chain_shift = 0 if self.chain is None else self.chain['height'] chain_shift = 0 if self.chain is None else self.chain["height"]
height = request_end_block.height + chain_shift height = request_end_block.height + chain_shift
self.new_height = height self.new_height = height
# store pre-commit state to recover in case there is a crash during # store pre-commit state to recover in case there is a crash during
# `end_block` or `commit` # `end_block` or `commit`
logger.debug(f'Updating pre-commit state: {self.new_height}') logger.debug(f"Updating pre-commit state: {self.new_height}")
pre_commit_state = dict(height=self.new_height, pre_commit_state = dict(height=self.new_height, transactions=self.block_txn_ids)
transactions=self.block_txn_ids)
self.planetmint_node.store_pre_commit_state(pre_commit_state) self.planetmint_node.store_pre_commit_state(pre_commit_state)
block_txn_hash = calculate_hash(self.block_txn_ids) block_txn_hash = calculate_hash(self.block_txn_ids)
block = self.planetmint_node.get_latest_block() block = self.planetmint_node.get_latest_block()
if self.block_txn_ids: if self.block_txn_ids:
self.block_txn_hash = calculate_hash([block['app_hash'], block_txn_hash]) self.block_txn_hash = calculate_hash([block["app_hash"], block_txn_hash])
else: else:
self.block_txn_hash = block['app_hash'] self.block_txn_hash = block["app_hash"]
validator_update = Election.process_block(self.planetmint_node, validator_update = Election.process_block(self.planetmint_node, self.new_height, self.block_transactions)
self.new_height,
self.block_transactions)
return ResponseEndBlock(validator_updates=validator_update) return ResponseEndBlock(validator_updates=validator_update)
@ -224,29 +218,29 @@ class App(BaseApplication):
self.abort_if_abci_chain_is_not_synced() self.abort_if_abci_chain_is_not_synced()
data = self.block_txn_hash.encode('utf-8') data = self.block_txn_hash.encode("utf-8")
# register a new block only when new transactions are received # register a new block only when new transactions are received
if self.block_txn_ids: if self.block_txn_ids:
self.planetmint_node.store_bulk_transactions(self.block_transactions) self.planetmint_node.store_bulk_transactions(self.block_transactions)
block = Block(app_hash=self.block_txn_hash, block = Block(app_hash=self.block_txn_hash, height=self.new_height, transactions=self.block_txn_ids)
height=self.new_height,
transactions=self.block_txn_ids)
# NOTE: storing the block should be the last operation during commit # NOTE: storing the block should be the last operation during commit
# this effects crash recovery. Refer BEP#8 for details # this effects crash recovery. Refer BEP#8 for details
self.planetmint_node.store_block(block._asdict()) self.planetmint_node.store_block(block._asdict())
logger.debug('Commit-ing new block with hash: apphash=%s ,' logger.debug(
'height=%s, txn ids=%s', data, self.new_height, "Commit-ing new block with hash: apphash=%s ," "height=%s, txn ids=%s",
self.block_txn_ids) data,
self.new_height,
self.block_txn_ids,
)
if self.events_queue: if self.events_queue:
event = Event(EventTypes.BLOCK_VALID, { event = Event(
'height': self.new_height, EventTypes.BLOCK_VALID,
'hash': self.block_txn_hash, {"height": self.new_height, "hash": self.block_txn_hash, "transactions": self.block_transactions},
'transactions': self.block_transactions )
})
self.events_queue.put(event) self.events_queue.put(event)
return ResponseCommit(data=data) return ResponseCommit(data=data)
@ -266,10 +260,10 @@ def rollback(b):
latest_block = b.get_latest_block() latest_block = b.get_latest_block()
if latest_block is None: if latest_block is None:
logger.error('Found precommit state but no blocks!') logger.error("Found precommit state but no blocks!")
sys.exit(1) sys.exit(1)
# NOTE: the pre-commit state is always at most 1 block ahead of the commited state # NOTE: the pre-commit state is always at most 1 block ahead of the commited state
if latest_block['height'] < pre_commit['height']: if latest_block["height"] < pre_commit["height"]:
Election.rollback(b, pre_commit['height'], pre_commit['transactions']) Election.rollback(b, pre_commit["height"], pre_commit["transactions"])
b.delete_transactions(pre_commit['transactions']) b.delete_transactions(pre_commit["transactions"])

View File

@ -8,7 +8,7 @@ from collections import defaultdict
from multiprocessing import Queue from multiprocessing import Queue
POISON_PILL = 'POISON_PILL' POISON_PILL = "POISON_PILL"
class EventTypes: class EventTypes:
@ -73,7 +73,7 @@ class Exchange:
try: try:
self.started_queue.get(timeout=1) self.started_queue.get(timeout=1)
raise RuntimeError('Cannot create a new subscriber queue while Exchange is running.') raise RuntimeError("Cannot create a new subscriber queue while Exchange is running.")
except Empty: except Empty:
pass pass
@ -99,7 +99,7 @@ class Exchange:
def run(self): def run(self):
"""Start the exchange""" """Start the exchange"""
self.started_queue.put('STARTED') self.started_queue.put("STARTED")
while True: while True:
event = self.publisher_queue.get() event = self.publisher_queue.get()

View File

@ -8,7 +8,7 @@ from planetmint.backend import query
from planetmint.transactions.common.transaction import TransactionLink from planetmint.transactions.common.transaction import TransactionLink
class FastQuery(): class FastQuery:
"""Database queries that join on block results from a single node.""" """Database queries that join on block results from a single node."""
def __init__(self, connection): def __init__(self, connection):
@ -17,11 +17,12 @@ class FastQuery():
def get_outputs_by_public_key(self, public_key): def get_outputs_by_public_key(self, public_key):
"""Get outputs for a public key""" """Get outputs for a public key"""
txs = list(query.get_owned_ids(self.connection, public_key)) txs = list(query.get_owned_ids(self.connection, public_key))
return [TransactionLink(tx['id'], index) return [
TransactionLink(tx["id"], index)
for tx in txs for tx in txs
for index, output in enumerate(tx['outputs']) for index, output in enumerate(tx["outputs"])
if condition_details_has_owner(output['condition']['details'], if condition_details_has_owner(output["condition"]["details"], public_key)
public_key)] ]
def filter_spent_outputs(self, outputs): def filter_spent_outputs(self, outputs):
"""Remove outputs that have been spent """Remove outputs that have been spent
@ -31,9 +32,7 @@ class FastQuery():
""" """
links = [o.to_dict() for o in outputs] links = [o.to_dict() for o in outputs]
txs = list(query.get_spending_transactions(self.connection, links)) txs = list(query.get_spending_transactions(self.connection, links))
spends = {TransactionLink.from_dict(input_['fulfills']) spends = {TransactionLink.from_dict(input_["fulfills"]) for tx in txs for input_ in tx["inputs"]}
for tx in txs
for input_ in tx['inputs']}
return [ff for ff in outputs if ff not in spends] return [ff for ff in outputs if ff not in spends]
def filter_unspent_outputs(self, outputs): def filter_unspent_outputs(self, outputs):
@ -44,7 +43,5 @@ class FastQuery():
""" """
links = [o.to_dict() for o in outputs] links = [o.to_dict() for o in outputs]
txs = list(query.get_spending_transactions(self.connection, links)) txs = list(query.get_spending_transactions(self.connection, links))
spends = {TransactionLink.from_dict(input_['fulfills']) spends = {TransactionLink.from_dict(input_["fulfills"]) for tx in txs for input_ in tx["inputs"]}
for tx in txs
for input_ in tx['inputs']}
return [ff for ff in outputs if ff in spends] return [ff for ff in outputs if ff in spends]

View File

@ -25,10 +25,12 @@ import planetmint
from planetmint.config import Config from planetmint.config import Config
from planetmint import backend, config_utils, fastquery from planetmint import backend, config_utils, fastquery
from planetmint.models import Transaction from planetmint.models import Transaction
from planetmint.transactions.common.exceptions import ( from planetmint.transactions.common.exceptions import SchemaValidationError, ValidationError, DoubleSpend
SchemaValidationError, ValidationError, DoubleSpend)
from planetmint.transactions.common.transaction_mode_types import ( from planetmint.transactions.common.transaction_mode_types import (
BROADCAST_TX_COMMIT, BROADCAST_TX_ASYNC, BROADCAST_TX_SYNC) BROADCAST_TX_COMMIT,
BROADCAST_TX_ASYNC,
BROADCAST_TX_SYNC,
)
from planetmint.tendermint_utils import encode_transaction, merkleroot from planetmint.tendermint_utils import encode_transaction, merkleroot
from planetmint import exceptions as core_exceptions from planetmint import exceptions as core_exceptions
from planetmint.validation import BaseValidationRules from planetmint.validation import BaseValidationRules
@ -60,14 +62,12 @@ class Planetmint(object):
""" """
config_utils.autoconfigure() config_utils.autoconfigure()
self.mode_commit = BROADCAST_TX_COMMIT self.mode_commit = BROADCAST_TX_COMMIT
self.mode_list = (BROADCAST_TX_ASYNC, self.mode_list = (BROADCAST_TX_ASYNC, BROADCAST_TX_SYNC, self.mode_commit)
BROADCAST_TX_SYNC, self.tendermint_host = Config().get()["tendermint"]["host"]
self.mode_commit) self.tendermint_port = Config().get()["tendermint"]["port"]
self.tendermint_host = Config().get()['tendermint']['host'] self.endpoint = "http://{}:{}/".format(self.tendermint_host, self.tendermint_port)
self.tendermint_port = Config().get()['tendermint']['port']
self.endpoint = 'http://{}:{}/'.format(self.tendermint_host, self.tendermint_port)
validationPlugin = Config().get().get('validation_plugin') validationPlugin = Config().get().get("validation_plugin")
if validationPlugin: if validationPlugin:
self.validation = config_utils.load_validation_plugin(validationPlugin) self.validation = config_utils.load_validation_plugin(validationPlugin)
@ -78,16 +78,10 @@ class Planetmint(object):
def post_transaction(self, transaction, mode): def post_transaction(self, transaction, mode):
"""Submit a valid transaction to the mempool.""" """Submit a valid transaction to the mempool."""
if not mode or mode not in self.mode_list: if not mode or mode not in self.mode_list:
raise ValidationError('Mode must be one of the following {}.' raise ValidationError("Mode must be one of the following {}.".format(", ".join(self.mode_list)))
.format(', '.join(self.mode_list)))
tx_dict = transaction.tx_dict if transaction.tx_dict else transaction.to_dict() tx_dict = transaction.tx_dict if transaction.tx_dict else transaction.to_dict()
payload = { payload = {"method": mode, "jsonrpc": "2.0", "params": [encode_transaction(tx_dict)], "id": str(uuid4())}
'method': mode,
'jsonrpc': '2.0',
'params': [encode_transaction(tx_dict)],
'id': str(uuid4())
}
# TODO: handle connection errors! # TODO: handle connection errors!
return requests.post(self.endpoint, json=payload) return requests.post(self.endpoint, json=payload)
@ -100,29 +94,29 @@ class Planetmint(object):
def _process_post_response(self, response, mode): def _process_post_response(self, response, mode):
logger.debug(response) logger.debug(response)
error = response.get('error') error = response.get("error")
if error: if error:
status_code = 500 status_code = 500
message = error.get('message', 'Internal Error') message = error.get("message", "Internal Error")
data = error.get('data', '') data = error.get("data", "")
if 'Tx already exists in cache' in data: if "Tx already exists in cache" in data:
status_code = 400 status_code = 400
return (status_code, message + ' - ' + data) return (status_code, message + " - " + data)
result = response['result'] result = response["result"]
if mode == self.mode_commit: if mode == self.mode_commit:
check_tx_code = result.get('check_tx', {}).get('code', 0) check_tx_code = result.get("check_tx", {}).get("code", 0)
deliver_tx_code = result.get('deliver_tx', {}).get('code', 0) deliver_tx_code = result.get("deliver_tx", {}).get("code", 0)
error_code = check_tx_code or deliver_tx_code error_code = check_tx_code or deliver_tx_code
else: else:
error_code = result.get('code', 0) error_code = result.get("code", 0)
if error_code: if error_code:
return (500, 'Transaction validation failed') return (500, "Transaction validation failed")
return (202, '') return (202, "")
def store_bulk_transactions(self, transactions): def store_bulk_transactions(self, transactions):
txns = [] txns = []
@ -132,18 +126,20 @@ class Planetmint(object):
for t in transactions: for t in transactions:
transaction = t.tx_dict if t.tx_dict else rapidjson.loads(rapidjson.dumps(t.to_dict())) transaction = t.tx_dict if t.tx_dict else rapidjson.loads(rapidjson.dumps(t.to_dict()))
asset = transaction.pop('asset') asset = transaction.pop("asset")
metadata = transaction.pop('metadata') metadata = transaction.pop("metadata")
asset = backend.convert.prepare_asset(self.connection, asset = backend.convert.prepare_asset(
self.connection,
transaction_type=transaction["operation"], transaction_type=transaction["operation"],
transaction_id=transaction["id"], transaction_id=transaction["id"],
filter_operation=t.CREATE, filter_operation=t.CREATE,
asset=asset) asset=asset,
)
metadata = backend.convert.prepare_metadata(self.connection, metadata = backend.convert.prepare_metadata(
transaction_id=transaction["id"], self.connection, transaction_id=transaction["id"], metadata=metadata
metadata=metadata) )
txn_metadatas.append(metadata) txn_metadatas.append(metadata)
assets.append(asset) assets.append(asset)
@ -167,14 +163,10 @@ class Planetmint(object):
transaction incoming into the system for which the UTXOF transaction incoming into the system for which the UTXOF
set needs to be updated. set needs to be updated.
""" """
spent_outputs = [ spent_outputs = [spent_output for spent_output in transaction.spent_outputs]
spent_output for spent_output in transaction.spent_outputs
]
if spent_outputs: if spent_outputs:
self.delete_unspent_outputs(*spent_outputs) self.delete_unspent_outputs(*spent_outputs)
self.store_unspent_outputs( self.store_unspent_outputs(*[utxo._asdict() for utxo in transaction.unspent_outputs])
*[utxo._asdict() for utxo in transaction.unspent_outputs]
)
def store_unspent_outputs(self, *unspent_outputs): def store_unspent_outputs(self, *unspent_outputs):
"""Store the given ``unspent_outputs`` (utxos). """Store the given ``unspent_outputs`` (utxos).
@ -184,8 +176,7 @@ class Planetmint(object):
length tuple or list of unspent outputs. length tuple or list of unspent outputs.
""" """
if unspent_outputs: if unspent_outputs:
return backend.query.store_unspent_outputs( return backend.query.store_unspent_outputs(self.connection, *unspent_outputs)
self.connection, *unspent_outputs)
def get_utxoset_merkle_root(self): def get_utxoset_merkle_root(self):
"""Returns the merkle root of the utxoset. This implies that """Returns the merkle root of the utxoset. This implies that
@ -214,9 +205,7 @@ class Planetmint(object):
# TODO Once ready, use the already pre-computed utxo_hash field. # TODO Once ready, use the already pre-computed utxo_hash field.
# See common/transactions.py for details. # See common/transactions.py for details.
hashes = [ hashes = [
sha3_256( sha3_256("{}{}".format(utxo["transaction_id"], utxo["output_index"]).encode()).digest() for utxo in utxoset
'{}{}'.format(utxo['transaction_id'], utxo['output_index']).encode()
).digest() for utxo in utxoset
] ]
# TODO Notice the sorted call! # TODO Notice the sorted call!
return merkleroot(sorted(hashes)) return merkleroot(sorted(hashes))
@ -238,8 +227,7 @@ class Planetmint(object):
length tuple or list of unspent outputs. length tuple or list of unspent outputs.
""" """
if unspent_outputs: if unspent_outputs:
return backend.query.delete_unspent_outputs( return backend.query.delete_unspent_outputs(self.connection, *unspent_outputs)
self.connection, *unspent_outputs)
def is_committed(self, transaction_id): def is_committed(self, transaction_id):
transaction = backend.query.get_transaction(self.connection, transaction_id) transaction = backend.query.get_transaction(self.connection, transaction_id)
@ -251,14 +239,14 @@ class Planetmint(object):
asset = backend.query.get_asset(self.connection, transaction_id) asset = backend.query.get_asset(self.connection, transaction_id)
metadata = backend.query.get_metadata(self.connection, [transaction_id]) metadata = backend.query.get_metadata(self.connection, [transaction_id])
if asset: if asset:
transaction['asset'] = asset transaction["asset"] = asset
if 'metadata' not in transaction: if "metadata" not in transaction:
metadata = metadata[0] if metadata else None metadata = metadata[0] if metadata else None
if metadata: if metadata:
metadata = metadata.get('metadata') metadata = metadata.get("metadata")
transaction.update({'metadata': metadata}) transaction.update({"metadata": metadata})
transaction = Transaction.from_dict(transaction) transaction = Transaction.from_dict(transaction)
@ -268,10 +256,8 @@ class Planetmint(object):
return backend.query.get_transactions(self.connection, txn_ids) return backend.query.get_transactions(self.connection, txn_ids)
def get_transactions_filtered(self, asset_id, operation=None, last_tx=None): def get_transactions_filtered(self, asset_id, operation=None, last_tx=None):
"""Get a list of transactions filtered on some criteria """Get a list of transactions filtered on some criteria"""
""" txids = backend.query.get_txids_filtered(self.connection, asset_id, operation, last_tx)
txids = backend.query.get_txids_filtered(self.connection, asset_id,
operation, last_tx)
for txid in txids: for txid in txids:
yield self.get_transaction(txid) yield self.get_transaction(txid)
@ -297,27 +283,24 @@ class Planetmint(object):
return self.fastquery.filter_spent_outputs(outputs) return self.fastquery.filter_spent_outputs(outputs)
def get_spent(self, txid, output, current_transactions=[]): def get_spent(self, txid, output, current_transactions=[]):
transactions = backend.query.get_spent(self.connection, txid, transactions = backend.query.get_spent(self.connection, txid, output)
output)
transactions = list(transactions) if transactions else [] transactions = list(transactions) if transactions else []
if len(transactions) > 1: if len(transactions) > 1:
raise core_exceptions.CriticalDoubleSpend( raise core_exceptions.CriticalDoubleSpend(
'`{}` was spent more than once. There is a problem' "`{}` was spent more than once. There is a problem" " with the chain".format(txid)
' with the chain'.format(txid)) )
current_spent_transactions = [] current_spent_transactions = []
for ctxn in current_transactions: for ctxn in current_transactions:
for ctxn_input in ctxn.inputs: for ctxn_input in ctxn.inputs:
if ctxn_input.fulfills and \ if ctxn_input.fulfills and ctxn_input.fulfills.txid == txid and ctxn_input.fulfills.output == output:
ctxn_input.fulfills.txid == txid and \
ctxn_input.fulfills.output == output:
current_spent_transactions.append(ctxn) current_spent_transactions.append(ctxn)
transaction = None transaction = None
if len(transactions) + len(current_spent_transactions) > 1: if len(transactions) + len(current_spent_transactions) > 1:
raise DoubleSpend('tx "{}" spends inputs twice'.format(txid)) raise DoubleSpend('tx "{}" spends inputs twice'.format(txid))
elif transactions: elif transactions:
transaction = backend.query.get_transactions(self.connection, [transactions[0]['id']]) transaction = backend.query.get_transactions(self.connection, [transactions[0]["id"]])
transaction = Transaction.from_dict(transaction[0]) transaction = Transaction.from_dict(transaction[0])
elif current_spent_transactions: elif current_spent_transactions:
transaction = current_spent_transactions[0] transaction = current_spent_transactions[0]
@ -346,17 +329,16 @@ class Planetmint(object):
block = backend.query.get_block(self.connection, block_id) block = backend.query.get_block(self.connection, block_id)
latest_block = self.get_latest_block() latest_block = self.get_latest_block()
latest_block_height = latest_block['height'] if latest_block else 0 latest_block_height = latest_block["height"] if latest_block else 0
if not block and block_id > latest_block_height: if not block and block_id > latest_block_height:
return return
result = {'height': block_id, result = {"height": block_id, "transactions": []}
'transactions': []}
if block: if block:
transactions = backend.query.get_transactions(self.connection, block['transactions']) transactions = backend.query.get_transactions(self.connection, block["transactions"])
result['transactions'] = [t.to_dict() for t in Transaction.from_db(self, transactions)] result["transactions"] = [t.to_dict() for t in Transaction.from_db(self, transactions)]
return result return result
@ -372,9 +354,9 @@ class Planetmint(object):
""" """
blocks = list(backend.query.get_block_with_transaction(self.connection, txid)) blocks = list(backend.query.get_block_with_transaction(self.connection, txid))
if len(blocks) > 1: if len(blocks) > 1:
logger.critical('Transaction id %s exists in multiple blocks', txid) logger.critical("Transaction id %s exists in multiple blocks", txid)
return [block['height'] for block in blocks] return [block["height"] for block in blocks]
def validate_transaction(self, tx, current_transactions=[]): def validate_transaction(self, tx, current_transactions=[]):
"""Validate a transaction against the current status of the database.""" """Validate a transaction against the current status of the database."""
@ -388,10 +370,10 @@ class Planetmint(object):
try: try:
transaction = Transaction.from_dict(tx) transaction = Transaction.from_dict(tx)
except SchemaValidationError as e: except SchemaValidationError as e:
logger.warning('Invalid transaction schema: %s', e.__cause__.message) logger.warning("Invalid transaction schema: %s", e.__cause__.message)
return False return False
except ValidationError as e: except ValidationError as e:
logger.warning('Invalid transaction (%s): %s', type(e).__name__, e) logger.warning("Invalid transaction (%s): %s", type(e).__name__, e)
return False return False
return transaction.validate(self, current_transactions) return transaction.validate(self, current_transactions)
@ -401,10 +383,10 @@ class Planetmint(object):
try: try:
return self.validate_transaction(tx, current_transactions) return self.validate_transaction(tx, current_transactions)
except ValidationError as e: except ValidationError as e:
logger.warning('Invalid transaction (%s): %s', type(e).__name__, e) logger.warning("Invalid transaction (%s): %s", type(e).__name__, e)
return False return False
def text_search(self, search, *, limit=0, table='assets'): def text_search(self, search, *, limit=0, table="assets"):
"""Return an iterator of assets that match the text search """Return an iterator of assets that match the text search
Args: Args:
@ -414,8 +396,7 @@ class Planetmint(object):
Returns: Returns:
iter: An iterator of assets that match the text search. iter: An iterator of assets that match the text search.
""" """
return backend.query.text_search(self.connection, search, limit=limit, return backend.query.text_search(self.connection, search, limit=limit, table=table)
table=table)
def get_assets(self, asset_ids): def get_assets(self, asset_ids):
"""Return a list of assets that match the asset_ids """Return a list of assets that match the asset_ids
@ -450,7 +431,7 @@ class Planetmint(object):
def get_validators(self, height=None): def get_validators(self, height=None):
result = self.get_validator_change(height) result = self.get_validator_change(height)
return [] if result is None else result['validators'] return [] if result is None else result["validators"]
def get_election(self, election_id): def get_election(self, election_id):
return backend.query.get_election(self.connection, election_id) return backend.query.get_election(self.connection, election_id)
@ -466,15 +447,13 @@ class Planetmint(object):
NOTE: If the validator set already exists at that `height` then an NOTE: If the validator set already exists at that `height` then an
exception will be raised. exception will be raised.
""" """
return backend.query.store_validator_set(self.connection, {'height': height, return backend.query.store_validator_set(self.connection, {"height": height, "validators": validators})
'validators': validators})
def delete_validator_set(self, height): def delete_validator_set(self, height):
return backend.query.delete_validator_set(self.connection, height) return backend.query.delete_validator_set(self.connection, height)
def store_abci_chain(self, height, chain_id, is_synced=True): def store_abci_chain(self, height, chain_id, is_synced=True):
return backend.query.store_abci_chain(self.connection, height, return backend.query.store_abci_chain(self.connection, height, chain_id, is_synced)
chain_id, is_synced)
def delete_abci_chain(self, height): def delete_abci_chain(self, height):
return backend.query.delete_abci_chain(self.connection, height) return backend.query.delete_abci_chain(self.connection, height)
@ -499,16 +478,15 @@ class Planetmint(object):
block = self.get_latest_block() block = self.get_latest_block()
suffix = '-migrated-at-height-' suffix = "-migrated-at-height-"
chain_id = latest_chain['chain_id'] chain_id = latest_chain["chain_id"]
block_height_str = str(block['height']) block_height_str = str(block["height"])
new_chain_id = chain_id.split(suffix)[0] + suffix + block_height_str new_chain_id = chain_id.split(suffix)[0] + suffix + block_height_str
self.store_abci_chain(block['height'] + 1, new_chain_id, False) self.store_abci_chain(block["height"] + 1, new_chain_id, False)
def store_election(self, election_id, height, is_concluded): def store_election(self, election_id, height, is_concluded):
return backend.query.store_election(self.connection, election_id, return backend.query.store_election(self.connection, election_id, height, is_concluded)
height, is_concluded)
def store_elections(self, elections): def store_elections(self, elections):
return backend.query.store_elections(self.connection, elections) return backend.query.store_elections(self.connection, elections)
@ -517,4 +495,4 @@ class Planetmint(object):
return backend.query.delete_elections(self.connection, height) return backend.query.delete_elections(self.connection, height)
Block = namedtuple('Block', ('app_hash', 'height', 'transactions')) Block = namedtuple("Block", ("app_hash", "height", "transactions"))

View File

@ -11,11 +11,12 @@ from logging.config import dictConfig as set_logging_config
from planetmint.config import Config, DEFAULT_LOGGING_CONFIG from planetmint.config import Config, DEFAULT_LOGGING_CONFIG
import os import os
def _normalize_log_level(level): def _normalize_log_level(level):
try: try:
return level.upper() return level.upper()
except AttributeError as exc: except AttributeError as exc:
raise ConfigurationError('Log level must be a string!') from exc raise ConfigurationError("Log level must be a string!") from exc
def setup_logging(): def setup_logging():
@ -32,47 +33,47 @@ def setup_logging():
""" """
logging_configs = DEFAULT_LOGGING_CONFIG logging_configs = DEFAULT_LOGGING_CONFIG
new_logging_configs = Config().get()['log'] new_logging_configs = Config().get()["log"]
if 'file' in new_logging_configs: if "file" in new_logging_configs:
filename = new_logging_configs['file'] filename = new_logging_configs["file"]
logging_configs['handlers']['file']['filename'] = filename logging_configs["handlers"]["file"]["filename"] = filename
if 'error_file' in new_logging_configs: if "error_file" in new_logging_configs:
error_filename = new_logging_configs['error_file'] error_filename = new_logging_configs["error_file"]
logging_configs['handlers']['errors']['filename'] = error_filename logging_configs["handlers"]["errors"]["filename"] = error_filename
if 'level_console' in new_logging_configs: if "level_console" in new_logging_configs:
level = _normalize_log_level(new_logging_configs['level_console']) level = _normalize_log_level(new_logging_configs["level_console"])
logging_configs['handlers']['console']['level'] = level logging_configs["handlers"]["console"]["level"] = level
if 'level_logfile' in new_logging_configs: if "level_logfile" in new_logging_configs:
level = _normalize_log_level(new_logging_configs['level_logfile']) level = _normalize_log_level(new_logging_configs["level_logfile"])
logging_configs['handlers']['file']['level'] = level logging_configs["handlers"]["file"]["level"] = level
if 'fmt_console' in new_logging_configs: if "fmt_console" in new_logging_configs:
fmt = new_logging_configs['fmt_console'] fmt = new_logging_configs["fmt_console"]
logging_configs['formatters']['console']['format'] = fmt logging_configs["formatters"]["console"]["format"] = fmt
if 'fmt_logfile' in new_logging_configs: if "fmt_logfile" in new_logging_configs:
fmt = new_logging_configs['fmt_logfile'] fmt = new_logging_configs["fmt_logfile"]
logging_configs['formatters']['file']['format'] = fmt logging_configs["formatters"]["file"]["format"] = fmt
if 'datefmt_console' in new_logging_configs: if "datefmt_console" in new_logging_configs:
fmt = new_logging_configs['datefmt_console'] fmt = new_logging_configs["datefmt_console"]
logging_configs['formatters']['console']['datefmt'] = fmt logging_configs["formatters"]["console"]["datefmt"] = fmt
if 'datefmt_logfile' in new_logging_configs: if "datefmt_logfile" in new_logging_configs:
fmt = new_logging_configs['datefmt_logfile'] fmt = new_logging_configs["datefmt_logfile"]
logging_configs['formatters']['file']['datefmt'] = fmt logging_configs["formatters"]["file"]["datefmt"] = fmt
log_levels = new_logging_configs.get('granular_levels', {}) log_levels = new_logging_configs.get("granular_levels", {})
for logger_name, level in log_levels.items(): for logger_name, level in log_levels.items():
level = _normalize_log_level(level) level = _normalize_log_level(level)
try: try:
logging_configs['loggers'][logger_name]['level'] = level logging_configs["loggers"][logger_name]["level"] = level
except KeyError: except KeyError:
logging_configs['loggers'][logger_name] = {'level': level} logging_configs["loggers"][logger_name] = {"level": level}
set_logging_config(logging_configs) set_logging_config(logging_configs)

View File

@ -4,16 +4,16 @@
# Code is Apache-2.0 and docs are CC-BY-4.0 # Code is Apache-2.0 and docs are CC-BY-4.0
from planetmint.backend.schema import validate_language_key from planetmint.backend.schema import validate_language_key
from planetmint.transactions.common.exceptions import (InvalidSignature, DuplicateTransaction) from planetmint.transactions.common.exceptions import InvalidSignature, DuplicateTransaction
from planetmint.transactions.common.schema import validate_transaction_schema from planetmint.transactions.common.schema import validate_transaction_schema
from planetmint.transactions.common.transaction import Transaction from planetmint.transactions.common.transaction import Transaction
from planetmint.transactions.common.utils import (validate_txn_obj, validate_key) from planetmint.transactions.common.utils import validate_txn_obj, validate_key
class Transaction(Transaction): class Transaction(Transaction):
ASSET = 'asset' ASSET = "asset"
METADATA = 'metadata' METADATA = "metadata"
DATA = 'data' DATA = "data"
def validate(self, planet, current_transactions=[]): def validate(self, planet, current_transactions=[]):
"""Validate transaction spend """Validate transaction spend
@ -31,11 +31,10 @@ class Transaction(Transaction):
if self.operation == Transaction.CREATE: if self.operation == Transaction.CREATE:
duplicates = any(txn for txn in current_transactions if txn.id == self.id) duplicates = any(txn for txn in current_transactions if txn.id == self.id)
if planet.is_committed(self.id) or duplicates: if planet.is_committed(self.id) or duplicates:
raise DuplicateTransaction('transaction `{}` already exists' raise DuplicateTransaction("transaction `{}` already exists".format(self.id))
.format(self.id))
if not self.inputs_valid(input_conditions): if not self.inputs_valid(input_conditions):
raise InvalidSignature('Transaction signature is invalid.') raise InvalidSignature("Transaction signature is invalid.")
elif self.operation == Transaction.TRANSFER: elif self.operation == Transaction.TRANSFER:
self.validate_transfer_inputs(planet, current_transactions) self.validate_transfer_inputs(planet, current_transactions)
@ -68,7 +67,7 @@ class FastTransaction:
@property @property
def id(self): def id(self):
return self.data['id'] return self.data["id"]
def to_dict(self): def to_dict(self):
return self.data return self.data

View File

@ -39,8 +39,8 @@ class ParallelValidationApp(App):
return super().end_block(request_end_block) return super().end_block(request_end_block)
RESET = 'reset' RESET = "reset"
EXIT = 'exit' EXIT = "exit"
class ParallelValidator: class ParallelValidator:
@ -64,7 +64,7 @@ class ParallelValidator:
def validate(self, raw_transaction): def validate(self, raw_transaction):
dict_transaction = decode_transaction(raw_transaction) dict_transaction = decode_transaction(raw_transaction)
index = int(dict_transaction['id'], 16) % self.number_of_workers index = int(dict_transaction["id"], 16) % self.number_of_workers
self.routing_queues[index].put((self.transaction_index, dict_transaction)) self.routing_queues[index].put((self.transaction_index, dict_transaction))
self.transaction_index += 1 self.transaction_index += 1
@ -105,13 +105,11 @@ class ValidationWorker:
def validate(self, dict_transaction): def validate(self, dict_transaction):
try: try:
asset_id = dict_transaction['asset']['id'] asset_id = dict_transaction["asset"]["id"]
except KeyError: except KeyError:
asset_id = dict_transaction['id'] asset_id = dict_transaction["id"]
transaction = self.planetmint.is_valid_transaction( transaction = self.planetmint.is_valid_transaction(dict_transaction, self.validated_transactions[asset_id])
dict_transaction,
self.validated_transactions[asset_id])
if transaction: if transaction:
self.validated_transactions[asset_id].append(transaction) self.validated_transactions[asset_id].append(transaction)

View File

@ -40,13 +40,12 @@ def start(args):
exchange = Exchange() exchange = Exchange()
# start the web api # start the web api
app_server = server.create_server( app_server = server.create_server(
settings=Config().get()['server'], settings=Config().get()["server"], log_config=Config().get()["log"], planetmint_factory=Planetmint
log_config=Config().get()['log'], )
planetmint_factory=Planetmint) p_webapi = Process(name="planetmint_webapi", target=app_server.run, daemon=True)
p_webapi = Process(name='planetmint_webapi', target=app_server.run, daemon=True)
p_webapi.start() p_webapi.start()
logger.info(BANNER.format(Config().get()['server']['bind'])) logger.info(BANNER.format(Config().get()["server"]["bind"]))
# start websocket server # start websocket server
p_websocket_server = Process( p_websocket_server = Process(

View File

@ -17,28 +17,28 @@ except ImportError:
def encode_transaction(value): def encode_transaction(value):
"""Encode a transaction (dict) to Base64.""" """Encode a transaction (dict) to Base64."""
return base64.b64encode(json.dumps(value).encode('utf8')).decode('utf8') return base64.b64encode(json.dumps(value).encode("utf8")).decode("utf8")
def decode_transaction(raw): def decode_transaction(raw):
"""Decode a transaction from bytes to a dict.""" """Decode a transaction from bytes to a dict."""
return json.loads(raw.decode('utf8')) return json.loads(raw.decode("utf8"))
def decode_transaction_base64(value): def decode_transaction_base64(value):
"""Decode a transaction from Base64.""" """Decode a transaction from Base64."""
return json.loads(base64.b64decode(value.encode('utf8')).decode('utf8')) return json.loads(base64.b64decode(value.encode("utf8")).decode("utf8"))
def calculate_hash(key_list): def calculate_hash(key_list):
if not key_list: if not key_list:
return '' return ""
full_hash = sha3_256() full_hash = sha3_256()
for key in key_list: for key in key_list:
full_hash.update(key.encode('utf8')) full_hash.update(key.encode("utf8"))
return full_hash.hexdigest() return full_hash.hexdigest()
@ -59,16 +59,13 @@ def merkleroot(hashes):
# i.e. an empty list, then the hash of the empty string is returned. # i.e. an empty list, then the hash of the empty string is returned.
# This seems too easy but maybe that is good enough? TO REVIEW! # This seems too easy but maybe that is good enough? TO REVIEW!
if not hashes: if not hashes:
return sha3_256(b'').hexdigest() return sha3_256(b"").hexdigest()
# XXX END TEMPORARY -- MUST REVIEW ... # XXX END TEMPORARY -- MUST REVIEW ...
if len(hashes) == 1: if len(hashes) == 1:
return hexlify(hashes[0]).decode() return hexlify(hashes[0]).decode()
if len(hashes) % 2 == 1: if len(hashes) % 2 == 1:
hashes.append(hashes[-1]) hashes.append(hashes[-1])
parent_hashes = [ parent_hashes = [sha3_256(hashes[i] + hashes[i + 1]).digest() for i in range(0, len(hashes) - 1, 2)]
sha3_256(hashes[i] + hashes[i + 1]).digest()
for i in range(0, len(hashes) - 1, 2)
]
return merkleroot(parent_hashes) return merkleroot(parent_hashes)
@ -76,7 +73,7 @@ def public_key64_to_address(base64_public_key):
"""Note this only compatible with Tendermint 0.19.x""" """Note this only compatible with Tendermint 0.19.x"""
ed25519_public_key = public_key_from_base64(base64_public_key) ed25519_public_key = public_key_from_base64(base64_public_key)
encoded_public_key = amino_encoded_public_key(ed25519_public_key) encoded_public_key = amino_encoded_public_key(ed25519_public_key)
return hashlib.new('ripemd160', encoded_public_key).hexdigest().upper() return hashlib.new("ripemd160", encoded_public_key).hexdigest().upper()
def public_key_from_base64(base64_public_key): def public_key_from_base64(base64_public_key):
@ -93,8 +90,8 @@ def public_key_to_base64(ed25519_public_key):
def key_to_base64(ed25519_key): def key_to_base64(ed25519_key):
ed25519_key = bytes.fromhex(ed25519_key) ed25519_key = bytes.fromhex(ed25519_key)
return base64.b64encode(ed25519_key).decode('utf-8') return base64.b64encode(ed25519_key).decode("utf-8")
def amino_encoded_public_key(ed25519_public_key): def amino_encoded_public_key(ed25519_public_key):
return bytes.fromhex('1624DE6220{}'.format(ed25519_public_key)) return bytes.fromhex("1624DE6220{}".format(ed25519_public_key))

View File

@ -14,7 +14,7 @@ except ImportError:
from cryptoconditions import crypto from cryptoconditions import crypto
CryptoKeypair = namedtuple('CryptoKeypair', ('private_key', 'public_key')) CryptoKeypair = namedtuple("CryptoKeypair", ("private_key", "public_key"))
def hash_data(data): def hash_data(data):
@ -33,8 +33,7 @@ def generate_key_pair():
""" """
# TODO FOR CC: Adjust interface so that this function becomes unnecessary # TODO FOR CC: Adjust interface so that this function becomes unnecessary
return CryptoKeypair( return CryptoKeypair(*(k.decode() for k in crypto.ed25519_generate_key_pair()))
*(k.decode() for k in crypto.ed25519_generate_key_pair()))
PrivateKey = crypto.Ed25519SigningKey PrivateKey = crypto.Ed25519SigningKey
@ -43,13 +42,15 @@ PublicKey = crypto.Ed25519VerifyingKey
def key_pair_from_ed25519_key(hex_private_key): def key_pair_from_ed25519_key(hex_private_key):
"""Generate base58 encode public-private key pair from a hex encoded private key""" """Generate base58 encode public-private key pair from a hex encoded private key"""
priv_key = crypto.Ed25519SigningKey(bytes.fromhex(hex_private_key)[:32], encoding='bytes') priv_key = crypto.Ed25519SigningKey(bytes.fromhex(hex_private_key)[:32], encoding="bytes")
public_key = priv_key.get_verifying_key() public_key = priv_key.get_verifying_key()
return CryptoKeypair(private_key=priv_key.encode(encoding='base58').decode('utf-8'), return CryptoKeypair(
public_key=public_key.encode(encoding='base58').decode('utf-8')) private_key=priv_key.encode(encoding="base58").decode("utf-8"),
public_key=public_key.encode(encoding="base58").decode("utf-8"),
)
def public_key_from_ed25519_key(hex_public_key): def public_key_from_ed25519_key(hex_public_key):
"""Generate base58 public key from hex encoded public key""" """Generate base58 public key from hex encoded public key"""
public_key = crypto.Ed25519VerifyingKey(bytes.fromhex(hex_public_key), encoding='bytes') public_key = crypto.Ed25519VerifyingKey(bytes.fromhex(hex_public_key), encoding="bytes")
return public_key.encode(encoding='base58').decode('utf-8') return public_key.encode(encoding="base58").decode("utf-8")

View File

@ -40,9 +40,9 @@ class Input(object):
of a `TRANSFER` Transaction. of a `TRANSFER` Transaction.
""" """
if fulfills is not None and not isinstance(fulfills, TransactionLink): if fulfills is not None and not isinstance(fulfills, TransactionLink):
raise TypeError('`fulfills` must be a TransactionLink instance') raise TypeError("`fulfills` must be a TransactionLink instance")
if not isinstance(owners_before, list): if not isinstance(owners_before, list):
raise TypeError('`owners_before` must be a list instance') raise TypeError("`owners_before` must be a list instance")
self.fulfillment = fulfillment self.fulfillment = fulfillment
self.fulfills = fulfills self.fulfills = fulfills
@ -79,9 +79,9 @@ class Input(object):
fulfills = None fulfills = None
input_ = { input_ = {
'owners_before': self.owners_before, "owners_before": self.owners_before,
'fulfills': fulfills, "fulfills": fulfills,
'fulfillment': fulfillment, "fulfillment": fulfillment,
} }
return input_ return input_
@ -110,10 +110,10 @@ class Input(object):
Raises: Raises:
InvalidSignature: If an Input's URI couldn't be parsed. InvalidSignature: If an Input's URI couldn't be parsed.
""" """
fulfillment = data['fulfillment'] fulfillment = data["fulfillment"]
if not isinstance(fulfillment, (Fulfillment, type(None))): if not isinstance(fulfillment, (Fulfillment, type(None))):
try: try:
fulfillment = Fulfillment.from_uri(data['fulfillment']) fulfillment = Fulfillment.from_uri(data["fulfillment"])
except ASN1DecodeError: except ASN1DecodeError:
# TODO Remove as it is legacy code, and simply fall back on # TODO Remove as it is legacy code, and simply fall back on
# ASN1DecodeError # ASN1DecodeError
@ -121,6 +121,6 @@ class Input(object):
except TypeError: except TypeError:
# NOTE: See comment about this special case in # NOTE: See comment about this special case in
# `Input.to_dict` # `Input.to_dict`
fulfillment = _fulfillment_from_details(data['fulfillment']) fulfillment = _fulfillment_from_details(data["fulfillment"])
fulfills = TransactionLink.from_dict(data['fulfills']) fulfills = TransactionLink.from_dict(data["fulfills"])
return cls(fulfillment, data['owners_before'], fulfills) return cls(fulfillment, data["owners_before"], fulfills)

View File

@ -5,7 +5,7 @@ from functools import lru_cache
class HDict(dict): class HDict(dict):
def __hash__(self): def __hash__(self):
return hash(codecs.decode(self['id'], 'hex')) return hash(codecs.decode(self["id"], "hex"))
@lru_cache(maxsize=16384) @lru_cache(maxsize=16384)
@ -14,12 +14,11 @@ def from_dict(func, *args, **kwargs):
def memoize_from_dict(func): def memoize_from_dict(func):
@functools.wraps(func) @functools.wraps(func)
def memoized_func(*args, **kwargs): def memoized_func(*args, **kwargs):
if args[1] is None: if args[1] is None:
return None return None
elif args[1].get('id', None): elif args[1].get("id", None):
args = list(args) args = list(args)
args[1] = HDict(args[1]) args[1] = HDict(args[1])
new_args = tuple(args) new_args = tuple(args)
@ -30,7 +29,7 @@ def memoize_from_dict(func):
return memoized_func return memoized_func
class ToDictWrapper(): class ToDictWrapper:
def __init__(self, tx): def __init__(self, tx):
self.tx = tx self.tx = tx
@ -47,7 +46,6 @@ def to_dict(func, tx_wrapped):
def memoize_to_dict(func): def memoize_to_dict(func):
@functools.wraps(func) @functools.wraps(func)
def memoized_func(*args, **kwargs): def memoized_func(*args, **kwargs):

View File

@ -19,7 +19,7 @@ logger = logging.getLogger(__name__)
def _load_schema(name, version, path=__file__): def _load_schema(name, version, path=__file__):
"""Load a schema from disk""" """Load a schema from disk"""
path = os.path.join(os.path.dirname(path), version, name + '.yaml') path = os.path.join(os.path.dirname(path), version, name + ".yaml")
with open(path) as handle: with open(path) as handle:
schema = yaml.safe_load(handle) schema = yaml.safe_load(handle)
fast_schema = rapidjson.Validator(rapidjson.dumps(schema)) fast_schema = rapidjson.Validator(rapidjson.dumps(schema))
@ -27,22 +27,17 @@ def _load_schema(name, version, path=__file__):
# TODO: make this an env var from a config file # TODO: make this an env var from a config file
TX_SCHEMA_VERSION = 'v2.0' TX_SCHEMA_VERSION = "v2.0"
TX_SCHEMA_PATH, TX_SCHEMA_COMMON = _load_schema('transaction', TX_SCHEMA_PATH, TX_SCHEMA_COMMON = _load_schema("transaction", TX_SCHEMA_VERSION)
TX_SCHEMA_VERSION) _, TX_SCHEMA_CREATE = _load_schema("transaction_create", TX_SCHEMA_VERSION)
_, TX_SCHEMA_CREATE = _load_schema('transaction_create', _, TX_SCHEMA_TRANSFER = _load_schema("transaction_transfer", TX_SCHEMA_VERSION)
TX_SCHEMA_VERSION)
_, TX_SCHEMA_TRANSFER = _load_schema('transaction_transfer',
TX_SCHEMA_VERSION)
_, TX_SCHEMA_VALIDATOR_ELECTION = _load_schema('transaction_validator_election', _, TX_SCHEMA_VALIDATOR_ELECTION = _load_schema("transaction_validator_election", TX_SCHEMA_VERSION)
TX_SCHEMA_VERSION)
_, TX_SCHEMA_CHAIN_MIGRATION_ELECTION = _load_schema('transaction_chain_migration_election', _, TX_SCHEMA_CHAIN_MIGRATION_ELECTION = _load_schema("transaction_chain_migration_election", TX_SCHEMA_VERSION)
TX_SCHEMA_VERSION)
_, TX_SCHEMA_VOTE = _load_schema('transaction_vote', TX_SCHEMA_VERSION) _, TX_SCHEMA_VOTE = _load_schema("transaction_vote", TX_SCHEMA_VERSION)
def _validate_schema(schema, body): def _validate_schema(schema, body):
@ -66,7 +61,7 @@ def _validate_schema(schema, body):
jsonschema.validate(body, schema[0]) jsonschema.validate(body, schema[0])
except jsonschema.ValidationError as exc2: except jsonschema.ValidationError as exc2:
raise SchemaValidationError(str(exc2)) from exc2 raise SchemaValidationError(str(exc2)) from exc2
logger.warning('code problem: jsonschema did not raise an exception, wheras rapidjson raised %s', exc) logger.warning("code problem: jsonschema did not raise an exception, wheras rapidjson raised %s", exc)
raise SchemaValidationError(str(exc)) from exc raise SchemaValidationError(str(exc)) from exc
@ -77,7 +72,7 @@ def validate_transaction_schema(tx):
transaction. TX_SCHEMA_[TRANSFER|CREATE] add additional constraints on top. transaction. TX_SCHEMA_[TRANSFER|CREATE] add additional constraints on top.
""" """
_validate_schema(TX_SCHEMA_COMMON, tx) _validate_schema(TX_SCHEMA_COMMON, tx)
if tx['operation'] == 'TRANSFER': if tx["operation"] == "TRANSFER":
_validate_schema(TX_SCHEMA_TRANSFER, tx) _validate_schema(TX_SCHEMA_TRANSFER, tx)
else: else:
_validate_schema(TX_SCHEMA_CREATE, tx) _validate_schema(TX_SCHEMA_CREATE, tx)

View File

@ -120,26 +120,15 @@ class Transaction(object):
# Asset payloads for 'CREATE' operations must be None or # Asset payloads for 'CREATE' operations must be None or
# dicts holding a `data` property. Asset payloads for 'TRANSFER' # dicts holding a `data` property. Asset payloads for 'TRANSFER'
# operations must be dicts holding an `id` property. # operations must be dicts holding an `id` property.
if ( if operation == self.CREATE and asset is not None and not (isinstance(asset, dict) and "data" in asset):
operation == self.CREATE
and asset is not None
and not (isinstance(asset, dict) and "data" in asset)
):
raise TypeError( raise TypeError(
( (
"`asset` must be None or a dict holding a `data` " "`asset` must be None or a dict holding a `data` "
" property instance for '{}' Transactions".format(operation) " property instance for '{}' Transactions".format(operation)
) )
) )
elif operation == self.TRANSFER and not ( elif operation == self.TRANSFER and not (isinstance(asset, dict) and "id" in asset):
isinstance(asset, dict) and "id" in asset raise TypeError(("`asset` must be a dict holding an `id` property " "for 'TRANSFER' Transactions"))
):
raise TypeError(
(
"`asset` must be a dict holding an `id` property "
"for 'TRANSFER' Transactions"
)
)
if outputs and not isinstance(outputs, list): if outputs and not isinstance(outputs, list):
raise TypeError("`outputs` must be a list instance or None") raise TypeError("`outputs` must be a list instance or None")
@ -298,10 +287,7 @@ class Transaction(object):
# to decode to convert the bytestring into a python str # to decode to convert the bytestring into a python str
return public_key.decode() return public_key.decode()
key_pairs = { key_pairs = {gen_public_key(PrivateKey(private_key)): PrivateKey(private_key) for private_key in private_keys}
gen_public_key(PrivateKey(private_key)): PrivateKey(private_key)
for private_key in private_keys
}
tx_dict = self.to_dict() tx_dict = self.to_dict()
tx_dict = Transaction._remove_signatures(tx_dict) tx_dict = Transaction._remove_signatures(tx_dict)
@ -336,10 +322,7 @@ class Transaction(object):
elif isinstance(input_.fulfillment, ZenroomSha256): elif isinstance(input_.fulfillment, ZenroomSha256):
return cls._sign_threshold_signature_fulfillment(input_, message, key_pairs) return cls._sign_threshold_signature_fulfillment(input_, message, key_pairs)
else: else:
raise ValueError( raise ValueError("Fulfillment couldn't be matched to " "Cryptocondition fulfillment type.")
"Fulfillment couldn't be matched to "
"Cryptocondition fulfillment type."
)
@classmethod @classmethod
def _sign_zenroom_fulfillment(cls, input_, message, key_pairs): def _sign_zenroom_fulfillment(cls, input_, message, key_pairs):
@ -359,20 +342,15 @@ class Transaction(object):
public_key = input_.owners_before[0] public_key = input_.owners_before[0]
message = sha3_256(message.encode()) message = sha3_256(message.encode())
if input_.fulfills: if input_.fulfills:
message.update( message.update("{}{}".format(input_.fulfills.txid, input_.fulfills.output).encode())
"{}{}".format(input_.fulfills.txid, input_.fulfills.output).encode()
)
try: try:
# cryptoconditions makes no assumptions of the encoding of the # cryptoconditions makes no assumptions of the encoding of the
# message to sign or verify. It only accepts bytestrings # message to sign or verify. It only accepts bytestrings
input_.fulfillment.sign( input_.fulfillment.sign(message.digest(), base58.b58decode(key_pairs[public_key].encode()))
message.digest(), base58.b58decode(key_pairs[public_key].encode())
)
except KeyError: except KeyError:
raise KeypairMismatchException( raise KeypairMismatchException(
"Public key {} is not a pair to " "Public key {} is not a pair to " "any of the private keys".format(public_key)
"any of the private keys".format(public_key)
) )
return input_ return input_
@ -394,20 +372,15 @@ class Transaction(object):
public_key = input_.owners_before[0] public_key = input_.owners_before[0]
message = sha3_256(message.encode()) message = sha3_256(message.encode())
if input_.fulfills: if input_.fulfills:
message.update( message.update("{}{}".format(input_.fulfills.txid, input_.fulfills.output).encode())
"{}{}".format(input_.fulfills.txid, input_.fulfills.output).encode()
)
try: try:
# cryptoconditions makes no assumptions of the encoding of the # cryptoconditions makes no assumptions of the encoding of the
# message to sign or verify. It only accepts bytestrings # message to sign or verify. It only accepts bytestrings
input_.fulfillment.sign( input_.fulfillment.sign(message.digest(), base58.b58decode(key_pairs[public_key].encode()))
message.digest(), base58.b58decode(key_pairs[public_key].encode())
)
except KeyError: except KeyError:
raise KeypairMismatchException( raise KeypairMismatchException(
"Public key {} is not a pair to " "Public key {} is not a pair to " "any of the private keys".format(public_key)
"any of the private keys".format(public_key)
) )
return input_ return input_
@ -424,9 +397,7 @@ class Transaction(object):
input_ = deepcopy(input_) input_ = deepcopy(input_)
message = sha3_256(message.encode()) message = sha3_256(message.encode())
if input_.fulfills: if input_.fulfills:
message.update( message.update("{}{}".format(input_.fulfills.txid, input_.fulfills.output).encode())
"{}{}".format(input_.fulfills.txid, input_.fulfills.output).encode()
)
for owner_before in set(input_.owners_before): for owner_before in set(input_.owners_before):
# TODO: CC should throw a KeypairMismatchException, instead of # TODO: CC should throw a KeypairMismatchException, instead of
@ -442,15 +413,13 @@ class Transaction(object):
subffills = ccffill.get_subcondition_from_vk(base58.b58decode(owner_before)) subffills = ccffill.get_subcondition_from_vk(base58.b58decode(owner_before))
if not subffills: if not subffills:
raise KeypairMismatchException( raise KeypairMismatchException(
"Public key {} cannot be found " "Public key {} cannot be found " "in the fulfillment".format(owner_before)
"in the fulfillment".format(owner_before)
) )
try: try:
private_key = key_pairs[owner_before] private_key = key_pairs[owner_before]
except KeyError: except KeyError:
raise KeypairMismatchException( raise KeypairMismatchException(
"Public key {} is not a pair " "Public key {} is not a pair " "to any of the private keys".format(owner_before)
"to any of the private keys".format(owner_before)
) )
# cryptoconditions makes no assumptions of the encoding of the # cryptoconditions makes no assumptions of the encoding of the
@ -483,9 +452,7 @@ class Transaction(object):
# greatly, as we do not have to check against `None` values. # greatly, as we do not have to check against `None` values.
return self._inputs_valid(["dummyvalue" for _ in self.inputs]) return self._inputs_valid(["dummyvalue" for _ in self.inputs])
elif self.operation == self.TRANSFER: elif self.operation == self.TRANSFER:
return self._inputs_valid( return self._inputs_valid([output.fulfillment.condition_uri for output in outputs])
[output.fulfillment.condition_uri for output in outputs]
)
else: else:
allowed_ops = ", ".join(self.__class__.ALLOWED_OPERATIONS) allowed_ops = ", ".join(self.__class__.ALLOWED_OPERATIONS)
raise TypeError("`operation` must be one of {}".format(allowed_ops)) raise TypeError("`operation` must be one of {}".format(allowed_ops))
@ -506,9 +473,7 @@ class Transaction(object):
""" """
if len(self.inputs) != len(output_condition_uris): if len(self.inputs) != len(output_condition_uris):
raise ValueError( raise ValueError("Inputs and " "output_condition_uris must have the same count")
"Inputs and " "output_condition_uris must have the same count"
)
tx_dict = self.tx_dict if self.tx_dict else self.to_dict() tx_dict = self.tx_dict if self.tx_dict else self.to_dict()
tx_dict = Transaction._remove_signatures(tx_dict) tx_dict = Transaction._remove_signatures(tx_dict)
@ -517,9 +482,7 @@ class Transaction(object):
def validate(i, output_condition_uri=None): def validate(i, output_condition_uri=None):
"""Validate input against output condition URI""" """Validate input against output condition URI"""
return self._input_valid( return self._input_valid(self.inputs[i], self.operation, tx_serialized, output_condition_uri)
self.inputs[i], self.operation, tx_serialized, output_condition_uri
)
return all(validate(i, cond) for i, cond in enumerate(output_condition_uris)) return all(validate(i, cond) for i, cond in enumerate(output_condition_uris))
@ -574,9 +537,7 @@ class Transaction(object):
else: else:
message = sha3_256(message.encode()) message = sha3_256(message.encode())
if input_.fulfills: if input_.fulfills:
message.update( message.update("{}{}".format(input_.fulfills.txid, input_.fulfills.output).encode())
"{}{}".format(input_.fulfills.txid, input_.fulfills.output).encode()
)
# NOTE: We pass a timestamp to `.validate`, as in case of a timeout # NOTE: We pass a timestamp to `.validate`, as in case of a timeout
# condition we'll have to validate against it # condition we'll have to validate against it
@ -676,19 +637,11 @@ class Transaction(object):
transactions = [transactions] transactions = [transactions]
# create a set of the transactions' asset ids # create a set of the transactions' asset ids
asset_ids = { asset_ids = {tx.id if tx.operation == tx.CREATE else tx.asset["id"] for tx in transactions}
tx.id if tx.operation == tx.CREATE else tx.asset["id"]
for tx in transactions
}
# check that all the transasctions have the same asset id # check that all the transasctions have the same asset id
if len(asset_ids) > 1: if len(asset_ids) > 1:
raise AssetIdMismatch( raise AssetIdMismatch(("All inputs of all transactions passed" " need to have the same asset id"))
(
"All inputs of all transactions passed"
" need to have the same asset id"
)
)
return asset_ids.pop() return asset_ids.pop()
@staticmethod @staticmethod
@ -712,10 +665,7 @@ class Transaction(object):
tx_body_serialized = Transaction._to_str(tx_body) tx_body_serialized = Transaction._to_str(tx_body)
valid_tx_id = Transaction._to_hash(tx_body_serialized) valid_tx_id = Transaction._to_hash(tx_body_serialized)
if proposed_tx_id != valid_tx_id: if proposed_tx_id != valid_tx_id:
err_msg = ( err_msg = "The transaction's id '{}' isn't equal to " "the hash of its body, i.e. it's not valid."
"The transaction's id '{}' isn't equal to "
"the hash of its body, i.e. it's not valid."
)
raise InvalidHash(err_msg.format(proposed_tx_id)) raise InvalidHash(err_msg.format(proposed_tx_id))
@classmethod @classmethod
@ -729,27 +679,25 @@ class Transaction(object):
Returns: Returns:
:class:`~planetmint.transactions.common.transaction.Transaction` :class:`~planetmint.transactions.common.transaction.Transaction`
""" """
operation = ( operation = tx.get("operation", Transaction.CREATE) if isinstance(tx, dict) else Transaction.CREATE
tx.get("operation", Transaction.CREATE)
if isinstance(tx, dict)
else Transaction.CREATE
)
cls = Transaction.resolve_class(operation) cls = Transaction.resolve_class(operation)
id = None id = None
try: try:
id = tx['id'] id = tx["id"]
except KeyError: except KeyError:
id = None id = None
# tx['asset'] = tx['asset'][0] if isinstance( tx['asset'], list) or isinstance( tx['asset'], tuple) else tx['asset'], # noqa: E501 # tx['asset'] = tx['asset'][0] if isinstance( tx['asset'], list) or isinstance( tx['asset'], tuple) else tx['asset'], # noqa: E501
local_dict = { local_dict = {
'inputs': tx['inputs'], "inputs": tx["inputs"],
'outputs': tx['outputs'], "outputs": tx["outputs"],
'operation': operation, "operation": operation,
'metadata': tx['metadata'], "metadata": tx["metadata"],
'asset': tx['asset'], # [0] if isinstance( tx['asset'], list) or isinstance( tx['asset'], tuple) else tx['asset'], # noqa: E501 "asset": tx[
'version': tx['version'], "asset"
'id': id ], # [0] if isinstance( tx['asset'], list) or isinstance( tx['asset'], tuple) else tx['asset'], # noqa: E501
"version": tx["version"],
"id": id,
} }
if not skip_schema_validation: if not skip_schema_validation:
@ -802,14 +750,14 @@ class Transaction(object):
if asset is not None: if asset is not None:
# This is tarantool specific behaviour needs to be addressed # This is tarantool specific behaviour needs to be addressed
tx = tx_map[asset[1]] tx = tx_map[asset[1]]
tx['asset'] = asset[0] tx["asset"] = asset[0]
tx_ids = list(tx_map.keys()) tx_ids = list(tx_map.keys())
metadata_list = list(planet.get_metadata(tx_ids)) metadata_list = list(planet.get_metadata(tx_ids))
for metadata in metadata_list: for metadata in metadata_list:
if 'id' in metadata: if "id" in metadata:
tx = tx_map[metadata['id']] tx = tx_map[metadata["id"]]
tx.update({'metadata': metadata.get('metadata')}) tx.update({"metadata": metadata.get("metadata")})
if return_list: if return_list:
tx_list = [] tx_list = []
@ -851,9 +799,7 @@ class Transaction(object):
if input_tx is None: if input_tx is None:
raise InputDoesNotExist("input `{}` doesn't exist".format(input_txid)) raise InputDoesNotExist("input `{}` doesn't exist".format(input_txid))
spent = planet.get_spent( spent = planet.get_spent(input_txid, input_.fulfills.output, current_transactions)
input_txid, input_.fulfills.output, current_transactions
)
if spent: if spent:
raise DoubleSpend("input `{}` was already spent".format(input_txid)) raise DoubleSpend("input `{}` was already spent".format(input_txid))
@ -869,27 +815,15 @@ class Transaction(object):
# validate asset id # validate asset id
asset_id = self.get_asset_id(input_txs) asset_id = self.get_asset_id(input_txs)
if asset_id != self.asset["id"]: if asset_id != self.asset["id"]:
raise AssetIdMismatch( raise AssetIdMismatch(("The asset id of the input does not" " match the asset id of the" " transaction"))
(
"The asset id of the input does not"
" match the asset id of the"
" transaction"
)
)
input_amount = sum( input_amount = sum([input_condition.amount for input_condition in input_conditions])
[input_condition.amount for input_condition in input_conditions] output_amount = sum([output_condition.amount for output_condition in self.outputs])
)
output_amount = sum(
[output_condition.amount for output_condition in self.outputs]
)
if output_amount != input_amount: if output_amount != input_amount:
raise AmountError( raise AmountError(
( (
"The amount used in the inputs `{}`" "The amount used in the inputs `{}`" " needs to be same as the amount used" " in the outputs `{}`"
" needs to be same as the amount used"
" in the outputs `{}`"
).format(input_amount, output_amount) ).format(input_amount, output_amount)
) )

View File

@ -3,6 +3,7 @@
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0) # SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
# Code is Apache-2.0 and docs are CC-BY-4.0 # Code is Apache-2.0 and docs are CC-BY-4.0
class TransactionLink(object): class TransactionLink(object):
"""An object for unidirectional linking to a Transaction's Output. """An object for unidirectional linking to a Transaction's Output.
@ -51,7 +52,7 @@ class TransactionLink(object):
:class:`~planetmint.transactions.common.transaction.TransactionLink` :class:`~planetmint.transactions.common.transaction.TransactionLink`
""" """
try: try:
return cls(link['transaction_id'], link['output_index']) return cls(link["transaction_id"], link["output_index"])
except TypeError: except TypeError:
return cls() return cls()
@ -65,12 +66,11 @@ class TransactionLink(object):
return None return None
else: else:
return { return {
'transaction_id': self.txid, "transaction_id": self.txid,
'output_index': self.output, "output_index": self.output,
} }
def to_uri(self, path=''): def to_uri(self, path=""):
if self.txid is None and self.output is None: if self.txid is None and self.output is None:
return None return None
return '{}/transactions/{}/outputs/{}'.format(path, self.txid, return "{}/transactions/{}/outputs/{}".format(path, self.txid, self.output)
self.output)

View File

@ -3,6 +3,6 @@
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0) # SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
# Code is Apache-2.0 and docs are CC-BY-4.0 # Code is Apache-2.0 and docs are CC-BY-4.0
BROADCAST_TX_COMMIT = 'broadcast_tx_commit' BROADCAST_TX_COMMIT = "broadcast_tx_commit"
BROADCAST_TX_ASYNC = 'broadcast_tx_async' BROADCAST_TX_ASYNC = "broadcast_tx_async"
BROADCAST_TX_SYNC = 'broadcast_tx_sync' BROADCAST_TX_SYNC = "broadcast_tx_sync"

View File

@ -75,7 +75,7 @@ def validate_txn_obj(obj_name, obj, key, validation_fun):
Raises: Raises:
ValidationError: `validation_fun` will raise exception on failure ValidationError: `validation_fun` will raise exception on failure
""" """
backend = Config().get()['database']['backend'] backend = Config().get()["database"]["backend"]
if backend == "localmongodb": if backend == "localmongodb":
data = obj.get(key, {}) data = obj.get(key, {})
@ -184,9 +184,7 @@ def _fulfillment_to_details(fulfillment):
} }
if fulfillment.type_name == "threshold-sha-256": if fulfillment.type_name == "threshold-sha-256":
subconditions = [ subconditions = [_fulfillment_to_details(cond["body"]) for cond in fulfillment.subconditions]
_fulfillment_to_details(cond["body"]) for cond in fulfillment.subconditions
]
return { return {
"type": "threshold-sha-256", "type": "threshold-sha-256",
"threshold": fulfillment.threshold, "threshold": fulfillment.threshold,

View File

@ -10,23 +10,23 @@ from planetmint.transactions.common.output import Output
class Create(Transaction): class Create(Transaction):
OPERATION = 'CREATE' OPERATION = "CREATE"
ALLOWED_OPERATIONS = (OPERATION,) ALLOWED_OPERATIONS = (OPERATION,)
@classmethod @classmethod
def validate_create(self, tx_signers, recipients, asset, metadata): def validate_create(self, tx_signers, recipients, asset, metadata):
if not isinstance(tx_signers, list): if not isinstance(tx_signers, list):
raise TypeError('`tx_signers` must be a list instance') raise TypeError("`tx_signers` must be a list instance")
if not isinstance(recipients, list): if not isinstance(recipients, list):
raise TypeError('`recipients` must be a list instance') raise TypeError("`recipients` must be a list instance")
if len(tx_signers) == 0: if len(tx_signers) == 0:
raise ValueError('`tx_signers` list cannot be empty') raise ValueError("`tx_signers` list cannot be empty")
if len(recipients) == 0: if len(recipients) == 0:
raise ValueError('`recipients` list cannot be empty') raise ValueError("`recipients` list cannot be empty")
if not (asset is None or isinstance(asset, dict)): if not (asset is None or isinstance(asset, dict)):
raise TypeError('`asset` must be a dict or None') raise TypeError("`asset` must be a dict or None")
if not (metadata is None or isinstance(metadata, dict)): if not (metadata is None or isinstance(metadata, dict)):
raise TypeError('`metadata` must be a dict or None') raise TypeError("`metadata` must be a dict or None")
inputs = [] inputs = []
outputs = [] outputs = []
@ -34,9 +34,9 @@ class Create(Transaction):
# generate_outputs # generate_outputs
for recipient in recipients: for recipient in recipients:
if not isinstance(recipient, tuple) or len(recipient) != 2: if not isinstance(recipient, tuple) or len(recipient) != 2:
raise ValueError(('Each `recipient` in the list must be a' raise ValueError(
' tuple of `([<list of public keys>],' ("Each `recipient` in the list must be a" " tuple of `([<list of public keys>]," " <amount>)`")
' <amount>)`')) )
pub_keys, amount = recipient pub_keys, amount = recipient
outputs.append(Output.generate(pub_keys, amount)) outputs.append(Output.generate(pub_keys, amount))
@ -75,4 +75,4 @@ class Create(Transaction):
""" """
(inputs, outputs) = cls.validate_create(tx_signers, recipients, asset, metadata) (inputs, outputs) = cls.validate_create(tx_signers, recipients, asset, metadata)
return cls(cls.OPERATION, {'data': asset}, inputs, outputs, metadata) return cls(cls.OPERATION, {"data": asset}, inputs, outputs, metadata)

View File

@ -10,31 +10,31 @@ from copy import deepcopy
class Transfer(Transaction): class Transfer(Transaction):
OPERATION = 'TRANSFER' OPERATION = "TRANSFER"
ALLOWED_OPERATIONS = (OPERATION,) ALLOWED_OPERATIONS = (OPERATION,)
@classmethod @classmethod
def validate_transfer(cls, inputs, recipients, asset_id, metadata): def validate_transfer(cls, inputs, recipients, asset_id, metadata):
if not isinstance(inputs, list): if not isinstance(inputs, list):
raise TypeError('`inputs` must be a list instance') raise TypeError("`inputs` must be a list instance")
if len(inputs) == 0: if len(inputs) == 0:
raise ValueError('`inputs` must contain at least one item') raise ValueError("`inputs` must contain at least one item")
if not isinstance(recipients, list): if not isinstance(recipients, list):
raise TypeError('`recipients` must be a list instance') raise TypeError("`recipients` must be a list instance")
if len(recipients) == 0: if len(recipients) == 0:
raise ValueError('`recipients` list cannot be empty') raise ValueError("`recipients` list cannot be empty")
outputs = [] outputs = []
for recipient in recipients: for recipient in recipients:
if not isinstance(recipient, tuple) or len(recipient) != 2: if not isinstance(recipient, tuple) or len(recipient) != 2:
raise ValueError(('Each `recipient` in the list must be a' raise ValueError(
' tuple of `([<list of public keys>],' ("Each `recipient` in the list must be a" " tuple of `([<list of public keys>]," " <amount>)`")
' <amount>)`')) )
pub_keys, amount = recipient pub_keys, amount = recipient
outputs.append(Output.generate(pub_keys, amount)) outputs.append(Output.generate(pub_keys, amount))
if not isinstance(asset_id, str): if not isinstance(asset_id, str):
raise TypeError('`asset_id` must be a string') raise TypeError("`asset_id` must be a string")
return (deepcopy(inputs), outputs) return (deepcopy(inputs), outputs)
@ -78,4 +78,4 @@ class Transfer(Transaction):
:class:`~planetmint.common.transaction.Transaction` :class:`~planetmint.common.transaction.Transaction`
""" """
(inputs, outputs) = cls.validate_transfer(inputs, recipients, asset_id, metadata) (inputs, outputs) = cls.validate_transfer(inputs, recipients, asset_id, metadata)
return cls(cls.OPERATION, {'id': asset_id}, inputs, outputs, metadata) return cls(cls.OPERATION, {"id": asset_id}, inputs, outputs, metadata)

View File

@ -6,14 +6,14 @@ from planetmint.transactions.types.elections.election import Election
class ChainMigrationElection(Election): class ChainMigrationElection(Election):
OPERATION = 'CHAIN_MIGRATION_ELECTION' OPERATION = "CHAIN_MIGRATION_ELECTION"
CREATE = OPERATION CREATE = OPERATION
ALLOWED_OPERATIONS = (OPERATION,) ALLOWED_OPERATIONS = (OPERATION,)
TX_SCHEMA_CUSTOM = TX_SCHEMA_CHAIN_MIGRATION_ELECTION TX_SCHEMA_CUSTOM = TX_SCHEMA_CHAIN_MIGRATION_ELECTION
def has_concluded(self, planetmint, *args, **kwargs): def has_concluded(self, planetmint, *args, **kwargs):
chain = planetmint.get_latest_abci_chain() chain = planetmint.get_latest_abci_chain()
if chain is not None and not chain['is_synced']: if chain is not None and not chain["is_synced"]:
# do not conclude the migration election if # do not conclude the migration election if
# there is another migration in progress # there is another migration in progress
return False return False
@ -26,7 +26,7 @@ class ChainMigrationElection(Election):
def show_election(self, planet): def show_election(self, planet):
output = super().show_election(planet) output = super().show_election(planet)
chain = planet.get_latest_abci_chain() chain = planet.get_latest_abci_chain()
if chain is None or chain['is_synced']: if chain is None or chain["is_synced"]:
return output return output
output += f'\nchain_id={chain["chain_id"]}' output += f'\nchain_id={chain["chain_id"]}'
@ -34,14 +34,15 @@ class ChainMigrationElection(Election):
output += f'\napp_hash={block["app_hash"]}' output += f'\napp_hash={block["app_hash"]}'
validators = [ validators = [
{ {
'pub_key': { "pub_key": {
'type': 'tendermint/PubKeyEd25519', "type": "tendermint/PubKeyEd25519",
'value': k, "value": k,
}, },
'power': v, "power": v,
} for k, v in self.get_validators(planet).items() }
for k, v in self.get_validators(planet).items()
] ]
output += f'\nvalidators={json.dumps(validators, indent=4)}' output += f"\nvalidators={json.dumps(validators, indent=4)}"
return output return output
def on_rollback(self, planet, new_height): def on_rollback(self, planet, new_height):

View File

@ -12,13 +12,16 @@ from planetmint.transactions.types.assets.create import Create
from planetmint.transactions.types.assets.transfer import Transfer from planetmint.transactions.types.assets.transfer import Transfer
from planetmint.transactions.types.elections.vote import Vote from planetmint.transactions.types.elections.vote import Vote
from planetmint.transactions.common.exceptions import ( from planetmint.transactions.common.exceptions import (
InvalidSignature, MultipleInputsError, InvalidProposer, InvalidSignature,
UnequalValidatorSet, DuplicateTransaction) MultipleInputsError,
InvalidProposer,
UnequalValidatorSet,
DuplicateTransaction,
)
from planetmint.tendermint_utils import key_from_base64, public_key_to_base64 from planetmint.tendermint_utils import key_from_base64, public_key_to_base64
from planetmint.transactions.common.crypto import (public_key_from_ed25519_key) from planetmint.transactions.common.crypto import public_key_from_ed25519_key
from planetmint.transactions.common.transaction import Transaction from planetmint.transactions.common.transaction import Transaction
from planetmint.transactions.common.schema import ( from planetmint.transactions.common.schema import _validate_schema, TX_SCHEMA_COMMON, TX_SCHEMA_CREATE
_validate_schema, TX_SCHEMA_COMMON, TX_SCHEMA_CREATE)
class Election(Transaction): class Election(Transaction):
@ -33,9 +36,9 @@ class Election(Transaction):
# Custom validation schema # Custom validation schema
TX_SCHEMA_CUSTOM = None TX_SCHEMA_CUSTOM = None
# Election Statuses: # Election Statuses:
ONGOING = 'ongoing' ONGOING = "ongoing"
CONCLUDED = 'concluded' CONCLUDED = "concluded"
INCONCLUSIVE = 'inconclusive' INCONCLUSIVE = "inconclusive"
# Vote ratio to approve an election # Vote ratio to approve an election
ELECTION_THRESHOLD = 2 / 3 ELECTION_THRESHOLD = 2 / 3
@ -51,7 +54,7 @@ class Election(Transaction):
latest_block = planet.get_latest_block() latest_block = planet.get_latest_block()
if latest_block is None: if latest_block is None:
return None return None
return planet.get_validator_change(latest_block['height']) return planet.get_validator_change(latest_block["height"])
@classmethod @classmethod
def get_validators(cls, planet, height=None): def get_validators(cls, planet, height=None):
@ -61,8 +64,8 @@ class Election(Transaction):
validators = {} validators = {}
for validator in planet.get_validators(height): for validator in planet.get_validators(height):
# NOTE: we assume that Tendermint encodes public key in base64 # NOTE: we assume that Tendermint encodes public key in base64
public_key = public_key_from_ed25519_key(key_from_base64(validator['public_key']['value'])) public_key = public_key_from_ed25519_key(key_from_base64(validator["public_key"]["value"]))
validators[public_key] = validator['voting_power'] validators[public_key] = validator["voting_power"]
return validators return validators
@ -114,26 +117,25 @@ class Election(Transaction):
duplicates = any(txn for txn in current_transactions if txn.id == self.id) duplicates = any(txn for txn in current_transactions if txn.id == self.id)
if planet.is_committed(self.id) or duplicates: if planet.is_committed(self.id) or duplicates:
raise DuplicateTransaction('transaction `{}` already exists' raise DuplicateTransaction("transaction `{}` already exists".format(self.id))
.format(self.id))
if not self.inputs_valid(input_conditions): if not self.inputs_valid(input_conditions):
raise InvalidSignature('Transaction signature is invalid.') raise InvalidSignature("Transaction signature is invalid.")
current_validators = self.get_validators(planet) current_validators = self.get_validators(planet)
# NOTE: Proposer should be a single node # NOTE: Proposer should be a single node
if len(self.inputs) != 1 or len(self.inputs[0].owners_before) != 1: if len(self.inputs) != 1 or len(self.inputs[0].owners_before) != 1:
raise MultipleInputsError('`tx_signers` must be a list instance of length one') raise MultipleInputsError("`tx_signers` must be a list instance of length one")
# NOTE: Check if the proposer is a validator. # NOTE: Check if the proposer is a validator.
[election_initiator_node_pub_key] = self.inputs[0].owners_before [election_initiator_node_pub_key] = self.inputs[0].owners_before
if election_initiator_node_pub_key not in current_validators.keys(): if election_initiator_node_pub_key not in current_validators.keys():
raise InvalidProposer('Public key is not a part of the validator set') raise InvalidProposer("Public key is not a part of the validator set")
# NOTE: Check if all validators have been assigned votes equal to their voting power # NOTE: Check if all validators have been assigned votes equal to their voting power
if not self.is_same_topology(current_validators, self.outputs): if not self.is_same_topology(current_validators, self.outputs):
raise UnequalValidatorSet('Validator set much be exactly same to the outputs of election') raise UnequalValidatorSet("Validator set much be exactly same to the outputs of election")
return self return self
@ -141,10 +143,10 @@ class Election(Transaction):
def generate(cls, initiator, voters, election_data, metadata=None): def generate(cls, initiator, voters, election_data, metadata=None):
# Break symmetry in case we need to call an election with the same properties twice # Break symmetry in case we need to call an election with the same properties twice
uuid = uuid4() uuid = uuid4()
election_data['seed'] = str(uuid) election_data["seed"] = str(uuid)
(inputs, outputs) = Create.validate_create(initiator, voters, election_data, metadata) (inputs, outputs) = Create.validate_create(initiator, voters, election_data, metadata)
election = cls(cls.OPERATION, {'data': election_data}, inputs, outputs, metadata) election = cls(cls.OPERATION, {"data": election_data}, inputs, outputs, metadata)
cls.validate_schema(election.to_dict()) cls.validate_schema(election.to_dict())
return election return election
@ -174,21 +176,19 @@ class Election(Transaction):
def count_votes(cls, election_pk, transactions, getter=getattr): def count_votes(cls, election_pk, transactions, getter=getattr):
votes = 0 votes = 0
for txn in transactions: for txn in transactions:
if getter(txn, 'operation') == Vote.OPERATION: if getter(txn, "operation") == Vote.OPERATION:
for output in getter(txn, 'outputs'): for output in getter(txn, "outputs"):
# NOTE: We enforce that a valid vote to election id will have only # NOTE: We enforce that a valid vote to election id will have only
# election_pk in the output public keys, including any other public key # election_pk in the output public keys, including any other public key
# along with election_pk will lead to vote being not considered valid. # along with election_pk will lead to vote being not considered valid.
if len(getter(output, 'public_keys')) == 1 and [election_pk] == getter(output, 'public_keys'): if len(getter(output, "public_keys")) == 1 and [election_pk] == getter(output, "public_keys"):
votes = votes + int(getter(output, 'amount')) votes = votes + int(getter(output, "amount"))
return votes return votes
def get_commited_votes(self, planet, election_pk=None): def get_commited_votes(self, planet, election_pk=None):
if election_pk is None: if election_pk is None:
election_pk = self.to_public_key(self.id) election_pk = self.to_public_key(self.id)
txns = list(backend.query.get_asset_tokens_for_public_key(planet.connection, txns = list(backend.query.get_asset_tokens_for_public_key(planet.connection, self.id, election_pk))
self.id,
election_pk))
return self.count_votes(election_pk, txns, dict.get) return self.count_votes(election_pk, txns, dict.get)
def has_concluded(self, planet, current_votes=[]): def has_concluded(self, planet, current_votes=[]):
@ -208,15 +208,14 @@ class Election(Transaction):
votes_current = self.count_votes(election_pk, current_votes) votes_current = self.count_votes(election_pk, current_votes)
total_votes = sum(output.amount for output in self.outputs) total_votes = sum(output.amount for output in self.outputs)
if (votes_committed < (2 / 3) * total_votes) and \ if (votes_committed < (2 / 3) * total_votes) and (votes_committed + votes_current >= (2 / 3) * total_votes):
(votes_committed + votes_current >= (2 / 3) * total_votes):
return True return True
return False return False
def get_status(self, planet): def get_status(self, planet):
election = self.get_election(self.id, planet) election = self.get_election(self.id, planet)
if election and election['is_concluded']: if election and election["is_concluded"]:
return self.CONCLUDED return self.CONCLUDED
return self.INCONCLUSIVE if self.has_validator_set_changed(planet) else self.ONGOING return self.INCONCLUSIVE if self.has_validator_set_changed(planet) else self.ONGOING
@ -226,11 +225,11 @@ class Election(Transaction):
if latest_change is None: if latest_change is None:
return False return False
latest_change_height = latest_change['height'] latest_change_height = latest_change["height"]
election = self.get_election(self.id, planet) election = self.get_election(self.id, planet)
return latest_change_height > election['height'] return latest_change_height > election["height"]
def get_election(self, election_id, planet): def get_election(self, election_id, planet):
return planet.get_election(election_id) return planet.get_election(election_id)
@ -239,14 +238,14 @@ class Election(Transaction):
planet.store_election(self.id, height, is_concluded) planet.store_election(self.id, height, is_concluded)
def show_election(self, planet): def show_election(self, planet):
data = self.asset['data'] data = self.asset["data"]
if 'public_key' in data.keys(): if "public_key" in data.keys():
data['public_key'] = public_key_to_base64(data['public_key']['value']) data["public_key"] = public_key_to_base64(data["public_key"]["value"])
response = '' response = ""
for k, v in data.items(): for k, v in data.items():
if k != 'seed': if k != "seed":
response += f'{k}={v}\n' response += f"{k}={v}\n"
response += f'status={self.get_status(planet)}' response += f"status={self.get_status(planet)}"
return response return response
@ -257,8 +256,7 @@ class Election(Transaction):
if not isinstance(tx, Election): if not isinstance(tx, Election):
continue continue
elections.append({'election_id': tx.id, 'height': height, elections.append({"election_id": tx.id, "height": height, "is_concluded": False})
'is_concluded': False})
return elections return elections
@classmethod @classmethod
@ -268,7 +266,7 @@ class Election(Transaction):
if not isinstance(tx, Vote): if not isinstance(tx, Vote):
continue continue
election_id = tx.asset['id'] election_id = tx.asset["id"]
if election_id not in elections: if election_id not in elections:
elections[election_id] = [] elections[election_id] = []
elections[election_id].append(tx) elections[election_id].append(tx)

View File

@ -6,12 +6,16 @@
from planetmint.transactions.types.assets.create import Create from planetmint.transactions.types.assets.create import Create
from planetmint.transactions.types.assets.transfer import Transfer from planetmint.transactions.types.assets.transfer import Transfer
from planetmint.transactions.common.schema import ( from planetmint.transactions.common.schema import (
_validate_schema, TX_SCHEMA_COMMON, TX_SCHEMA_TRANSFER, TX_SCHEMA_VOTE) _validate_schema,
TX_SCHEMA_COMMON,
TX_SCHEMA_TRANSFER,
TX_SCHEMA_VOTE,
)
class Vote(Transfer): class Vote(Transfer):
OPERATION = 'VOTE' OPERATION = "VOTE"
# NOTE: This class inherits TRANSFER txn type. The `TRANSFER` property is # NOTE: This class inherits TRANSFER txn type. The `TRANSFER` property is
# overriden to re-use methods from parent class # overriden to re-use methods from parent class
TRANSFER = OPERATION TRANSFER = OPERATION
@ -41,7 +45,7 @@ class Vote(Transfer):
@classmethod @classmethod
def generate(cls, inputs, recipients, election_id, metadata=None): def generate(cls, inputs, recipients, election_id, metadata=None):
(inputs, outputs) = cls.validate_transfer(inputs, recipients, election_id, metadata) (inputs, outputs) = cls.validate_transfer(inputs, recipients, election_id, metadata)
election_vote = cls(cls.OPERATION, {'id': election_id}, inputs, outputs, metadata) election_vote = cls(cls.OPERATION, {"id": election_id}, inputs, outputs, metadata)
cls.validate_schema(election_vote.to_dict()) cls.validate_schema(election_vote.to_dict())
return election_vote return election_vote

View File

@ -6,12 +6,12 @@
from planetmint.transactions.common.exceptions import InvalidPowerChange from planetmint.transactions.common.exceptions import InvalidPowerChange
from planetmint.transactions.types.elections.election import Election from planetmint.transactions.types.elections.election import Election
from planetmint.transactions.common.schema import TX_SCHEMA_VALIDATOR_ELECTION from planetmint.transactions.common.schema import TX_SCHEMA_VALIDATOR_ELECTION
from .validator_utils import (new_validator_set, encode_validator, validate_asset_public_key) from .validator_utils import new_validator_set, encode_validator, validate_asset_public_key
class ValidatorElection(Election): class ValidatorElection(Election):
OPERATION = 'VALIDATOR_ELECTION' OPERATION = "VALIDATOR_ELECTION"
# NOTE: this transaction class extends create so the operation inheritence is achieved # NOTE: this transaction class extends create so the operation inheritence is achieved
# by renaming CREATE to VALIDATOR_ELECTION # by renaming CREATE to VALIDATOR_ELECTION
CREATE = OPERATION CREATE = OPERATION
@ -19,29 +19,28 @@ class ValidatorElection(Election):
TX_SCHEMA_CUSTOM = TX_SCHEMA_VALIDATOR_ELECTION TX_SCHEMA_CUSTOM = TX_SCHEMA_VALIDATOR_ELECTION
def validate(self, planet, current_transactions=[]): def validate(self, planet, current_transactions=[]):
"""For more details refer BEP-21: https://github.com/planetmint/BEPs/tree/master/21 """For more details refer BEP-21: https://github.com/planetmint/BEPs/tree/master/21"""
"""
current_validators = self.get_validators(planet) current_validators = self.get_validators(planet)
super(ValidatorElection, self).validate(planet, current_transactions=current_transactions) super(ValidatorElection, self).validate(planet, current_transactions=current_transactions)
# NOTE: change more than 1/3 of the current power is not allowed # NOTE: change more than 1/3 of the current power is not allowed
if self.asset['data']['power'] >= (1 / 3) * sum(current_validators.values()): if self.asset["data"]["power"] >= (1 / 3) * sum(current_validators.values()):
raise InvalidPowerChange('`power` change must be less than 1/3 of total power') raise InvalidPowerChange("`power` change must be less than 1/3 of total power")
return self return self
@classmethod @classmethod
def validate_schema(cls, tx): def validate_schema(cls, tx):
super(ValidatorElection, cls).validate_schema(tx) super(ValidatorElection, cls).validate_schema(tx)
validate_asset_public_key(tx['asset']['data']['public_key']) validate_asset_public_key(tx["asset"]["data"]["public_key"])
def has_concluded(self, planet, *args, **kwargs): def has_concluded(self, planet, *args, **kwargs):
latest_block = planet.get_latest_block() latest_block = planet.get_latest_block()
if latest_block is not None: if latest_block is not None:
latest_block_height = latest_block['height'] latest_block_height = latest_block["height"]
latest_validator_change = planet.get_validator_change()['height'] latest_validator_change = planet.get_validator_change()["height"]
# TODO change to `latest_block_height + 3` when upgrading to Tendermint 0.24.0. # TODO change to `latest_block_height + 3` when upgrading to Tendermint 0.24.0.
if latest_validator_change == latest_block_height + 2: if latest_validator_change == latest_block_height + 2:
@ -51,17 +50,15 @@ class ValidatorElection(Election):
return super().has_concluded(planet, *args, **kwargs) return super().has_concluded(planet, *args, **kwargs)
def on_approval(self, planet, new_height): def on_approval(self, planet, new_height):
validator_updates = [self.asset['data']] validator_updates = [self.asset["data"]]
curr_validator_set = planet.get_validators(new_height) curr_validator_set = planet.get_validators(new_height)
updated_validator_set = new_validator_set(curr_validator_set, updated_validator_set = new_validator_set(curr_validator_set, validator_updates)
validator_updates)
updated_validator_set = [v for v in updated_validator_set updated_validator_set = [v for v in updated_validator_set if v["voting_power"] > 0]
if v['voting_power'] > 0]
# TODO change to `new_height + 2` when upgrading to Tendermint 0.24.0. # TODO change to `new_height + 2` when upgrading to Tendermint 0.24.0.
planet.store_validator_set(new_height + 1, updated_validator_set) planet.store_validator_set(new_height + 1, updated_validator_set)
return encode_validator(self.asset['data']) return encode_validator(self.asset["data"])
def on_rollback(self, planetmint, new_height): def on_rollback(self, planetmint, new_height):
# TODO change to `new_height + 2` when upgrading to Tendermint 0.24.0. # TODO change to `new_height + 2` when upgrading to Tendermint 0.24.0.

View File

@ -8,67 +8,72 @@ from planetmint.transactions.common.exceptions import InvalidPublicKey
def encode_validator(v): def encode_validator(v):
ed25519_public_key = v['public_key']['value'] ed25519_public_key = v["public_key"]["value"]
pub_key = keys_pb2.PublicKey(ed25519=bytes.fromhex(ed25519_public_key)) pub_key = keys_pb2.PublicKey(ed25519=bytes.fromhex(ed25519_public_key))
return types_pb2.ValidatorUpdate(pub_key=pub_key, power=v['power']) return types_pb2.ValidatorUpdate(pub_key=pub_key, power=v["power"])
def decode_validator(v): def decode_validator(v):
return {'public_key': {'type': 'ed25519-base64', return {
'value': codecs.encode(v.pub_key.ed25519, 'base64').decode().rstrip('\n')}, "public_key": {
'voting_power': v.power} "type": "ed25519-base64",
"value": codecs.encode(v.pub_key.ed25519, "base64").decode().rstrip("\n"),
},
"voting_power": v.power,
}
def new_validator_set(validators, updates): def new_validator_set(validators, updates):
validators_dict = {} validators_dict = {}
for v in validators: for v in validators:
validators_dict[v['public_key']['value']] = v validators_dict[v["public_key"]["value"]] = v
updates_dict = {} updates_dict = {}
for u in updates: for u in updates:
decoder = get_public_key_decoder(u['public_key']) decoder = get_public_key_decoder(u["public_key"])
public_key64 = base64.b64encode(decoder(u['public_key']['value'])).decode('utf-8') public_key64 = base64.b64encode(decoder(u["public_key"]["value"])).decode("utf-8")
updates_dict[public_key64] = {'public_key': {'type': 'ed25519-base64', updates_dict[public_key64] = {
'value': public_key64}, "public_key": {"type": "ed25519-base64", "value": public_key64},
'voting_power': u['power']} "voting_power": u["power"],
}
new_validators_dict = {**validators_dict, **updates_dict} new_validators_dict = {**validators_dict, **updates_dict}
return list(new_validators_dict.values()) return list(new_validators_dict.values())
def encode_pk_to_base16(validator): def encode_pk_to_base16(validator):
pk = validator['public_key'] pk = validator["public_key"]
decoder = get_public_key_decoder(pk) decoder = get_public_key_decoder(pk)
public_key16 = base64.b16encode(decoder(pk['value'])).decode('utf-8') public_key16 = base64.b16encode(decoder(pk["value"])).decode("utf-8")
validator['public_key']['value'] = public_key16 validator["public_key"]["value"] = public_key16
return validator return validator
def validate_asset_public_key(pk): def validate_asset_public_key(pk):
pk_binary = pk['value'].encode('utf-8') pk_binary = pk["value"].encode("utf-8")
decoder = get_public_key_decoder(pk) decoder = get_public_key_decoder(pk)
try: try:
pk_decoded = decoder(pk_binary) pk_decoded = decoder(pk_binary)
if len(pk_decoded) != 32: if len(pk_decoded) != 32:
raise InvalidPublicKey('Public key should be of size 32 bytes') raise InvalidPublicKey("Public key should be of size 32 bytes")
except binascii.Error: except binascii.Error:
raise InvalidPublicKey('Invalid `type` specified for public key `value`') raise InvalidPublicKey("Invalid `type` specified for public key `value`")
def get_public_key_decoder(pk): def get_public_key_decoder(pk):
encoding = pk['type'] encoding = pk["type"]
decoder = base64.b64decode decoder = base64.b64decode
if encoding == 'ed25519-base16': if encoding == "ed25519-base16":
decoder = base64.b16decode decoder = base64.b16decode
elif encoding == 'ed25519-base32': elif encoding == "ed25519-base32":
decoder = base64.b32decode decoder = base64.b32decode
elif encoding == 'ed25519-base64': elif encoding == "ed25519-base64":
decoder = base64.b64decode decoder = base64.b64decode
else: else:
raise InvalidPublicKey('Invalid `type` specified for public key `value`') raise InvalidPublicKey("Invalid `type` specified for public key `value`")
return decoder return decoder

View File

@ -17,9 +17,7 @@ from planetmint.transactions.common.crypto import key_pair_from_ed25519_key
class ProcessGroup(object): class ProcessGroup(object):
def __init__(self, concurrency=None, group=None, target=None, name=None, args=None, kwargs=None, daemon=None):
def __init__(self, concurrency=None, group=None, target=None, name=None,
args=None, kwargs=None, daemon=None):
self.concurrency = concurrency or mp.cpu_count() self.concurrency = concurrency or mp.cpu_count()
self.group = group self.group = group
self.target = target self.target = target
@ -31,9 +29,14 @@ class ProcessGroup(object):
def start(self): def start(self):
for i in range(self.concurrency): for i in range(self.concurrency):
proc = mp.Process(group=self.group, target=self.target, proc = mp.Process(
name=self.name, args=self.args, group=self.group,
kwargs=self.kwargs, daemon=self.daemon) target=self.target,
name=self.name,
args=self.args,
kwargs=self.kwargs,
daemon=self.daemon,
)
proc.start() proc.start()
self.processes.append(proc) self.processes.append(proc)
@ -117,8 +120,8 @@ def condition_details_has_owner(condition_details, owner):
bool: True if the public key is found in the condition details, False otherwise bool: True if the public key is found in the condition details, False otherwise
""" """
if 'subconditions' in condition_details: if "subconditions" in condition_details:
result = condition_details_has_owner(condition_details['subconditions'], owner) result = condition_details_has_owner(condition_details["subconditions"], owner)
if result: if result:
return True return True
@ -128,8 +131,7 @@ def condition_details_has_owner(condition_details, owner):
if result: if result:
return True return True
else: else:
if 'public_key' in condition_details \ if "public_key" in condition_details and owner == condition_details["public_key"]:
and owner == condition_details['public_key']:
return True return True
return False return False
@ -157,7 +159,7 @@ class Lazy:
return self return self
def __getitem__(self, key): def __getitem__(self, key):
self.stack.append('__getitem__') self.stack.append("__getitem__")
self.stack.append(([key], {})) self.stack.append(([key], {}))
return self return self
@ -184,7 +186,7 @@ class Lazy:
def load_node_key(path): def load_node_key(path):
with open(path) as json_data: with open(path) as json_data:
priv_validator = json.load(json_data) priv_validator = json.load(json_data)
priv_key = priv_validator['priv_key']['value'] priv_key = priv_validator["priv_key"]["value"]
hex_private_key = key_from_base64(priv_key) hex_private_key = key_from_base64(priv_key)
return key_pair_from_ed25519_key(hex_private_key) return key_pair_from_ed25519_key(hex_private_key)
@ -200,7 +202,7 @@ def tendermint_version_is_compatible(running_tm_ver):
""" """
# Splitting because version can look like this e.g. 0.22.8-40d6dc2e # Splitting because version can look like this e.g. 0.22.8-40d6dc2e
tm_ver = running_tm_ver.split('-') tm_ver = running_tm_ver.split("-")
if not tm_ver: if not tm_ver:
return False return False
for ver in __tm_supported_versions__: for ver in __tm_supported_versions__:

View File

@ -4,7 +4,7 @@
# Code is Apache-2.0 and docs are CC-BY-4.0 # Code is Apache-2.0 and docs are CC-BY-4.0
class BaseValidationRules(): class BaseValidationRules:
"""Base validation rules for Planetmint. """Base validation rules for Planetmint.
A validation plugin must expose a class inheriting from this one via an entry_point. A validation plugin must expose a class inheriting from this one via an entry_point.

View File

@ -21,7 +21,7 @@ def add_routes(app):
for (prefix, routes) in API_SECTIONS: for (prefix, routes) in API_SECTIONS:
api = Api(app, prefix=prefix) api = Api(app, prefix=prefix)
for ((pattern, resource, *args), kwargs) in routes: for ((pattern, resource, *args), kwargs) in routes:
kwargs.setdefault('strict_slashes', False) kwargs.setdefault("strict_slashes", False)
api.add_resource(resource, pattern, *args, **kwargs) api.add_resource(resource, pattern, *args, **kwargs)
@ -30,20 +30,20 @@ def r(*args, **kwargs):
ROUTES_API_V1 = [ ROUTES_API_V1 = [
r('/', info.ApiV1Index), r("/", info.ApiV1Index),
r('assets/', assets.AssetListApi), r("assets/", assets.AssetListApi),
r('metadata/', metadata.MetadataApi), r("metadata/", metadata.MetadataApi),
r('blocks/<int:block_id>', blocks.BlockApi), r("blocks/<int:block_id>", blocks.BlockApi),
r('blocks/latest', blocks.LatestBlock), r("blocks/latest", blocks.LatestBlock),
r('blocks/', blocks.BlockListApi), r("blocks/", blocks.BlockListApi),
r('transactions/<string:tx_id>', tx.TransactionApi), r("transactions/<string:tx_id>", tx.TransactionApi),
r('transactions', tx.TransactionListApi), r("transactions", tx.TransactionListApi),
r('outputs/', outputs.OutputListApi), r("outputs/", outputs.OutputListApi),
r('validators/', validators.ValidatorsApi), r("validators/", validators.ValidatorsApi),
] ]
API_SECTIONS = [ API_SECTIONS = [
(None, [r('/', info.RootIndex)]), (None, [r("/", info.RootIndex)]),
('/api/v1/', ROUTES_API_V1), ("/api/v1/", ROUTES_API_V1),
] ]

View File

@ -44,13 +44,14 @@ class StandaloneApplication(gunicorn.app.base.BaseApplication):
def load_config(self): def load_config(self):
# find a better way to pass this such that # find a better way to pass this such that
# the custom logger class can access it. # the custom logger class can access it.
custom_log_config = self.options.get('custom_log_config') custom_log_config = self.options.get("custom_log_config")
self.cfg.env_orig['custom_log_config'] = custom_log_config self.cfg.env_orig["custom_log_config"] = custom_log_config
config = dict((key, value) for key, value in self.options.items() config = dict(
if key in self.cfg.settings and value is not None) (key, value) for key, value in self.options.items() if key in self.cfg.settings and value is not None
)
config['default_proc_name'] = 'planetmint_gunicorn' config["default_proc_name"] = "planetmint_gunicorn"
for key, value in config.items(): for key, value in config.items():
# not sure if we need the `key.lower` here, will just keep # not sure if we need the `key.lower` here, will just keep
# keep it for now. # keep it for now.
@ -81,7 +82,7 @@ def create_app(*, debug=False, threads=1, planetmint_factory=None):
app.debug = debug app.debug = debug
app.config['bigchain_pool'] = utils.pool(planetmint_factory, size=threads) app.config["bigchain_pool"] = utils.pool(planetmint_factory, size=threads)
add_routes(app) add_routes(app)
@ -101,18 +102,18 @@ def create_server(settings, log_config=None, planetmint_factory=None):
settings = copy.deepcopy(settings) settings = copy.deepcopy(settings)
if not settings.get('workers'): if not settings.get("workers"):
settings['workers'] = (multiprocessing.cpu_count() * 2) + 1 settings["workers"] = (multiprocessing.cpu_count() * 2) + 1
if not settings.get('threads'): if not settings.get("threads"):
# Note: Threading is not recommended currently, as the frontend workload # Note: Threading is not recommended currently, as the frontend workload
# is largely CPU bound and parallisation across Python threads makes it # is largely CPU bound and parallisation across Python threads makes it
# slower. # slower.
settings['threads'] = 1 settings["threads"] = 1
settings['custom_log_config'] = log_config settings["custom_log_config"] = log_config
app = create_app(debug=settings.get('debug', False), app = create_app(
threads=settings['threads'], debug=settings.get("debug", False), threads=settings["threads"], planetmint_factory=planetmint_factory
planetmint_factory=planetmint_factory) )
standalone = StandaloneApplication(app, options=settings) standalone = StandaloneApplication(app, options=settings)
return standalone return standalone

View File

@ -22,9 +22,9 @@ class StripContentTypeMiddleware:
def __call__(self, environ, start_response): def __call__(self, environ, start_response):
"""Run the middleware and then call the original WSGI application.""" """Run the middleware and then call the original WSGI application."""
if environ['REQUEST_METHOD'] == 'GET': if environ["REQUEST_METHOD"] == "GET":
try: try:
del environ['CONTENT_TYPE'] del environ["CONTENT_TYPE"]
except KeyError: except KeyError:
pass pass
else: else:

View File

@ -30,17 +30,17 @@ class AssetListApi(Resource):
A list of assets that match the query. A list of assets that match the query.
""" """
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
parser.add_argument('search', type=str, required=True) parser.add_argument("search", type=str, required=True)
parser.add_argument('limit', type=int) parser.add_argument("limit", type=int)
args = parser.parse_args() args = parser.parse_args()
if not args['search']: if not args["search"]:
return make_error(400, 'text_search cannot be empty') return make_error(400, "text_search cannot be empty")
if not args['limit']: if not args["limit"]:
# if the limit is not specified do not pass None to `text_search` # if the limit is not specified do not pass None to `text_search`
del args['limit'] del args["limit"]
pool = current_app.config['bigchain_pool'] pool = current_app.config["bigchain_pool"]
with pool() as planet: with pool() as planet:
assets = planet.text_search(**args) assets = planet.text_search(**args)
@ -49,7 +49,4 @@ class AssetListApi(Resource):
# This only works with MongoDB as the backend # This only works with MongoDB as the backend
return list(assets) return list(assets)
except OperationError as e: except OperationError as e:
return make_error( return make_error(400, "({}): {}".format(type(e).__name__, e))
400,
'({}): {}'.format(type(e).__name__, e)
)

View File

@ -17,13 +17,13 @@ logger = logging.getLogger(__name__)
def make_error(status_code, message=None): def make_error(status_code, message=None):
if status_code == 404 and message is None: if status_code == 404 and message is None:
message = 'Not found' message = "Not found"
response_content = {'status': status_code, 'message': message} response_content = {"status": status_code, "message": message}
request_info = {'method': request.method, 'path': request.path} request_info = {"method": request.method, "path": request.path}
request_info.update(response_content) request_info.update(response_content)
logger.error('HTTP API error: %(status)s - %(method)s:%(path)s - %(message)s', request_info) logger.error("HTTP API error: %(status)s - %(method)s:%(path)s - %(message)s", request_info)
response = jsonify(response_content) response = jsonify(response_content)
response.status_code = status_code response.status_code = status_code
@ -37,10 +37,10 @@ def base_ws_uri():
customized (typically when running behind NAT, firewall, etc.) customized (typically when running behind NAT, firewall, etc.)
""" """
config_wsserver = Config().get()['wsserver'] config_wsserver = Config().get()["wsserver"]
scheme = config_wsserver['advertised_scheme'] scheme = config_wsserver["advertised_scheme"]
host = config_wsserver['advertised_host'] host = config_wsserver["advertised_host"]
port = config_wsserver['advertised_port'] port = config_wsserver["advertised_port"]
return '{}://{}:{}'.format(scheme, host, port) return "{}://{}:{}".format(scheme, host, port)

View File

@ -21,7 +21,7 @@ class LatestBlock(Resource):
A JSON string containing the data about the block. A JSON string containing the data about the block.
""" """
pool = current_app.config['bigchain_pool'] pool = current_app.config["bigchain_pool"]
with pool() as planet: with pool() as planet:
block = planet.get_latest_block() block = planet.get_latest_block()
@ -43,7 +43,7 @@ class BlockApi(Resource):
A JSON string containing the data about the block. A JSON string containing the data about the block.
""" """
pool = current_app.config['bigchain_pool'] pool = current_app.config["bigchain_pool"]
with pool() as planet: with pool() as planet:
block = planet.get_block(block_id=block_id) block = planet.get_block(block_id=block_id)
@ -64,12 +64,12 @@ class BlockListApi(Resource):
"valid", "invalid", "undecided". "valid", "invalid", "undecided".
""" """
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
parser.add_argument('transaction_id', type=str, required=True) parser.add_argument("transaction_id", type=str, required=True)
args = parser.parse_args(strict=True) args = parser.parse_args(strict=True)
tx_id = args['transaction_id'] tx_id = args["transaction_id"]
pool = current_app.config['bigchain_pool'] pool = current_app.config["bigchain_pool"]
with pool() as planet: with pool() as planet:
blocks = planet.get_block_containing_tx(tx_id) blocks = planet.get_block_containing_tx(tx_id)

View File

@ -15,23 +15,20 @@ from planetmint.web.websocket_server import EVENTS_ENDPOINT, EVENTS_ENDPOINT_BLO
class RootIndex(Resource): class RootIndex(Resource):
def get(self): def get(self):
docs_url = [ docs_url = ["https://docs.planetmint.io/projects/server/en/v", version.__version__ + "/"]
'https://docs.planetmint.io/projects/server/en/v', return flask.jsonify(
version.__version__ + '/' {
] "api": {"v1": get_api_v1_info("/api/v1/")},
return flask.jsonify({ "docs": "".join(docs_url),
'api': { "software": "Planetmint",
'v1': get_api_v1_info('/api/v1/') "version": version.__version__,
}, }
'docs': ''.join(docs_url), )
'software': 'Planetmint',
'version': version.__version__,
})
class ApiV1Index(Resource): class ApiV1Index(Resource):
def get(self): def get(self):
return flask.jsonify(get_api_v1_info('/')) return flask.jsonify(get_api_v1_info("/"))
def get_api_v1_info(api_prefix): def get_api_v1_info(api_prefix):
@ -41,19 +38,19 @@ def get_api_v1_info(api_prefix):
websocket_root_tx = base_ws_uri() + EVENTS_ENDPOINT websocket_root_tx = base_ws_uri() + EVENTS_ENDPOINT
websocket_root_block = base_ws_uri() + EVENTS_ENDPOINT_BLOCKS websocket_root_block = base_ws_uri() + EVENTS_ENDPOINT_BLOCKS
docs_url = [ docs_url = [
'https://docs.planetmint.io/projects/server/en/v', "https://docs.planetmint.io/projects/server/en/v",
version.__version__, version.__version__,
'/http-client-server-api.html', "/http-client-server-api.html",
] ]
return { return {
'docs': ''.join(docs_url), "docs": "".join(docs_url),
'transactions': '{}transactions/'.format(api_prefix), "transactions": "{}transactions/".format(api_prefix),
'blocks': '{}blocks/'.format(api_prefix), "blocks": "{}blocks/".format(api_prefix),
'assets': '{}assets/'.format(api_prefix), "assets": "{}assets/".format(api_prefix),
'outputs': '{}outputs/'.format(api_prefix), "outputs": "{}outputs/".format(api_prefix),
'streams': websocket_root_tx, "streams": websocket_root_tx,
'streamedblocks': websocket_root_block, "streamedblocks": websocket_root_block,
'metadata': '{}metadata/'.format(api_prefix), "metadata": "{}metadata/".format(api_prefix),
'validators': '{}validators'.format(api_prefix), "validators": "{}validators".format(api_prefix),
} }

View File

@ -30,25 +30,22 @@ class MetadataApi(Resource):
A list of metadata that match the query. A list of metadata that match the query.
""" """
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
parser.add_argument('search', type=str, required=True) parser.add_argument("search", type=str, required=True)
parser.add_argument('limit', type=int) parser.add_argument("limit", type=int)
args = parser.parse_args() args = parser.parse_args()
if not args['search']: if not args["search"]:
return make_error(400, 'text_search cannot be empty') return make_error(400, "text_search cannot be empty")
if not args['limit']: if not args["limit"]:
del args['limit'] del args["limit"]
pool = current_app.config['bigchain_pool'] pool = current_app.config["bigchain_pool"]
with pool() as planet: with pool() as planet:
args['table'] = 'meta_data' args["table"] = "meta_data"
metadata = planet.text_search(**args) metadata = planet.text_search(**args)
try: try:
return list(metadata) return list(metadata)
except OperationError as e: except OperationError as e:
return make_error( return make_error(400, "({}): {}".format(type(e).__name__, e))
400,
'({}): {}'.format(type(e).__name__, e)
)

View File

@ -18,14 +18,11 @@ class OutputListApi(Resource):
A :obj:`list` of :cls:`str` of links to outputs. A :obj:`list` of :cls:`str` of links to outputs.
""" """
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
parser.add_argument('public_key', type=parameters.valid_ed25519, parser.add_argument("public_key", type=parameters.valid_ed25519, required=True)
required=True) parser.add_argument("spent", type=parameters.valid_bool)
parser.add_argument('spent', type=parameters.valid_bool)
args = parser.parse_args(strict=True) args = parser.parse_args(strict=True)
pool = current_app.config['bigchain_pool'] pool = current_app.config["bigchain_pool"]
with pool() as planet: with pool() as planet:
outputs = planet.get_outputs_filtered(args['public_key'], outputs = planet.get_outputs_filtered(args["public_key"], args["spent"])
args['spent']) return [{"transaction_id": output.txid, "output_index": output.output} for output in outputs]
return [{'transaction_id': output.txid, 'output_index': output.output}
for output in outputs]

View File

@ -6,45 +6,47 @@
import re import re
from planetmint.transactions.common.transaction_mode_types import ( from planetmint.transactions.common.transaction_mode_types import (
BROADCAST_TX_COMMIT, BROADCAST_TX_ASYNC, BROADCAST_TX_SYNC) BROADCAST_TX_COMMIT,
BROADCAST_TX_ASYNC,
BROADCAST_TX_SYNC,
)
def valid_txid(txid): def valid_txid(txid):
if re.match('^[a-fA-F0-9]{64}$', txid): if re.match("^[a-fA-F0-9]{64}$", txid):
return txid.lower() return txid.lower()
raise ValueError('Invalid hash') raise ValueError("Invalid hash")
def valid_bool(val): def valid_bool(val):
val = val.lower() val = val.lower()
if val == 'true': if val == "true":
return True return True
if val == 'false': if val == "false":
return False return False
raise ValueError('Boolean value must be "true" or "false" (lowercase)') raise ValueError('Boolean value must be "true" or "false" (lowercase)')
def valid_ed25519(key): def valid_ed25519(key):
if (re.match('^[1-9a-zA-Z]{43,44}$', key) and not if re.match("^[1-9a-zA-Z]{43,44}$", key) and not re.match(".*[Il0O]", key):
re.match('.*[Il0O]', key)):
return key return key
raise ValueError('Invalid base58 ed25519 key') raise ValueError("Invalid base58 ed25519 key")
def valid_operation(op): def valid_operation(op):
op = op.upper() op = op.upper()
if op == 'CREATE': if op == "CREATE":
return 'CREATE' return "CREATE"
if op == 'TRANSFER': if op == "TRANSFER":
return 'TRANSFER' return "TRANSFER"
raise ValueError('Operation must be "CREATE" or "TRANSFER"') raise ValueError('Operation must be "CREATE" or "TRANSFER"')
def valid_mode(mode): def valid_mode(mode):
if mode == 'async': if mode == "async":
return BROADCAST_TX_ASYNC return BROADCAST_TX_ASYNC
if mode == 'sync': if mode == "sync":
return BROADCAST_TX_SYNC return BROADCAST_TX_SYNC
if mode == 'commit': if mode == "commit":
return BROADCAST_TX_COMMIT return BROADCAST_TX_COMMIT
raise ValueError('Mode must be "async", "sync" or "commit"') raise ValueError('Mode must be "async", "sync" or "commit"')

View File

@ -65,9 +65,7 @@ class TransactionListApi(Resource):
A ``dict`` containing the data about the transaction. A ``dict`` containing the data about the transaction.
""" """
parser = reqparse.RequestParser() parser = reqparse.RequestParser()
parser.add_argument( parser.add_argument("mode", type=parameters.valid_mode, default=BROADCAST_TX_ASYNC)
"mode", type=parameters.valid_mode, default=BROADCAST_TX_ASYNC
)
args = parser.parse_args() args = parser.parse_args()
mode = str(args["mode"]) mode = str(args["mode"])
@ -85,21 +83,15 @@ class TransactionListApi(Resource):
message="Invalid transaction schema: {}".format(e.__cause__.message), message="Invalid transaction schema: {}".format(e.__cause__.message),
) )
except KeyError as e: except KeyError as e:
return make_error( return make_error(400, "Invalid transaction ({}): {}".format(type(e).__name__, e))
400, "Invalid transaction ({}): {}".format(type(e).__name__, e)
)
except ValidationError as e: except ValidationError as e:
return make_error( return make_error(400, "Invalid transaction ({}): {}".format(type(e).__name__, e))
400, "Invalid transaction ({}): {}".format(type(e).__name__, e)
)
with pool() as planet: with pool() as planet:
try: try:
planet.validate_transaction(tx_obj) planet.validate_transaction(tx_obj)
except ValidationError as e: except ValidationError as e:
return make_error( return make_error(400, "Invalid transaction ({}): {}".format(type(e).__name__, e))
400, "Invalid transaction ({}): {}".format(type(e).__name__, e)
)
else: else:
status_code, message = planet.write_transaction(tx_obj, mode) status_code, message = planet.write_transaction(tx_obj, mode)

View File

@ -15,7 +15,7 @@ class ValidatorsApi(Resource):
A JSON string containing the validator set of the current node. A JSON string containing the validator set of the current node.
""" """
pool = current_app.config['bigchain_pool'] pool = current_app.config["bigchain_pool"]
with pool() as planet: with pool() as planet:
validators = planet.get_validators() validators = planet.get_validators()

View File

@ -15,7 +15,7 @@ class Dispatcher:
This class implements a simple publish/subscribe pattern. This class implements a simple publish/subscribe pattern.
""" """
def __init__(self, event_source, type='tx'): def __init__(self, event_source, type="tx"):
"""Create a new instance. """Create a new instance.
Args: Args:
@ -49,20 +49,18 @@ class Dispatcher:
@staticmethod @staticmethod
def simplified_block(block): def simplified_block(block):
txids = [] txids = []
for tx in block['transactions']: for tx in block["transactions"]:
txids.append(tx.id) txids.append(tx.id)
return {'height': block['height'], 'hash': block['hash'], 'transaction_ids': txids} return {"height": block["height"], "hash": block["hash"], "transaction_ids": txids}
@staticmethod @staticmethod
def eventify_block(block): def eventify_block(block):
for tx in block['transactions']: for tx in block["transactions"]:
if tx.asset: if tx.asset:
asset_id = tx.asset.get('id', tx.id) asset_id = tx.asset.get("id", tx.id)
else: else:
asset_id = tx.id asset_id = tx.id
yield {'height': block['height'], yield {"height": block["height"], "asset_id": asset_id, "transaction_id": tx.id}
'asset_id': asset_id,
'transaction_id': tx.id}
async def publish(self): async def publish(self):
"""Publish new events to the subscribers.""" """Publish new events to the subscribers."""
@ -77,9 +75,9 @@ class Dispatcher:
if isinstance(event, str): if isinstance(event, str):
str_buffer.append(event) str_buffer.append(event)
elif event.type == EventTypes.BLOCK_VALID: elif event.type == EventTypes.BLOCK_VALID:
if self.type == 'tx': if self.type == "tx":
str_buffer = map(json.dumps, self.eventify_block(event.data)) str_buffer = map(json.dumps, self.eventify_block(event.data))
elif self.type == 'blk': elif self.type == "blk":
str_buffer = [json.dumps(self.simplified_block(event.data))] str_buffer = [json.dumps(self.simplified_block(event.data))]
else: else:
return return

View File

@ -29,8 +29,8 @@ from planetmint.web.websocket_dispatcher import Dispatcher
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
EVENTS_ENDPOINT = '/api/v1/streams/valid_transactions' EVENTS_ENDPOINT = "/api/v1/streams/valid_transactions"
EVENTS_ENDPOINT_BLOCKS = '/api/v1/streams/valid_blocks' EVENTS_ENDPOINT_BLOCKS = "/api/v1/streams/valid_blocks"
def _multiprocessing_to_asyncio(in_queue, out_queue1, out_queue2, loop): def _multiprocessing_to_asyncio(in_queue, out_queue1, out_queue2, loop):
@ -51,60 +51,60 @@ def _multiprocessing_to_asyncio(in_queue, out_queue1, out_queue2, loop):
async def websocket_tx_handler(request): async def websocket_tx_handler(request):
"""Handle a new socket connection.""" """Handle a new socket connection."""
logger.debug('New TX websocket connection.') logger.debug("New TX websocket connection.")
websocket = aiohttp.web.WebSocketResponse() websocket = aiohttp.web.WebSocketResponse()
await websocket.prepare(request) await websocket.prepare(request)
uuid = uuid4() uuid = uuid4()
request.app['tx_dispatcher'].subscribe(uuid, websocket) request.app["tx_dispatcher"].subscribe(uuid, websocket)
while True: while True:
# Consume input buffer # Consume input buffer
try: try:
msg = await websocket.receive() msg = await websocket.receive()
except RuntimeError as e: except RuntimeError as e:
logger.debug('Websocket exception: %s', str(e)) logger.debug("Websocket exception: %s", str(e))
break break
except CancelledError: except CancelledError:
logger.debug('Websocket closed') logger.debug("Websocket closed")
break break
if msg.type == aiohttp.WSMsgType.CLOSED: if msg.type == aiohttp.WSMsgType.CLOSED:
logger.debug('Websocket closed') logger.debug("Websocket closed")
break break
elif msg.type == aiohttp.WSMsgType.ERROR: elif msg.type == aiohttp.WSMsgType.ERROR:
logger.debug('Websocket exception: %s', websocket.exception()) logger.debug("Websocket exception: %s", websocket.exception())
break break
request.app['tx_dispatcher'].unsubscribe(uuid) request.app["tx_dispatcher"].unsubscribe(uuid)
return websocket return websocket
async def websocket_blk_handler(request): async def websocket_blk_handler(request):
"""Handle a new socket connection.""" """Handle a new socket connection."""
logger.debug('New BLK websocket connection.') logger.debug("New BLK websocket connection.")
websocket = aiohttp.web.WebSocketResponse() websocket = aiohttp.web.WebSocketResponse()
await websocket.prepare(request) await websocket.prepare(request)
uuid = uuid4() uuid = uuid4()
request.app['blk_dispatcher'].subscribe(uuid, websocket) request.app["blk_dispatcher"].subscribe(uuid, websocket)
while True: while True:
# Consume input buffer # Consume input buffer
try: try:
msg = await websocket.receive() msg = await websocket.receive()
except RuntimeError as e: except RuntimeError as e:
logger.debug('Websocket exception: %s', str(e)) logger.debug("Websocket exception: %s", str(e))
break break
except CancelledError: except CancelledError:
logger.debug('Websocket closed') logger.debug("Websocket closed")
break break
if msg.type == aiohttp.WSMsgType.CLOSED: if msg.type == aiohttp.WSMsgType.CLOSED:
logger.debug('Websocket closed') logger.debug("Websocket closed")
break break
elif msg.type == aiohttp.WSMsgType.ERROR: elif msg.type == aiohttp.WSMsgType.ERROR:
logger.debug('Websocket exception: %s', websocket.exception()) logger.debug("Websocket exception: %s", websocket.exception())
break break
request.app['blk_dispatcher'].unsubscribe(uuid) request.app["blk_dispatcher"].unsubscribe(uuid)
return websocket return websocket
@ -115,16 +115,16 @@ def init_app(tx_source, blk_source, *, loop=None):
An aiohttp application. An aiohttp application.
""" """
blk_dispatcher = Dispatcher(blk_source, 'blk') blk_dispatcher = Dispatcher(blk_source, "blk")
tx_dispatcher = Dispatcher(tx_source, 'tx') tx_dispatcher = Dispatcher(tx_source, "tx")
# Schedule the dispatcher # Schedule the dispatcher
loop.create_task(blk_dispatcher.publish(), name='blk') loop.create_task(blk_dispatcher.publish(), name="blk")
loop.create_task(tx_dispatcher.publish(), name='tx') loop.create_task(tx_dispatcher.publish(), name="tx")
app = aiohttp.web.Application(loop=loop) app = aiohttp.web.Application(loop=loop)
app['tx_dispatcher'] = tx_dispatcher app["tx_dispatcher"] = tx_dispatcher
app['blk_dispatcher'] = blk_dispatcher app["blk_dispatcher"] = blk_dispatcher
app.router.add_get(EVENTS_ENDPOINT, websocket_tx_handler) app.router.add_get(EVENTS_ENDPOINT, websocket_tx_handler)
app.router.add_get(EVENTS_ENDPOINT_BLOCKS, websocket_blk_handler) app.router.add_get(EVENTS_ENDPOINT_BLOCKS, websocket_blk_handler)
return app return app
@ -139,13 +139,12 @@ def start(sync_event_source, loop=None):
tx_source = asyncio.Queue(loop=loop) tx_source = asyncio.Queue(loop=loop)
blk_source = asyncio.Queue(loop=loop) blk_source = asyncio.Queue(loop=loop)
bridge = threading.Thread(target=_multiprocessing_to_asyncio, bridge = threading.Thread(
args=(sync_event_source, tx_source, blk_source, loop), target=_multiprocessing_to_asyncio, args=(sync_event_source, tx_source, blk_source, loop), daemon=True
daemon=True) )
bridge.start() bridge.start()
app = init_app(tx_source, blk_source, loop=loop) app = init_app(tx_source, blk_source, loop=loop)
aiohttp.web.run_app(app, aiohttp.web.run_app(
host=Config().get()['wsserver']['host'], app, host=Config().get()["wsserver"]["host"], port=Config().get()["wsserver"]["port"], loop=loop
port=Config().get()['wsserver']['port'], )
loop=loop)

View File

@ -4,6 +4,3 @@ test=pytest
[coverage:run] [coverage:run]
source = . source = .
omit = *test* omit = *test*
[flake8]
max_line_length = 119

View File

@ -89,21 +89,12 @@ docs_require = [
check_setuptools_features() check_setuptools_features()
dev_require = [ dev_require = ["ipdb", "ipython", "watchdog", "logging_tree", "pre-commit", "twine", "ptvsd"]
"ipdb",
"ipython",
"watchdog",
"logging_tree",
"pre-commit",
"twine",
"ptvsd"
]
tests_require = [ tests_require = [
"coverage", "coverage",
"pep8", "pep8",
"flake8", "black",
"flake8-quotes==0.8.1",
"hypothesis>=5.3.0", "hypothesis>=5.3.0",
"pytest>=3.0.0", "pytest>=3.0.0",
"pytest-cov==2.8.1", "pytest-cov==2.8.1",
@ -116,27 +107,27 @@ tests_require = [
] + docs_require ] + docs_require
install_requires = [ install_requires = [
'chardet==3.0.4', "chardet==3.0.4",
'aiohttp==3.8.1', "aiohttp==3.8.1",
'abci==0.8.3', "abci==0.8.3",
'planetmint-cryptoconditions>=0.9.9', "planetmint-cryptoconditions>=0.9.9",
'flask-cors==3.0.10', "flask-cors==3.0.10",
'flask-restful==0.3.9', "flask-restful==0.3.9",
'flask==2.1.2', "flask==2.1.2",
'gunicorn==20.1.0', "gunicorn==20.1.0",
'jsonschema==3.2.0', "jsonschema==3.2.0",
'logstats==0.3.0', "logstats==0.3.0",
'packaging>=20.9', "packaging>=20.9",
# TODO Consider not installing the db drivers, or putting them in extras. # TODO Consider not installing the db drivers, or putting them in extras.
'pymongo==3.11.4', "pymongo==3.11.4",
'tarantool==0.7.1', "tarantool==0.7.1",
'python-rapidjson==1.0', "python-rapidjson==1.0",
'pyyaml==5.4.1', "pyyaml==5.4.1",
'requests==2.25.1', "requests==2.25.1",
'setproctitle==1.2.2', "setproctitle==1.2.2",
'werkzeug==2.0.3', "werkzeug==2.0.3",
'nest-asyncio==1.5.5', "nest-asyncio==1.5.5",
'protobuf==3.20.1' "protobuf==3.20.1",
] ]
setup( setup(

View File

@ -8,23 +8,22 @@ import random
from planetmint.transactions.types.assets.create import Create from planetmint.transactions.types.assets.create import Create
from planetmint.transactions.types.assets.transfer import Transfer from planetmint.transactions.types.assets.transfer import Transfer
def test_asset_transfer(b, signed_create_tx, user_pk, user_sk): def test_asset_transfer(b, signed_create_tx, user_pk, user_sk):
tx_transfer = Transfer.generate(signed_create_tx.to_inputs(), [([user_pk], 1)], tx_transfer = Transfer.generate(signed_create_tx.to_inputs(), [([user_pk], 1)], signed_create_tx.id)
signed_create_tx.id)
tx_transfer_signed = tx_transfer.sign([user_sk]) tx_transfer_signed = tx_transfer.sign([user_sk])
b.store_bulk_transactions([signed_create_tx]) b.store_bulk_transactions([signed_create_tx])
assert tx_transfer_signed.validate(b) == tx_transfer_signed assert tx_transfer_signed.validate(b) == tx_transfer_signed
assert tx_transfer_signed.asset['id'] == signed_create_tx.id assert tx_transfer_signed.asset["id"] == signed_create_tx.id
def test_validate_transfer_asset_id_mismatch(b, signed_create_tx, user_pk, user_sk): def test_validate_transfer_asset_id_mismatch(b, signed_create_tx, user_pk, user_sk):
from planetmint.transactions.common.exceptions import AssetIdMismatch from planetmint.transactions.common.exceptions import AssetIdMismatch
tx_transfer = Transfer.generate(signed_create_tx.to_inputs(), [([user_pk], 1)], tx_transfer = Transfer.generate(signed_create_tx.to_inputs(), [([user_pk], 1)], signed_create_tx.id)
signed_create_tx.id) tx_transfer.asset["id"] = "a" * 64
tx_transfer.asset['id'] = 'a' * 64
tx_transfer_signed = tx_transfer.sign([user_sk]) tx_transfer_signed = tx_transfer.sign([user_sk])
b.store_bulk_transactions([signed_create_tx]) b.store_bulk_transactions([signed_create_tx])
@ -35,6 +34,7 @@ def test_validate_transfer_asset_id_mismatch(b, signed_create_tx, user_pk, user_
def test_get_asset_id_create_transaction(alice, user_pk): def test_get_asset_id_create_transaction(alice, user_pk):
from planetmint.models import Transaction from planetmint.models import Transaction
tx_create = Create.generate([alice.public_key], [([user_pk], 1)]) tx_create = Create.generate([alice.public_key], [([user_pk], 1)])
assert Transaction.get_asset_id(tx_create) == tx_create.id assert Transaction.get_asset_id(tx_create) == tx_create.id
@ -42,21 +42,18 @@ def test_get_asset_id_create_transaction(alice, user_pk):
def test_get_asset_id_transfer_transaction(b, signed_create_tx, user_pk): def test_get_asset_id_transfer_transaction(b, signed_create_tx, user_pk):
from planetmint.models import Transaction from planetmint.models import Transaction
tx_transfer = Transfer.generate(signed_create_tx.to_inputs(), [([user_pk], 1)], tx_transfer = Transfer.generate(signed_create_tx.to_inputs(), [([user_pk], 1)], signed_create_tx.id)
signed_create_tx.id)
asset_id = Transaction.get_asset_id(tx_transfer) asset_id = Transaction.get_asset_id(tx_transfer)
assert asset_id == tx_transfer.asset['id'] assert asset_id == tx_transfer.asset["id"]
def test_asset_id_mismatch(alice, user_pk): def test_asset_id_mismatch(alice, user_pk):
from planetmint.models import Transaction from planetmint.models import Transaction
from planetmint.transactions.common.exceptions import AssetIdMismatch from planetmint.transactions.common.exceptions import AssetIdMismatch
tx1 = Create.generate([alice.public_key], [([user_pk], 1)], tx1 = Create.generate([alice.public_key], [([user_pk], 1)], metadata={"msg": random.random()})
metadata={'msg': random.random()})
tx1.sign([alice.private_key]) tx1.sign([alice.private_key])
tx2 = Create.generate([alice.public_key], [([user_pk], 1)], tx2 = Create.generate([alice.public_key], [([user_pk], 1)], metadata={"msg": random.random()})
metadata={'msg': random.random()})
tx2.sign([alice.private_key]) tx2.sign([alice.private_key])
with pytest.raises(AssetIdMismatch): with pytest.raises(AssetIdMismatch):

View File

@ -19,7 +19,7 @@ from planetmint.transactions.common.exceptions import DoubleSpend
# Single owners_after # Single owners_after
def test_single_in_single_own_single_out_single_own_create(alice, user_pk, b): def test_single_in_single_own_single_out_single_own_create(alice, user_pk, b):
tx = Create.generate([alice.public_key], [([user_pk], 100)], asset={'name': random.random()}) tx = Create.generate([alice.public_key], [([user_pk], 100)], asset={"name": random.random()})
tx_signed = tx.sign([alice.private_key]) tx_signed = tx.sign([alice.private_key])
assert tx_signed.validate(b) == tx_signed assert tx_signed.validate(b) == tx_signed
@ -35,8 +35,7 @@ def test_single_in_single_own_single_out_single_own_create(alice, user_pk, b):
# Single owners_after per output # Single owners_after per output
def test_single_in_single_own_multiple_out_single_own_create(alice, user_pk, b): def test_single_in_single_own_multiple_out_single_own_create(alice, user_pk, b):
tx = Create.generate([alice.public_key], [([user_pk], 50), ([user_pk], 50)], tx = Create.generate([alice.public_key], [([user_pk], 50), ([user_pk], 50)], asset={"name": random.random()})
asset={'name': random.random()})
tx_signed = tx.sign([alice.private_key]) tx_signed = tx.sign([alice.private_key])
assert tx_signed.validate(b) == tx_signed assert tx_signed.validate(b) == tx_signed
@ -53,7 +52,7 @@ def test_single_in_single_own_multiple_out_single_own_create(alice, user_pk, b):
# Multiple owners_after # Multiple owners_after
def test_single_in_single_own_single_out_multiple_own_create(alice, user_pk, b): def test_single_in_single_own_single_out_multiple_own_create(alice, user_pk, b):
tx = Create.generate([alice.public_key], [([user_pk, user_pk], 100)], asset={'name': random.random()}) tx = Create.generate([alice.public_key], [([user_pk, user_pk], 100)], asset={"name": random.random()})
tx_signed = tx.sign([alice.private_key]) tx_signed = tx.sign([alice.private_key])
assert tx_signed.validate(b) == tx_signed assert tx_signed.validate(b) == tx_signed
@ -61,8 +60,8 @@ def test_single_in_single_own_single_out_multiple_own_create(alice, user_pk, b):
assert tx_signed.outputs[0].amount == 100 assert tx_signed.outputs[0].amount == 100
output = tx_signed.outputs[0].to_dict() output = tx_signed.outputs[0].to_dict()
assert 'subconditions' in output['condition']['details'] assert "subconditions" in output["condition"]["details"]
assert len(output['condition']['details']['subconditions']) == 2 assert len(output["condition"]["details"]["subconditions"]) == 2
assert len(tx_signed.inputs) == 1 assert len(tx_signed.inputs) == 1
@ -75,8 +74,9 @@ def test_single_in_single_own_single_out_multiple_own_create(alice, user_pk, b):
# owners_after # owners_after
def test_single_in_single_own_multiple_out_mix_own_create(alice, user_pk, b): def test_single_in_single_own_multiple_out_mix_own_create(alice, user_pk, b):
tx = Create.generate([alice.public_key], [([user_pk], 50), ([user_pk, user_pk], 50)], tx = Create.generate(
asset={'name': random.random()}) [alice.public_key], [([user_pk], 50), ([user_pk, user_pk], 50)], asset={"name": random.random()}
)
tx_signed = tx.sign([alice.private_key]) tx_signed = tx.sign([alice.private_key])
assert tx_signed.validate(b) == tx_signed assert tx_signed.validate(b) == tx_signed
@ -85,8 +85,8 @@ def test_single_in_single_own_multiple_out_mix_own_create(alice, user_pk, b):
assert tx_signed.outputs[1].amount == 50 assert tx_signed.outputs[1].amount == 50
output_cid1 = tx_signed.outputs[1].to_dict() output_cid1 = tx_signed.outputs[1].to_dict()
assert 'subconditions' in output_cid1['condition']['details'] assert "subconditions" in output_cid1["condition"]["details"]
assert len(output_cid1['condition']['details']['subconditions']) == 2 assert len(output_cid1["condition"]["details"]["subconditions"]) == 2
assert len(tx_signed.inputs) == 1 assert len(tx_signed.inputs) == 1
@ -95,11 +95,10 @@ def test_single_in_single_own_multiple_out_mix_own_create(alice, user_pk, b):
# Single input # Single input
# Multiple owners_before # Multiple owners_before
# Output combinations already tested above # Output combinations already tested above
def test_single_in_multiple_own_single_out_single_own_create(alice, b, user_pk, def test_single_in_multiple_own_single_out_single_own_create(alice, b, user_pk, user_sk):
user_sk):
from planetmint.transactions.common.utils import _fulfillment_to_details from planetmint.transactions.common.utils import _fulfillment_to_details
tx = Create.generate([alice.public_key, user_pk], [([user_pk], 100)], asset={'name': random.random()}) tx = Create.generate([alice.public_key, user_pk], [([user_pk], 100)], asset={"name": random.random()})
tx_signed = tx.sign([alice.private_key, user_sk]) tx_signed = tx.sign([alice.private_key, user_sk])
assert tx_signed.validate(b) == tx_signed assert tx_signed.validate(b) == tx_signed
assert len(tx_signed.outputs) == 1 assert len(tx_signed.outputs) == 1
@ -107,8 +106,8 @@ def test_single_in_multiple_own_single_out_single_own_create(alice, b, user_pk,
assert len(tx_signed.inputs) == 1 assert len(tx_signed.inputs) == 1
ffill = _fulfillment_to_details(tx_signed.inputs[0].fulfillment) ffill = _fulfillment_to_details(tx_signed.inputs[0].fulfillment)
assert 'subconditions' in ffill assert "subconditions" in ffill
assert len(ffill['subconditions']) == 2 assert len(ffill["subconditions"]) == 2
# TRANSFER divisible asset # TRANSFER divisible asset
@ -116,16 +115,14 @@ def test_single_in_multiple_own_single_out_single_own_create(alice, b, user_pk,
# Single owners_before # Single owners_before
# Single output # Single output
# Single owners_after # Single owners_after
def test_single_in_single_own_single_out_single_own_transfer(alice, b, user_pk, def test_single_in_single_own_single_out_single_own_transfer(alice, b, user_pk, user_sk):
user_sk):
# CREATE divisible asset # CREATE divisible asset
tx_create = Create.generate([alice.public_key], [([user_pk], 100)], asset={'name': random.random()}) tx_create = Create.generate([alice.public_key], [([user_pk], 100)], asset={"name": random.random()})
tx_create_signed = tx_create.sign([alice.private_key]) tx_create_signed = tx_create.sign([alice.private_key])
# TRANSFER # TRANSFER
tx_transfer = Transfer.generate(tx_create.to_inputs(), [([alice.public_key], 100)], tx_transfer = Transfer.generate(tx_create.to_inputs(), [([alice.public_key], 100)], asset_id=tx_create.id)
asset_id=tx_create.id)
tx_transfer_signed = tx_transfer.sign([user_sk]) tx_transfer_signed = tx_transfer.sign([user_sk])
b.store_bulk_transactions([tx_create_signed]) b.store_bulk_transactions([tx_create_signed])
@ -141,17 +138,16 @@ def test_single_in_single_own_single_out_single_own_transfer(alice, b, user_pk,
# Single owners_before # Single owners_before
# Multiple output # Multiple output
# Single owners_after # Single owners_after
def test_single_in_single_own_multiple_out_single_own_transfer(alice, b, user_pk, def test_single_in_single_own_multiple_out_single_own_transfer(alice, b, user_pk, user_sk):
user_sk):
# CREATE divisible asset # CREATE divisible asset
tx_create = Create.generate([alice.public_key], [([user_pk], 100)], asset={'name': random.random()}) tx_create = Create.generate([alice.public_key], [([user_pk], 100)], asset={"name": random.random()})
tx_create_signed = tx_create.sign([alice.private_key]) tx_create_signed = tx_create.sign([alice.private_key])
# TRANSFER # TRANSFER
tx_transfer = Transfer.generate(tx_create.to_inputs(), tx_transfer = Transfer.generate(
[([alice.public_key], 50), ([alice.public_key], 50)], tx_create.to_inputs(), [([alice.public_key], 50), ([alice.public_key], 50)], asset_id=tx_create.id
asset_id=tx_create.id) )
tx_transfer_signed = tx_transfer.sign([user_sk]) tx_transfer_signed = tx_transfer.sign([user_sk])
b.store_bulk_transactions([tx_create_signed]) b.store_bulk_transactions([tx_create_signed])
@ -168,17 +164,16 @@ def test_single_in_single_own_multiple_out_single_own_transfer(alice, b, user_pk
# Single owners_before # Single owners_before
# Single output # Single output
# Multiple owners_after # Multiple owners_after
def test_single_in_single_own_single_out_multiple_own_transfer(alice, b, user_pk, def test_single_in_single_own_single_out_multiple_own_transfer(alice, b, user_pk, user_sk):
user_sk):
# CREATE divisible asset # CREATE divisible asset
tx_create = Create.generate([alice.public_key], [([user_pk], 100)], asset={'name': random.random()}) tx_create = Create.generate([alice.public_key], [([user_pk], 100)], asset={"name": random.random()})
tx_create_signed = tx_create.sign([alice.private_key]) tx_create_signed = tx_create.sign([alice.private_key])
# TRANSFER # TRANSFER
tx_transfer = Transfer.generate(tx_create.to_inputs(), tx_transfer = Transfer.generate(
[([alice.public_key, alice.public_key], 100)], tx_create.to_inputs(), [([alice.public_key, alice.public_key], 100)], asset_id=tx_create.id
asset_id=tx_create.id) )
tx_transfer_signed = tx_transfer.sign([user_sk]) tx_transfer_signed = tx_transfer.sign([user_sk])
b.store_bulk_transactions([tx_create_signed]) b.store_bulk_transactions([tx_create_signed])
@ -188,8 +183,8 @@ def test_single_in_single_own_single_out_multiple_own_transfer(alice, b, user_pk
assert tx_transfer_signed.outputs[0].amount == 100 assert tx_transfer_signed.outputs[0].amount == 100
condition = tx_transfer_signed.outputs[0].to_dict() condition = tx_transfer_signed.outputs[0].to_dict()
assert 'subconditions' in condition['condition']['details'] assert "subconditions" in condition["condition"]["details"]
assert len(condition['condition']['details']['subconditions']) == 2 assert len(condition["condition"]["details"]["subconditions"]) == 2
assert len(tx_transfer_signed.inputs) == 1 assert len(tx_transfer_signed.inputs) == 1
b.store_bulk_transactions([tx_transfer_signed]) b.store_bulk_transactions([tx_transfer_signed])
@ -203,17 +198,18 @@ def test_single_in_single_own_single_out_multiple_own_transfer(alice, b, user_pk
# Multiple outputs # Multiple outputs
# Mix: one output with a single owners_after, one output with multiple # Mix: one output with a single owners_after, one output with multiple
# owners_after # owners_after
def test_single_in_single_own_multiple_out_mix_own_transfer(alice, b, user_pk, def test_single_in_single_own_multiple_out_mix_own_transfer(alice, b, user_pk, user_sk):
user_sk):
# CREATE divisible asset # CREATE divisible asset
tx_create = Create.generate([alice.public_key], [([user_pk], 100)], asset={'name': random.random()}) tx_create = Create.generate([alice.public_key], [([user_pk], 100)], asset={"name": random.random()})
tx_create_signed = tx_create.sign([alice.private_key]) tx_create_signed = tx_create.sign([alice.private_key])
# TRANSFER # TRANSFER
tx_transfer = Transfer.generate(tx_create.to_inputs(), tx_transfer = Transfer.generate(
tx_create.to_inputs(),
[([alice.public_key], 50), ([alice.public_key, alice.public_key], 50)], [([alice.public_key], 50), ([alice.public_key, alice.public_key], 50)],
asset_id=tx_create.id) asset_id=tx_create.id,
)
tx_transfer_signed = tx_transfer.sign([user_sk]) tx_transfer_signed = tx_transfer.sign([user_sk])
b.store_bulk_transactions([tx_create_signed]) b.store_bulk_transactions([tx_create_signed])
@ -224,8 +220,8 @@ def test_single_in_single_own_multiple_out_mix_own_transfer(alice, b, user_pk,
assert tx_transfer_signed.outputs[1].amount == 50 assert tx_transfer_signed.outputs[1].amount == 50
output_cid1 = tx_transfer_signed.outputs[1].to_dict() output_cid1 = tx_transfer_signed.outputs[1].to_dict()
assert 'subconditions' in output_cid1['condition']['details'] assert "subconditions" in output_cid1["condition"]["details"]
assert len(output_cid1['condition']['details']['subconditions']) == 2 assert len(output_cid1["condition"]["details"]["subconditions"]) == 2
assert len(tx_transfer_signed.inputs) == 1 assert len(tx_transfer_signed.inputs) == 1
@ -239,18 +235,17 @@ def test_single_in_single_own_multiple_out_mix_own_transfer(alice, b, user_pk,
# Multiple owners_before # Multiple owners_before
# Single output # Single output
# Single owners_after # Single owners_after
def test_single_in_multiple_own_single_out_single_own_transfer(alice, b, user_pk, def test_single_in_multiple_own_single_out_single_own_transfer(alice, b, user_pk, user_sk):
user_sk):
from planetmint.transactions.common.utils import _fulfillment_to_details from planetmint.transactions.common.utils import _fulfillment_to_details
# CREATE divisible asset # CREATE divisible asset
tx_create = Create.generate([alice.public_key], [([alice.public_key, user_pk], 100)], tx_create = Create.generate(
asset={'name': random.random()}) [alice.public_key], [([alice.public_key, user_pk], 100)], asset={"name": random.random()}
)
tx_create_signed = tx_create.sign([alice.private_key]) tx_create_signed = tx_create.sign([alice.private_key])
# TRANSFER # TRANSFER
tx_transfer = Transfer.generate(tx_create.to_inputs(), [([alice.public_key], 100)], tx_transfer = Transfer.generate(tx_create.to_inputs(), [([alice.public_key], 100)], asset_id=tx_create.id)
asset_id=tx_create.id)
tx_transfer_signed = tx_transfer.sign([alice.private_key, user_sk]) tx_transfer_signed = tx_transfer.sign([alice.private_key, user_sk])
b.store_bulk_transactions([tx_create_signed]) b.store_bulk_transactions([tx_create_signed])
@ -261,8 +256,8 @@ def test_single_in_multiple_own_single_out_single_own_transfer(alice, b, user_pk
assert len(tx_transfer_signed.inputs) == 1 assert len(tx_transfer_signed.inputs) == 1
ffill = _fulfillment_to_details(tx_transfer_signed.inputs[0].fulfillment) ffill = _fulfillment_to_details(tx_transfer_signed.inputs[0].fulfillment)
assert 'subconditions' in ffill assert "subconditions" in ffill
assert len(ffill['subconditions']) == 2 assert len(ffill["subconditions"]) == 2
b.store_bulk_transactions([tx_transfer_signed]) b.store_bulk_transactions([tx_transfer_signed])
with pytest.raises(DoubleSpend): with pytest.raises(DoubleSpend):
@ -274,16 +269,15 @@ def test_single_in_multiple_own_single_out_single_own_transfer(alice, b, user_pk
# Single owners_before per input # Single owners_before per input
# Single output # Single output
# Single owners_after # Single owners_after
def test_multiple_in_single_own_single_out_single_own_transfer(alice, b, user_pk, def test_multiple_in_single_own_single_out_single_own_transfer(alice, b, user_pk, user_sk):
user_sk):
# CREATE divisible asset # CREATE divisible asset
tx_create = Create.generate([alice.public_key], [([user_pk], 50), ([user_pk], 50)], tx_create = Create.generate(
asset={'name': random.random()}) [alice.public_key], [([user_pk], 50), ([user_pk], 50)], asset={"name": random.random()}
)
tx_create_signed = tx_create.sign([alice.private_key]) tx_create_signed = tx_create.sign([alice.private_key])
# TRANSFER # TRANSFER
tx_transfer = Transfer.generate(tx_create.to_inputs(), [([alice.public_key], 100)], tx_transfer = Transfer.generate(tx_create.to_inputs(), [([alice.public_key], 100)], asset_id=tx_create.id)
asset_id=tx_create.id)
tx_transfer_signed = tx_transfer.sign([user_sk]) tx_transfer_signed = tx_transfer.sign([user_sk])
b.store_bulk_transactions([tx_create_signed]) b.store_bulk_transactions([tx_create_signed])
@ -303,19 +297,19 @@ def test_multiple_in_single_own_single_out_single_own_transfer(alice, b, user_pk
# Multiple owners_before per input # Multiple owners_before per input
# Single output # Single output
# Single owners_after # Single owners_after
def test_multiple_in_multiple_own_single_out_single_own_transfer(alice, b, user_pk, def test_multiple_in_multiple_own_single_out_single_own_transfer(alice, b, user_pk, user_sk):
user_sk):
from planetmint.transactions.common.utils import _fulfillment_to_details from planetmint.transactions.common.utils import _fulfillment_to_details
# CREATE divisible asset # CREATE divisible asset
tx_create = Create.generate([alice.public_key], [([user_pk, alice.public_key], 50), tx_create = Create.generate(
([user_pk, alice.public_key], 50)], [alice.public_key],
asset={'name': random.random()}) [([user_pk, alice.public_key], 50), ([user_pk, alice.public_key], 50)],
asset={"name": random.random()},
)
tx_create_signed = tx_create.sign([alice.private_key]) tx_create_signed = tx_create.sign([alice.private_key])
# TRANSFER # TRANSFER
tx_transfer = Transfer.generate(tx_create.to_inputs(), [([alice.public_key], 100)], tx_transfer = Transfer.generate(tx_create.to_inputs(), [([alice.public_key], 100)], asset_id=tx_create.id)
asset_id=tx_create.id)
tx_transfer_signed = tx_transfer.sign([alice.private_key, user_sk]) tx_transfer_signed = tx_transfer.sign([alice.private_key, user_sk])
b.store_bulk_transactions([tx_create_signed]) b.store_bulk_transactions([tx_create_signed])
@ -327,10 +321,10 @@ def test_multiple_in_multiple_own_single_out_single_own_transfer(alice, b, user_
ffill_fid0 = _fulfillment_to_details(tx_transfer_signed.inputs[0].fulfillment) ffill_fid0 = _fulfillment_to_details(tx_transfer_signed.inputs[0].fulfillment)
ffill_fid1 = _fulfillment_to_details(tx_transfer_signed.inputs[1].fulfillment) ffill_fid1 = _fulfillment_to_details(tx_transfer_signed.inputs[1].fulfillment)
assert 'subconditions' in ffill_fid0 assert "subconditions" in ffill_fid0
assert 'subconditions' in ffill_fid1 assert "subconditions" in ffill_fid1
assert len(ffill_fid0['subconditions']) == 2 assert len(ffill_fid0["subconditions"]) == 2
assert len(ffill_fid1['subconditions']) == 2 assert len(ffill_fid1["subconditions"]) == 2
b.store_bulk_transactions([tx_transfer_signed]) b.store_bulk_transactions([tx_transfer_signed])
with pytest.raises(DoubleSpend): with pytest.raises(DoubleSpend):
@ -343,18 +337,17 @@ def test_multiple_in_multiple_own_single_out_single_own_transfer(alice, b, user_
# owners_before # owners_before
# Single output # Single output
# Single owners_after # Single owners_after
def test_muiltiple_in_mix_own_multiple_out_single_own_transfer(alice, b, user_pk, def test_muiltiple_in_mix_own_multiple_out_single_own_transfer(alice, b, user_pk, user_sk):
user_sk):
from planetmint.transactions.common.utils import _fulfillment_to_details from planetmint.transactions.common.utils import _fulfillment_to_details
# CREATE divisible asset # CREATE divisible asset
tx_create = Create.generate([alice.public_key], [([user_pk], 50), ([user_pk, alice.public_key], 50)], tx_create = Create.generate(
asset={'name': random.random()}) [alice.public_key], [([user_pk], 50), ([user_pk, alice.public_key], 50)], asset={"name": random.random()}
)
tx_create_signed = tx_create.sign([alice.private_key]) tx_create_signed = tx_create.sign([alice.private_key])
# TRANSFER # TRANSFER
tx_transfer = Transfer.generate(tx_create.to_inputs(), [([alice.public_key], 100)], tx_transfer = Transfer.generate(tx_create.to_inputs(), [([alice.public_key], 100)], asset_id=tx_create.id)
asset_id=tx_create.id)
tx_transfer_signed = tx_transfer.sign([alice.private_key, user_sk]) tx_transfer_signed = tx_transfer.sign([alice.private_key, user_sk])
b.store_bulk_transactions([tx_create_signed]) b.store_bulk_transactions([tx_create_signed])
@ -365,9 +358,9 @@ def test_muiltiple_in_mix_own_multiple_out_single_own_transfer(alice, b, user_pk
ffill_fid0 = _fulfillment_to_details(tx_transfer_signed.inputs[0].fulfillment) ffill_fid0 = _fulfillment_to_details(tx_transfer_signed.inputs[0].fulfillment)
ffill_fid1 = _fulfillment_to_details(tx_transfer_signed.inputs[1].fulfillment) ffill_fid1 = _fulfillment_to_details(tx_transfer_signed.inputs[1].fulfillment)
assert 'subconditions' not in ffill_fid0 assert "subconditions" not in ffill_fid0
assert 'subconditions' in ffill_fid1 assert "subconditions" in ffill_fid1
assert len(ffill_fid1['subconditions']) == 2 assert len(ffill_fid1["subconditions"]) == 2
b.store_bulk_transactions([tx_transfer_signed]) b.store_bulk_transactions([tx_transfer_signed])
with pytest.raises(DoubleSpend): with pytest.raises(DoubleSpend):
@ -381,18 +374,18 @@ def test_muiltiple_in_mix_own_multiple_out_single_own_transfer(alice, b, user_pk
# Multiple outputs # Multiple outputs
# Mix: one output with a single owners_after, one output with multiple # Mix: one output with a single owners_after, one output with multiple
# owners_after # owners_after
def test_muiltiple_in_mix_own_multiple_out_mix_own_transfer(alice, b, user_pk, def test_muiltiple_in_mix_own_multiple_out_mix_own_transfer(alice, b, user_pk, user_sk):
user_sk):
from planetmint.transactions.common.utils import _fulfillment_to_details from planetmint.transactions.common.utils import _fulfillment_to_details
# CREATE divisible asset # CREATE divisible asset
tx_create = Create.generate([alice.public_key], [([user_pk], 50), ([user_pk, alice.public_key], 50)], tx_create = Create.generate(
asset={'name': random.random()}) [alice.public_key], [([user_pk], 50), ([user_pk, alice.public_key], 50)], asset={"name": random.random()}
)
tx_create_signed = tx_create.sign([alice.private_key]) tx_create_signed = tx_create.sign([alice.private_key])
# TRANSFER # TRANSFER
tx_transfer = Transfer.generate(tx_create.to_inputs(), tx_transfer = Transfer.generate(
[([alice.public_key], 50), ([alice.public_key, user_pk], 50)], tx_create.to_inputs(), [([alice.public_key], 50), ([alice.public_key, user_pk], 50)], asset_id=tx_create.id
asset_id=tx_create.id) )
tx_transfer_signed = tx_transfer.sign([alice.private_key, user_sk]) tx_transfer_signed = tx_transfer.sign([alice.private_key, user_sk])
b.store_bulk_transactions([tx_create_signed]) b.store_bulk_transactions([tx_create_signed])
@ -404,15 +397,15 @@ def test_muiltiple_in_mix_own_multiple_out_mix_own_transfer(alice, b, user_pk,
cond_cid0 = tx_transfer_signed.outputs[0].to_dict() cond_cid0 = tx_transfer_signed.outputs[0].to_dict()
cond_cid1 = tx_transfer_signed.outputs[1].to_dict() cond_cid1 = tx_transfer_signed.outputs[1].to_dict()
assert 'subconditions' not in cond_cid0['condition']['details'] assert "subconditions" not in cond_cid0["condition"]["details"]
assert 'subconditions' in cond_cid1['condition']['details'] assert "subconditions" in cond_cid1["condition"]["details"]
assert len(cond_cid1['condition']['details']['subconditions']) == 2 assert len(cond_cid1["condition"]["details"]["subconditions"]) == 2
ffill_fid0 = _fulfillment_to_details(tx_transfer_signed.inputs[0].fulfillment) ffill_fid0 = _fulfillment_to_details(tx_transfer_signed.inputs[0].fulfillment)
ffill_fid1 = _fulfillment_to_details(tx_transfer_signed.inputs[1].fulfillment) ffill_fid1 = _fulfillment_to_details(tx_transfer_signed.inputs[1].fulfillment)
assert 'subconditions' not in ffill_fid0 assert "subconditions" not in ffill_fid0
assert 'subconditions' in ffill_fid1 assert "subconditions" in ffill_fid1
assert len(ffill_fid1['subconditions']) == 2 assert len(ffill_fid1["subconditions"]) == 2
b.store_bulk_transactions([tx_transfer_signed]) b.store_bulk_transactions([tx_transfer_signed])
with pytest.raises(DoubleSpend): with pytest.raises(DoubleSpend):
@ -429,26 +422,24 @@ def test_multiple_in_different_transactions(alice, b, user_pk, user_sk):
# CREATE divisible asset # CREATE divisible asset
# `b` creates a divisible asset and assigns 50 shares to `b` and # `b` creates a divisible asset and assigns 50 shares to `b` and
# 50 shares to `user_pk` # 50 shares to `user_pk`
tx_create = Create.generate([alice.public_key], [([user_pk], 50), ([alice.public_key], 50)], tx_create = Create.generate(
asset={'name': random.random()}) [alice.public_key], [([user_pk], 50), ([alice.public_key], 50)], asset={"name": random.random()}
)
tx_create_signed = tx_create.sign([alice.private_key]) tx_create_signed = tx_create.sign([alice.private_key])
# TRANSFER divisible asset # TRANSFER divisible asset
# `b` transfers its 50 shares to `user_pk` # `b` transfers its 50 shares to `user_pk`
# after this transaction `user_pk` will have a total of 100 shares # after this transaction `user_pk` will have a total of 100 shares
# split across two different transactions # split across two different transactions
tx_transfer1 = Transfer.generate(tx_create.to_inputs([1]), tx_transfer1 = Transfer.generate(tx_create.to_inputs([1]), [([user_pk], 50)], asset_id=tx_create.id)
[([user_pk], 50)],
asset_id=tx_create.id)
tx_transfer1_signed = tx_transfer1.sign([alice.private_key]) tx_transfer1_signed = tx_transfer1.sign([alice.private_key])
# TRANSFER # TRANSFER
# `user_pk` combines two different transaction with 50 shares each and # `user_pk` combines two different transaction with 50 shares each and
# transfers a total of 100 shares back to `b` # transfers a total of 100 shares back to `b`
tx_transfer2 = Transfer.generate(tx_create.to_inputs([0]) + tx_transfer2 = Transfer.generate(
tx_transfer1.to_inputs([0]), tx_create.to_inputs([0]) + tx_transfer1.to_inputs([0]), [([alice.private_key], 100)], asset_id=tx_create.id
[([alice.private_key], 100)], )
asset_id=tx_create.id)
tx_transfer2_signed = tx_transfer2.sign([user_sk]) tx_transfer2_signed = tx_transfer2.sign([user_sk])
b.store_bulk_transactions([tx_create_signed, tx_transfer1_signed]) b.store_bulk_transactions([tx_create_signed, tx_transfer1_signed])
@ -471,15 +462,14 @@ def test_amount_error_transfer(alice, b, user_pk, user_sk):
from planetmint.transactions.common.exceptions import AmountError from planetmint.transactions.common.exceptions import AmountError
# CREATE divisible asset # CREATE divisible asset
tx_create = Create.generate([alice.public_key], [([user_pk], 100)], asset={'name': random.random()}) tx_create = Create.generate([alice.public_key], [([user_pk], 100)], asset={"name": random.random()})
tx_create_signed = tx_create.sign([alice.private_key]) tx_create_signed = tx_create.sign([alice.private_key])
b.store_bulk_transactions([tx_create_signed]) b.store_bulk_transactions([tx_create_signed])
# TRANSFER # TRANSFER
# output amount less than input amount # output amount less than input amount
tx_transfer = Transfer.generate(tx_create.to_inputs(), [([alice.public_key], 50)], tx_transfer = Transfer.generate(tx_create.to_inputs(), [([alice.public_key], 50)], asset_id=tx_create.id)
asset_id=tx_create.id)
tx_transfer_signed = tx_transfer.sign([user_sk]) tx_transfer_signed = tx_transfer.sign([user_sk])
with pytest.raises(AmountError): with pytest.raises(AmountError):
@ -487,8 +477,7 @@ def test_amount_error_transfer(alice, b, user_pk, user_sk):
# TRANSFER # TRANSFER
# output amount greater than input amount # output amount greater than input amount
tx_transfer = Transfer.generate(tx_create.to_inputs(), [([alice.public_key], 101)], tx_transfer = Transfer.generate(tx_create.to_inputs(), [([alice.public_key], 101)], asset_id=tx_create.id)
asset_id=tx_create.id)
tx_transfer_signed = tx_transfer.sign([user_sk]) tx_transfer_signed = tx_transfer.sign([user_sk])
with pytest.raises(AmountError): with pytest.raises(AmountError):
@ -504,13 +493,11 @@ def test_threshold_same_public_key(alice, b, user_pk, user_sk):
# that does not mean that the code shouldn't work. # that does not mean that the code shouldn't work.
# CREATE divisible asset # CREATE divisible asset
tx_create = Create.generate([alice.public_key], [([user_pk, user_pk], 100)], tx_create = Create.generate([alice.public_key], [([user_pk, user_pk], 100)], asset={"name": random.random()})
asset={'name': random.random()})
tx_create_signed = tx_create.sign([alice.private_key]) tx_create_signed = tx_create.sign([alice.private_key])
# TRANSFER # TRANSFER
tx_transfer = Transfer.generate(tx_create.to_inputs(), [([alice.public_key], 100)], tx_transfer = Transfer.generate(tx_create.to_inputs(), [([alice.public_key], 100)], asset_id=tx_create.id)
asset_id=tx_create.id)
tx_transfer_signed = tx_transfer.sign([user_sk, user_sk]) tx_transfer_signed = tx_transfer.sign([user_sk, user_sk])
b.store_bulk_transactions([tx_create_signed]) b.store_bulk_transactions([tx_create_signed])
@ -524,14 +511,14 @@ def test_threshold_same_public_key(alice, b, user_pk, user_sk):
def test_sum_amount(alice, b, user_pk, user_sk): def test_sum_amount(alice, b, user_pk, user_sk):
# CREATE divisible asset with 3 outputs with amount 1 # CREATE divisible asset with 3 outputs with amount 1
tx_create = Create.generate([alice.public_key], [([user_pk], 1), ([user_pk], 1), ([user_pk], 1)], tx_create = Create.generate(
asset={'name': random.random()}) [alice.public_key], [([user_pk], 1), ([user_pk], 1), ([user_pk], 1)], asset={"name": random.random()}
)
tx_create_signed = tx_create.sign([alice.private_key]) tx_create_signed = tx_create.sign([alice.private_key])
# create a transfer transaction with one output and check if the amount # create a transfer transaction with one output and check if the amount
# is 3 # is 3
tx_transfer = Transfer.generate(tx_create.to_inputs(), [([alice.public_key], 3)], tx_transfer = Transfer.generate(tx_create.to_inputs(), [([alice.public_key], 3)], asset_id=tx_create.id)
asset_id=tx_create.id)
tx_transfer_signed = tx_transfer.sign([user_sk]) tx_transfer_signed = tx_transfer.sign([user_sk])
b.store_bulk_transactions([tx_create_signed]) b.store_bulk_transactions([tx_create_signed])
@ -548,14 +535,16 @@ def test_sum_amount(alice, b, user_pk, user_sk):
def test_divide(alice, b, user_pk, user_sk): def test_divide(alice, b, user_pk, user_sk):
# CREATE divisible asset with 1 output with amount 3 # CREATE divisible asset with 1 output with amount 3
tx_create = Create.generate([alice.public_key], [([user_pk], 3)], asset={'name': random.random()}) tx_create = Create.generate([alice.public_key], [([user_pk], 3)], asset={"name": random.random()})
tx_create_signed = tx_create.sign([alice.private_key]) tx_create_signed = tx_create.sign([alice.private_key])
# create a transfer transaction with 3 outputs and check if the amount # create a transfer transaction with 3 outputs and check if the amount
# of each output is 1 # of each output is 1
tx_transfer = Transfer.generate(tx_create.to_inputs(), tx_transfer = Transfer.generate(
tx_create.to_inputs(),
[([alice.public_key], 1), ([alice.public_key], 1), ([alice.public_key], 1)], [([alice.public_key], 1), ([alice.public_key], 1), ([alice.public_key], 1)],
asset_id=tx_create.id) asset_id=tx_create.id,
)
tx_transfer_signed = tx_transfer.sign([user_sk]) tx_transfer_signed = tx_transfer.sign([user_sk])
b.store_bulk_transactions([tx_create_signed]) b.store_bulk_transactions([tx_create_signed])

View File

@ -63,22 +63,10 @@ def test_zenroom_signing():
alice = json.loads(zencode_exec(GENERATE_KEYPAIR).output)["keyring"] alice = json.loads(zencode_exec(GENERATE_KEYPAIR).output)["keyring"]
bob = json.loads(zencode_exec(GENERATE_KEYPAIR).output)["keyring"] bob = json.loads(zencode_exec(GENERATE_KEYPAIR).output)["keyring"]
zen_public_keys = json.loads( zen_public_keys = json.loads(zencode_exec(SK_TO_PK.format("Alice"), keys=json.dumps({"keyring": alice})).output)
zencode_exec( zen_public_keys.update(json.loads(zencode_exec(SK_TO_PK.format("Bob"), keys=json.dumps({"keyring": bob})).output))
SK_TO_PK.format("Alice"), keys=json.dumps({"keyring": alice})
).output
)
zen_public_keys.update(
json.loads(
zencode_exec(
SK_TO_PK.format("Bob"), keys=json.dumps({"keyring": bob})
).output
)
)
zenroomscpt = ZenroomSha256( zenroomscpt = ZenroomSha256(script=FULFILL_SCRIPT, data=ZENROOM_DATA, keys=zen_public_keys)
script=FULFILL_SCRIPT, data=ZENROOM_DATA, keys=zen_public_keys
)
print(f"zenroom is: {zenroomscpt.script}") print(f"zenroom is: {zenroomscpt.script}")
# CRYPTO-CONDITIONS: generate the condition uri # CRYPTO-CONDITIONS: generate the condition uri
@ -107,11 +95,7 @@ def test_zenroom_signing():
biolabs.public_key, biolabs.public_key,
], ],
} }
metadata = { metadata = {"result": {"output": ["ok"]}}
"result": {
"output": ["ok"]
}
}
token_creation_tx = { token_creation_tx = {
"operation": "CREATE", "operation": "CREATE",
"asset": HOUSE_ASSETS, "asset": HOUSE_ASSETS,

Some files were not shown because too many files have changed in this diff Show More