Fix 781 and 124

The queue/pipeline used in the block creation process is unbounded.
When too many transactions occur, the backlog table fills up quite easily and
bigchaindb just reads all the transaction in the backlog to create a block.
This causes memory usage to grow indefinitely.

Limiting the queue size to 1000 transactions for now as the block creation and
voting happens in batches of 1000. Can be increased later in case block size
is increased.
This commit is contained in:
Krish 2016-12-08 08:54:21 +00:00
parent bd6b9da080
commit 85bb4a9233

View File

@ -8,7 +8,7 @@ function.
import logging
import rethinkdb as r
from multipipes import Pipeline, Node
from multipipes import Pipeline, Node, Pipe
from bigchaindb.models import Transaction
from bigchaindb.pipelines.utils import ChangeFeed
@ -161,6 +161,7 @@ def create_pipeline():
block_pipeline = BlockPipeline()
pipeline = Pipeline([
Pipe(maxsize=1000),
Node(block_pipeline.filter_tx),
Node(block_pipeline.validate_tx, fraction_of_cores=1),
Node(block_pipeline.create, timeout=1),