Algo Runner API

A class for running backtests and the latest pricing data with an automated publishing of the Trading History to S3

class analysis_engine.algo_runner.AlgoRunner(ticker, algo_config=None, start_date=None, end_date=None, history_loc=None, predictions_loc=None, run_on_engine=False, verbose_algo=False, verbose_processor=False, verbose_indicators=False, **kwargs)[source]

Run an algorithm backtest or with the latest pricing data and publish the compressed trading history to s3 which can be used to train AI

Full Backtest

import analysis_engine.algo_runner as algo_runner
runner = algo_runner.AlgoRunner('SPY')

Run Algorithm with Latest Pricing Data

import analysis_engine.algo_runner as algo_runner
import analysis_engine.plot_trading_history as plot
ticker = 'SPY'
runner = algo_runner.AlgoRunner(ticker)
# run the algorithm with the latest 200 minutes:
df = runner.latest()
print(df[['minute', 'close']].tail(5))
        f'{ticker} - ${df["close"].iloc[-1]} '
        f'at: {df["minute"].iloc[-1]}'),

determine the latest minute or day in the pricing dataset and convert date and minute columns to datetime objects

latest(date_str=None, start_row=-200, extract_iex=True, extract_yahoo=False, extract_td=True, verbose=False, **kwargs)[source]

Run the algorithm with the latest pricing data. Also supports running a backtest for a historical date in the pricing history (format YYYY-MM-DD)

  • date_str – optional - string start date YYYY-MM-DD default is the latest close date
  • start_row – negative number of rows back from the end of the list in the data default is -200 where this means the algorithm will process the latest 200 rows in the minute dataset
  • extract_iex – bool flag for extracting from IEX
  • extract_yahoo – bool flag for extracting from Yahoo which is disabled as of 1/2019
  • extract_td – bool flag for extracting from Tradier
  • verbose – bool flag for logs
  • kwargs – keyword arg dict
load_trading_history(s3_access_key=None, s3_secret_key=None, s3_address=None, s3_region=None, s3_bucket=None, s3_key=None, s3_secure=7, **kwargs)[source]

Helper for loading an algorithm Trading History from S3

  • s3_access_key – access key
  • s3_secret_key – secret
  • s3_address – address
  • s3_region – region
  • s3_bucket – bucket
  • s3_key – key
  • s3_secure – secure flag
  • kwargs – support for keyword arg dict
publish_trading_history(records_for_history, pt_s3_access_key=None, pt_s3_secret_key=None, pt_s3_address=None, pt_s3_region=None, pt_s3_bucket=None, pt_s3_key=None, pt_s3_secure=7, **kwargs)[source]

Helper for publishing a trading history to another S3 service like AWS

  • records_for_history – list of dictionaries for the history file
  • pt_s3_access_key – access key
  • pt_s3_secret_key – secret
  • pt_s3_address – address
  • pt_s3_region – region
  • pt_s3_bucket – bucket
  • pt_s3_key – key
  • pt_s3_secure – secure flag
  • kwargs – support for keyword arg dict

Start the algorithm backtest


wait until the algorithm finishes

Build an Algorithm Backtest Dictionary

Build a dictionary by extracting all required pricing datasets for the algorithm’s indicators out of Redis

This dictionary should be passed to an algorithm’s handle_data method like:

analysis_engine.build_dataset_node.build_dataset_node(ticker, datasets, date=None, service_dict=None, log_label=None, redis_enabled=True, redis_address=None, redis_db=None, redis_password=None, redis_expire=None, redis_key=None, s3_enabled=True, s3_address=None, s3_bucket=None, s3_access_key=None, s3_secret_key=None, s3_region_name=None, s3_secure=False, s3_key=None, verbose=False)[source]

Helper for building a dictionary that of cached datasets from redis.

The datasets should be built from off the algorithm’s config indicators uses_data fields which if not set will default to minute data

  • ticker – string ticker
  • datasets – list of string dataset names to extract from redis
  • date – optional - string datetime formatted YYYY-MM-DD (default is last trading close date)
  • service_dict – optional - dictionary for all service connectivity to Redis and Minio if not set the arguments for all s3_* and redis_* will be used to lookup data in Redis and Minio

(Optional) Redis connectivity arguments

  • redis_enabled – bool - toggle for auto-caching all datasets in Redis (default is True)
  • redis_address – Redis connection string format is host:port (default is localhost:6379)
  • redis_db – Redis db to use (default is 0)
  • redis_password – optional - Redis password (default is None)
  • redis_expire – optional - Redis expire value (default is None)
  • redis_key – optional - redis key not used (default is None)
  • s3_enabled – bool - toggle for turning on/off Minio or AWS S3 (default is True)
  • s3_address – Minio S3 connection string address format is host:port (default is localhost:9000)
  • s3_bucket – S3 Bucket for storing the artifacts (default is dev) which should be viewable on a browser: http://localhost:9000/minio/dev/
  • s3_access_key – S3 Access key (default is trexaccesskey)
  • s3_secret_key – S3 Secret key (default is trex123321)
  • s3_region_name – S3 region name (default is us-east-1)
  • s3_secure – Transmit using tls encryption (default is False)
  • s3_key – optional s3 key not used (default is None)


  • log_label – optional - log label string
  • verbose – optional - flag for debugging (default to False)

Run an Algorithm Backtest with the Runner API

Algorithm Runner API Example Script

Run Full Backtest -t TICKER -b S3_BUCKET -k S3_KEY -c ALGO_CONFIG

Run Algorithm with Latest Pricing Data -l -t TICKER -b S3_BUCKET -k S3_KEY -c ALGO_CONFIG

Debug by adding -d as an argument


build and publish a trading history from an algorithm config. -t TICKER -c ALGO_CONFIG -s START_DATE
-k S3_KEY -b S3_BUCKET -l