Algorithm - Run Custom Algorithm

This is a wrapper for running your own custom algorithms

Note

Please refer to the sa.py for the lastest usage examples.

Example with the command line tool:

bt -t SPY -g /opt/sa/analysis_engine/mocks/example_algo_minute.py
analysis_engine.run_custom_algo.run_custom_algo(mod_path, ticker='SPY', balance=50000, commission=6.0, start_date=None, end_date=None, name='myalgo', auto_fill=True, config_file=None, config_dict=None, load_from_s3_bucket=None, load_from_s3_key=None, load_from_redis_key=None, load_from_file=None, load_compress=False, load_publish=True, load_config=None, report_redis_key=None, report_s3_bucket=None, report_s3_key=None, report_file=None, report_compress=False, report_publish=True, report_config=None, history_redis_key=None, history_s3_bucket=None, history_s3_key=None, history_file=None, history_compress=False, history_publish=True, history_config=None, extract_redis_key=None, extract_s3_bucket=None, extract_s3_key=None, extract_file=None, extract_save_dir=None, extract_compress=False, extract_publish=True, extract_config=None, publish_to_s3=True, publish_to_redis=True, publish_to_slack=True, dataset_type=20000, serialize_datasets=['daily', 'minute', 'quote', 'stats', 'peers', 'news1', 'financials', 'earnings', 'dividends', 'company', 'news', 'calls', 'puts', 'pricing', 'tdcalls', 'tdputs'], compress=False, encoding='utf-8', redis_enabled=True, redis_key=None, redis_address=None, redis_db=None, redis_password=None, redis_expire=None, redis_serializer='json', redis_encoding='utf-8', s3_enabled=True, s3_key=None, s3_address=None, s3_bucket=None, s3_access_key=None, s3_secret_key=None, s3_region_name=None, s3_secure=False, slack_enabled=False, slack_code_block=False, slack_full_width=False, timeseries=None, trade_strategy=None, verbose=False, debug=False, dataset_publish_extract=False, dataset_publish_history=False, dataset_publish_report=False, run_on_engine=False, auth_url='redis://localhost:6379/13', backend_url='redis://localhost:6379/14', include_tasks=['analysis_engine.work_tasks.task_run_algo', 'analysis_engine.work_tasks.get_new_pricing_data', 'analysis_engine.work_tasks.handle_pricing_update_task', 'analysis_engine.work_tasks.prepare_pricing_dataset', 'analysis_engine.work_tasks.publish_from_s3_to_redis', 'analysis_engine.work_tasks.publish_pricing_update', 'analysis_engine.work_tasks.task_screener_analysis', 'analysis_engine.work_tasks.publish_ticker_aggregate_from_s3'], ssl_options={}, transport_options={}, path_to_config_module='analysis_engine.work_tasks.celery_config', raise_on_err=True)[source]

Run a custom algorithm that derives the analysis_engine.algo.BaseAlgo class

Note

Make sure to only have 1 class defined in an algo module. Imports from other modules should work just fine.

Algorithm arguments

Parameters:
  • mod_path – file path to custom algorithm class module
  • ticker – ticker symbol
  • balance – float - starting balance capital for creating buys and sells
  • commission – float - cost pet buy or sell
  • name – string - name for tracking algorithm in the logs
  • start_date – string - start date for backtest with format YYYY-MM-DD HH:MM:SS
  • end_date – end date for backtest with format YYYY-MM-DD HH:MM:SS
  • auto_fill – optional - boolean for auto filling buy and sell orders for backtesting (default is True)
  • config_file – path to a json file containing custom algorithm object member values (like indicator configuration and predict future date units ahead for a backtest)
  • config_dict – optional - dictionary that can be passed to derived class implementations of: def load_from_config(config_dict=config_dict)

Timeseries

Parameters:timeseries – optional - string to set day or minute backtesting or live trading (default is minute)

Trading Strategy

Parameters:trade_strategy – optional - string to set the type of Trading Strategy for backtesting or live trading (default is count)

Running Distributed Algorithms on the Engine Workers

Parameters:
  • run_on_engine – optional - boolean flag for publishing custom algorithms to Celery ae workers for distributing algorithm workloads (default is False which will run algos locally) this is required for distributing algorithms
  • auth_url – Celery broker address (default is redis://localhost:6379/11 or analysis_engine.consts.WORKER_BROKER_URL environment variable) this is required for distributing algorithms
  • backend_url – Celery backend address (default is redis://localhost:6379/12 or analysis_engine.consts.WORKER_BACKEND_URL environment variable) this is required for distributing algorithms
  • include_tasks – list of modules containing tasks to add (default is analysis_engine.consts.INCLUDE_TASKS)
  • ssl_options – security options dictionary (default is analysis_engine.consts.SSL_OPTIONS)
  • trasport_options – transport options dictionary (default is analysis_engine.consts.TRANSPORT_OPTIONS)
  • path_to_config_module – config module for advanced Celery worker connectivity requirements (default is analysis_engine.work_tasks.celery_config or analysis_engine.consts.WORKER_CELERY_CONFIG_MODULE)

Load Algorithm-Ready Dataset From Source

Use these arguments to load algorithm-ready datasets from supported sources (file, s3 or redis)

Parameters:
  • load_from_s3_bucket – optional - string load the algo from an a previously-created s3 bucket holding an s3 key with an algorithm-ready dataset for use with: handle_data
  • load_from_s3_key – optional - string load the algo from an a previously-created s3 key holding an algorithm-ready dataset for use with: handle_data
  • load_from_redis_key – optional - string load the algo from a a previously-created redis key holding an algorithm-ready dataset for use with: handle_data
  • load_from_file – optional - string path to a previously-created local file holding an algorithm-ready dataset for use with: handle_data
  • load_compress – optional - boolean flag for toggling to decompress or not when loading an algorithm-ready dataset (True means the dataset must be decompressed to load correctly inside an algorithm to run a backtest)
  • load_publish – boolean - toggle publishing the load progress to slack, s3, redis or a file (default is True)
  • load_config – optional - dictionary for setting member variables to load an agorithm-ready dataset from a file, s3 or redis

Publishing Control Bool Flags

Parameters:
  • publish_to_s3 – optional - boolean for toggling publishing to s3 on/off (default is True)
  • publish_to_redis – optional - boolean for publishing to redis on/off (default is True)
  • publish_to_slack – optional - boolean for publishing to slack (default is True)

Algorithm Trade History Arguments

Parameters:
  • history_redis_key – optional - string where the algorithm trading history will be stored in an redis key
  • history_s3_bucket – optional - string where the algorithm trading history will be stored in an s3 bucket
  • history_s3_key – optional - string where the algorithm trading history will be stored in an s3 key
  • history_file – optional - string key where the algorithm trading history will be stored in a file serialized as a json-string
  • history_compress – optional - boolean flag for toggling to decompress or not when loading an algorithm-ready dataset (True means the dataset will be compressed on publish)
  • history_publish – boolean - toggle publishing the history to s3, redis or a file (default is True)
  • history_config – optional - dictionary for setting member variables to publish an algo trade history to s3, redis, a file or slack

Algorithm Trade Performance Report Arguments (Output Dataset)

Parameters:
  • report_redis_key – optional - string where the algorithm trading performance report (report) will be stored in an redis key
  • report_s3_bucket – optional - string where the algorithm report will be stored in an s3 bucket
  • report_s3_key – optional - string where the algorithm report will be stored in an s3 key
  • report_file – optional - string key where the algorithm report will be stored in a file serialized as a json-string
  • report_compress – optional - boolean flag for toggling to decompress or not when loading an algorithm-ready dataset (True means the dataset will be compressed on publish)
  • report_publish – boolean - toggle publishing the trading performance report s3, redis or a file (default is True)
  • report_config – optional - dictionary for setting member variables to publish an algo trading performance report to s3, redis, a file or slack

Extract an Algorithm-Ready Dataset Arguments

Parameters:
  • extract_redis_key – optional - string where the algorithm report will be stored in an redis key
  • extract_s3_bucket – optional - string where the algorithm report will be stored in an s3 bucket
  • extract_s3_key – optional - string where the algorithm report will be stored in an s3 key
  • extract_file – optional - string key where the algorithm report will be stored in a file serialized as a json-string
  • extract_save_dir – optional - string path to auto-generated files from the algo
  • extract_compress – optional - boolean flag for toggling to decompress or not when loading an algorithm-ready dataset (True means the dataset will be compressed on publish)
  • extract_publish – boolean - toggle publishing the used algorithm-ready dataset to s3, redis or a file (default is True)
  • extract_config – optional - dictionary for setting member variables to publish an algo trading performance report to s3, redis, a file or slack

Dataset Arguments

Parameters:
  • dataset_type – optional - dataset type (default is SA_DATASET_TYPE_ALGO_READY)
  • serialize_datasets – optional - list of dataset names to deserialize in the dataset (default is DEFAULT_SERIALIZED_DATASETS)
  • encoding – optional - string for data encoding

Publish Algorithm Datasets to S3, Redis or a File

Parameters:
  • dataset_publish_extract – optional - bool for publishing the algorithm’s algorithm-ready dataset to: s3, redis or file
  • dataset_publish_history – optional - bool for publishing the algorithm’s trading history dataset to: s3, redis or file
  • dataset_publish_report – optional - bool for publishing the algorithm’s trading performance report dataset to: s3, redis or file

Redis connectivity arguments

Parameters:
  • redis_enabled – bool - toggle for auto-caching all datasets in Redis (default is True)
  • redis_key – string - key to save the data in redis (default is None)
  • redis_address – Redis connection string format: host:port (default is localhost:6379)
  • redis_db – Redis db to use (default is 0)
  • redis_password – optional - Redis password (default is None)
  • redis_expire – optional - Redis expire value (default is None)
  • redis_serializer – not used yet - support for future pickle objects in redis
  • redis_encoding – format of the encoded key in redis

Minio (S3) connectivity arguments

Parameters:
  • s3_enabled – bool - toggle for auto-archiving on Minio (S3) (default is True)
  • s3_key – string - key to save the data in redis (default is None)
  • s3_address – Minio S3 connection string format: host:port (default is localhost:9000)
  • s3_bucket – S3 Bucket for storing the artifacts (default is dev) which should be viewable on a browser: http://localhost:9000/minio/dev/
  • s3_access_key – S3 Access key (default is trexaccesskey)
  • s3_secret_key – S3 Secret key (default is trex123321)
  • s3_region_name – S3 region name (default is us-east-1)
  • s3_secure – Transmit using tls encryption (default is False)

Slack arguments

Parameters:
  • slack_enabled – optional - boolean for publishing to slack
  • slack_code_block – optional - boolean for publishing as a code black in slack
  • slack_full_width – optional - boolean for publishing as a to slack using the full width allowed

Debugging arguments

Parameters:
  • debug – optional - bool for debug tracking
  • verbose – optional - bool for increasing logging
  • raise_on_err – boolean - set this to False on prod to ensure exceptions do not interrupt services. With the default (True) any exceptions from the library and your own algorithm are sent back out immediately exiting the backtest.