Bloomberg Equities Analytics Accelerator Quickstart Guide

This page provides a quickstart guide for the Bloomberg Equities Analytics, an FSI Accelerator package that comes with ready-made Pipelines, Views, and Schemas out of the box.

Prerequisites

This quickstart guide is not intended to be a comprehensive KDB Insights Install Guide. For more information regarding kdb Insights installation, refer to the kdb Insights Enterprise documentation. This guide assumes the following prerequisites:

  • kdb Insights Enterprise is installed, and:

    • The following credentials have been obtained and configured for Insights users: GUI User API Client

    • A KX Downloads Portal bearer token to download packages/charts (represented by BEARER in this guide).

  • Tools used:

    • Access to *nix command-line

    • KDB Insights CLI - kxi

    • Kubernetes tools:

      • kubectl

      • K9s

    • Helm installed and logged in to nexus

    • VSCode plugins: KX kdb Jupyter

Quickstart guide

This guide details the following steps to get you started with the Bloomberg Equity Analytics Accelerator:

Download the packages

To download the required FSI packages from the KX Downloads Portal, run the following:

Shell

Copy
# Download fsi-lib - the Core FSI Library
curl -s --fail-with-body -D /dev/stderr --oauth2-bearer ${BEARER} -L -OJ https://portal.dl.kx.com/assets/raw/kxi-accelerators/fsi/fsi-lib/packages/1.3.0/fsi-lib-1.3.0.kxi

# Download fsi-app-bbg-eqea - the Bloomberg Equities Analytics Accelerator
curl -s --fail-with-body -D /dev/stderr --oauth2-bearer ${BEARER} -L -OJ https://portal.dl.kx.com/assets/raw/kxi-accelerators/fsi/fsi-bbg-eqea/packages/1.3.0/fsi-app-bbg-eqea-1.3.0.kxi

Install the packages

Run the command below to install the FSI packages:

Shell

Copy
kxi pm push fsi-lib-1.3.0.kxi
kxi pm push fsi-app-bbg-eqea-1.3.0.kxi
                    

Deploy the database

Run the command below to deploy the database:

Shell

Copy
kxi pm deploy fsi-app-eqea --db fsi-core-db
                    

Information

At this point, you should be able to see the data access point (DAPs) booting up on the GUI or on k9s.

Deploy the pipelines

If all the files are in the same s3 region, you can set the environment variable locally to be re-used in your commands. To do that, run the following command:

Shell

Copy
export MY_S3_REGION='us-east-2'
                    

Then, deploy the pipelines for:

Order ingest

First, deploy the Order ingest pipeline with your chosen configuration.

If you are using normalization, the setting is turned on by setting the FSI_EQEA_TZNORM boolean as true.However, in this example it is set as false, which is the default so it can be omitted.

When you don't activate normalization, your orders are ingested as default, without a strike time calculation or time zone adjustment.

To deploy the pipeline with the target file given, run the following:

Shell

Copy
kxi pm deploy fsi-app-bbg-eqea --pipeline eqeaorderingest --env eqeaorderingest:FSI_FILEPATH=':s3://my-bucket/orders.csv' --env eqeaorderingest:FSI_REGION=$MY_S3_REGION
                    

fxRates

If you are using fxRates to calculate values in USD, deploy the pipeline with the target file, as follows:

Shell

Copy
kxi pm deploy fsi-app-bbg-eqea --pipeline eqeafxrates --env eqeafxrates:FSI_FILEPATH=':s3://my-bucket/fxRates.csv' --env eqeafxrates:FSI_REGION=$MY_S3_REGION
                    

fxRates are required to calculate the value in USD, which is the foundation of the dashboard metrics.

Market data

Market data for the time period the order covers is required.

Bloomberg historical data is provided as separate files for Trades and Quotes, so we set the files in two pipelines. There's also the option of either CSV or Parquet. In this example, we use the CSV pipelines.

Deploy the Bloomberg market data ingestion pipeline with the corresponding Bloomberg historical data:

Shell

Copy
kxi pm deploy fsi-app-bbg-eqea --pipeline bbgtradeingest --env bbgtradeingest:FSI_FILEPATH=':s3://my-bucket/trades.csv.gz'  --env bbgtradeingest:FSI_REGION=$MY_S3_REGION

kxi pm deploy fsi-app-bbg-eqea --pipeline bbgquoteingest --env bbgquoteingest:FSI_FILEPATH=':s3://my-bucket/quotes.csv.gz'  --env bbgquoteingest:FSI_REGION=$MY_S3_REGION

Access APIs

Once you have deployed all the pipelines and ingested the data, you can query the tables using the getData or getTicks APIs, then use the generation API to preview the order analytics.

Run the generation pipeline to persist the order analytics

Once the data is ingested and the API for generation is working, you can run an instance of the nightly generation pipeline that persists the order analytics. For details refer to the documentation for generation and persistence of OrderAnalytics.

To run an instance of the generation pipeline, deploy the pipeline with the corresponding date, as follows:

Shell

Copy
kxi pm deploy fsi-app-bbg-eqea --pipeline eqeagentca --env eqeagentca:DATE='2000.01.01'
                    

View summary of order analytics

Once orderAnalytics has been written into the database, you can query the table and use the summary rollup functions. These summary rollups power the example dashboards.

There are two ways to view a summary of order analytics:

  1. API. Call the summary API on the orderAnalytics table for rollup aggregations using the API.

  2. Dashboards. Log into the dashboards to investigate order analytics. The dashboard is named `Equity-Execution-Analysis`. The dashboard serves as an example of rendering some metrics from the summary rollup calculations.

Additional pipeline configuration

To use the full feature set of the accelerator, the following additional reference pipelines are also required:

Exchange reference data

If you are normalizing Order data, exchange reference data is required to calculate strike time. For this, run the following command:

Shell

Copy
kxi pm deploy fsi-app-bbg-eqea --pipeline eqeaexchange-refdata --env eqeaexchange-refdata:FSI_FILEPATH=':s3://my-bucket/exchangeHours.csv' --env eqeaexchange-refdata:FSI_REGION=$MY_S3_REGION
                    

Composite mapping

If composite tickers are being used for market data but orders are for local tickers, then you must ingest a composite ticker map, as follows:

Shell

Copy
kxi pm deploy fsi-app-bbg-eqea --pipeline eqeacompositeticker --env eqeacompositeticker:FSI_FILEPATH=':s3://my-bucket/compositeTickerMap.csv' --env eqeacompositeticker:FSI_REGION=$MY_S3_REGION
                    

This feature also requires a boolean setting turned on - which is enabled by default - in a custom file in an overlay:

q

Copy
.eqea.mapToCompositeTicker:1b;
                    

Next steps - nightly schedules

Once you are happy with all the pipelines, the next step is to set up a re-occurring schedule of ingest pipelines to consume all data nightly, followed by a pipeline that generates the analytics.

Refer to the scheduling pipelines documentation for more details on setting up a re-occurring schedule.