I spoke at the Open Data Science London conference last weekend on the topic of becoming a quant. Part of the talk was aimed at educating practising data scientists on the fact that quantitative finance firms do actually contribute to, and create, many open source projects.
One such project is QSTrader, which I haven't discussed for some time on the site. In today's post I am pleased to announce that our team has been working hard to produce a completely updated version of QSTrader that will be released in the coming months.
QSTrader was initially developed as a modular event-driven backtesting system primarily aimed at equities-based strategies. However, it soon became clear that retail traders and institutional firms alike were improving it beyond what we had initially envisioned.
Although the current version does have basic portfolio handling capabilities, it is far from the multi-account, multi-strategy system that many users have asked for. This motivated us to produce a more sophisticated system "from the ground up".
Our intention is to be able to simulate large allocations of capital from the backtesting/research phase, forward simulation (e.g. with various mathematical models of asset paths), through to paper-trading and finally live deployment.
The goal is to improve QSTrader from a simple event-driven equities backtester to a fully-fledged real-time trade engine and performance reporting environment across multiple asset classes, currencies and instruments, using an institutional-style portfolio construction framework.
The following sections describe some of the main proposed components in detail.
In the current version of QSTrader there is no concept of a "liquidity provider" or "brokerage" that the trading system can utilise—either for live market data or portfolio tracking. In the new version a
Broker class hierarchy has been designed to handle this position accounting.
The broker entity supports a "master" account with multiple sub-accounts, each tracking their own PnL. Each account can be denominated in a separate currency, allowing multi-region portfolios. The sub-account PnLs will be aggregated to obtain total account PnL, which will be marked-to-market across the various currencies, using point-in-time F/X data.
The trading strategy will interface with the broker entity solely by subscribing/withdrawing funds, creating and deleting sub-accounts, obtaining latest market data and executing orders. This allows all internal portfolio and position handling logic to carried out by the broker entity itself.
At this stage we feel this is a sufficiently generic model to support many varieties of trading entities, from small account size retail traders through to "friends and family" managers and even family offices/small quant funds without heavy requirements.
In the future there is likely to be a broader
LiquidityProvider class hierarchy, of which the
Broker will be a subclass. This would ultimately reflect the varying providers of market access that are available to certain financial market participants (e.g. prime brokerage, ECNs).
There is no support for margin in the current version of QSTrader, but in the new version realistic margin calculations will be provided.
These calculations will initially be based on US margin requirements against Interactive Brokers. IB provides two forms of margin account—a Reg T account and a more sophisticiated "commodities portfolio" account. The latter takes into account hedging positions to offset risk. Both have non-trivial calculations for margin requirements and liquidation scenarios, which need to be factored in.
In addition borrowing on margin generates interest, which also has non-trivial calculations across multiple currencies and depends upon external point-in-time rates (such as LIBOR). For an example of how Interactive Brokers calculates interest, take a look at their page on Interest Schedule. Eventually, QSTrader will support margin and interest in this manner, which will then allow leveraged futures and f/x positions.
Commissions will be much more accurately calculated in the new version of QSTrader.
A particular broker in the UK that our team utilises for live trades provides a tiered commission structure depending on total consideration and frequency of trading, with an introductory rate for the first three months of the account age. In addition most share transactions in the UK are subject to a flat 0.5% stamp duty, although some equities are exempt.
Given that commission is relatively expensive in the UK compared to the US, these costs need to be accurately calculated for a realistic backtest. For our own internal UK usage this has already been developed and works extremely well in our initial research backtests.
This is clearly a single example among many. In the future it is hoped that we will be able to support multiple jurisdictions so that QSTrader will truly be an "international" backtester, rather than one heavily coupled to US or UK jurisdictional constraints.
Cashflow support is absent in the current version of QSTrader. An initial account equity is constructed and then no further cash can be added or removed over the lifetime of the strategy.
This is obviously quite unrealistic. High-end retail traders will potentially be withdrawing dividends and trading gains as income, or will be adding a regular cash injection in order to fund the purchase of new assets. Allocations and redemptions in funds occur frequently.
The new version of QSTrader supports cash transfers into and out of the broker, along with allocation across various sub-accounts in multiple currencies. Such a feature was crucial in allowing cash dividends, which are discussed below.
In the new version of QSTrader cash dividends are handled as direct injections of cash into a portfolio sub-account. This requires taking into account the ex-date of the asset, along with ensuring that the position was opened prior to the ex-date.
Eventually the goal is to also support more complex corporate actions such as stock-for-stock splits. By placing cash back into the sub-account in this way it is possible for the trading algorithm to choose itself how the cash is reinvested, rather than have this 'dictated' to the strategy by using a form of total return series.
QSTrader currently handles market data through the use of a
PriceHandler entity, which is iterated over to generate
This mechanic has been completely rebuilt in the new version of QSTrader. In particular there is now a decoupling of bar data from trading events. Time-stamped events are now generated by a "simulation timer" entity, at a particular frequency (daily/minutely) that queries an
Exchange for its opening hours, in order to generate a series of events that the backtest will respond to.
Instead of working directly with "bars", the trading strategy entity now calls a
get_latest_price_volume(asset)-like method, which returns the latest market price as far as the broker understands it. Behind the scenes the bar data DataFrame is queried to produce the correct market price for a particular timestamp.
Effectively this means that the frequency of bar data provided and frequency of signal generation are decoupled. For instance it would be possible to use an hourly rebalance schedule on minutely bar data. This makes the transition from backtest to live trading much more straightforward.
The Exchange entity will also be aware of geographic-specific closing times/holidays both in the past and in the future, once again closing the "delta" between backtest simulation and live trading results.
In all previous iterations of our event-driven backtesters we have utilised a
Strategy model to take in market data and generate
Order objects. This is certainly sufficient for a simple while-loop event-driven system. However, it is insufficient for the research requirements of a modern small quant fund. A more sophisticated approach to forecasting is required.
In professional quant environments it is often the case that researchers work to develop "alphas", which are forecasts generated on a particular asset. These "alphas" are often combined into "weighted alphas" that are then fed into a portfolio construction model, along with risk management guidelines and transaction cost estimations for rebalances.
Hence the concept of a
Signal has been eliminated and replaced with a
Forecast, which are generated by an
AlphaModel entity. This provides a multi-strategy approach to the system, where various
AlphaModels can be combined to produce
Forecast combinations. In particular the
AlphaModel does not generate
Order entities. This job is left to the
PortfolioConstructionModel described below.
This framework is extremely flexible as a
Forecast entity is tagged with an asset, a floating point value (which could represent almost any type of "signal") and a date over the which the forecast extends to. Extensive examples will be provided within the codebase to outline how this would support typical quant strategies such as time-series momentum, stat-arb, factor construction and even alternative data based strategies.
Portfolio construction is arguably the most important step in creating a sophisticated quant model in institutional settings.
PortfolioConstructionModel (PCM) entity is designed to replace the
RiskManager objects that previously acted on a set of
Order objects generated by a
Instead the PCM takes in a handle to a
RiskModel and a
TransactionCostModel, both of which provide "opinions" on whether a set of
Order instances should be modified, cancelled or added to. For example, the risk model may wish to introduce a hedge or reduce exposure to a particular market sector. The transaction cost model may estimate that the cost of a rebalance is too high compared to the expected return from the trade in which case the order will be cancelled.
The job of the PCM is to weigh the "opinions" of the alpha forecasts, the risk model and the transaction cost estimator in order to construct and idealised, or desired, portfolio. This portfolio will then be compared against the current portfolio at the brokerage and a set of "delta" orders will be generated to update the broker portfolio to the desired portfolio.
Clearly there are many such ways that portfolio construction can occur. Some specific methods will be added to QSTrader as examples upon which you can build or modify for your own purposes. This includes basic structures such as equal weighting, fixed dollar proportion weighting and inverse-vol weighting (so called "risk parity"). It will also include more advanced systems such as mean-variance optimisation (based on ideas from Modern Portfolio Theory), a Black-Litterman approach and even newer techniques such as Marcos Lopez de Prado's Hierarchical Risk Parity.
By adding these examples and testing them thoroughly, it is hoped that end users will be able to modify them for their own needs and minimise their "time to market" for their own strategy development.
PositionSizer classes were confusing to some, so we have overhauled this process to reflect a more institutional-style portfolio construction with risk controls approach.
RiskModel hierarchy allows calculation of various risk metrics that are appropriate to the trading strategy employed. For instance, it could be designed to keep track of asset volatility through the use of historical standard deviations or by utilising a stochastic volatility model. It could also keep track of sector allocation risk and produce warnings accordingly if a sector exposure is too high.
In large asset allocation firms rebalances are often carried out on a weekly or monthly basis in order to minimise tracking error. There is a natural tension between trying to minimise tracking error while also reducing costs by frequent rebalancing. The
TransactionCostModel class hierarchy has been introduced in an attempt to quantity rebalance costs and thus suggest portfolios be rebalanced when costs do not drag too significantly on portfolio performance.
Both of these instances provide guidance to the PCM in order to aid its construction methodology. Clearly there will be some overlap for certain models. For instance, in a mean-variance optimiser is it the job of the
RiskManager or the PCM to calculate the covariance matrix between assets? The QSTrader framework actually provides sufficient flexibility to allow this decision to be left up to the PM or retail trader managing the account/portfolio.
The current QSTrader backtest simulation in the
TradingSession uses a simple while loop event handler to dispatch events to various components within the system.
The new version decouples the market events from the bar data, in order to allow "pre-market" and "post-market" events for each trading day. These events allow stock splits, cash dividends, investor cashflows, mark-to-market calculations and other broker/exchange related constraints to be calculated outside of the main intraday trading session.
Each simulated entity that needs to keep track of time also has an
update method that is called to ensure information does not propagate from the future into the past—via the so-called lookahead bias. It also keeps all components "in sync".
To date most of the
Broker hierarchy has been constructed. This excludes margin, futures and f/x handling. However it does include cashflows, commission plans for certain brokerages, basic handling of cash dividends and general position tracking.
Exchange hierarchy has also been developed, although specific calendar entities for the major exchanges such as the NYSE and the LSE have not yet been developed. Bar data can be read in to the system and queried.
PortfolioConstructionModel hierarchy has been partially developed, with an
EqualWeightPCM and a
FixedWeightPCM both tested.
Crucially, all currently committed code on our refactor branches now has 100% code coverage.
The development process for QSTrader has been fundamentally overhauled. Previously many in the team were communicating through Slack and adding pull requests for various desired features. There was no official versioning, nor was the repository added to PyPI in order to allow installation with pip.
Our team has now adopted the more advanced Gitflow Workflow for version control and continuous integration. This means that all feature development will occur independently in separate branches split off from a main
develop branch. Periodically, these features will be merged into
develop and a
release-*.*.* branch will be created with a specific version number. This release branch will be a point-in-time branch consisting of test and documentation additions, but will crucially not allow any new features to be added for that release.
When a release is ready to be shipped it will be merged into the
master branch, along with the
develop branch. This release will then be tagged with a specific version number that will be syncronised with PyPI to ensure proper versioning. At this stage new features will be added to
develop and the process will continue.
One of the most significant changes to the development workflow is the fact that support for Python 2.7 will be dropped leaving only 3.4, 3.5 and 3.6 as supported versions. It is also likely that support for 3.4 will be dropped in the future if certain code features warrant it. This is less of a concern than it used to be given the wide acceptance of Continuum Analytics' Anaconda distribution, which allows a straightforward installation of a Python 3.6 scientific stack on Windows, Mac and Linux.
Another major change to the testing of QSTrader is that it will now require 100% code coverage in order for a release branch to be merged into
master. This may seem like a tall order, but given the "mission criticality" of the system, it will be crucial for every line of code to be covered by at least one unit test. We have worked hard to ensure that this is now the case for the new version and will continue to do so as development progresses.
Community pull requests are most certainly welcome, but we do ask that they are compared against the
develop branch rather than the
master branch in order to maintain a Gitflow Workflow.
We will also be listing a series of desired features and will be providing developer guidelines so that contributions can be more directed going forward.
At this stage most of the work has been added to our own internal private QSTrader repository, which is separate from the public QSTrader repo found here. In the next few weeks code will be made available on development branches for those who are interested in early-stage alpha testing.
Our aim is to have a well-tested system ready for beta testing by the end of this year or very early into 2018. I will also be providing more regular updates of the system development progress as the team continues to code up new features.
QSTrader has and always will be a freely available, commercially permissive, open source community driven project. A large proportion of the current workflow has been written by a dedicated team of volunteer developers.
If you would like to get involved in the development of QSTrader and help shape the future of the project, please get in touch at email@example.com and we will invite you to the developers Slack channel.
In the coming weeks the project page will be overhauled to focus on new version development and a list of desired features will be outlined. If you would like to contribute to any of those features please make yourself known either via email or via the Slack channel.
The QuantStart team are always extremely grateful for the contributions made by the community, especially given the volunteer nature of the work. We would like to thank everybody who has made such strong contributions to the project to date.
The team and myself are certainly looking forward to learning about how the community makes use of QSTrader and we are eager to see how development is shaped over the next coming months and years of the project.
-Mikecomments powered by Disqus
You'll get instant access to a free 10-part email course packed with hints and tips to help you get started in quantitative trading!
Every week I'll send you a wrap of all activity on QuantStart so you'll never miss a post again.
Real, actionable quant trading tips with no nonsense.