Back

N00b question - how to issue historical orders during a backtest?

how can i have my algorithm issue orders at a given point in time, in the past?

context: i'd like to test a strategy by feeding it a timeseries of data and letting it make orders. these orders have to happen at some point in time. i'm confused about how to specify when in time an order should be issued.

for example, if we have data for the month of May, and the algorithm is able to make trades every day, how do i specify that a trade is being requested for May 5th and not May 6th?

ideally i'd like to be able to consider a learning model - in this case the model should "observe" the backtest data sequentially, issue an order, and "evaluate" its performance on future backtest data. i'm confused about how to achieve this in QC.

Update Backtest







0

The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by QuantConnect. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. QuantConnect makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances. All investments involve risk, including loss of principal. You should consult with an investment professional before making any investment decisions.


LEAN is an event-driven backtester. You can think of this as a big while loop, and data is streamed in sequentially (i.e. a video game, you see a box -> you interact with the box -> the box opens). This is different from vectorized backtesting, where we simply loop over the past and compute stuff (i.e. we know there were 10 boxes in the past, so we loop through time and open all of them). An important distinction is that at time t, the algorithm can only access information up to time t - similar to live trading. Being able to evaluate performance on future backtest data will skew your results and is known as look-ahead bias.

So to specify a trade is requested for May 5th, you can simply do 

 

import datetime
def OnData(self):
"""
Data is streamed in based on requested resolution,
and this function is called every time that happens.
"""
if self.Time == datetime.date(2020,5,5):
self.SetHoldings('SPY',1)

 

This page may be helpful to understanding how time flows through a backtest.

1

Thanks Adam, i have a feeling the "Understanding Time" page will clear things up for me, i appreciate it.

I'm coming from a world of more traditional DNN trainig, where we have training steps and inference steps. When I say evaluate I mean calculate the return for the investment decision made, so that the loss function can update weights in the net.

I don't think this is the same as lookahead bias but some quick googling suggests the concept may be more subtle. than I think What do you make of this toy example: you want to train a net to predict the price of some equity one week in the future.

As you backtest it, it needs to: (a) observe data up to point t, (b) make a prediction for the price of an equity one week later, and (c) evaluate the loss function given the true price at time t+one_week.

How would you accomplish this kind of training in QC?

0

Hey Bobo! Please checkout Boot Camp! I am 100% sure with your experience it'll click after doing a few more courses =)

1

The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by QuantConnect. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. QuantConnect makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances. All investments involve risk, including loss of principal. You should consult with an investment professional before making any investment decisions.


That makes more sense. As Jared mentioned, definitely check out the Boot Camp first to get a feel for the QC API and pore over the Documentation afterwards for more details.

For the toy example, most likely you will want to implement some sort of training process outside of market hours to train the model. Look up the sections on Scheduled Functions, Historical Data, and Machine Learning in the Docs for this. In the scheduled function, you can request a DataFrame of historical data and train the model in a supervised manner similar to what you likely are used to working with. 

After the model is trained, you can store it and use that to make one-step predictions as data is streamed in via .OnData(). You should also keep in mind model complexity/training times on QC, see this comment.

1

Update Backtest





0

The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by QuantConnect. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. QuantConnect makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances. All investments involve risk, including loss of principal. You should consult with an investment professional before making any investment decisions.


Loading...

This discussion is closed