Back

OnData(Slice) and timing

Hi all,

I've been struggling with the OnData(slice) object for some time, and I just can't seem to wrap my head around it. Could someone perhaps help me with this hypothetical scenario?

Let's say I have two indicators that I want to update with data. First I set up my data feeds to be daily resolution, and because they are custom data, I am using OnData(slice) rather than OnData(tradebars). Now, once both indicators are fully charged (i.e. ready), I want to trade on some comparison of those indicators. So in my OnData(slice) method, I put in some logic to update these indicators. and some logic that says once we're ready, go ahead and compare the indicator values and make trades. Pretty basic.

My question: How many times will OnData(slice) get called per day in this scenario? Is it once or twice?

Further: if one indicator needs to be charged with minutely data, and the other needs to be charged with daily data, does the OnData(slice) get called minutely? And if so, won't that mean that you need some checks in place to prevent your daily indicator from being "updated" with clones of daily data every minute?

Thanks very much for your answers in advance, and I guarantee you, there'll be more questions, as it seems I'm the slowest person on the planet with this type of thing...

Update Backtest








Yo Stephen,
Seems like we always are answering each others question. I believe I could be of use here.

OnData(Slice slice) depends on your base resolution and will be called based on the fastest interval. Even if you have two indicators it will act like this: tick>second>minute>hour>day.

In this situation I think it would be best to use consolidators. var minIndicator = RSI(symbol, 10, MovingAverageType.Exponential, Resolution.Minute);
var dailyIndicator = SMA(symbol, 1, Resolution.Daily);
//resolution time frame consolidation AKA: turns our minute resolution into days
TradeBarConsolidator consolidator = new TradeBarConsolidator(TimeSpan.FromDays(1));
consolidator.DataConsolidated += DailyHandler;
SubscriptionManager.AddConsolidator(symbol, consolidator);

}
public void DailyHandler(object sender, TradeBar data) {
// handle the data each 1 day here
// call your indicator from this method
// dailyIndicator.Update(); or something like that
}



Does that help? I know this kind of stuff can get confusing sometimes.
1

Hi Travis, many thanks. Yes, this helps immensely. I think this might be a good time for me to brush up on the consolidator.

I think I have a weird situation going on where some of my data feeds update at the end of the day, but one of my other data feeds only gets updated for the current day at 11am on the NEXT day, causing a potential synchronization issue. Looking into it now. I may (more like will) be back for more questions on this.. Thanks very much!

0

First, the difference between `OnData(slice)` and `OnData(tradebars)` is that `tradebars = slice.Bars`. I advice you to stick with the slice object that is richer in information.

I didn't give much thought about it, but I don't think you need consolidators. You just need some logic to check when new data from the lowest resolution security/indicator has arrived and apply your trading logic with current value of the highest resolution security/indicator.

1

The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by QuantConnect. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. QuantConnect makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances. All investments involve risk, including loss of principal. You should consult with an investment professional before making any investment decisions.


Nice answers :)
PS: "because they are custom data I am using OnData(slice) rather than OnData(tradebars)" - you can also have separate event handlers for your custom types if you wish. Like this: public void OnData(MyCustomType data) { }

0

The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by QuantConnect. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. QuantConnect makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances. All investments involve risk, including loss of principal. You should consult with an investment professional before making any investment decisions.


Thanks for the answers everyone.

@Alexandre - I'd very much prefer to use the Slice, but I don't feel like I'm in control of when data arrives and leaves the slice. I'll explain in my question below

@Jared - Thanks for pointing those methods out; I may end up needing to use it, as I'm never seeing a situation where all of my data is present and accounted for in the same slice. I.e. When my first and second datafeeds are there, my third and fourth ones aren't. Then when my third and fourth ones arrive, the first and second are gone.

My problem is that I have no control over when my data is in my hands. I can't even tell what day the data is "for", as it seems to arrive nearly a day and a half later, and sometimes doesn't arrive at all. I believe this to be a limitation of Quandl, which doesn't enforce a uniform refresh cycle across its datasets. I'm left with more questions than before:

1.) When does data "expire" (i.e. leave the slice)? In live testing, all of my data feeds are set to "Daily", yet they never stay within the same slice at the same time. Some come and go and they are never "all in there" at the same time. Is there at least a way to determine what time each datafeed is "good for" so I can predict when it will leave?

2.) Live trading is behaving differently than backtesting. In backtesting, my slice gets called once per day and has all four data feeds in it. In live trading, it is called every second (I have no idea why it is doing this on daily resolution), and is flooding the log, complaining that it is constantly missing data feeds. I left it running for three days and it spun its wheels without accomplishing anything.

3.) Is there a way I can "pull data" from Quandl/QuandlFutures? I'm getting to the point where I think it might be better to just schedule an event at 1pm to look at all of the recent data for all four data feeds, when I know they are fully updated for the prior day. This would remove my headache with understanding this slice business.

4.) Live testing has been inconsistent in its ability to "warm up" fully before go time. One day it will warm up fully, the next it will only warm up 10% and needs to spin for several more days. I also chalk this up to inconsistency in the data feeds.

Very close to throwing my hands up and working this strategy manually, as I've been trying to understand the slice for over a month and I still don't understand how it works.

0

Hi @Stephen! There's a few key concepts you're missing I think, please share a project to get more help otherwise we're stabbing in the dark.

- Slice is a moment in time, and was designed to contain all the information you need. This is the best vehicle for accessing it all at once.

- However, custom data is not fill forwarded by default, so if you want it to be on every slice, set fill forward true. This will copy old data forward when there's no new entries available. See the AddData() API:
AddData(symbol, resolution, fillDataForward: false, leverage: 1m);

- Quandl data is sporadically updated, we can't control that sadly. When they don't have data it will either be copied forward or if you have fill forward disabled there will be no data. If its sensitive for your strategy you could sign up for a commercial vendor who will have data updated more reliably, or use your own data source like DropBox to control what data gets into the algorithm when.

- Re: Warm up - the Quandl source you were using had several days without data, explaining why it was missing data during your warm up period. The same code is used for warm-up as the primary datafeeds.

- @Michael says the live trading slicing every second was a recent bug and he's pushed a fix to master.

I'd recommend turning off all fill-forward and making your algorithm resilient to missing data. When fill-forward is used it hides information (no trades for equities / or no data for custom sources).

Hope that helps!

0

The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by QuantConnect. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. QuantConnect makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances. All investments involve risk, including loss of principal. You should consult with an investment professional before making any investment decisions.


Ok so I'm going to attack this from the standpoint of a very simple algorithm and see if I can reproduce on paper what I'm saying is happening. This way, I might be able to help anybody who has similar issues after this point :-)

Attached is an algo. It is very simple. It has six daily resolution data feeds. Two are standard equities, two are quandls, two are quandl futures.

As per Jared's suggestion (thanks, by the way), I've created three separate OnData() methods, each are designed to accept data from one of the three datatypes (standard, quandl, quandlfuture). this allows me to take control of the data feed process rather than use Slice, and allows me to judge exactly when data comes into my hands, and I can potentially hold onto it however long I want.

The algo doesn't trade. It just takes data in. At the end of each day, it reports when data was received and what the most recent value of that data was.

The data snapshots at the end of each day look like this:

2012-03-15 00:00:00 DATA SNAPSHOT:
2012-03-15 00:00:00 Symbol: VXX Price: 339.68 Time Pumped: 03/14/2012 00:00 Time End: 03/14/2012 00:00
2012-03-15 00:00:00 Symbol: XIV Price: 10.13 Time Pumped: 03/14/2012 00:00 Time End: 03/14/2012 00:00
2012-03-15 00:00:00 Symbol: YAHOO/INDEX_GSPC Price: 1395.95 Time Pumped: 03/14/2012 00:00 Time End: 03/14/2012 00:00
2012-03-15 00:00:00 Symbol: YAHOO/INDEX_VIX Price: 14.73 Time Pumped: 03/14/2012 00:00 Time End: 03/14/2012 00:00
2012-03-15 00:00:00 Symbol: SCF/CBOE_VX1_EW Price: 17 Time Pumped: 03/14/2012 00:00 Time End: 03/14/2012 00:00
2012-03-15 00:00:00 Symbol: SCF/CBOE_VX2_FW Price: 23.2 Time Pumped: 03/14/2012 00:00 Time End: 03/14/2012 00:00


If you run this, the backtesting works perfectly, as the time pumped is always one day behind. Good.

I'm going to take this algo live and give an update as soon as I can on what the live engine is reporting for my quandl futures. I am hypothesizing that it is exactly one day out of phase with everything else.

0


Ah you posted right as I did, Jared! Sorry about that. :-)

As for this: However, custom data is not fill forwarded by default, so if you want it to be on every slice, set fill forward true. This will copy old data forward when there's no new entries available. See the AddData() API:
AddData(symbol, resolution, fillDataForward: false, leverage: 1m);


Hah, I think this is probably the key concept I was missing. For some reason, I had it in my head that fillforwarding was always set to true by default... Yikes.

You are right on the best suggestion being that my algo should be made resilient to data gaps. I'll take a look into that. Will keep this thread up to date as I come to more conclusions on what is going on here...

0

Update Backtest





0

The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by QuantConnect. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. QuantConnect makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances. All investments involve risk, including loss of principal. You should consult with an investment professional before making any investment decisions.


Loading...

This discussion is closed