I'm testing a strategy that requires loading many years of Historical Minute Data upfront. This is of course a lot of data, and unfortunately loading such data for more than half-a-dozen equities takes longer than the timeout max of 5 minutes.

Algorithm.Initialize() Error: Execution Security Error: Operation timed out - 5 minutes max. Check for recursive loops.

 Using the 16GB 8-core nodes did not reduce the loading runtime, so I am guessing the bottleneck is not RAM, but rather just clock speed?

I've already tried breaking up the History requests into a few chunks. That has cut the loading time in half and allowed me to load more data, but I have not been able to improve the loading to be any faster than about 20Y of equity minute-data per minute in total. My current thoughts are that that could be okay if I can get longer than 5 minutes somehow or if I could load the data into the Object Store.

What are my options?

If I'm already pushing the limits, could I instead load a serialized list of all Historical Minute Data for dozens of equities into the Object Store?


In case anyone seeing this problem in the future would like to know how to break their similarly large History requests into smaller chunks, this Community has helped me a lot, and I would like to start giving back! Here's how I did it…

#### Your list of equities here.
self.EquitySymbols = ["AXAS","COP","COG","OXY","KMI","CPE","CVX","BP","EOG","EPD"]
self.EquityHistories = []
self.YearsOfHistoricalData = 10

#### Break self.History() requests into chunks.
chunksize = 3 # For 10 years of minute data, only 3 stocks can be collected at a time. Looks to be a RAM limitation.
SymbolChunks = [self.EquitySymbols[i:i + chunksize] for i in range(0, len(self.EquitySymbols), chunksize)]
all_histories = []
##  Manually append rather than inlining to show live progress in Debug output
for c in SymbolChunks:
	##  Get historical minute data directly from QuantConnect, since it's provided for free.
	all_histories.append( self.History( [self.Symbol(i) for i in c], timedelta(weeks=round(self.YearsOfHistoricalData*52)), Resolution.Minute ) )
	self.Debug(f"Done collecting History for chunk of symbols:  {c}")

#### Here is how I access the Historical data.
for symbol in self.EquitySymbols: #EquitySymbols:
	##  Grab our minute data. Could also use next(iter(...)) but this is clearer and the chunking is added complexity.
	for c in range(len(SymbolChunks)):
		if symbol in SymbolChunks[c]:
			history = all_histories[c].loc[symbol]
			# Do we need to store all of this data? Below, only keep what is needed.
			history = history[["open", "high", "low", "close", "volume"]]
			break
	#### Preprocess history further as needed.
	self.EquityHistories.append( history )

Author