Hi,

I have created a simple strategy that just collects prices for 11 tickers 5 mins before market close using a scheduled event. (The idea is that this is the template for “doing stuff” intra-day at specific times for specific symbols).

Resolution is 1 min, but only one event is happening per day (5 mins before close).

Time stats for length of backtest (B2-8 node, CPU is at 100%, but RAM < 200M )

 - 1 month = 20 seconds (⅓ of a minute)

 - 3 months = 180s = 3mins (time taken 10 x 1 month) 

 - 9 months = 30 mins (time taken 10 x 3 months)

I expect a 9 month back test to take (approximately) 9x the time of a 1 month back test. Even if it took 20x, that would be ok.  But 100x??

I thought it must be memory/garbage collection but memory is fine (< 200M). (Some time improvement if I don't use self.Debug….but my main point still holds).

I can only assume that I am doing something fundamentally dumb. Can someone please make a suggestion on how to improve something so fundamental.  Thanks.

Explanation of code:

This is all that is is doing (full code in attached back test):

def OnData(self, data):
     
       self.Schedule.On(
           self.DateRules.EveryDay("SPY"),
           self.TimeRules.BeforeMarketClose("SPY", 5),
           self.get_data
           )
       

       
   def get_data(self):
       
       #now get latest prices
       T=self.Time
       price_list = []
       for symbol in self.symbols:
           price_list.append(self.Securities[symbol.Value].Price)
           
       
       self.Debug(str(T) +  ' ' + str(price_list))