I get the following error when trying to run my algorithm on timeframes longer than 2 months for all tickers above $20M in trading volume (about 1400 tickers) reducing this number won't help this problem too much but I can't reduce this due to my strategy.

Runtime Error: Exception of type 'System.OutOfMemoryException' was thrown. in SubscriptionSynchronizer.cs:line 130. this problem has nothing to do with RAM usage as only 15% of my RAM gets utilized in the backtests. This problem apparently stems from the results being too large to upload to quantconnect because it may have hit the 700MB data limit and the upload times out

Alex from support was able help improve the efficiency of my algorithm but only so much. 

I am using this logic to add the data from benzinga from my universe:

def OnSecuritiesChanged(self, changes):
       for security in changes.AddedSecurities:
           self.benzingas[security.Symbol] = self.AddData(BenzingaNews, security.Symbol).Symbol
       
       for security in changes.RemovedSecurities:
           if security.Symbol in self.benzingas:
               self.RemoveSecurity(self.benzingas[security.Symbol])
               del self.benzingas[security.Symbol]
           

How can I avoid hitting this 700MB data limit in my backtests? what causes the most data usage in backtests?

Just thinking, is there an easy way to give my backtests a lookahead bias to only select tickers that will be used by the benzinga dataset for the next day?