Back

Memory Utilization Leak?

I am occasionally getting the "Memory Usage Maxed Out - 4096MB max" error when backtesting a Daily-resolution algorithm, so I started logging the memory utlilization (GC.GetTotalMemory) at the end of each day.  

The interesting thing is that the same algorithm with the same parameters and making the same trades is seeing vastly different memory utilization on different runs.  Most of the time, my memory utlization is maintaining an average of around 1G.  On some runs, however, memory utilization starts at close to 2M after Day 1 and steadily creeps upwards, as if the garbage collection on the VM instance is not performing or something.

Any ideas/suggestions?

Thanks.

Update Backtest








Both Mono and .NET GC are quite "lazy" and will avoid collecting garbage until made necessary by the limits set by available system memory. Even if you try to force it to collect memory via calling the methods for that on the GC class it's only taken as a hint and may be ignored. Overall, collection is very non-deterministic in terms of total observed memory use and stalls incurred in program execution.

The best advice I can give is to make sure class objects (primitives and structs don't count) either get immediately garbage collected or live "forever" (through any important to not interrupt period of execution). What actually happens, at least in .NET, is that on each collection a surviving object is promoted a generation. Only generation 0 and 1 are efficiently collected, storage of generation 2 objects is optimized to assume they will survive for a long time. If you're creating a lot of objects that are promoted to generation 2 and then die, it may cause performance issues.

Lean in itself does generate a fair amount of garbage as far as I can tell, so the extent you can impact this is limited. I've considered looking into this but the out of memory issues in backtests have become so uncommon that I don't think it's necessary anymore.

2

Thanks Peter.   I get that the garbage collection is not deterministic, but I would expect memory utlilization for the same code with the same inputs to follow the same general trend.  I'm tempted to create a quick plot of memory utlization over time to show how different my 2 samples are: one seems basically flat, and the other starts higher and has a steep upward slope until it hits the limit.  It makes me think it's more of a framework/infrastructure issue than something I have much control over.

But, as you said, it doesn't happen a lot, at least.

0

Thanks Nate, Petter;

We have large, fast, dedicated servers running the backtesting - and we run up to 10 backtests on a server at a time (before moving to another one). We've been looking into this issue for the last year but its so rare its hard to diagnose. We have a hunch that when there is a heavy load (say all servers maxed for a little while) that the GC runs slower; or gets pushed to the background. The servers themselves have 128GB-256GB of RAM; so I think C# decides "Hey we have plenty of ram, lets do this GC later". Then  when it reaches 4GB in software we kill the backtest to preserve the RAM for the commuity. 

We're addressing this in 2 ways:

1 - We'll soon allow you to upgrade backtesting RAM in steps of 4GB. This should be ready soon and eliminate 99.9% of issues for people unless they're loading large amounts of data into RAM.

2 - In the next couple of months we're moving to a new public cloud, using a CPU-based load balancing system which should mean there's always spare CPU cycles available; and hopefully should eliminate this issue. The downside though is the public clouds are 50% of the speed of our beautiful tuned dedicated OC 4.7GHz water cooled boxes. So we're going to continue offering the water-cooled boxes as a upgrade and they will basically be a fast-swim lane. 

1

The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by QuantConnect. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. QuantConnect makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances. All investments involve risk, including loss of principal. You should consult with an investment professional before making any investment decisions.


Hello,

My algorithms are really suffering due to this issue.
50% of my backtests was stopped last 3 weeks because of RAM usage overload.
As Nate mentioned - memory usage at one backtest is flat (~120MB) and at the other one suddenly starts to grow up to the limit (above 12000MB).
Same inside one backtest - at one point is flat, at the second one grows up, at third one it's flat again.
It's completely random. With running two backtests at same time almost always issue appears.

Classifier 0 fully trained - 3.98973894119 - RAM: 701
Classifier 1 fully trained - 4.01481199265 - RAM: 1888
...
Classifier 21 fully trained - 20.2698619366 - RAM: 12271
Classifier 22 fully trained - 20.7529881001 - RAM: 12831
Runtime Error: System.Exception: Execution Security Error: Memory Usage Maxed Out - 12288MB max, with last sample of 13468MB.

I'm attaching exemplary algorithm showing the general idea of the actions I'm taking in my algos.
I hope it will help to track and solve the problem.

0


Update Backtest





0

The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by QuantConnect. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. QuantConnect makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances. All investments involve risk, including loss of principal. You should consult with an investment professional before making any investment decisions.


Loading...

This discussion is closed