At the moment, I have 2CPUs | 8GB RAM for backtesting. I am trying to develop an ML based algo off of my strategies. Does anyone have advice on how to size the nodes for the backtesting of the ML? Using 3 symbols, 25 -35 features, minute data, and a training sample of ~TBD (would like 2000). I am running a stripped down test with only 250 samples in the training and is taking forever. If it makes a difference I am using xgboost and a neural net ML approach.  

Author