The algorithm parameters (a.k.a project parameters) are currently being used only as default values, and are only useful when using the optimiser. I think that with minimal adjustments we could use them for standard backtests, adding great value!

  • Given that the parameters are stored along with the project, they cannot really be called parameters, as changing them in order to run a backtest actually changes the state of the project.
  • currently writing GetParameter("param-name") and adding “param-name as a parameter but leaving it as an empty string will cause an exception to occur whenever starting an algorithm. I suggest instead that if some parameters do not have a default value, after clicking the backtest button display a form to allow filling them in (or with the lean cloud backtest command to be able to pass an option --parameters listing them as string or file path).
  • currently backtest results do not leave a trace of which values were used for the parameters. This could be vital to reconstruct what has been done to achieve the backtest result. As we have access to the code for a specific backtest run, it would only make sense to have a page with the parameters used listed out. (btw the library code / shared code used at that time would also be important to have, as it might change in time).
     

This would make it much easier to identify a backtest run, as currently I am using titles to describe which parameters were used, and if too many I have to go and look at the code. Also, together with being able to lock a project, this would make it way more realistic and safer to collaborate on a project with somebody who is not very familiar or scared to touch autosaving code ;) It might allow in the future to be able to search for a backtest run by parameter value, and many other things.What do you think?