Optimization

Parameters

Introduction

Parameter optimization is the process of finding the optimal algorithm parameters to maximize or minimize an objective function. For instance, you can optimize your indicator parameters to maximize the Sharpe ratio that your algorithm achieves over a backtest. Optimization can help you adjust your strategy to achieve better backtesting performance, but be wary of overfitting. If you select parameter values that model the past too closely, your algorithm may not be robust enough to perform well using out-of-sample data.

Parameters are project variables that your algorithm uses to define the value of internal variables like indicator arguments or the length of lookback windows.

Setting Values

Parameters are stored outside of your algorithm code, but we inject the values of the parameters into your algorithm when you launch an optimization job. The optimizer adjusts the value of your project parameters across a range and step size that you define to minimize or maximize an objective function. To optimize some parameters, add some parameters to your project and add the GetParameter method to your code files. The GetParameter method returns the parameter value with the numeric type of the default value. If you don't provide a default value, the method returns a string, and you need to cast it to the data type you need. If there are no parameters in your project that match the name you pass to the method and you provide a default value to the method, it returns the default value.

var defaultValue = 100;
var parameterValue = GetParameter(parameterName, defaultValue);
var parameterValue = Convert.ToInt32(GetParameter(parameterName));
default_value = 100
parameter_value = self.GetParameter(parameter_name, default_value)
parameter_value = int(self.GetParameter(parameter_name))

Overfitting

Overfitting occurs when a function is fit too closely fit to a limited set of training data. Overfitting can occur in your trading algorithms if you have many parameters or select parameters values that worked very well in the past but are sensitive to small changes in their values. In these cases, your algorithm will likely be fine-tuned to fit the detail and noise of the historical data to the extent that it negatively impacts the live performance of your algorithm. The following image shows examples of underfit, optimally-fit, and overfit functions:

An algorithm that is dynamic and generalizes to new data is more likely to survive across different market conditions and apply to other markets.

Look-Ahead Bias

Look-ahead bias occurs when an algorithm makes decisions using data that would not have yet been available. For instance, in optimization jobs, you optimize a set of parameters over a historical backtesting period. After the optimizer finds the optimal parameter values, the backtest period becomes part of the in-sample data. If you run a backtest over the same period using the optimal parameters, look-ahead bias has seeped into your research. In reality, it would not be possible to know the optimal parameters during the testing period until after the testing period is over. To avoid issues with look-ahead bias, optimize on older historical data and test the optimal parameter values on recent historical data. Alternatively, apply walk forward optimization to optimize the parameters on smaller batches of history.

Number of Parameters

The cloud optimizer can optimize up to 2 parameters. There are several reasons for this quota. First, the optimizer only supports the grid search strategy, which is very inefficient. This strategy tests every permutation of parameter values, so the number of backtests that the optimization job must run explodes as you add more parameters. Second, the parameter charts that display the optimization results are limited to two dimensions. Third, if you optimize with many variables, it increases the likelihood of overfitting to historical data.

We plan to upgrade the parameter charts on the optimization results page to support 3D surface charts. When we upgrade the visualization technology and add more efficient optimization strategies, we will increase the number of parameters you can optimize in the cloud. To optimize more parameters now, run local optimizations with the CLI.

You can also see our Videos. You can also get in touch with us via Discord.

Did you find this page helpful?

Contribute to the documentation: