Hi All!

Thank you for taking the time to read this.  It's a question regarding what seems to be a minor implementation difference between the TrueRange indicator and the AverageTrueRange indicator when calculating the inital True Range value of a time series.

The former will set the first value to 0.0 if there is no previous bar.  The latter will set the first value to High-Low of the current bar if there is no previous bar.

TrueRange.cs [line 66]:

       protected override decimal ComputeNextValue(IBaseDataBar input)
        {
            if (!IsReady)
            {
                _previousInput = input;
                return 0m;
            }

 

vs AverTruerange.cs [line 103]:

        public static decimal ComputeTrueRange(IBaseDataBar previous, IBaseDataBar current)
        {
            var range1 = current.High - current.Low;
            if (previous == null)
            {
                return range1;
            }

 

The differences are minor in resulting values, however, there exist edge cases where one versus the other will generate different resulting actions.  Of course, this is noise, and no trading system should be that sensitive.  

The challenge is that it makes debugging / replicating results in Excel, external frameworks, etc. a bit of a challenge.

Does it make sense to make the two indicators produce the same values?  If so, I'm assuming I can generate a pull request for TrueRange.cs?

Thank you,

Corvin