I've posted before on related problems to this, so maybe I'm just not getting something. If I create consolidated bars, does QC update those bars according to the resolution I have set? I.e, if I am trying to perform a calculation on consolidated 15 second bars using second resolution, are those bars changing each second, so that the most recent bar is representative of the current time minus fifteen seconds, akin to a rolling window? Because I believe this is the effect that is occurring in my algorithm. I want to have discrete bars, so if I were the look at the most recent 1560 15 second bars, every 15 seconds, I receive a new one, and the oldest bar is discarded from the calculation, whereas I feel like I am am essentially recreating new 15 second bars, adding data each second, and discarding the oldest second. I apologize if I have not articulated this properly

Vladimir helped me devise an indicator to calculate the standard deviation of the range of a bar, but when I implemented it, clearly something has gone wrong. I have commented above the two seeming problem areas. If I am debugging, say I have 1560 values representing the range of 15 second bars and I calculate the std deviation of those, I should theoretically receive the same calculation each second, 14 times in a row, until a new bar has been created, where then the value of the std deviation will be different, as the oldest bar is, lost, and the new one is added. Thus however is not the case, and instead I receive new values each second, and sometimes it returns 0.00. (be careful if you debug, the current line I have will eat a lot of data)

Obviously I am missing something, but I am not sure if it is related to data, the std dev indicator(period should simply be the number of data points I wish to include?), or my consolidated bars. 

https://www.quantconnect.com/u/vladimir_2