Sometimes when you're downloading a years worth of historic stock data on a ticker, say the example following command:

dotnet QuantConnect.ToolBox.dll --app=PDL  --tickers=XLP --resolution=Hour --from-date=20190101-00:00:00 --to-d
ate=20200101-00:00:00 --security-type=Equity --market=usa  --api=xxxxxxxxxx


you'll run into an SSL Connection dropped error and have to reissue the command.  I've been checking where the downloader left off and will adjust my --from-date accordingly.  The issue I am seeing however when I examine my historic stock data is data will be stored in the csv (which is zipped and found in the 
~/Lean/Data/equity/usa/hour/qqq.zip) for example like below:

20190102 11:00,502800,505200,502300,503600,2658141
20190102 12:00,503700,505450,502900,505200,1806334
20190102 13:00,505200,506400,504900,506300,1762108
20190102 14:00,506200,506200,503900,504300,1580498
20190102 15:00,504300,505700,501700,505200,0,504400,505800,501800,505300,0
20190102 16:00,505000,505000,489200,490000,0,505300,520200,504800,504800,0
20190102 17:00,490000,502000,490000,500100,0,504800,504800,502000,504000,0
20190102 18:00,500100,500600,498100,498200,0,504000,504000,500000,503800,0
20190102 19:00,498700,498700,498700,498700,0,503800,503800,503800,503800,0

Has anyone run into this behavior when downloading historic stock data via the Quantconnect Toolbox?  My question is, is this concerning or will this still work when conducting local backtests?  Does the downloader know not to double up the data if it re-runs on a particular date when stored in the csv file within the Data directory?