VXX|XIV Trading the Odds Strategy (Work In Progress)

Ok, as promised, I'll open up a bit about this strategy I'm attempting :-)
BTW: kudos to Jared who consolidated (more like completely overhauled) what I had originally put together!

Feast your eyes on this:

My focus is on that green line. Essentially, we alternate going long on VIX (long VXX) when a certain metric is below "1" and going short on VIX (long XIV) when that metric is above "1".

The metric is defined as:


where VX1 | VX2 is the constant 30-day contract price of VIX futures as interpolated between VX1 and VX2 (some black magic here)
where HV2 is the historical volatility of the S&P 500 index as calculated by:

STDEV(LN(closeGSPC[0] / closeGSPC[1]), LN(closeGSPC[1] / closeGSPC[2])) * sqrt(252) * 100

where closeGSPC[0] is the close of GSPC today, closeGSPC[1] is the close of GSPC yesterday, closeGSPC[2] is the close of GSPC two days ago.

So this algorithm is intended to run well after the close of market, maybe even the morning of the next day before market opens, in order to ensure GSPC has been updated with the most up to date information.

Attached is the project so far. The To-Do list:

1.) Rework the calculation of VX1 and VX2. This involves:
---Replacing the current VIX source (which is currently reporting an aggregate of futures) with rolling volatility data that is brought in from an external source (dropbox, etc.)
---Replace the current VIX source with YAHOO/INDEX_VIX and replace the metric benchmark with "0" rather than "1" (this would give us the orange line in the graph above)

Update Backtest

A few things to note:

1.) Again, the project is not complete (just wanted to remind everyone).
2.) Once its up and running, it will trade in an extremely volatile manner. This should be obvious from the fact that we're trading in volatility.
3.) The website listed above is using "simulated data" for VXX and XIV. I'm not sure what the implications are given that. But note that, realistically speaking, we can only backtest to March of 2011 (as noted in my comments in the code).
4.) We are currently experiencing a new maximum drawdown with this strategy (and it is a whopper). So there's some contention about whether or not the underlying strategy here is broken due to the fact that VIX seems to be indicating suppressed implied volatility in a market that is highly volatile (this means we're sitting in XIV too long).

Once I validate the trades it's attempting to make, I'll post a valid backtest. Stay tuned!


Any progress with this algo? I'm working on the same thing but in Python and I'd be interested in collaborating if you want to complete this project in C#.

Hi! Yes, sorry its been a long time since I've updated it. :-) I got diverted by some other work I was doing with the Coarse Universe function that has been taking a lot of brainpower (what little I have).

I actually started reading heavily into the posts on the tradingtheodds blog (as linked in my first posting). Given that VXX is derived from the first front month VX1 and the second front month VX2, it sounded like it would make more sense to use the pricing from either of those futures to get better results than the Vix spot price. I recently tested out data from Quandl's dataset: 'Stevens Continuous Futures' (SCF) as a drop-in replacement for the Vix spot price in the main evaluative equation and I am getting VERY good results. It takes a bit of tweaking though.

As for which dataset I used, I'm experimenting, but if you go to Quandl and do a search for Stevens Continuous Futures, and then in that database, search for "VX1" or "VX2" it'll present you with a ton of options. I've tested a few and they give much better results than the Vix Spot.

So, in other words, I'd pick one of those and replace this line in the code:

_ema.Update(new IndicatorDataPoint(Time, vix.Value - astd));

with this:

_ema.Update(new IndicatorDataPoint(Time, vx1or2.Value - astd));

Then tweak the comparison statement number (1.0m) to get better results for the dataset you've chosen:

if (_ema > 1.0m)

I can't tell yet how much of that tweaking is "curve fitting". I'd like to think that over five years of backtesting it is more of an optimization of the average value you should use, rather than curve fitting :-)

Thanks for the offer to collaborate! I can give pointers here and there, and I'd love to hear your progress whenever you can pen something down on this thread! I still hold a lot of interest in this volatility risk premium study. For my primary work, though, I will be continuing to work on the coarse universe stuff, as I think that holds a ton of potential.

I wish you luck on the python front too; I'm glad someone is working to code things up in python; really highlights how diverse the coding languages are here at QC :-)


I just wanted to post a website I've been using for gaining more insight into the VIX term structure and to get a better idea of what the front months look like with respect to the current spot price:

Very useful website, especially since CBOE (the official source of said data) requires people to create an account and log in to actually see the Vix data.


This very simple this helpful.

Taming Inverse Volatility with a Simple Ratio


Update Backtest


The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by QuantConnect. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. QuantConnect makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances. All investments involve risk, including loss of principal. You should consult with an investment professional before making any investment decisions.


This discussion is closed