Back

A Quantitative Approach to Tactical Asset Allocation: GTAA(5) with SimpleMovingAverage

GTAA consists of five global asset classes: US stocks, foreign stocks, bonds, real estate and commodities.

In my previous post we had a look at "Base" case of simply buying and holding, in equal 20% weighting's, the Asset Allocation described in the paper here. In this post we will compare that "Base" performance to the implementation of "timing" described within the paper.

To begin let's revisit the "Base Case". From the back test attached below we can see that this yielded a CAGR=4.2% for a Drawdown=26.5% and a SharpeRatio=0.44 over nearly the last decade... nothing to get excited about. For comparative purposes over this same period buying and holding the SPY yielded a 13%CAGR, Drawdown=33% and Sharpe=0.79. (It's probably not fair to compare a basket of diversified assets to the outperformer of the last decade, but I think it's nice to know what an investor could have achieved had they remained 100% invested in stocks.)

 

In the paper, Meb Faber examines Introducing a "timing" element to his asset allocation (which is where the "Tactical" part of his GlobalTacticalAssetAllocation gets its name) as a risk mitigation strategy. By applying a 10-month (200day) SimpleMovingAverage filter to the RiskyAssets within the portfolio and allocating to cash (or mid duration bonds examined below) when an asset price falls below its MovingAverage.  It is an established fact that high volatility diminishes compound returns, and this "timing" factor is an attempt to reduce that volatility. 

His claim: "Timing results in a reduction of volatility to single-digit levels, as well as a single-digit maximum drawdown. Drawdown is reduced from 46% to less than 10%". 

Note: The filter is disregarded intra-month and only acted upon if the signal still exists at months end, hence large drawdowns could be experienced between monthly rebalancing periods.

 

So with that in mind, how did this SMA implementation perform over the same period as the backtest below...? 

Update Backtest








0

The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by QuantConnect. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. QuantConnect makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances. All investments involve risk, including loss of principal. You should consult with an investment professional before making any investment decisions.


... From the back test attached below we can see that this yielded a CAGR=4.5%, Drawdown=8.1% and a SharpeRatio=0.66. This is a slight improvement in returns but a MASSIVE reduction in drawdowns and volatility. By smoothing volatility the strategy now lends itself to possibly implementing it with some leverage to dial up the returns, something that was definitely not possible in the "Base" case.

In my next post we will have a look at how expanding our diversified universe of assets affects performance with GTAA(13).

 

This is my second attempt (as a non-programmer) to build a strategy within the QC AlgorithmFramework. It might not be pretty or clean, so all comments and improvements are welcomed! I know it could use some more explanatory commenting so apologies, I promise my next post will be better

0


... From the back test attached below we can see that this yielded a CAGR=4.5%, Drawdown=8.1% and a SharpeRatio=0.66. This is a slight improvement in returns but a MASSIVE reduction in drawdowns and volatility. By smoothing volatility the strategy now lends itself to possibly implementing it with some leverage to dial up the returns, something that was definitely not possible in the "Base" case.

In my next post we will have a look at how expanding our diversified universe of assets affects performance with GTAA(13).

 

This is my second attempt (as a non-programmer) to build a strategy within the QC AlgorithmFramework. It might not be pretty or clean, so all comments and improvements are welcomed! 

0


Hi Mark,

Great start on this algorithm!

One thing we can do to improve it is to remove some of the `History` calls used for calculating the SMA. We don't actually need to reset the SMA and recalculate it for each security every time `Update` is called. Instead, we can create and warm up the SMA inside the SymbolData constructor and setup a consolidator to update it automatically.

class SymbolData:

def __init__(self, security, algorithm, smaLength, resolution):
self.Security = security
self.Symbol = security.Symbol
self.MovingAverage = SimpleMovingAverage(smaLength)
self.algorithm = algorithm

# Warm up MA
history = algorithm.History([self.Symbol], smaLength, resolution).loc[self.Symbol]
for time, row in history.iterrows():
self.MovingAverage.Update(time, row["close"])

# Setup indicator consolidator
self.consolidator = TradeBarConsolidator(resolution)
self.consolidator.DataConsolidated += self.CustomDailyHandler
algorithm.SubscriptionManager.AddConsolidator(self.Symbol, self.consolidator)

def CustomDailyHandler(self, sender, consolidated):
self.MovingAverage.Update(consolidated.Time, consolidated.Close)

def dispose(self):
self.algorithm.SubscriptionManager.RemoveConsolidator(self.Symbol, self.consolidator)

Then in OnSecuritiesChanged, we just remove the consolidator when a security leaves the universe.

def OnSecuritiesChanged(self, algorithm, changes):
for added in changes.AddedSecurities:
self.symbolDataBySymbol[added.Symbol] = SymbolData(added, algorithm, self.smaLength, self.resolution)
for removed in changes.RemovedSecurities:
symbolData = self.symbolDataBySymbol.pop(removed.Symbol, None)
if symbolData:
# Remove consolidator
symbolData.dispose()

We can also replace the `predictionInterval` parameter of the alpha model and utilize the built-in `Expiry.EndOfMonth` instead. This ensures we don't have overlapping durations and changes the insight generation lines slightly to look like

insights.append( Insight.Price(symbol, Expiry.EndOfMonth, InsightDirection.Up, None, None, None, riskOnWeight))

In the attached backtest, both of these changes have been made. See the full code files for reference.

Best,
Derek Melchin

1

The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by QuantConnect. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. QuantConnect makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances. All investments involve risk, including loss of principal. You should consult with an investment professional before making any investment decisions.


Thank you so much Derek! 

I really appreciate you taking the time to look over these. Re: Expiry.xxx - I have searched the docs but cant find it anywhere? Where should I go to find the details & methods available behind this function?   

Also, it looks like a consolidator by nature does NOT need to be updated.... so, once created it will automatically have data pumped into it at the nominated security's.Resolution? Is this correct?

0

Another thing I noticed... the performance is slighlty different, around double the drawdowns with the new implementation. When I looked into it the new one seems to be putting orders in over the first TWO trading days of each month? This was reflected in the fees being 50% higher with your implementation. I found:

self.Settings.RebalancePortfolioOnInsightChanges = True

I beleive having it set to "True" meant the algorithm was rebalancing once when the insight expired (Expiry.MonthEnd) and again, when the Execution model was triggered (MonthStart). I set it to False which fixed the FEES and multiple orders split over two days issue.

Even after fixing this the discrepancy was much the same. Our insights are slightly different, same quantity, slightly different signals. I am guessing it has to do with how (or when) the moving average is calculated? I was calculating it after a new month BEGINS in Update(). I cant tell when the new algorithm is calculating it due to my lack of knowledge around consolidators and when it is being updated... my best guess is that's where the difference lies?

0

Hi Mark,

The Expiry class has yet to be added to the LEAN documentation, but we can view the source code for it here.

Yes, after setting up the consolidator with

algorithm.SubscriptionManager.AddConsolidator(self.Symbol, self.consolidator)

the consolidated TradeBars are sent to the `CustomDailyHandler` method automatically and the SMA is updated. To learn more about consolidators, I recommend reviewing our documentation.

In regards to the RebalancePortfolioOnInsightChanges setting, my apologies. I switched it to True but did not discuss the reasoning above. By having this setting set to False, if we set a start date of the algorithm that isn't the first day of the month, trading doesn't begin until the following month. By setting it to True and utilizing the `Expiry.EndOfMonth` insight duration, the algorithm is equipped to begin trading immediately, regardless of the backtest start date. However, as you discovered, this causes the algorithm to rebalance at the end and beginning of new months, resulting in 2 days of orders. In the attached backtest, I've switched this property back to False.

The reason our insights were slightly different was infact an issue with the SMA. When we setup the consolidator with `Resolution.Daily`, the bars are consolidated for 24 hours of trading instead of 1 trading day. After changing the consolidator time to `timedelta(1)`, the insights are identical.

See the attached algorithm for reference.

Best,
Derek Melchin

1

The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by QuantConnect. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. QuantConnect makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances. All investments involve risk, including loss of principal. You should consult with an investment professional before making any investment decisions.


Update Backtest





0

The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by QuantConnect. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. QuantConnect makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances. All investments involve risk, including loss of principal. You should consult with an investment professional before making any investment decisions.


Loading...

This discussion is closed