As I said the other day when posting Part 1, I was tired. I also don’t wish to delete older posts that may be of little value, rather I want to keep my workings and progress. Maybe a symbolic way of dealing with my “Shadow”. For those readers not into geekish financial maths, give this post a miss.

Let us have another look at the results, this time with some tweaks to the approach and then we will go into some more depth trying to establish whether this method truly adds value to the Buy and Hold approach.

We start by downloading S&P500 and US 10yr Treasury data from Quandl going back to 1962 so we have 53 years of data. I would actually like to use a lot more data as I fear that the secular bond bull cycle may be distorting our results. We create a ratio by dividing the S&P500 by the bond yield (I think working with nominal index and not using log returns may be an issue worth looking into). Once we have a ratio we then smooth it out with a 50 and 200 day moving average. The signal trigger in our model then is to buy the market whenever the 50 day moving average of the ratio is greater than the 200 day. It is quite hard to see on this chart but you can at least see the shape of the ratio relative to the S&P500 (black):

So lets look at the results:

According to the results from 1962 the Sharpe Ratio of the Ratio Cross vs Buy&Hold is better 0.50 vs 0.32. You can also see a superior Max DrawDown ratio, we do not focus on absolute return where buy and hold outperforms as our stated objective is to always focus on risk adjusted returns. I include the tail(sig_ratio) so that you can see that the system is invested at the moment.

We now try a different study and look at the Sharpe Ratio on a yearly basis and then look at the mean of the 53 years. Here I get a confusing result which I don’t know how to explain, it seems to say that the Ratio Cross has a Sharpe Ratio of 0.60 vs 0.70 hmmm???. So to take this one step further I tested the market timing ability of my Bond Timer with the Treynor-Mazuy quadratic regression approach, and you can see below that the system produces alpha, which seems to suggest that the Bond Timer is providing market timing value add. Btw, the MarketTiming() function isn’t in the Performance Analytics package on CRAN so I have included the function in my code below.

**Conclusion:**

This is still very much a work in progress but I believe the ratio timing method adds value to a buy and hold approach, however I need to explain why I am getting a conflicting result when looking at Sharpe Ratio’s of 1 year at a time.

require(quantmod) require(PerformanceAnalytics) require(Quandl) start<- "1962-01-02" #get the data (S&P500 and US 10yr Treasury) spy<- Quandl("YAHOO/INDEX_GSPC", authcode="kvYEqCqKCTyL4anWz5Zv", type="xts") bond<- Quandl("FRED/DGS10", authcode="kvYEqCqKCTyL4anWz5Zv", type="xts") data<- merge(spy[,6],bond[,1]) data<- data["1999::"] # This is the magic signal or is it? It is the Nominal Price of the S&P500 / bond yield. # The fact that it is nominal worries me. ratio<- data[,1]/data[,2] ratio<- na.omit(ratio) # moving averages of the ratio ratio_short<- rollmean(ratio, k=50, align= "right") ratio_long<- rollmean(ratio, k=200, align= "right") # I like visual references ignore the red errors in the output console I am too lazy to fix up the axis. plot(data[,1], main="Mike's Bond Timer") par(new=TRUE) plot(ratio_long, col ='red') axis(4) par(new=TRUE) plot(ratio_short, col ='green') par(new=TRUE) plot(ratio, col ='blue') #our baseline, unfiltered results ret <- ROC(data[,1]) #our comparision, filtered result. The idea being to trade the short ratio while it is above the long ratio. sig_ratio <- Lag(ifelse(ratio_short > ratio_long, 1, 0)) sig_ret <- ret * sig_ratio btimer<- cbind(sig_ret, ret) colnames(btimer) = c('Ratio Cross', 'Buy&Hold') table.AnnualizedReturns(btimer, Rf= 0) charts.PerformanceSummary(btimer, Rf = 0, main="Bond Timer",geometric=FALSE) maxDrawdown(btimer) tail(sig_ratio) # This looks at the Sharpe Ratio on an annual basis as opposed to the whole period. years <- apply.yearly(btimer, SharpeRatio.annualized) # In order to get rid of NaN's this function I found on StackFlow helps convert them to zero. is.nan.data.frame <- function(x) do.call(cbind, lapply(x, is.nan)) years[is.nan(years)] <- 0 years sapply(years, mean) # Market Timing Function which is not part of the CRAN release. MarketTiming <- function (Ra, Rb, Rf = 0, method = c("TM", "HM")) { # @author Andrii Babii, Peter Carl # FUNCTION Ra = checkData(Ra) Rb = checkData(Rb) if (!is.null(dim(Rf))) Rf = checkData(Rf) Ra.ncols = NCOL(Ra) Rb.ncols = NCOL(Rb) pairs = expand.grid(1:Ra.ncols, 1) method = method[1] xRa = Return.excess(Ra, Rf) xRb = Return.excess(Rb, Rf) mt <- function (xRa, xRb) { switch(method, "HM" = { S = xRb > 0 }, "TM" = { S = xRb } ) R = merge(xRa, xRb, xRb*S) R.df = as.data.frame(R) model = lm(R.df[, 1] ~ 1 + ., data = R.df[, -1]) return(coef(model)) } result = apply(pairs, 1, FUN = function(n, xRa, xRb) mt(xRa[, n[1]], xRb[, 1]), xRa = xRa, xRb = xRb) result = t(result) if (ncol(Rb) > 1){ for (i in 2:ncol(xRb)){ res = apply(pairs, 1, FUN = function(n, xRa, xRb) mt(xRa[, n[1]], xRb[, i]), xRa = xRa, xRb = xRb) res = t(res) result = rbind(result, res) } } rownames(result) = paste(rep(colnames(Ra), ncol(Rb)), "to", rep(colnames(Rb), each = ncol(Ra))) colnames(result) = c("Alpha", "Beta", "Gamma") return(result) } print(MarketTiming(years[,1],years[,2],Rf=0))