Python Progress

A quick update on my Python coding progress.

I am proud to say that I am actually writing some code that is actually doing stuff in Python. Boy starting something new is time consuming, doing the most mundane tasks that one could do in Excel in a flash takes absolute hours to do in a new language.

I also constantly feel why I am doing this when I know how to code many of these functions in R.

After a couple of weeks of learning I have now accumulated some knowledge which has been further enhanced with the arrival of a few more Python textbooks last week. In summary I am fast falling in love with the Pandas data library in Python which is like learning a whole new language on its own. Pandas has one of the most comprehensive data manipulation functions a data scientist can dream of. I love it, and I am sure in the weeks and months to come if I stay with the learning I will be doing really cool things with the data I come across.

So far I haven’t come across a library like “Performance Analytics” in R, once I come into contact with this type of library in Python then I will feel complete. I believe there is a way to pass R functionality via a wrapper into a Python project. This may be something I will research later today, however my first prize is to stay completely Python native, so anyone with insights into a Performance Analytic type library in Python please let me know.

Massive Python Weekend

I thought I would just summarize some of my early Python experiences.

Let me start by saying it hasn’t been as easy as many of the reviews would have you believe. To begin with the current version in production is 3.4.3 which is built on a completely new framework to the version 2 productions. So to begin with you encounter a debate about whether to start learning with version 2 or 3. You encounter this issue with many of the learning tutorials and video’s.

Once you have made your choice about version and download the Python program you are faced with the issue of an IDE or to stick with the raw “Shell” console. This reminded me of my earlier days with R where you also got a simple console, very similar to the DOS command prompt. The next big issue comes with downloading and installing libraries that are not shipped with the standard python installation.

With R there is no longer the issue with the IDE as RStudio is now accepted as the default IDE and installing new libraries is simply done the CRAN distribution hub doing all the heavy lifting with the built-in RStudio installation tools. Python libraries are typically shipped as binaries and need to be compiled into the base python programme. These are all big obstacles to getting going as a novice. Fortunately there is a great piece of software called Anaconda which updates and compiles most of the library’s in the broader ecosystem making it easier to work with.

So far I have chosen PyCharm as my IDE and code editor of choice. I have also found IPython Notes to be an awesome way to share code and explain each step in a markdown text so there are great editing solutions it just takes some time to get a grip on as there is so much choice.

I have to say I have been struggling a little to unlearn some of the R syntax to handle simple data manipulation and plotting. The literature seems to say that R syntax is very non-intuitive compared to much easier more natural Python syntax. As R was my first language I learned I am finding things the complete opposite.  As a quant it seems to be an essential to work with Numpy and Panda libraries as these guys seem to do all the heavy lifting in terms of standard functions for our industry. Because there is so much buy in to these libraries I plan on focusing a lot of my learning around these functions and not bother learning how to do things from the native library that can be doing with the call of a simple function in Pandas.

One of my biggest weaknesses and strengths is by nature I don’t like to ask for instructions, I am the kind of guy who likes to learn by myself on my own terms – excuse the arrogane. I almost never read instruction manuals when purchasing new appliances. I simply dive in like the proverbial bull in a china shop and try and work it out. That is how I learned my way around R and that is kind of how I started with Python.

Then it dawned on me I am not a computer science graduate with a good understanding of high level coding, and the different language environments, etc. To this end I have started to go down the more passive and slower approach of going through video tutorials by experts. This was indeed a good move as it has filled in many blanks and given me a far greater context to the subject and stronger building blocks.

The big find for me was Udemy which is an online “university” where teaching is broken into tiny ~5 minute type modules. I have been peppered with discount offerings and bought some excellent courses for $19 a piece some of which typically go for $279. So while this goes against my nature of diving in head first I have spent many hours over the last week “attending” lectures. In order to keep me sane though every couple of hours, I try and solve a real life problem that I have already mastered in R. This is still the best way to learn as it engraves the steps into ones memory far better than passively sitting and watching a lecture.

In conclusion I am starting to feel like I am getting somewhere but know I have a very long road ahead. As a further point of mention I must add that the reason for me wanting to learn a language like Python is that I know I will never be the sharpest quant in the house as many of the complex math problems are simply too difficult for me to master. However Python is far more than just a quant tool it has excellent Web building and general purpose capability that one cannot do well in R and it is these more general capabilities that I am hoping to add to my toolkit.


Last night I was running some code in R which was doing quite an involved VaR calculation over 1yr of daily data. The calculation was not running on the time series of the equity curve but on the open positions at each point in time and was taking my computer about ~1hr to run. I am very interested to know how long it would take to run in Python. I suspect it will be quicker, however as Python and R are not compiled languages they will still run much slower than say C++ or other compiled languages. All of this for another day, I would be happy to simply achieve “Hello World” in Python for now  😉


The US Dollar after the Fed Hikes

I think this is a terrific chart presented by Jawad Mian. I am so tired of everyones postulating whether the USD is going up or down, and whether the Fed is going to hike or not. Who the hell knows when the Fed will hike, but going back to 1970 and the last 17 Fed hike (starts) cycles the USD has actually weakened.

I find this kind of empirical research informative.

Source: Nodea Markets

Source: Nodea Markets



Sleeping Beauty

I have been wanting to write something inspired about the markets but I have very little to say as frankly I feel like my emotional tank is running on reserves. I want to write something personal before making a simple but fundamental comment about the way to successful investing.

Yesterday I saw this post on Facebook which couldn’t have been more appropriate as we were admitting my 15yr old daughter into hospital for chronic fatigue syndrome and fibromyalgia. For those not familiar how debilitating this condition can be; over the last 5 months my daughter has been sleeping 20hrs a day fighting pain 24/7. Her life has simply been put on hold. No school, no socializing, no fun, frankly no life. For a parent to watch helplessly on the sidelines is one of life’s real challenges.


The message I wish to draw from this difficult experience is simply that in life some of the most important principles are so basic that we tend to ignore their importance.

Watching my Sleeping Beauty sleep hour after hour I am realizing how precious every moment every experience is. When someone is denied something so basic as going to school, going out with friends, walking around without pain then we realize how blessed we are when we can do the basic things in life and how for granted we take these blessings.

It almost feels inappropriate to discuss the markets in the same context of this message but I feel we have lost our way when it comes to investing. If you think you are investing when you buy and sell shares on the market you are probably following some misguided belief. At the end of the day investing is about buying the market when its cheap and selling when its expensive. Do you get any simpler than that?

Norbert Keimling from StarCapital put some nice charts together using the CAPE (cyclically adjusted price-earning) ratio, otherwise known as the Shiller PE ratio to measure whether a market is cheap or expensive.

What you see is that the cheaper the market is in when you invest the better your future returns will be. The markets are breaking to new nominal highs, and the US is at its 2nd most expensive point in its history using CAPE, so you decide whether now is a good time to be investing in the markets.

2015-02-23_0844 2015-02-23_0843

Alpha Stable Distributions

I am doing some work with return distributions. In a previous gig we did a lot of work on this subject, and I was really encouraged by the track of work we were following.

A quick recap: we know that trading returns do not typically follow a normal Gaussian distribution path, yet most of the models in modern day finance still use these less than perfect solutions. Its typical human nature, the solutions give us a nice elegant quick solution most of the time. Because a more accurate solution is more difficult to figure out we rather dismiss the facts that bad things happen more often than we anticipate in the name of progress.

We were determined to find a more realistic solution, in keeping with my previous post, keeping it real 🙂 .

In the 2 charts I am going to illustrate how our most likely estimate (MLE) model using Alpha Stable Levy distributions does a much better job than the blue line normal distribution model.

fitted distribution

fitted distribution 2

Shiller PE Market Timing Model

As old readers will know by now that I am a great fan of the Shiller PE as a valuation metric to decipher whether the market is cheap or expensive. I am a value investor at heart and also a bit of an idealist.

The markets are expensive if one looks at the Shiller PE from a historic perspective. Here is a look at Shiller with a slight nuance as the Crestmont P/E Ratio (for now lets assume they are the same)

One of the big challenges for value investors is that they can get in or out a market way too early. This is something I have done all too often and have the emotional scars to bare testimony.

Somehow my trusted Shiller PE Market Timer seems to do a better job in eliminating my torturous contrarian suffering and still outperforms. Here the model is still long the market despite the Shiller PE being its only indicator generating buy or sell signals. This model has performed going back all the way to 1870.


Some extra findings

I think it is very important to work with daily data where it is available. In the charts below I show the histogram of 20yrs of daily returns on the S&P500 index. What I notice is there is some negative auto-correlation, I find that quite interesting..