Portfolio Analysis

Greetings again. Took a deep dive into portfolio analysis for a colleague.

Portfolio analysis, of course, has been deeply influenced by Modern Portfolio Theory (MPT) and the work of Harry Markowitz and Robert Merton, to name a couple of the giants in this field.

Conventionally, investment risk is associated with the standard deviation of returns. So one might visualize the dispersion of actual returns for investments around expected returns, as in the following chart.

investmentrisk

Here, two investments have the same expected rate of return, but different standard deviations. Viewed in isolation, the green curve indicates the safer investment.

More directly relevant for portfolios are curves depicting the distribution of typical returns for stocks and bonds, which can be portrayed as follows.

stocksbonds

Now the classic portfolio is comprised of 60 percent stocks and 40 percent bonds.

Where would its expected return be? Well, the expected value of a sum of random variables is the sum of their expected values. There is an algebra of expectations to express this around the operator E(.). So we have E(.6S+.4B)=.6E(S)+.4E(B), since a constant multiplied into a random variable just shifts the expectation by that factor. Here, of course, S stands for “stocks” and B “bonds.”

Thus, the expected return for the classic 60/40 portfolio is less than the returns that could be expected from stocks alone.

But the benefit here is that the risks have been reduced, too.

Thus, the variance of the 60/40 portfolio usually is less than the variance of a portfolio composed strictly of stocks.

One of the ways this is true is when the correlation or covariation of stocks and bonds is negative, as it has been in many periods over the last century. Thus, high interest rates mean slow to negative economic growth, but can be associated with high returns on bonds.

Analytically, this is because the variance of the sum of two random variables is the sum of their variances, plus their covariance multiplied by a factor of 2.

Thus, algebra and probability facts underpin arguments for investment diversification. Pick investments which are not perfectly correlated in their reaction to events, and your chances of avoiding poor returns and disastrous losses can be improved.

Implementing MPF

When there are more than two assets, you need computational help to implement MPT portfolio allocations.

For a general discussion of developing optimal portfolios and the efficient frontier see http://faculty.washington.edu/ezivot/econ424/portfoliotheorymatrixslides.pdf

There are associated R programs and a guide to using Excel’s Solver with this University of Washington course.

Also see Package ‘Portfolio’.

These programs help you identify the minimum variance portfolio, based on a group of assets and histories of their returns. Also, it is possible to find the minimum variance combination from a designated group of assets which meet a target rate of return, if, in fact, that is feasible with the assets in question. You also can trace out the efficient frontier – combinations of assets mapped in a space of returns and variances. These assets in each case have expected returns on the curve and are minimum variance compared with all other combinations that generate that rate of return (from your designated group of assets).

One of the governing ideas is that this efficient frontier is something an individual investor might travel along as they age – going from higher risk portfolios when they are younger, to more secure, lower risk portfolios, as they age.

Issues

As someone who believes you don’t really know something until you can compute it, it interests me that there are computational issues with implementing MPT.

I find, for example, that the allocations are quite sensitive to small changes in expected returns, variances, and the underlying covariances.

One of the more intelligent, recent discussions with suggested “fixes” can be found in An Improved Estimation to Make Markowitz’s Portfolio Optimization Theory Users Friendly and Estimation Accurate with Application on the US Stock Market Investment.

The more fundamental issue, however, is that MPT appears to assume that stock returns are normally distributed, when everyone after Mandelbrot should know this is hardly the case.

Again, there is a vast literature, but a useful approach seems to be outlined in Modelling in the spirit of Markowitz portfolio theory in a non-Gaussian world. These authors use MPT algorithms as the start of a search for portfolios which minimize value-at-risk, instead of variances.

Finally, if you want to cool off and still stay on point, check out the 2014 Annual Report of Berkshire Hathaway, and, especially, the Chairman’s Letter. That’s Warren Buffett who has truly mastered an old American form which I believe used to be called “cracker barrel philosophy.” Good stuff.

Peer-to-Peer Lending – Disruptive Innovation

Today, I chatted with Emmanuel Marot, CEO and Co-founder at LendingRobot.

We were talking about stock market forecasting, for the most part, but Marot’s peer to peer (P2P) lending venture is fascinating.

LRspread

According to Gilad Golan, another co-founder of LendingRobot, interviewed in GeekWire Startup Spotlight May of last year,

With over $4 billion in loans issued already, and about $500 million issued every month, the peer lending market is experiencing phenomenal growth. But that’s nothing compared to where it’s going. The market is doubling every nine months. Yet it is still only 0.2 percent of the overall consumer credit market today.

And, yes, P2P lending is definitely an option for folks with less-than-perfect credit.

In addition to lending to persons with credit scores lower than currently acceptable to banks (700 or so), P2P lending can offer lower interest rates and larger loans, because of lower overhead costs and other efficiencies.

LendIt USA is scheduled for April 13-15, 2015 in New York City, and features luminaries such as Lawrence Summers, former head of the US Treasury, as well as executives in some leading P2P lending companies (only a selection shown).

Speakers

Lending Club and OnDeck went public last year and boast valuations of $9.5 and $1.5 billion, respectively.

Topics at the Lendit USA Conference include:

◾ State of the Industry: Today and Beyond

◾ Lending to Small Business

◾ Buy Now! Pay Later! – Purchase Finance meets P2P

◾ Working Capital for Companies through invoice financing

◾ Real Estate Investing: Equity, Debt and In-Between

◾ Big Money Talks: the institutional investor panel

◾ Around the World in 40 minutes: the Global Lending Landscape

◾ The Giant Overseas: Chinese P2P Lending

◾ The Support Network: Service Providers for a Healthy Ecosystem

Peer-to-peer lending is small in comparison to the conventional banking sector, but has the potential to significantly disrupt conventional banking with its marble pillars, spacious empty floors, and often somewhat decorative bank officers.

By eliminating the need for traditional banks, P2P lending is designed to improve efficiency and unnecessary frictions in the lending and borrowing processes. P2P lending has been recognised as being successful in reducing the time it takes to process these transactions as compared to the traditional banking sector, and also in many cases costs are reduced to borrowers. Furthermore in the current extremely low interest-rate environment that we are facing across the globe, P2P lending provides investors with easy access to alternative venues for their capital so that their returns may be boosted significantly by the much higher rates of return available on the P2P projects on offer. The P2P lending and investing business is therefore disrupting, albeit moderately for the moment, the traditional banking sector at its very core.

Peer-to-Peer Lending—Disruption for the Banking Sector?

Top photo of LendingRobot team from GeekWire.

Stock Trading – Volume and Volatility

What about the relationship between the volume of trades and stock prices? And while we are on the topic, how about linkages between volume, volatility, and stock prices?

These questions have absorbed researchers for decades, recently drawing forth very sophisticated analysis based on intraday data.

I highlight big picture and key findings, and, of course, cannot resolve everything. My concern is not to be blindsided by obvious facts.

Relation Between Stock Transactions and Volatility

One thing is clear.

From a “macrofinancial” perspective, stock volumes, as measured by transactions, and volatility, as measured by the VIX volatility index, are essentially the same thing.

This is highlighted in the following chart, based on NYSE transactions data obtained from the Facts and Figures resource maintained by the Exchange Group.

VIXandNYSETrans

Now eyeballing this chart, it is possible, given this is daily data, that there could be slight lags or leads between these variables. However, the greatest correlation between these series is contemporaneous. Daily transactions and the closing value of the VIX move together trading day by trading day.

And just to bookmark what the VIX is, it is maintained by the Chicago Board Options Exchange (CBOE) and

The CBOE Volatility Index® (VIX®) is a key measure of market expectations of near-term volatility conveyed by S&P 500 stock index option prices. Since its introduction in 1993, VIX has been considered by many to be the world’s premier barometer of investor sentiment and market volatility. Several investors expressed interest in trading instruments related to the market’s expectation of future volatility, and so VIX futures were introduced in 2004, and VIX options were introduced in 2006.

Although the CBOE develops the VIX via options information, volatility in conventional terms is a price-based measure, being variously calculated with absolute or squared returns on closing prices.

Relation Between Stock Prices and Volume of Transactions

As you might expect, the relation between stock prices and the volume of stock transactions is controversial

It seems reasonable there should be a positive relationship between changes in transactions and price changes. However, shifts to the downside can trigger or be associated with surges in selling and higher volume. So, at the minimum, the relationship probably is asymmetric and conditional on other factors.

The NYSE data in the graph above – and discussed more extensively in the previous post – is valuable, when it comes to testing generalizations.

Here is a chart showing the rate of change in the volume of daily transactions sorted or ranked by the rate of change in the average prices of stocks sold each day on the New York Stock Exchange (click to enlarge).

delPdelT

So, in other words, array the daily transactions and the daily average price of stocks sold side-by-side. Then, calculate the day-over-day growth (which can be negative of course) or rate of change in these variables. Finally, sort the two columns of data, based on the size and sign of the rate of change of prices – indicated by the blue line in the above chart.

This chart indicates the largest negative rates of daily change in NYSE average prices are associated with the largest positive changes in daily transactions, although the data is noisy. The trendline for the rate of transactions data is indicated by the trend line in red dots.

The relationship, furthermore, is slightly nonlinear,and weak.

There may be more frequent or intense surges to unusual levels in transactions associated with the positive side of the price change chart. But, if you remove “outliers” by some criteria, you colud find that the average level of transactions tends to be higher for price drops, that for price increases, except perhaps for the highest price increases.

As you might expect from the similarity of the stock transactions volume and VIX series, a similar graph can be cooked up showing the rates of change for the VIX, ranked by rates of change in daily average prices of stock on the NYSE.

delPdelVIX

Here the trendline more clearly delineates a negative relationship between rates of change in the VIX and rates of change of prices – as, indeed, the CBOE site suggests, at one point.

Its interesting a high profile feature of the NYSE and, presumably, other exchanges – volume of stock transactions – has, by some measures, only a tentative relationship with price change.

I’d recommend several articles on this topic:

The relation between price changes and trading volume: a survey (from the 1980’s, no less)

Causality between Returns and Traded Volumes (from the late 1990’)

The bivariate GARCH approach to investigating the relation between stock returns, trading volume, and return volatility (from 2011)

The plan is to move on to predictability issues for stock prices and other relevant market variables in coming posts.

Trading Volume- Trends, Forecasts, Predictive Role

The New York Stock Exchange (NYSE) maintains a data library with historic numbers on trading volumes. Three charts built with some of this data tell an intriguing story about trends and predictability of volumes of transactions and dollars on the NYSE.

First, the number of daily transactions peaked during the financial troubles of 2008, only showing some resurgence lately.

transvol

This falloff in the number of transactions is paralleled by the volume of dollars spent in these transactions.

dollartrans

These charts are instructive, since both highlight the existence of “spikes” in transaction and dollar volume that would seem to defy almost any run-of-the-mill forecasting algorithm. This is especially true for the transactions time series, since the spikes are more irregularly spaced. The dollar volume time series suggests some type of periodicity is possible for these spikes, particularly in recent years.

But lower trading volume has not impacted stock prices, which, as everyone knows, surged past 2008 levels some time ago.

A raw ratio between the value of trades and NYSE stock transactions gives the average daily price per transaction.

vluepershare

So stock prices have rebounded, for the most part, to 2008 levels. Note here that the S&P 500 index stocks have done much better than this average for all stocks.

Why has trading volume declined on the NYSE? Some reasons gleaned from the commentariat.

  1. Mom and Pop traders largely exited the market, after the crash of 2008
  2. Some claim that program trading or high frequency trading peaked a few years back, and is currently in something of a decline in terms of its proportion of total stock transactions. This is, however, not confirmed by the NYSE Facts and Figures, which shows program trading pretty consistently at around 30 percent of total trading transactions..
  3. Interest has shifted to options and futures, where trading volumes are rising.
  4. Exchange Traded Funds (ETF’s) make up a larger portion of the market, and they, of course, do not actively trade.
  5. Banks have reduced their speculation in equities, in anticipation of Federal regulations

See especially Market Watch and Barry Ritholtz on these trends.

But what about the impact of trading volume on price? That’s the real zinger of a question I hope to address in coming posts this week.

Future Scenarios

An item from ETF Daily News caught my eye. It’s a post from Tyler Durden Lord Rothschild Warns Investors: Geopolitical Situation Most Dangerous Since WWII.

Lord Rothschild is concerned about the growing military conflict in eastern Europe and the mid-east, deflation and economic challenge in Europe, stock market prices moving above valuations, zero interest rates, and other risk prospects.

Durden has access to some advisory document associated with Rothschild which features two interesting exhibits.

There is this interesting graphic highlighting four scenarios for the future.

R2

And there are details, as follows, for each scenario (click to enlarge).

RSheet

If I am not mistaken, these exhibits originate from last year at this time.

Think of them then as forecasts, and what has actually happened since they were released, as the actual trajectory of events.

For example, we have been in the “Muddling through” scenario. Monetary policy has remained “very loose,” and real interest rates have remained negative. We have even seen negative nominal interest rates being explored by, for example, the European Central Bank (ECB) – charging banks for maintaining excess reserves, rather than putting them into circulation. Emerging markets certainly are mixed, with confusing signals coming out of China. Growth has been choppy – witness quarterly GDP growth in the US recently – weak and then strong. And one could argue that stagnation has become more or less endemic in Europe with signs of real deflation.

It is useful to decode “structural reform” in the above exhibit. I believe this refers to eliminating protections and rules governing labor, I suppose, to follow a policy of general wage reduction in the idea that European production then could again become competitive with China.

One thing is clear to me pertaining to these scenarios. Infrastructure investment at virtually zero interest rates is no brainer in this economic context, especially for Europe. Also, there is quite a bit of infrastructure investment which can be justified as a response to, say, rising sea levels or other climate change prospects.

This looks to be on track to becoming a very challenging time. The uproar over Iranian nuclear ambitions is probably a sideshow compared to the emerging conflict between nuclear powers shaping up in the Ukraine. A fragile government in Pakistan, also, it must be remembered, has nuclear capability. For more on the growing nuclear threat, see the recent Economist article cited in Business Insider.

In terms of forecasting, the type of scenario formulation we see Rothschild doing is going to become a mainstay of our outlook for 2015-16. There are many balls in the air.

Time-Varying Coefficients and the Risk Environment for Investing

My research provides strong support for variation of key forecasting parameters over time, probably reflecting the underlying risk environment facing investors. This type of variation is suggested by Lo ( 2005).

So I find evidence for time varying coefficients for “proximity variables” that predict the high or low of a stock in a period, based on the spread between the opening price and the high or low price of the previous period.

Figure 1 charts the coefficients associated with explanatory variables that I call OPHt and OPLt. These coefficients are estimated in rolling regressions estimated with five years of history on trading day data for the S&P 500 stock index. The chart is generated with more than 3000 separate regressions.

Here OPHt is the difference between the opening price and the high of the previous period, scaled by the high of the previous period. Similarly, OPLt is the difference between the opening price and the low of the previous period, scaled by the low of the previous period. Such rolling regressions sometimes are called “adaptive regressions.”

Figure 1 Evidence for Time Varying Coefficients – Estimated Coefficients of OPHt and OPLt Over Study Sample

TvaryCoeff

Note the abrupt changes in the values of the coefficients of OPHt and OPLt in 2008.

These plausibly reflect stock market volatility in the Great Recession. After 2010 the value of both coefficients tends to move back to levels seen at the beginning of the study period.

This suggests trajectories influenced by the general climate of risk for investors and their risk preferences.

I am increasingly convinced the influence of these so-called proximity variables is based on heuristics such as “buy when the opening price is greater than the previous period high” or “sell, if the opening price is lower than the previous period low.”

Recall, for example, that the coefficient of OPHt measures the influence of the spread between the opening price and the previous period high on the growth in the daily high price.

The trajectory, shown in the narrow, black line, trends up in the approach to 2007. This may reflect investors’ greater inclination to buy the underlying stocks, when the opening price is above the previous period high. But then the market experiences the crisis of 2008, and investors abruptly back off from their eagerness to respond to this “buy” signal. With onset of the Great Recession, investors become increasingly risk adverse to such “buy” signals, only starting to recover their nerve after 2013.

A parallel interpretation of the trajectory of the coefficient of OPLt can be developed based on developments 2008-2009.

Time variation of these coefficients also has implications for out-of-sample forecast errors.

Thus, late 2008, when values of the coefficients of both OPH and OPL make almost vertical movements in opposite directions, is the period of maximum out-of-sample forecast errors. Forecast errors for daily highs, for example, reach a maximum of 8 percent in October 2008. This can be compared with typical errors of less than 0.4 percent for out-of-sample forecasts of daily highs with the proximity variable regressions.

Heuristics

Finally, I recall a German forecasting expert discussing heuristics with an example from baseball. I will try to find his name and give him proper credit. By the idea is that an outfielder trying to catch a flyball does not run calculations involving mass, angle, velocity, acceleration, windspeed, and so forth. Instead, basically, an outfielder runs toward the flyball, keeping it at a constant angle in his vision, so that it falls then into his glove at the last second. If the ball starts descending in his vision, as he approaches it, it may fall on the ground before him. If it starts to float higher in his perspective as he runs to get under it, it may soar over him, landing further back int he field.

I wonder whether similar arguments can be advanced for the strategy of buying based or selling based on spreads between the opening price in a period and the high and low prices in a previous period.

How Did My Forecast of the SPY High and Low Issued January 22 Do?

A couple of months ago, I applied the stock market forecasting approach based on what I call “proximity variables” to forward-looking forecasts – as opposed to “backcasts” testing against history.

I’m surprised now that I look back at this, because I offered a forecast for 40 trading days (a little foolhardy?).

In any case, I offered forecasts for the high and low of the exchange traded fund SPY, as follows:

What about the coming period of 40 trading days, starting from this morning’s (January 22, 2015) opening price for the SPY – $203.99?

Well, subject to qualifications I will state further on here, my estimates suggest the high for the period will be in the range of $215 and the period low will be around $194. Cents attached to these forecasts would be, of course, largely spurious precision.

In my opinion, these predictions are solid enough to suggest that no stock market crash is in the cards over the next 40 trading days, nor will there be a huge correction. Things look to trade within a range not too distant from the current situation, with some likelihood of higher highs.

It sounds a little like weather forecasting.

Well, 27 trading days have transpired since January 22, 2015 – more than half the proposed 40 associated with the forecast.

How did I do?

Here is a screenshot of the Yahoo Finance table showing opening, high, low, and closing prices since January 22, 2015.

SPYJan22etpassim

The bottom line – so far, so good. Neither the high nor low of any trading day has broached my proposed forecasts of $194 for the low and $215 for the high.

Now, I am pleased – a win just out of the gates with the new modeling approach.

However, I would caution readers seeking to use this for investment purposes. This approach recommends shorter term forecasts to focus in on the remaining days of the original forecast period. So, while I am encouraged the $215 high has not been broached, despite the hoopla about recent gains in the market, I don’t recommend taking $215 as an actual forecast at this point for the remaining 13 trading days – two or three weeks. Better forecasts are available from the model now.

“What are they?”

Well, there are a lot of moving parts in the computer programs to make these types of updates.

Still, it is interesting and relevant to forecasting practice – just how well do the models perform in real time?

So I am planning a new feature, a periodic update of stock market forecasts, with a look at how well these did. Give me a few days to get this up and running.