Links – February 2015

I buy into the “hedgehog/fox” story, when it comes to forecasting. So you have to be dedicated to the numbers, but still cast a wide net. Here are some fun stories, relevant facts, positive developments, and concerns – first Links post for 2015.

Cool Facts and Projections

How the world’s population has changed – we all need to keep track of this, 9.6 billion souls by 2050, Nigeria’s population outstrips US.


What does the world eat for breakfast?

Follow a Real New York Taxi’s Daily Slog 30 Days, 30 random cabbie journeys based on actual location data

Information Technology

Could Microsoft’s HoloLens Be The Real Deal?


I’ll Be Back: The Return of Artificial Intelligence



Why tomorrow’s technology needs a regulatory revolution Fascinating article. References genome sequencing and frontier biotech, such as,

Jennifer Doudna, for instance, is at the forefront of one of the most exciting biomedical advances in living memory: engineering the genomes not of plants, but of people. Her cheap and easy Crispr technology holds out the promise that anybody with a gene defect could get that problem fixed, on an individual, bespoke basis. No more one-size-fits all disease cures: everything can now be personalized. The dystopian potential here, of course, is obvious: while Doudna’s name isn’t Frankenstein, you can be sure that if and when her science gains widespread adoption, the parallels will be hammered home ad nauseam.

Doudna is particularly interesting because she doesn’t dismiss fearmongers as anti-science trolls. While she has a certain amount of control over what her own labs do, her scientific breakthrough is in the public domain, now, and already more than 700 papers have been published in the past two years on various aspects of genome engineering. In one high-profile example, a team of researchers found a way of using Doudna’s breakthrough to efficiently and predictably cause lung cancer in mice.

There is more on Doudna’ Innovative Genomics Initiative here, but the initially linked article on the need for regulatory breakthrough goes on to make some interesting observations about Uber and Airbnb, both of which have thrived by ignoring regulations in various cities, or even flagrantly breaking the law.


Is China Preparing for Currency War? Provocative header for Bloomberg piece with some real nuggets, such as,

Any significant drop in the yuan would prompt Japan to unleash another quantitative-easing blitz. The same goes for South Korea, whose exports are already hurting. Singapore might feel compelled to expand upon last week’s move to weaken its dollar. Before long, officials in Bangkok, Hanoi, Jakarta, Manila, Taipei and even Latin America might act to protect their economies’ competitiveness…

There’s obvious danger in so many economies engaging in this race to the bottom. It will create unprecedented levels of volatility in markets and set in motion flows of hot money that overwhelm developing economies, inflating asset bubbles and pushing down bond rates irrationally low. Consider that Germany’s 10-year debt yields briefly fell below Japan’s (they’re both now in the 0.35 percent to 0.36 percent range). In a world in which the Bank of Japan, the European Central Bank and the Federal Reserve are running competing QE programs, the task of pricing risk can get mighty fuzzy.

Early Look: Deflation Clouds Loom Over China’s Economy

The [Chinese] consumer-price index, a main gauge of inflation, likely rose only 0.9% from a year earlier, according to a median forecast of 13 economists surveyed by the Wall Street Journal

China’s Air Pollution: The Tipping Point


Energy and Renewables

Good News About How America Uses Energy A lot more solar and renewables, increasing energy efficiency – all probably contributors to the Saudi move to push oil prices back to historic lows, wean consumers from green energy and conservation.

Nuclear will die. Solar will live Companion piece to the above. Noah Smith curates Noahpinion, one of the best and quirkiest economics blogs out there. Here’s Smith on the reason nuclear is toast (in his opinion) –

There are three basic reasons conventional nuclear is dead: cost, safety risk, and obsolescence risk. These factors all interact.            

First, cost. Unlike solar, which can be installed in small or large batches, a nuclear plant requires an absolutely huge investment. A single nuclear plant can cost on the order of $10 billion U.S. That is a big chunk of change to plunk down on one plant. Only very large companies, like General Electric or Hitachi, can afford to make that kind of investment, and it often relies on huge loans from governments or from giant megabanks. Where solar is being installed by nimble, gritty entrepreneurs, nuclear is still forced to follow the gigantic corporatist model of the 1950s.

Second, safety risk. In 1945, the U.S. military used nuclear weapons to destroy Hiroshima and Nagasaki, but a decade later, these were thriving, bustling cities again. Contrast that with Fukushima, site of the 2011 Japanese nuclear meltdown, where whole towns are still abandoned. Or look at Chernobyl, almost three decades after its meltdown. It will be many decades before anyone lives in those places again. Nuclear accidents are very rare, but they are also very catastrophic – if one happens, you lose an entire geographical region to human habitation.

Finally, there is the risk of obsolescence. Uranium fission is a mature technology – its costs are not going to change much in the future. Alternatives, like solar, are young technologies – the continued staggering drops in the cost of solar prove it. So if you plunk down $10 billion to build a nuclear plant, thinking that solar is too expensive to compete, the situation can easily reverse in a couple of years, before you’ve recouped your massive fixed costs.

Owners of the wind Greenpeace blog post on Denmark’s extraordinary and successful embrace of wind power.

What’s driving the price of oil down? Econbrowser is always a good read on energy topics, and this post is no exception. Demand factors tend to be downplayed in favor of stories about Saudi production quotas.

Forecasting Controversy Swirling Around Computer Models and Forecasts

I am intrigued by Fabius Maximus’ We must rely on forecasts by computer models. Are they reliable?

This is a broad, but deeply relevant, question.

With the increasing prominence of science in public policy debates, the public’s beliefs about theories also have effects. Playing to this larger audience, scientists have developed an effective tool: computer models making bold forecasts about the distant future. Many fields have been affected, such as health care, ecology, astronomy, and climate science. With their conclusions amplified by activists, long-term forecasts have become a powerful lever to change pubic opinion.

It’s true. Large scale computer models are vulnerable to confirmation bias in their construction and selection – example being the testing of drugs. There are issues of measuring their reliability and — more fundamentally — validation (e.g., falsification).

Peer-review has proven quite inadequate to cope with these issues (which lie beyond the concerns about peer-review’s ability to cope with even standard research). A review or audit of a large model often requires over a man-years or more of work by a multidisciplinary team of experts, the kind of audit seldom done even on projects of great public concern.

Of course, FM is sort of famous, in my mind, for their critical attitude toward global warming and climate change.

And they don’t lose an opportunity to score points about climate science, citing the Georgia Institute of Technology scientist Judith Curry.

Dr. Curry is author of a recent WSJ piece The Global Warming Statistical Meltdown

At the recent United Nations Climate Summit, Secretary-General Ban Ki-moon warned that “Without significant cuts in emissions by all countries, and in key sectors, the window of opportunity to stay within less than 2 degrees [of warming] will soon close forever.” Actually, this window of opportunity may remain open for quite some time. A growing body of evidence suggests that the climate is less sensitive to increases in carbon-dioxide emissions than policy makers generally assume—and that the need for reductions in such emissions is less urgent.

A key issue in this furious and emotionally-charged debate is discussed in my September blogpost CO2 Concentrations Spiral Up, Global Temperature Stabilizes – Was Gibst?

..carbon dioxide (CO2) concentrations continue to skyrocket, while global temperature has stabilized since around 2000.

The scientific consensus (excluding Professor Curry and the climate change denial community) is that the oceans currently are absorbing the excess heat, but this cannot continue forever.

If my memory serves me (and I don’t have time this morning to run down the link), backtesting the Global Climate Models (GCM) in a recent IPCC methodology publication basically crashed and burned – but the authors blithely moved on to re-iterate the “consensus.”

At the same time, the real science behind climate change – the ice cores for example retrieved from glacial and snow and ice deposits of long tenure – do show abrupt change may be possible. Within a decade or two, for example, there might be regime shifts in global climate.

I am not going to draw conclusions at this point, wishing to carry on this thread with some discussion of macroeconomic models and forecasting.

But I leave you today with my favorite viewing of Blalog’s “Chasing Ice.”

High Frequency Trading – 2

High Frequency Trading (HFT) occurs faster than human response times – often quoted as 750 milliseconds. It is machine or algorithmic trading, as Sean Gourley’s “High Frequency Trading and the New Algorithmic Ecosystem” highlights.

This is a useful introductory video.

It mentions Fixnetix’s field programmable array chip and new undersea cables designed to shave milliseconds off trading speeds from Europe to the US and elsewhere.

Also, Gourley refers to dark pool pinging, which tries to determine the state of large institutional orders by “sniffing them out” and using this knowledge to make (almost) risk-free arbitrage by trading on different exchanges in milliseconds or faster. Institutional investors using slower and not-so-smart algorithms lose.

Other HFT tractics include “quote stuffing”, “smoking”, and “spoofing.” Of these, stuffing may be the most damaging. It limits access of slower traders by submitting large numbers of orders and then canceling them very quickly. This leads to order congestion, which may create technical trouble and lagging quotes.

Smoking and spoofing strategies, on the other hand, try to manipulate other traders to participate in trading at unfavorable moments, such as just before the arrival of relevant news.

Here are some more useful links on this important development and the technological arms race that has unfolded around it.

Financial black swans driven by ultrafast machine ecology Key research on ultrafast black swan events

Nanosecond Trading Could Make Markets Go Haywire Excellent Wired article

High-Frequency Trading and Price Discovery

Defense of HFT on basis that HFTs’ trade (buy or sell) in the direction of permanent price changes and against transitory pricing errors creates benefits which outweigh adverse selection of HFT liquidity supplying (non-marketable) limit orders.

The Good, the Bad, and the Ugly of Automated High-Frequency Trading tries to strike a balance, but tilts toward a critique

Has HFT seen its heyday? I read at one and the same time I read at one and the same time that HFT profits per trade are dropping, that some High Frequency Trading companies report lower profits or are shutting their doors, but that 70 percent of the trades on the New York Stock Exchange are the result of high frequency trading.

My guess is that HFT is a force to be dealt with, and if financial regulators are put under restraint by the new US Congress, we may see exotic new forms flourishing in this area.