The Limitations of Econometric Computer Models


By: Thomas E. Brewton

Computer models were the tools employed to inflate the subprime mortgage balloon that precipitated collapse of our financial markets. And the same sorts of inherently faulty computer models are generating proclamations that President Obama’s near-trillion-dollar stimulus and mortgage protection programs will create millions of new jobs and restore the economy to prosperity.

The old saw is “garbage in; garbage out.” When the computer model is too complex and attempts to cover too wide a scope, its output is worse than useless; disaster is just around the corner.

Econometric computer models were at the root of the 1998 collapse of John Meriwether’s hedge fund Long-Term Capital Management (LTC). That event led the Federal Reserve to pull out all the stops and flood the market with liquidity to prevent a possible world-wide financial meltdown. Long-Term Capital Management had too many huge financial commitments to and from the world’s major financial institutions. Allowing it to fail, so the Fed thought, would risk system-wide disaster.

Mr. Meriwether’s computer models had built in all of the assumptions of risks that humans then could foresee in constructing very complex trades involving very complex financial instruments. He and his colleagues were as intelligent and as well informed as any group of securities traders in the world. They were former “masters of the universe” at Salomon Brothers, when in the 1980s it was the world’s largest non-governmental trader of securities. And Mr. Meriwether was at the top of the Salomon Brothers trading heap.

Yet, an unanticipated confluence of market conditions affected a swift collapse of LTC’s house of cards.

A nearly identical pattern emerged in our present financial collapse, beginning with the subprime mortgage meltdown. Some of the world’s most sophisticated computer jocks modeled some of the most complex financial instruments in history. A vast network of risk swaps among the world’s major financial players was constructed to hedge against all foreseeable risks. However, the computer modelers failed to foresee combinations of circumstances that would quickly undermine the risk protections built into their models.

Once again, unanticipated confluences of market conditions kicked the slats from under this vast financial structure.

To dig out of this economic debris, the nation is rolling the dice with the largest financial wager ever made: President Obama’s stimulus bill.

Why are we once again placing our bets on the same school of econometric-model economists who gave us the stagflation of the 1970s, economists who are using the same sorts of computer models that led to disaster in 1998 with Long-Term Capital Management and in 2008 with the subprime meltdown?

In a free-market economy, hundreds of millions of individuals can make better and faster economic decisions that come closer to sensing all available information than can any group of intellectual planners working with the world’s fastest block of supercomputers. Planners cannot possibly gather all available information in real time and cannot effectively integrate that information into policy enactments that must apply uniformly to the entire population.



Thomas E. Brewton is a staff writer for the New Media Alliance, Inc. The New Media Alliance is a non-profit (501c3) national coalition of writers, journalists and grass-roots media outlets.

His weblog is THE VIEW FROM 1776
http://www.thomasbrewton.com/

Email comments to viewfrom1776@thomasbrewton.com

About The Author Thomas E. Brewton:
Thomas E. Brewton is a staff writer for the New Media Alliance, Inc. The New Media Alliance is a non-profit (501c3) national coalition of writers, journalists and grass-roots media outlets.
Website:http://www.thomasbrewton.com/

No Comments

No comments yet.

RSS feed for comments on this post. TrackBack URI

Sorry, the comment form is closed at this time.