Skip the navigation

Greenspan, Cox tell Congress that bad data hurt Wall Street's computer models

October 23, 2008 12:00 PM ET

Computerworld - WASHINGTON — Using nature to describe a man-made financial disaster, Alan Greenspan, former chairman of the Federal Reserve, today called the financial-market meltdown a "once in a century tsunami" and explained to a U.S. House committee what he thought went wrong. And insufficient data was one of the causes he pointed to.

Greenspan has long praised computer technology as a tool that can be used to limit risks in financial markets. For instance, in 2005, he credited improved computing power and risk-scoring models with making it possible for lenders to extend credit to subprime mortgage borrowers.

But at a hearing held today by the House Committee on Oversight and Government Reform, Greenspan acknowledged that the data fed into financial systems was often a case of garbage in, garbage out.

Business decisions by financial services firms were based on "the best insights of mathematicians and finance experts, supported by major advances in computer and communications technology," Greenspan told the committee. "The whole intellectual edifice, however, collapsed in the summer of last year because the data inputted into the risk management models generally covered only the past two decades — a period of euphoria."

He added that if the risk models had also been built to include "historic periods of stress, capital requirements would have been much higher and the financial world would be in far better shape today, in my judgment."

And that wasn't the only bad data being crunched by IT systems. Christopher Cox, chairman of the Securities and Exchange Commission, told the committee that credit rating agencies gave AAA ratings to mortgage-backed securities that didn't deserve them. "These ratings not only gave false comfort to investors, but also skewed the computer risk models and regulatory capital computations," Cox said in written testimony.

Financial services firms are increasingly using high-performance computing systems to calculate investment risks. In his testimony, Cox also pointed to a 2004 decision by the SEC to loosen its capital rules and agree to instead rely on the computer models for assessing risks — a decision that essentially outsourced regulatory duties to Wall Street firms themselves.

It was unclear from Cox's testimony just what sort of regulatory changes he was suggesting. But he said that the SEC is now engaged in "aggressive law enforcement." As part of that effort, Cox said, the agency's IT department is working with its enforcement division "to create a common database of trading information, audit trail data and credit-default-swaps clearing data."

Read more about Business Intelligence/Analytics in Computerworld's Business Intelligence/Analytics Topic Center.



Our Commenting Policies
Internet of Things: Get the latest!
Internet of Things

Our new bimonthly Internet of Things newsletter helps you keep pace with the rapidly evolving technologies, trends and developments related to the IoT. Subscribe now and stay up to date!