Share

The mortgage services industry does not lack state-of-the-art analytics tools. From automated underwriting software used in loan origination to the statistical modelling programs that value mortgage-backed securities, analytics played a central role in keeping the recent housing boom alive. How ironic that those tools did not assess the risks that mattered most.

Improved algorithms and increased computing power have increased the accuracy of risk analysis by allowing lenders to crunch more data and process it faster than ever. So why did those programs not help lenders predict the current subprime loan crisis? Inadequate models, bad assumptions and poor judgment all contributed to the market's collapse.

On the front end, analytics facilitated the lending boom, in non-traditional mortgages, by helping lenders create new loans faster, approving them in minutes. Such software was well established in the prime lending markets through institutions such as Freddie Mac but was relatively new to the non-conventional mortgage markets. It vastly expanded the number of loans that a lender could originate per month, including loans to the subprime market, which serves higher-risk borrowers. Unfortunately, the risk scoring models used to grant such loans does not include macroeconomic factors, such as the effect of flat or negative home price growth.

On the one hand, no lender was going to reject an individual loan based on the possibility that home price growth might flatten or go negative in the future. But the risk of sliding home prices should have been factored into overall lending policies, especially as prices rose to the point where lending standards had to be lowered to qualify buyers and keep the bubble going. Mike Beardsell, director of risk model analytics at Loan Performance states:"It should be an important strategic concern embedded in credit policy lending operations."

It was not.

Lenders also found a way around waiting for professional appraisals, which during the boom often held up deals. Instead, many used automated valuation models (AVM), appraisal-emulation programs that assess a property's value based on prior sales and other variables, including in some cases the current asking prices for similar properties. At the time, housing prices were rising at unprecedented levels. This created a pricing feedback loop in which ever-higher asking prices validated ever-higher assessments.

AVM's main benefit was speed. "AVM makes underwriting faster. You can get approvals within 15 minutes," claims Christopher Cagan, director of research and analytics at American CoreLogic. However, AVM did not capture everything. Without a live inspection, lenders could not know the true condition, or true value, of an individual property.

As housing prices went out of sight, lenders began dropping their standards, creating new products designed to keep borrowers from being priced out of the market and to keep the lending boom going. As competition for customers increased, getting deals closed fast became paramount. Borrowers were offered new, riskier loan products with features such as no money down, no income verification, interest-only payments and even negative amortisation loans.

Lenders ignored the risks because in a rising market they could not lose. "Home prices were growing so fast that even if the borrower got into trouble, it would rarely lead to a loss," says Beardsell. In some cases, lenders relied on only the applicant's credit score to approve the loan. Brokers then sold bundles of such mortgages into the secondary market.

"If you had to pinpoint one area where analytics really was a factor, it would be the rating agencies" that rated mortgage-backed securities, says Beardsell. These "innovative" mortgage products presented a challenge. The agencies' statistical models, which relied on historical records of how traditional mortgage loans performed, had to be adjusted. But "the market was changing faster than the models could keep up," argues Beardsell. And the users of those models made a fatal assumption.

In the subprime market, many borrowers could not afford the payments on adjustable rate loans two or three years later, once the low teaser rates expired. The reset rates on those loans created large rises in monthly payments, and some borrowers are now seeing their monthly payments increase by $1,000 or more. Lenders were fully aware that borrowers could not afford those payments. But in a booming housing market where values were increasing at double-digit rates, lenders just assumed that borrowers would always have more than enough equity to refinance later.

Those mortgages and others were mixed together into pools that were "securitised," rated and distributed to investors, who often borrowed money to purchase them. The value was determined by statistical models. And those models assumed that home values would continue to rise. In fact, such an assumption was critical to the marketing of the structured securities based on those mortgages. "The economics of the securitisation would be negatively impacted by a zero growth assumption," Beardsell says, never mind a negative one.

But even if the models had factored in the risk of declining property values, it would not have mattered. Business decision-makers were not going to restrict lending in a hot market.

"People said, 'We've got to do it because if we don't, someone else will, and they'll take the money off the table,'" says Dennis Santiago, CEO at Institutional Risk Analytics. Once the loan was sold and off the books, it was not the loan initiator's problem anymore.

Then the boom ended, property values declined, teaser rates reset to higher levels, and delinquencies hit record levels. "It was the perfect storm," says Beardsell.

Earlier this year, Cagan used one of the tools that helped to facilitate the boom to predict its fallout. While AVM can make mistakes on individual properties, it does a better job when applied against groups of properties that serve as collateral behind the securities now held by nervous banks and investment firms.

Cagan ran the AVM model against a database of properties backing $326bn (£162bn) in loans made from 2004 to 2006 and used it to predict foreclosures on adjustable rate mortgages as the payments on those loans reset to higher, current market rates. He estimates that in the next six to seven years, investors will lose $112bn (£56bn) of value, as 7% of adjustable rate mortgages, 32% of teaser loans and 12% of subprime loans default.

Could analytics have done more to prevent the crisis? "The tools to assess the risks and trade-offs were never better," says Larry Rosenberger, vice president of research and development at Fair Isaac, or FICO, whose credit-score services are used by many lenders. "It was the lending practices that failed. They put too much emphasis on growth over risk control."

In the future, underwriters could include risk factors for future interest rate and property value changes. Ultimately, however, there is a limit to the role analytics can play. "Analytics are a tool for the human mind," says Cagan. "They cannot replace human judgement."