Greed, analytics and the mortgage lending crisis

The reliance on analytics and competition may have significantly contributed to morgage crisis. As the boom breaks, what next?

Share

The mortgage services industry does not lack state-of-the-art analytics tools. From automated underwriting software used in loan origination to the statistical modelling programs that value mortgage-backed securities, analytics played a central role in keeping the recent housing boom alive. How ironic that those tools did not assess the risks that mattered most.

Improved algorithms and increased computing power have increased the accuracy of risk analysis by allowing lenders to crunch more data and process it faster than ever. So why did those programs not help lenders predict the current subprime loan crisis? Inadequate models, bad assumptions and poor judgment all contributed to the market's collapse.

On the front end, analytics facilitated the lending boom, in non-traditional mortgages, by helping lenders create new loans faster, approving them in minutes. Such software was well established in the prime lending markets through institutions such as Freddie Mac but was relatively new to the non-conventional mortgage markets. It vastly expanded the number of loans that a lender could originate per month, including loans to the subprime market, which serves higher-risk borrowers. Unfortunately, the risk scoring models used to grant such loans does not include macroeconomic factors, such as the effect of flat or negative home price growth.

On the one hand, no lender was going to reject an individual loan based on the possibility that home price growth might flatten or go negative in the future. But the risk of sliding home prices should have been factored into overall lending policies, especially as prices rose to the point where lending standards had to be lowered to qualify buyers and keep the bubble going. Mike Beardsell, director of risk model analytics at Loan Performance states:"It should be an important strategic concern embedded in credit policy lending operations."

It was not.

Lenders also found a way around waiting for professional appraisals, which during the boom often held up deals. Instead, many used automated valuation models (AVM), appraisal-emulation programs that assess a property's value based on prior sales and other variables, including in some cases the current asking prices for similar properties. At the time, housing prices were rising at unprecedented levels. This created a pricing feedback loop in which ever-higher asking prices validated ever-higher assessments.

AVM's main benefit was speed. "AVM makes underwriting faster. You can get approvals within 15 minutes," claims Christopher Cagan, director of research and analytics at American CoreLogic. However, AVM did not capture everything. Without a live inspection, lenders could not know the true condition, or true value, of an individual property.

As housing prices went out of sight, lenders began dropping their standards, creating new products designed to keep borrowers from being priced out of the market and to keep the lending boom going. As competition for customers increased, getting deals closed fast became paramount. Borrowers were offered new, riskier loan products with features such as no money down, no income verification, interest-only payments and even negative amortisation loans.

Lenders ignored the risks because in a rising market they could not lose. "Home prices were growing so fast that even if the borrower got into trouble, it would rarely lead to a loss," says Beardsell. In some cases, lenders relied on only the applicant's credit score to approve the loan. Brokers then sold bundles of such mortgages into the secondary market.

Find your next job with computerworld UK jobs