By David Lykken
Special to MPA
I've been thinking a lot lately about credit standards, and whether it is a good idea to lower them again. On the one hand, it seems like the economy is recovering and the industry could use an injection of new homeowners. On the other hand, we aren't too far removed from the housing crisis and we want to be cautious that we don't find ourselves in the same mess all over again.
A few weeks ago, I was speaking with my business partner Andy Schell, and he brought up an interesting argument as to why we should lower credit standards: big data. The statistical capabilities and access to information we have now vastly outweighs that which we had before the recession occurred. With the business intelligence we possess, we can more precisely and effectively evaluate who will be able to bear a mortgage and who won't.
Of course, all of us use some level of data when we're making decisions as to who should be approved for loans and who shouldn't. But, many times, it does come down to a judgement call. There is more relevant data we aren't using and there are more sophisticated techniques available for interpreting it.
Why aren't we adopting these concepts into the mortgage industry to gain a greater understanding of who should qualify for loans? In the future, I think it's quite possible that we can harness the available data to increase approvals and, at the same time, decrease defaults. We just need to place a little more focus on the numbers, and little less on our intuition.