How one firm is overcoming racial bias in the mortgage industry

It's using a tool many of us disparage to achieve its aim

How one firm is overcoming racial bias in the mortgage industry

For many a layman observer, it’s hard to muster genuine enthusiasm for the growing emergence of artificial intelligence (AI) applied to tasks once assigned to humans.

Yet at San Francisco-based HouseCanary – a self-described technology and data-forward national real estate brokerage – the technology is being applauded for removing bias in the appraisal process. Traditional appraisals have come under fire in recent years for providing insufficient property information at best and undervaluing minority-owned homes relative to white-owned homes at worst. 

To test the efficacy of its own technology, HouseCanary officials conducted a statistical study using a recent Freddie Mac report showcasing bias as a guide. The aim: To test the accuracy of the brokerage firm’s in-house, automated valuation tools in appraising homes amid minority neighborhoods.

The upshot: “No evidence of racial bias exists in HouseCanary’s automated comp and AVM (automated valuation models) tools,” company officials concluded in their study, a copy of which was provided to Mortgage Professional America. “This stands in stark contrast to the results of Freddie Mac’s examination of traditional appraisals, which found that ‘Black and Latino applicants receive lower appraisal values than the contract price more often than white applicants.’ “

To hear CEO Jeremy Sicklick (pictured) tell it, it’s pretty straightforward stuff: “Our software is focused on accurately valuing residential real estate in the $36 trillion real estate market,” he said during a telephone interview. “Our software is focused on bringing together as much data around a property with comparables to local markets to get to an accurate property value with what we call an automated valuation model, or AVM for short. This AVM technology can be used in lieu of an appraisal or to support an appraisal or basically to quality control or check an appraisal as well.”

The proprietary software is used in combination with clients through its major lenders, major single-family rental investors, high buyers and others, Sicklick said. “So we provide the software and the underlying content out to our customers,” he added. 

Interest has grown in providing bias-free appraisals, an effort jump-started with the September publication of a Freddie Mac report titled “Racial and Ethnic Valuation Gaps in Home Purchase Appraisals.” The study laid bare severe bias in appraisal data, leading to lowered appraisal values than the contract price properties were sold for.  

While the findings of the Freddie Mac report might be seen only in the abstract, real-world examples emerge virtually every day illustrating the scourge of undervaluation in traditional (non-AI) analysis. An extreme example emerged in Marin City, Calif.

In its Dec. 8 edition, Black Enterprise magazine highlights the Black couple in California now suing a real estate appraiser and her employer after their homestead’s value was underestimated by $500,000. The couple had spent more than $400,000 on renovations – adding a floor, expanding by 1,000 square feet, constructing a deck, and building a fireplace – but the resulting $995,000 appraisal came in significantly lower than previous ones, according to the report. 

To test their suspicions of having been lowballed, the couple invited a white friend to pose as their home’s owner before securing another appraisal. As part of the ruse, the couple painstakingly removed family photos and African American artwork adorning the interior of their home to remove evidence as to who lived there, as the magazine reported. Ultimately, the resulting appraisal came in at $1.48 million. 

“These articles now, unfortunately, come out every other day,” Sicklick said. “They’re coming out all the time. These stories are troubling and alarming.” 

Other than dark motives that may lie in the heart of an appraiser, a main antagonist in such narratives is so-called “comp selection.” When evaluating a property, appraisers must select comparable properties in adherence to a myriad of rules (time of sale, distance from target property, and the like) upon which to base their valuation figure. In its damning report, Fannie Mae found the three most frequent issues in traditional appraisals – accounting for 62% of the overall appraisal defects – are all related to the selection of comparable properties aka “comp selection.”

Given such system flaws and personal bias, HouseCanary clients have applauded the company’s artificial intelligence offering, the CEO said. “When you can use innovation and technology to not only get to something that is more accurate but it’s cheaper, it’s faster and doesn’t have racial bias built in, you get a lot of people moving forward with more innovation.”

Brandon Lwowski (pictured immediately above), the senior research manager at HouseCanary, chimed in with more AI attributes: “Our AVM model do not take in or receive any racial information,” he said. “We don’t use crime data, school data, we don’t use income level. So none of these protect attributes that can eventually lead someone to assume an ethnicity. All that information is removed during training, and none of that is used as inputs into our model. That’s kind of step one to reduce and eliminate the amount of bias that’s present in home valuations. Our models don’t go into a home, look around and see a family. Our models use big data to estimate the value of a property.” 

Blame HAL 9000, the fictional artificial intelligence character serving as the main antagonist in the movie “2001: A Space Odyssey” for any remnant misgivings to which some of us stubbornly adhere. “It’s a brave new world,” some of us will continue muttering at the thought of real-life AI. But maybe, just maybe, there are some good AI characters out there as well.