zillow Thinkstock

Zero One: Can a Machine Acquire Human Intuition?

Zillow believes AI can awaken the human eye and sense the value of homes just by looking at pictures.

What is the fair market value of your house? That is the question Zillow has been trying to answer for more than a decade. Now the company is on the verge of a breakthrough in the way it estimates home values in its online tool, called Zestimate.

Nearly three years in the making, a deep learning feature for gazing at hundreds of millions of images of homes will be infused into Zestimate and rolled out later this year. The idea is to bring human intuition into the equation of home value estimation.

“An untrained human eye can look at a lot of those photos and be like, ‘Wow, that’s a super-nice home,’ and yet we undervalued it,” says Stan Humphries, chief analytics officer and creator of Zestimate. “If I’d only seen that picture, I might have actually nudged that estimate up a little bit. It was that very simple intuition that took us down a path of deep learning.”

The human act of window shopping and sensing the value of things is about to be done by a machine. This breakthrough begs the question: Can a human appraiser conducting a walk-through and looking at the condition of a home be replaced by deep learning? It doesn’t sound far-fetched anymore.

For Zillow, deep learning is the future of the company, mainly because the current way of estimating home value has hit a wall.

Since Zillow’s start in 2006, Humphries has been optimizing Zestimate. By accessing both home owner-provided data and publicly available data from assessor files and listing databases, making sweeping improvements to algorithms, and tapping into the power of cloud computing, he was able to drive down Zestimate’s national median error rate from about 14 percent on 43 million homes to 4.3 percent on 110 million homes.

(The median error rate means that half of the estimates nationwide were within 4.3 percent of the final selling price, and half were off by more than 4.3 percent.)

But it hasn’t been enough, and Humphries, who had optimized beyond even his own expectations, didn’t know where the next round of gains was going to come from.

Zillow is clearly on a quest to make Zestimate more accurate. This summer, Zillow fended off a federal lawsuit brought by homeowners claiming that Zestimate undervalued their homes by, in some cases, hundreds of thousands of dollars. Around this time, Zillow launched a competition for data scientists and engineers to improve Zestimate, with a $1 million prize going to the person or team with the best solution.

One of the main reasons for under-valuations and over-valuations, says Humphries, is the lack of a human eye. For instance, two homes might look identical from a data standpoint of location, square footage and number of bedrooms, but differ in how the kitchen looks or what real estate agents call “curb appeal.” In other words, the je ne sais quoi of a home.

And so Humphries began the enormous task of integrating deep learning, not merely programming, into Zestimate, in order to awaken the human eye. “How do we teach a computer to add that same sense of curb appeal and that this photo of the kitchen is nicer than that other photo of the kitchen?” Humphries says.

On the upside, as a digital native company, Zillow aggressively funds innovation without fear of failure. Humphries didn’t even have to make a business case for researching and developing deep learning. “We’re set up institutionally to allow this to happen,” he says.

The research path went from parsing photos to pattern recognition with imagery to deep learning to convolutional neural networks, Humphries says. The goal is to ask the computer to discern between two photos that have a difference in valuation. A major thrust is defining pixel structures and patterns that correlate to the differences.

“That in itself is no mean feat,” Humphries says, adding, “That’s the merging of art and science that really gets you the gains.”

Then there’re the millions of images that need to be labeled correctly so that they can be analyzed. The machine needs to be able to compare, say, two kitchen photos rather than a photo of a kitchen and a photo of the front of the house.

“No one gives us photos with such fantastic metadata attached to it,” Humphries says. “Your first problem is trying to figure out what you’re looking at and putting that into a discrete set of categories that can be compared across homes.”

The engineers have their work cut out for them, too. Harnessing and processing so much image data is a computational challenge, to say the least. It’s a costly undertaking, since convolutional neural networks don’t work on standard centralized processing units, instead requiring expensive graphical processing units.

What will all of this effort yield? How low can deep learning take the currently 4.3 percent national median error rate? Humphries doesn’t know, but he thinks the very best achievable outcome is 1.5 percent. A zero error rate is impossible.

“The price of a home is, of course, a random variable,” Humphries says. “If you were to sell that home to a different buyer, a hundred different times, you’d get a hundred different price points.”

That’s not to say machine learning can’t disrupt the real estate speculation market or replace appraisers. It’s already starting to happen. Fannie Mae and Freddie Mac are piloting computer evaluations for home equity lines of credit, Humphries says. For some homes on the market, they’re using automated computer evaluation in lieu of a human appraiser or at least to do most of the appraising.

However, Humphries doesn’t expect a human appraiser apocalypse in the foreseeable future. Human appraisers are deeply embedded in the mortgage process, he says, and banks want someone to put his or her stamp of approval before issuing a loan.

What about the unforeseeable future? That’s a different matter.

“We’re now pushing into people’s brains, figuring out how they make a decision and then trying to teach the computers to do that same thing,” Humphries says. “Once you’re inside someone’s head, that’s a whole new frontier. It’s unbounded.”

Tom Kaneshige writes the Zero One blog covering digital transformation, AI, marketing tech and the Internet of Things for line-of-business executives. He is based in Silicon Valley. You can reach him at [email protected].

Hide comments


  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.