Trulia tries harder, Alon Chaver speaks up

by Victor Lund on January 30, 2013

Trulia LogoWAV Group works with brokers and MLSs who have been fed up and frustrated with the data quality on third party listing websites like Trulia, Zillow and others. We study and measure the problem in detail – and have published countless articles, papers, and reports on the topic. Frankly, the discussion about the ills of listing syndication are getting tired and laborious. It is the same stuff regurgitated over and over again. But today, a shining light appeared in an article on RE Technology. Trulia launched a dedicated effort to make a difference.

Today, Trulia announced that they are moving to a data management framework that enables continuous feeds of listing information. As many of you know, Alon Chaver of iHomefinder moved over to Trulia a few months ago. I have been waiting and watching to see how that would impact Trulia. Perhaps this is the first step of many. To be clear, it appears that the continuous data feed program appears to be limited to feeds from the MLS. Hence, this does not help brokers at all. Let me explain why.

Trulia, like most third party websites, has a schema for Authoritative Feed Source. Everyone’s schema is different and it changes frequently, so I will not detail it here. But suffice to say, Trulia gets many data feeds of the same listing – from agents, from brokers, from franchises, from MLSs, from newspapers, from virtual tour companies, from Listhub, from Point2, etc. WAV Group has a project about once a week involving the removal of duplicate feeds. It is a painful mess to unwind, but is must be done. Every feed has different business rules regarding display, so if the wrong feed trumps the right feed, competitors appear on broker listings and leads leak out to competing agents. In addition, the link back to view more information may be diverted away from the broker site to the Franchise website or to a Listhub landing page, or some other place that may be godforsaken.

My hope is that continuous data will move through the publisher industry as a well adopted strategy. This hope is grounded in the improving the industry reputation with the consumer, and eliminating pain for the agent. Consumers are frustrated by bad data. Agents, Brokers, and MLSs are tired of wasting time managing customer issues related to bad data. Bad data is simply and irrevocably, BAD for everyone.

Well done Trulia. I hope everyone follows in your path.

{ 2 comments… read them below or add one }

Janet Choynowski January 31, 2013 at 6:17 am

A further complication in life: many MLSes have now signed up with one or both of the two major feed distributors, and do not want to give direct feeds. This leaves publishers in a “no win” situation where we get listings of quality and freshness outside of our control in any way, but are left to make explanations for why new listings have not surfaced or old listings been removed.
Also, the data collected and passed on by the feed distributors has been seriously “dumb-ed down” and so we also have less information to display. Add to that the practice in many areas of replacing MLS numbers with ” listing numbers” and it becomes impossible to weed out duplicate listings.

Reply

Alon Chaver February 1, 2013 at 7:22 am

Thanks for the kind words Victor – the credit for this massive effort belongs to Trulia’s hard working engineering staff!

This is indeed only the first step on the path to providing the best consumer experience and helping real estate professionals connect with qualified buyers and sellers.

Our data quality initiative also includes working closely with brokers to help them benefit from the data quality improvements delivered by the new program in conjunction with their MLSs – initial broker response to the program is very exciting and we’re looking forward to executing the next phases of this initiative.

Alon

Reply

Leave a Comment

Previous post:

Next post: