A Brief History of the Local Review Space

Part of what we do at allLocal is help national and regional businesses understand how their brand is viewed at the local level.  Based on this work, we have developed a rich set of data around the local review space and we thought it was time to shape this data into a bit of retrospective on the space.  The earliest local review we have analyzed for our customers is from May 2002 and comes from Citysearch.  In the almost 9 years since this review was written the local review space has grown…but has not really changed.  The only real innovation in the space is the community / vanity aspect that Yelp brought to the table in 2004.  While the number of reviews and the pervasiveness of them in the local search experience has grown, there has been no dramatic shift in the space.  As for the future, I think our best bet lies with the social networks and their ability to passively collect this information as part of a conversation as opposed to the explicit ‘user writes a review’ model that we have today.  After all, local reviews are so valuable because they are taking information that typically does not exist in a digital form and digitizing it…making it crunchable by an algorithm.  Who says this digitization has to be done actively and can’t just be something that happens as a user lives their digital life?  Enough prognosticating though, let’s get down to the data:

About The Data

  • ~2,200 locations distributed across 4 general verticals: 45% travel, 40% retail, 13% services and 2% restaurants
  • Analyzed the number of reviews written at Yelp, Google, Yahoo, Citysearch and Yellowpages.com.  We specifically focused on review sites that intend to have a broad appeal and because of this we left sites like TripAdvisor out of the mix.  We will evaluate them in the future when we take a more vertical specific look at reviews
  • For each engine we looked at the number of reviews written for any of the 2,200 locations on each day between January 1, 2005 and January 30, 2011.  The numbers you see tracked here are the 14 day moving average of the total number of reviews we saw written each day, divided by 2,200 to get the daily reviews per location and then multiplied by 100 to get a more readable number.  Basically, we took the 14 day moving average of this: New Reviews / 2,200 * 100



The 5 Year Retrospective

reviews-5-years
Some observations based on this:

  • The number of new reviews created each day across the 5 engines as a whole has grown by ~10x over the last 6 years
  • Yahoo! had the early lead in the space but did not capitalize on it (I was able to copy and paste that line from other articles that I have mentioned Yahoo! in…)
  • Yelp had a slow burn in the beginning and then really broke out in the later half of 2008



The 2 Year Retrospective

reviews-2-years

When we zoom into a 2 year range, some other interesting observations emerge:

  • The trend is Yelp’s friend
  • Google permanently passed Yahoo! based on new local review growth in early 2010
  • Citysearch and Yellowpages continue to struggle in getting user participation on their sites (i.e., getting people to write reviews)
  • The pop in Google reviews in the middle of November 2010 is interesting…



The 8 Month Retrospective

reviews-8-month


Looking at the last 8 months give us a few more things to consider:

  • In a relative sense, Citysearch had a good holiday season and generated a bunch of new reviews from their users…though they are still far behind in an absolute sense
  • Yahoo! had two peaks in review activity in late 2010…and based on a quick review the bursts seems to be very methodical and almost robotic in nature.  Activity like this stands out when you are dealing with what is still a relatively small data set at 2,200 locations
  • That Google pop is worth discussing further…so let’s add some commentary to the graph

reviews-8-month-marked

Ahh yes, Google Hotpot went live on November 15th.  Looks like they got a nice pop from the launch but then quickly settled back into their (flat) trend line.  Ouch….but to be fair, this analysis is not restaurant centric and Hotpot, while available across all types of business, is certainly angled towards the restaurant industry.  But then the same could be said for Yelp.  And speaking of Yelp, their trend line either got a nice boost from the holidays or from the Hotpot launch reminding everyone to go write some more reviews on Yelp.

This trend needs to be watched as the success of Hotpot is critical to Google.  Their review aggregation approach has crumbled under the weight of Yelp and now TripAdvisor not always playing nice with Google.  However, even if they do play nice a review scraped off some 3rd party site will never provide the insights that a review written directly on Google will.  All the meta-data that comes along with a direct review (i.e., the history of the person who wrote the review) allows Google to place much more (or less) trust on the authenticity of the review.  For scraped reviews, Google either needs to rely on the 3rd party sites to police their reviews effectively or has to make the decision themselves based on much less data.  We continue to see the importance of review text rise in Google Places ranking but access to fresh and trusted reviews is needed to make that model work.  Without Hotpot, Google may lose the local digitization race which, in my opinion, will ultimately decide who wins the local search race.

Thanks for taking the time to review some local reviews with us.  Lots more to look at in this space and our next post about reviews will dive into either a vertical specific or a geographic specific look at the review space as both provide some interesting data.

Google Local Search Stats for Black Friday

As promised in our post about the impacts of the October 27th Google update, we have put together a quick look at the traffic trends we saw for retails outlets on Black Friday and Cyber Monday.

About The Data

  • Sample (hundreds) of retail chain locations across the United States (other verticals responded differently and this analysis focuses only on retail stores)
  • Click and ‘click action’ data from Google Places for November 12th through December 12th
  • All locations have clean, complete and verified listings in Google Places

Our first step was to take a look at the number of clicks generated per day across the sample locations.  We took this data and indexed the performance to the click totals on November 12th:

Click Activty Around Black Friday

Click Activty Around Black Friday

Few things standout on this graph:

  • Not all that surprisingly, Black Friday was the best day in terms of the number of clicks generated
  • Local traffic popped on Cyber Monday as well
  • Big slowdown going into the Thanksgiving holiday
  • Black Friday is just a launching pad for the season with local traffic accelerating strongly starting in early December

Now we wanted to understand a bit more about the Black Friday burst versus the Cyber Monday burst – specifically if Black Friday is truly a ‘physical’ and in-store event while Cyber Monday is an online event.  To do this, we simply looked at the click actions (as reported by Google) for clicks to the location’s website versus clicks for directions to the location.  We used this data as a metric to help understand what people’s intent was once they found the local listing:

Going to the Website or to the Store?

Going to the Website or to the Store?

The data showed a material difference in the click behavior with a full 10% of user clicks shifting from directions to website on Cyber Monday.  While this data is not perfect (for example, many people going to the website just ended up using the on-site functionality to get directions) is does provide a reasonable proxy for the behavior of shoppers on these two days.

In early 2011 we will be posting an analysis of the entire 2010 holiday shopping season.  Follow us on Twitter or Facebook to get the latest updates as they become available.  Thanks!

Google Local Updates: Performance Impact Analysis

Right around October 27th Google rolled out an update to both how they rank their local listings and how they integrate those local listings into the main search result pages.  While the 3-pack and 7-pack still exist, this layout is starting to become the norm for local listings on the main Google results page:

Example Google Results Page with Local Listings

Example Google Results Page with Local Listings

To help understand what these changes mean to local businesses listed on Google, we have put together an analytical summary of what we saw during and after this update.

About The Data

  • Sample of retail chain locations across the United States (other verticals responded differently to the change and these number apply only to retail stores)
  • Click and Impression data for October 18th through November 14th, as reported in Google Places
  • Removed outliers based on total number of clicks and/or impressions
  • All the locations have clean, complete and verified listings in Google Places

Summary of Results

  • Significant CTR improvements
  • Some of the CTR improvement is driven by increased impressions for branded traffic
  • The number of reviews for a location seems to have impacted overall traffic growth for the location
  • The average review rating does not seem to be connected to overall traffic

Details and Graphs

The impact of the change is clear when looking at the clicks and impressions:

Clicks and Impressions (Indexed to 10/27)

Clicks and Impressions (Indexed to 10/27)

Clicks grew over 50% in the ~2 weeks following the update while impressions grew only ~15% — implying that a majority of the click benefit came from increased CTR.  This leads to the question of “did the query stream change?”.  To help answer that, we took a look at the growth in clicks and the growth in the percentage of branded traffic:

Clicks and Percentage of Branded Traffic (Indexed to 10/27)

Clicks and Percentage of Branded Traffic (Indexed to 10/27)

While the update did drive a noticeable and material change in the percentage of branded traffic (up over 10%), that change alone did not account for a majority of the increase in overall traffic to the locations.

So, good clean listings seem to have benefited from the update…but how do reviews play into the performance equation?  To help answer this we took a look at the click growth for locations that had received 3 or more reviews in the last 6 months versus those that have received 2 or less:

Clicks by Number of Reviews (Indexed to 10/27)

Clicks by Number of Reviews (Indexed to 10/27)

The locations with 3 or more reviews performed better after the update.  Now, this does not mean more reviews = better ranking.  It could simply be that more reviews is just a good proxy for the overall number of citations a location has.  That said, this data does provide additional evidence that the more times Google can find matching information about your business location on the web the better it is for your ranking.

With the recent discussions around how rating could/should/does impact ranking, we also took a look at the click growth based on the average rating for the sample locations.  We split the locations into two buckets, one for those with a rating over 3.5 and the other for those with rating of 3.5 or lower.  Each bucket had an average right around 5.5 reviews per locations (to help deal with the impacts that total number of reviews could have on the analysis).  Here is what we saw:

Click Performance by Averge Rating (Indexed to 10/27)

Click by Averge Rating (Indexed to 10/27)

Based on this data, the poorly rated locations actually saw better click growth than the strongly rated locations.  While not definitive, it certainly points to the average rating being ignored by Google at this point (but not by your potential customers!).  And this actually makes a lot of sense given the issues we have seen with Google importing 3rd party reviews correctly (which we will take a look at another time) and the lack of credibility of the ratings/reviews on many 3rd party sites.  However, as Google (tries) to gather it’s own rating data via Hotpot they will be able to build a data set that they can trust — and can then start to use more actively in the ranking process.

If you are still reading…thanks and we will be posting an update on the holiday shopping traffic in the near future.

Mt. Everest Makes Moves to Address Online Complaints

Image via Engadget

Image via Engadget

As we discussed a few weeks ago, Mt. Everest had some online review issues.  One of the most common complaints was lack of cellphone service at the top.  Well, the mountain has listened and that problem has been solved.  Ncell has completed the install of a 3G capable cell station at 17K feet that will apparently provide service all they way up to the summit.

In other news, there is no news on Smart Phone friendly mittens that are rated for Everest’s summit temperatures.

New Google Results Page for Local Searches

Looks like Google is using their new results page layout in much wider (or complete) distribution now.  Greg Sterling has a good overview at Search Engine Land.

Mt. Everest Getting Not So Rave Reviews on Google Maps

It appears that the Online Reputation Management Agency for Mt. Everest has been sleeping on the job.  The Highest Mountain in the World’s Place Page on Google Maps is receiving a mediocre 3 star rating:

Mt Everest Places Page

Closer inspection of the reviews reveals some very unhappy customers.

Mt Everest Reviews

Obviously, these are tongue-in-cheek reviews ala Amazon’s Three Wolf Moon T-Shirt Meme.  However, they clearly display yet another reason to closely monitor online reviews for your business.  In the Three Wolf Moon case, the fake humorous reviews frontpaged on every social media site, including Digg and Reddit. The huge exposure catapulted sales 2,300%, making it the top selling item in Amazon’s Clothing Store.

While review content going viral can be a boon to business, both of these instances shine the light on the fact that a small group of reviewers can highjack reviews and drastically sway the outward appearance of your product or service.  Who’s to say this can’t be done by a competitor or former employee to sway public appearances.  This should be a concern for any business owner, from a national chain to the small business owner.  For a small local restaurant, poor local reviews can be crippling.  Everest’s tourism industry won’t likely feel the effect of a few fraudulent  reviews:

More Mt Everest Reviews

But consider if Everest was a new, up-and-coming restaurant in Chicago.  How many of the 165,000 searches for “chicago restaurants” last month, would bother looking twice at a location with only 3 stars?

– Gavin

The Google Brand Will Never be Social

All the talk about how Google ‘must’ buy Twitter just does not make sense to me, at least not for the reasons that are being discussed. Twitter is a fine business and may mint money some day, and that alone may be a reason to buy it.  However, Twitter will not solve Google’s so called ‘social problem’.

(1) Google’s ‘social problem’ was created by Facebook, but Twitter does not help Google compete with Facebook.  The social graph created by Facebook is completely different than the social graph created by Twitter.  The reasons people use these two applications are completely different and the types and amount of information shared is not comparable

(2) Google cannot be a social brand.  I entered a relationship with Google long ago with an expectation that they would provide me with search results and  email.  The relationship I entered into with Facebook was based on me sharing information with others.   These are two very different relationships and I don’t want Google changing the nature of our relationship, especially when they use the data I shared when I was operating under our ‘search and email only’ relationship.   This is the fundamental challenge for Google.  Our relationship with them is not a social one

(3) User’s do not want to mix search and social.  Many of the searches we do are private in nature.  We ask Google the most private and embarrassing questions we have.  A social aspect to Google removes the (real or perceived) privacy users feel when they share this information with Google.  Based on this, you could also argue that Facebook will never be the primary search destination for users

In my opinion, if Google wants to be successful in social they need to make a clean break with their existing services (specifically search and email).  Create a new brand that uses all the powerful Google technology and brains, but is not based on an integration with existing Google tools and data…which are based on my current Google relationship.  Establish a new relationship with me and then let me decide if and how I want to bring my old Google relationship into the mix.

Retail or Online Marketing: Who ‘Owns’ Local Search?

allLocal is a dream product for a salesperson.  It fills a gap in the market and it provides significant value for local businesses.  I have had the pleasure of presenting the product to advertising agencies and fortune 500 companies, most of which have had overwhelmingly positive feedback about allLocal, but inevitably the same question almost always comes up “whose budget should pay for this product?”

Local listings drive foot traffic to brick and mortar locations, but also online as the business name will typically link to the website homepage.  This gives the user the option of converting offline or online.  The business owner should be happy to get that conversion whether it’s in a store or via the website, but this is where the line gets blurry about whose budget should be paying for a local product.   The retail team gets credit for in-store sales and the online marketing team is tasked with driving acquisitions online.  In theory two teams within the same company want that conversion to take place; they just want it to take place in their revenue channel.

In a perfect world both of those teams would be working for the greater good of the company and agree to split the low costs of a platform like allLocal because it makes sense for the customers that are trying to find their business.  The reality is that it takes just one sale from each local listing, whether it’s offline or online, to more than cover the monthly costs of allLocal.

We (Finally) Launched Our New Website

In any company, prioritization is key. Where do you spend your marketing dollars, where do you spend your development time and what comes first on each of the product roadmaps? This is especially true in a small company like ours. While we have been working hard to make local search marketing and reputation management easier for local businesses, we have for the most part ignored our website…until now! The new www.alllocal.com is live, complete with some online demos and a local search marketing learning center. Now back to work…

Google Tags Now Available Nationwide

In a post on the Google Maps blog last night, Google announced that Tags will begin rolling out nationwide.  The latest list of available areas will be available here.

For the $25 a month, your Tags are now included in mobile search results as well.

The post still used the word ‘trial’ to describe the program, so it is still possible that Tags get retired at some point here.  However, the nationwide expansion means the limited trial must have cleared whatever internal relevancy, performance or revenue metrics Google had in place.