Penguin Data Update Yesterday

There appears to have been a Penguin data update either yesterday or early today.

What makes me so sure?

Well, I set up a little Penguin trap. I built a one-page site, tuned for a nonsense term, and built about a dozen links to that page with the nonsense term as anchor text.

Then, I waited until it ranked #1 for that term…that took just a couple of days. The pages that linked TO that page all rank for that term, as well.

So, once my nonsense site ranked #1 for that term, I disavowed all links.

And waited. Till today.

BOOM. After not moving a single spot for the past month, it suddenly dropped 9 places.

Say Goodbye to Not Provided

Google Webmaster Tools newly-announced Search Analytics brings with it a rescue from the dreaded [not provided] debacle of the past 3 1/2 years.

Google’s talking about “more precise data” etc. in the new Search Analytics report–referring to the massive data issues in the old Top Queries and Top Pages reports in Webmaster Tools, where often the numbers were totally hallucinogenic, attributing 10s of thousands of clicks at a time to the wrong landing pages and many other errors.

While it’s really nice that these data issues have apparently been fixed, what’s REALLY going to be a help to webmasters everywhere is that you once again have access to all of the query data on a per-landing-page basis.

I took a quick look at the data for one of my landing pages, and while the data does seem to include the [not provided] searches, the numbers still don’t quite add up.

As an example:


They’ve provided a very helpful Download button at the bottom, which lets you dump the data into an Excel spreadsheet. This is great, but it’s still not without its problems. The total clicks in the spreadsheet from the Download of the data you’re looking at in the above screenshot? 5653. Oops. That’s off by 35%.

But, it’s still officially in beta (according to the page title), so I expect they’ll work some of these bugs out.

The really good news, however, is that you can get FAR more query data now than ever before. And I’m happy about that.

UPDATE: for a much more detailed analysis of this, see Dan Kern’s excellent post here.

Yext Claim Listing Bug

In trying to help a client of mine fix some nutty listing info on Citysearch, I was instructed by Citysearch support people to click the “Unlock listing” link on the page.

OK, I can do that…

But that takes you to Yext, who apparently is attempting to manage this process. I say “attempting”, because here’s what you get:


So what’s the problem?

Well….clicking on Claim It does nothing. As a nerdy developer type, I looked at the source behind that button. Nope, no link, and no Javascript onclick().

I guess there’s really no need for Yext to have a Quality Assurance department test something like this, when they can get their users to test it for free.


UPDATE: Yext fixed that bug within a couple of hours of me reporting it (great job, guys!). However, I’m disappointed to have to report that business owners no longer have any way to add or even fix a listing on CitySearch for free–you’re forced into one of the pay-per-month packages from Yext.

The Problem with Google’s Monopoly

no option to leave a review

If Google+ reviews are so important, how come it’s so difficult to leave one on a mobile device?

I’m not one of those who think Google is inherently evil, and that everything they do is a devious attempt to trick us into spending more on AdWords. OK, some of what they do is…but they’re a for-profit company, and need to do their best to make a buck for their shareholders…and that’s OK.

The problem with having a virtual monopoly on any given space is that there’s no real incentive to be better than the competition. If you release a version of your software and it’s got problems, the customers won’t leave you in droves because–well, you’re really the only game in town.

So innovation tends to be driven by the desire to produce something new and cool, rather than making the basic product work. Let’s take a look at Google Maps search, for example.

I woke up this morning in sunny Manzanita, Oregon. Hungry…where to have breakfast? I do a search for “breakfast restaurants” on my iPhone, and get a nice selection of restaurants in Google’s little carousel. For Newport. That’s where I was yesterday. 2 hours away.

So I scroll way down to the bottom of the page, and find a little link that says “use precise location”. OK, I click that. It finds me in Manzanita, all right. But doesn’t change the search results. I have to search AGAIN…ok, now I’ve got a list of restaurants near me.

Luckily there isn’t a restaurant named “Breakfast Restaurant”, or else Maps would return just that single business, with no option for the list of all restaurants :-/

My son Ben and I toddle off down the road and have a marvelous breakfast (pan-fried razor clams, eggs, toast…and great coffee!) at The Big Wave, and I want to write a glowing review of this lovely little cafe, so I search for it on my iPhone, find the Google+ page for it, see all the existing reviews. But there’s no link anywhere for me to review it (and yes, I’m logged in to Google).

This is all a pretty basic, ordinary, common use case for Google local search. Probably one of the most common use cases, in fact. And yet there are FOUR fairly important usability bugs in that process.

I’m IN the search business, and have a very technical software background. Google, if I’m struggling to figure your interface out, it’s not me, it’s YOU.

So Google, STOP messing with cutesy little changes to the interface. No more just building what you think is clever and cool, UNLESS you do proper usability testing with this, and get the basic, common use cases to actually work.

And yes, I realize that you might not have the best pool of people in-house to test Maps stuff, when some of your people never leave your campus :-).

MozLocal update

I’ve been a huge fan of the MozLocal tools for submitting and managing location information for my clients. David and team have just launched a major update to the tools–one I had the privilege of beta testing (if only Google’s released tools had NEAR the quality of Moz’s beta products! #donerantingfornow).


  • Ability to enter a single listings’ information through forms, instead of creating and uploading a CSV.
  • Tools to detect and resolve duplicate listings, listings for closed locations, etc.
  • Better integration with Factual, via Factual’s Trusted Data Contributor program.

There’s a price increase to $84 per location, starting Oct. 1st–but that’s still way cheap, considering the time it saves vs. trying to submit and manage these all manually. I’d guess that I probably spent 3 or 4 hours per client location in the past, doing the submissions and verifications manually.

Independent search engine optimization (SEO) consultant Michael Cottam. Independent SEO Consultant Michael Cottam
3166 N Winchell Street, Portland, OR US 97217 (503) 283-0177
45.580359 -122.699297