Google’s talking about “more precise data” etc. in the new Search Analytics report–referring to the massive data issues in the old Top Queries and Top Pages reports in Webmaster Tools, where often the numbers were totally hallucinogenic, attributing 10s of thousands of clicks at a time to the wrong landing pages and many other errors.
While it’s really nice that these data issues have apparently been fixed, what’s REALLY going to be a help to webmasters everywhere is that you once again have access to all of the query data on a per-landing-page basis.
I took a quick look at the data for one of my landing pages, and while the data does seem to include the [not provided] searches, the numbers still don’t quite add up.
As an example:
They’ve provided a very helpful Download button at the bottom, which lets you dump the data into an Excel spreadsheet. This is great, but it’s still not without its problems. The total clicks in the spreadsheet from the Download of the data you’re looking at in the above screenshot? 5653. Oops. That’s off by 35%.
But, it’s still officially in beta (according to the page title), so I expect they’ll work some of these bugs out.
The really good news, however, is that you can get FAR more query data now than ever before. And I’m happy about that.
UPDATE: for a much more detailed analysis of this, see Dan Kern’s excellent post here.
In trying to help a client of mine fix some nutty listing info on Citysearch, I was instructed by Citysearch support people to click the “Unlock listing” link on the page.
OK, I can do that…
But that takes you to Yext, who apparently is attempting to manage this process. I say “attempting”, because here’s what you get:
So what’s the problem?
I guess there’s really no need for Yext to have a Quality Assurance department test something like this, when they can get their users to test it for free.
UPDATE: Yext fixed that bug within a couple of hours of me reporting it (great job, guys!). However, I’m disappointed to have to report that business owners no longer have any way to add or even fix a listing on CitySearch for free–you’re forced into one of the pay-per-month packages from Yext.
Rand Fishkin at Moz did this White Board Friday presentation on a high-level approach for structuring how you tackle search marketing. I think it’s a great perspective on SEO, and I think describing what we do in these simple steps will be very helpful for many of my clients. Continue reading
Moz published my blog post on techniques for leveraging Panda to make your product feed-based website stand out, both to Google and to end users.
I had a great discussion with Nathalie Nahai on establishing trust and making new customers over the internet. Continue reading
I’m not one of those who think Google is inherently evil, and that everything they do is a devious attempt to trick us into spending more on AdWords. OK, some of what they do is…but they’re a for-profit company, and need to do their best to make a buck for their shareholders…and that’s OK.
The problem with having a virtual monopoly on any given space is that there’s no real incentive to be better than the competition. If you release a version of your software and it’s got problems, the customers won’t leave you in droves because–well, you’re really the only game in town.
So innovation tends to be driven by the desire to produce something new and cool, rather than making the basic product work. Let’s take a look at Google Maps search, for example.
I woke up this morning in sunny Manzanita, Oregon. Hungry…where to have breakfast? I do a search for “breakfast restaurants” on my iPhone, and get a nice selection of restaurants in Google’s little carousel. For Newport. That’s where I was yesterday. 2 hours away.
So I scroll way down to the bottom of the page, and find a little link that says “use precise location”. OK, I click that. It finds me in Manzanita, all right. But doesn’t change the search results. I have to search AGAIN…ok, now I’ve got a list of restaurants near me.
Luckily there isn’t a restaurant named “Breakfast Restaurant”, or else Maps would return just that single business, with no option for the list of all restaurants :-/
My son Ben and I toddle off down the road and have a marvelous breakfast (pan-fried razor clams, eggs, toast…and great coffee!) at The Big Wave, and I want to write a glowing review of this lovely little cafe, so I search for it on my iPhone, find the Google+ page for it, see all the existing reviews. But there’s no link anywhere for me to review it (and yes, I’m logged in to Google).
This is all a pretty basic, ordinary, common use case for Google local search. Probably one of the most common use cases, in fact. And yet there are FOUR fairly important usability bugs in that process.
I’m IN the search business, and have a very technical software background. Google, if I’m struggling to figure your interface out, it’s not me, it’s YOU.
So Google, STOP messing with cutesy little changes to the interface. No more just building what you think is clever and cool, UNLESS you do proper usability testing with this, and get the basic, common use cases to actually work.
And yes, I realize that you might not have the best pool of people in-house to test Maps stuff, when some of your people never leave your campus :-).
I think this makes it 7 out of 7 for my clients with Google manual penalties. Some I’ve gotten out in 1 request, a bunch took 2, and one took 7 rounds.
I’ve been a huge fan of the MozLocal tools for submitting and managing location information for my clients. David and team have just launched a major update to the tools–one I had the privilege of beta testing (if only Google’s released tools had NEAR the quality of Moz’s beta products! #donerantingfornow).
- Ability to enter a single listings’ information through forms, instead of creating and uploading a CSV.
- Tools to detect and resolve duplicate listings, listings for closed locations, etc.
- Better integration with Factual, via Factual’s Trusted Data Contributor program.
There’s a price increase to $84 per location, starting Oct. 1st–but that’s still way cheap, considering the time it saves vs. trying to submit and manage these all manually. I’d guess that I probably spent 3 or 4 hours per client location in the past, doing the submissions and verifications manually.
And while I’ll admit I get all hot and sweaty looking at pictures of Ferraris, and have been known to utter all sorts of nasty words while working on an old car, reading from the manual to “attach part X as shown” when it’s bloody well NOT shown…shop manuals and collector guides really aren’t porn in anybody’s world.