Google Webmaster Tools Releases Manual Webspam Reports

2461

Webspam is a menace. Unfortunately, not only has it worked to attract readers from the search engines in the past, it still does work. Because of that, many people have employed webspam to get more visitors to their websites. Two years ago, google finally drew a line in the sand, and started to take action. That didn’t stop the webspammers, and it didn’t stop the non-guru gurus who still sell courses, systems and links that get unaware webmasters into trouble.

Google continually updates its search-related algorithms, with new releases coming out as often as most of us eat breakfast. Each day, hundreds of webmasters who had what they thought was a thriving or expanding business, find themselves crushed into the ground by google’s heavy jackboot.

The crushing blow is often called “the google slap,” and that slap might be due to a change in one or more of their algorithms, or it may be a manual penalty that google calls “a manual action.”

The reasons for the crushing blow vary widely, but they are generally grouped under the single word heading, “quality,” or more correctly, “lack of quality.” Webmasters are caught in a spider’s web of problems including, but not limited to, poor quality content, no useful content, scraped content, duplicate content, stolen content, hacked sites, floods of ads, poor site structure, poor technical aspects, and “unnatural” links.

Also caught in this are webmasters who didn’t do any of those bad things, whose sites have been accepted by their visitors for years, but now google says they are low quality. These are often the hardest to work out.

Although google says they always send a message to inform webmasters about a manual action, sometimes that notification is tardy, and sometimes it is not seen. Other than receiving that message, until today, there has only been one way to discover if the search engine giant had taken a manual action against a site.

That way, triggered by a drop in visitors, was to guess what the problem might be, do some research, guess what you thought might be wrong, make changes, and then ask google if you had actually fixed the problem. Then there is the wait for a response, which, when it eventually comes, is as often as not, obfuscated and completely bewildering – but that is a story for another day.

The response could be “no manual actions were found.” This means you guessed incorrectly, and the problem it likely to be in a different area that you didn’t even consider.

Last night, google released a new feature in the Webmaster Tools section of its site, that should go some way towards assuring webmasters whether any perceived change in traffic is a manual slap or something else. Webmaster Tools, in case you don’t already know, is a special membership site maintained by google, in which everyone who owns a website should have a presence. Webmaster Tools is a sometimes frustrating system that provides information about your websites, that can be as confusing as it is helpful.

Today’s story is only about the new feature.

The new Manual Webspam Reports can be found inside your Webmaster Tools account, in the revamped “Search Traffic” section, under the submenu, Manual Actions.”

no manual actions
Google Webmaster Tools report showing no manual actions have been taken against the website click to see full size

To recap, Google sometimes takes a manual action against a site when the site breaches certain quality standards, that google documents in their quality guidelines. The manual action is taken against the site when the automated algorithms that determine search ranking cannot take account of some egregious quality issue that can only be discovered or adjudicated manually.

This new Manual Actions report tells the webmaster if google has actually taken a manual action against the site. That manual action would have caused the website’s ranking in search results to drop, and by extension, traffic to the website drops too. The website may also be affected by an algorithm, but there is no way to know that for sure.

According to the official webmaster tools blog, in a post issued overnight, there are currently two types of action that could be taken, that they are willing to share with us. The two types are labeled “Site-wide Match” and “Partial Match.”

These two variations tell you if the quality problem with your site is in one particular area, such as a forum you run, or a blog, or perhaps the ecommerce part of your site, then, according to the explanation in the blog post, only that part of your site has a problem that you need to fix. What isn’t clear, is whether a problem with your forum can cause the ranking of your whole site to drop, or just the ranking of the forum pages.

Assuming you can identify something that needs to be corrected, and you do correct it, or at least try to correct it, the next step is to send a “reconsideration request.” Google has also worked to improve the reconsideration request system.

Here is what the official blog says about the improved reconsideration request:

Once you’ve corrected any violations of Google’s quality guidelines, the next step is to request reconsideration. With this new feature, you’ll find a simpler and more streamlined reconsideration request process. Now, when you visit the reconsideration request page, you’ll be able to check your site for manual actions, and then request reconsideration only if there’s a manual action applied to your site. If you do have a webspam issue to address, you can do so directly from the Manual Actions page by clicking “Request a review.”

This feature has been a long time coming, and it must reduce the number of requests google receives, saving them time. Many webmasters, faced with a drop in visitors, file reconsideration requests without knowing if there was a manual action against them. This must have wasted a lot of time for google employees. That may not stop immediately.

Sadly, the previous lack of this report has probably caused hundreds of thousands of webmasters to waste weeks, months and even years investigating and making unnecessary changes, sometimes causing irreparable damage to their websites, for the wrong reasons.

There are two important things I want to add about reconsideration requests.

First, don’t make a small change and fire off a reconsideration request. From what I have seen and heard, that rarely works. If you only make a minimal effort, it won’t be enough, and you will just prolong your own agony. Second, if you paid for or built hundreds of what google calls “unnatural links,” you need to have them removed, so they are no longer pointing to your site, because if you don’t expend any effort, then google won’t either.

I think the new Manual Action report in Webmaster Tools is a good thing, and I hope it stands the test of time and doesn’t get pulled.

Alan Gray is the Publisher and Editor-in-Chief of NewsBlaze Daily News and other online newspapers. He prefers to edit, rather than write, but sometimes an issue rears it’s head and makes him start pounding the keyboard. Alan has a fascination with making video and video editing, so watch out if he points his Canon 7d in your direction.