Did @mattcutts Endorse Rap Genius Link Spam?

On TWIG Matt Cutts spoke about the importance of defunding spammers & breaking their spirits.

If you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is break their spirits. You want to make them frustrated and angry. There are parts of Google’s algorithms specifically designed to frustrate spammers and mystify them and make them frustrated. And some of the stuff we do gives people a hint their site is going to drop and then a week or two later their site actually does drop so they get a little bit more frustrated. And so hopefully, and we’ve seen this happen, people step away from the dark side and say “you know, that was so much pain and anguish and frustration, let’s just stay on the high road from now on” some of the stuff I like best is when people say “you know this SEO stuff is too unpredictable, I’m just going to write some apps. I’m going to go off and do something productive for society.” And that’s great because all that energy is channeled at something good.

What was less covered was that in the same video Matt Cutts made it sound like anything beyond information architecture, duplicate content cleanup & clean URLs was quickly approaching scamming – especially anything to do with links. So over time more and more behaviors get reclassified as black hat spam as Google gains greater control over the ecosystem.

there’s the kind of SEO that is better architecture, cleaner URLs, not duplicate content … that’s just like making sure your resume doesn’t have any typos on it. that’s just clever stuff. and then there’s the type of SEO that is sort of cheating. trying to get a lot of bad backlinks or scamming, and that’s more like lying on your resume. when you get caught sometime’s there’s repercussions. and it definitely helps to personalize because now anywhere you search for plumbers there’s local results and they are not the same across the world. we’ve done a diligent job of trying to crack down on black hat spam. so we had an algorithm named Penguin that launched that kind of had a really big impact. we had a more recent launch just a few months ago. and if you go and patrole the black hat SEO forums where the guys talk about the techniques that work, now its more people trying to sell other people scams rather than just trading tips. a lot of the life has gone out of those forums. and even the smaller networks that they’re trying to promote “oh buy my anglo rank or whatever” we’re in the process of tackling a lot of those link networks as well. the good part is if you want to create a real site you don’t have to worry as much about these bad guys jumping ahead of you. the playing ground is a lot more level now. panda was for low quality. penguin was for spam – actual cheating.

The Matt Cutts BDSM School of SEO

As part of the ongoing campaign to “break their spirits” we get increasing obfuscation, greater time delays between certain algorithmic updates, algorithmic features built explicitly with the goal of frustrating people, greater brand bias, and more outrageous selective enforcement of the guidelines.

Those who were hit by either Panda or Penguin in some cases took a year or more to recover. Far more common is no recovery — ever. How long do you invest in & how much do you invest in a dying project when the recovery timeline is unknown?

You Don’t Get to Fascism Without 2-Tier Enforcement

While success in and of itself may make one a “spammer” to the biased eyes of a search engineer (especially if you are not VC funded nor part of a large corporation), many who are considered “spammers” self-regulate in a way that make them far more conservative than the alleged “clean” sites do.

Pretend you are Ask.com and watch yourself get slaughtered without warning.

Build a big brand & you will have advanced notification & free customer support inside the GooglePlex:

In my experience with large brand penalties, (ie, LARGE global brands) Google have reached out in advance of the ban every single time. – Martin Macdonald

Launching a Viral Linkspam Sitemap Campaign

When RapGenius was penalized, the reason they were penalized is they were broadly and openly and publicly soliciting to promote bloggers who would dump a list of keyword rich deeplinks into their blog posts. They were basically turning boatloads of blogs into mini-sitemaps for popular new song albums.

Remember reading dozens (hundreds?) of blog posts last year about how guest posts are spam & Google should kill them? Well these posts from RapGenius were like a guest post on steroids. The post “buyer” didn’t have to pay a single cent for the content, didn’t care at all about relevancy, AND a sitemap full of keyword rich deep linking spam was included in EACH AND EVERY post.

Most “spammers” would never attempt such a campaign because they would view it as being far too spammy. They would have a zero percent chance of recovery as Google effectively deletes their site from the web.

And while RG is quick to distance itself from scraper sites, for almost the entirety of their history virtually none of the lyrics posted on their site were even licensed.

In the past I’ve mentioned Google is known to time the news cycle. It comes without surprise that on a Saturday barely a week after being penalized Google restored RapGenius’s rankings.

How to Gain Over 400% More Links, While Allegedly Losing

While the following graph may look scary in isolation, if you know the penalty is only a week or two then there’s virtually no downside.

Since being penalized, RapGenius has gained links from over 1,000* domains

  • December 25th: 129
  • December 26th: 85
  • December 27th: 87
  • December 28th: 54
  • December 29th: 61
  • December 30th: 105
  • December 31st: 182
  • January 1st: 142
  • January 2nd: 112
  • January 3rd: 122

The above add up to 1,079 & RapGenius only has built a total of 11,930 unique linking domains in their lifetime. They grew about 10% in 10 days!

On every single day the number of new referring domains VASTLY exceeded the number of referring domains that disappeared. And many of these new referring domains are the mainstream media and tech press sites, which are both vastly over-represented in importance/authority on the link graph. They not only gained far more links than they lost, but they also gained far higher quality links that will be nearly impossible for their (less spammy) competitors to duplicate.

They not only got links, but the press coverage acted as a branded advertising campaign for RapGenius.

Here’s some quotes from RapGenius on their quick recovery:

  • “we owe a big thanks to Google for being fair and transparent and allowing us back onto their results pages” <– Not the least bit true. RapGenius was not treated fairly, but rather they were given a free ride compared to the death hundreds of thousands of small businesses have been been handed over the past couple years.
  • “On guest posts, we appended lists of song links (often tracklists of popular new albums) that were sometimes completely unrelated to the music that was the subject of the post.” <– and yet others are afraid of writing relevant on topic posts due to Google’s ramped fearmongering campaigns
  • “we compiled a list of 100 “potentially problematic domains”” <– so their initial list of domains to inspect was less than 10% the number of links they gained while being penalized
  • “Generally Google doesn’t hold you responsible for unnatural inbound links outside of your control” <– another lie
  • “of the 286 potentially problematic URLs that we manually identified, 217 (more than 75 percent!) have already had all unnatural links purged.” <– even the “all in” removal of pages was less than 25% of the number of unique linking domains generated during the penalty period

And Google allowed the above bullshit during a period when they were sending out messages telling other people WHO DID THINGS FAR LESS EGREGIOUS that they are required to remove more links & Google won’t even look at their review requests for at least a couple weeks – A TIME PERIOD GREATER THAN THE ENTIRE TIME RAPGENIUS WAS PENALIZED FOR.

Failed reconsideration requests are now coming with this email that tells site owners they must remove more links: pic.twitter.com/tiyXtPvY32— Marie Haynes (@Marie_Haynes) January 2, 2014

In Conclusion…

If you tell people what works and why you are a spammer with no morals. But if you are VC funded, Matt Cutts has made it clear that you should spam the crap out of Google. Just make sure you hire a PR firm to trump up press coverage of the “unexpected” event & then have a faux apology saved in advance. So long as you lie to others and spread Google’s propaganda you are behaving in an ethical white hat manner.

Google & @mattcutts didn’t ACTUALLY care about Rap Genius’ link scheme, they just didn’t want to miss a propaganda opportunity.— Ben Cook (@Skitzzo) January 4, 2014

Notes

* These stats are from Ahrefs. A few of these links may have been in place before the penality and only recently crawled. However it is also worth mentioning that all third party databases of links are limited in size & refresh rate by optimizing their capital spend, so there are likely hundreds more links which have not yet been crawled by Ahrefs. One should also note that the story is still ongoing and they keep generating more links every day. By the time the story is done spreading they are likely to see roughly a 30% growth in unique linking domains in about 6 weeks.

Categories: 

Preparing For The Link Apocalypse That May Or May Not Be Coming

Something big is happening to link building. Hummingbird might have been the writing on the wall signaling more changes to come; but, we still have no idea what those changes will actually look like. Will links be less important in 2014? Is it possible that they could go away? Should I start link…

Please visit Search Engine Land for the full article.

Search In Pics: Glasshole Receipt, GDG Gingerbread Cookies & Penguin Popper

In this week’s Search In Pictures, here are the latest images culled from the web, showing what people eat at the search engine companies, how they play, who they meet, where they speak, what toys they have, and more. Glasshole Printed On Bar Receipt: Source: Google+ GDG Gingerbread Cookies:…

Please visit Search Engine Land for the full article.

Gray Hat Search Engineering

Almost anyone in internet marketing who has spent a couple months in the game has seen some “shocking” case study where changing the color of a button increased sales 183% or such. In many cases such changes only happen when the original site had not had any focus on conversion at all.

Google, on the other hand, has billions of daily searches and is constantly testing ways to increase yield:

The company was considering adding another sponsored link to its search results, and they were going to do a 30-day A/B test to see what the resulting change would be. As it turns out, the change brought massive returns. Advertising revenues from those users who saw more ads doubled in the first 30 days.

By the end of the second month, 80 percent of the people in the cohort that was being served an extra ad had started using search engines other than Google as their primary search engine.

One of the reasons traditional media outlets struggle with the web is the perception that ads and content must be separated. When they had regional monopolies they could make large demands to advertisers – sort of like how Google may increase branded CPCs on AdWords by 500% if you add sitelinks. You not only pay more for clicks that you were getting for free, but you also pay more for the other paid clicks you were getting cheaper in the past.

That’s how monopolies work – according to Eric Schmidt they are immune from market forces.

Search itself is the original “native ad.” The blend confuses many searchers as the background colors fade into white.

Google tests colors & can control the flow of traffic based not only on result displacement, but also the link colors.

It was reported last month that Google tested adding ads to the knowledge graph. The advertisement link is blue, while the ad disclosure is to the far right out of view & gray.

I was searching for a video game yesterday & noticed that now the entire Knowledge Graph unit itself is becoming an ad unit. Once again, gray disclosure & blue ad links.

Where Google gets paid for the link, the link is blue.

Where Google scrapes third party content & shows excerpts, the link is gray.

The primary goal of such a knowledge block is result displacement – shifting more clicks to the ads and away from the organic results.

When those blocks appear in the search results, even when Google manages to rank the Mayo Clinic highly, it’s below the fold.

What’s so bad about this practice in health

  • Context Matters: Many issues have overlapping symptoms where a quick glance at a few out-of-context symptoms causes a person to misdiagnose themselves. Flu-like symptoms from a few months ago turned out to be indication of a kidney stone. That level of nuance will *never* be in the knowledge graph. Google’s remote rater documents discuss your money your life (YMYL) topics & talk up the importance of knowing who exactly is behind content, but when they use gray font on the source link for their scrape job they are doing just the opposite.
  • Hidden Costs: Many of the heavily advertised solutions appearing above the knowledge graph have hidden costs yet to be discovered. You can’t find a pharmaceutical company worth $10s of billions that hasn’t plead guilty to numerous felonies associated with deceptive marketing and/or massaging research.
  • Artificially Driving Up Prices: in-patent drugs often cost 100x as much as the associated generic drugs & thus the affordable solutions are priced out of the ad auctions where the price for a click can vastly exceed than the profit from selling a generic prescription drug.

Where’s the business model for publishers when they have real editorial cost & must fact check and regularly update their content, their content is good enough to be featured front & center on Google, but attribution is nearly invisible (and thus traffic flow is cut off)? As the knowledge graph expands, what does that publishing business model look like in the future?

Does the knowledge graph eventually contain sponsored self-assessment medical quizzes? How far does this cancer spread?

Where do you place your chips?

Google believes it can ultimately fulfil people’s data needs by sending results directly to microchips implanted into its user’s brains.

Categories: 

What’s on your SEO wishlist for 2014?

From a SEO’s perspective, which of Google’s changes in 2013 have you least appreciated?

Julia Logan, Irish Wonder

Where Google is going with the Knowledge base. It is simply becoming a scraper, and on top of that it cynically advises the website owners to produce more great content.

Yeah right, so Google has something to scrape.

Dr Pete Meyers, Marketing Scientist at Moz

Again, [not provided]. Easily.

Andrew Girdwood, Media Innovations Director at LBi

I dislike the 100% not provided and the communication around it. Google’s claims of ‘for privacy’ are too easily dismissed until you begin to speculate what other information Google might want to include in search results.

For a company that once claimed to want to help with the world’s information it seems a pleasant move would be to pass keyword data through their improved privacy.

It could investigate a redirect link with a keyword loaded query string on it, for example. Google has established, with its PPC policy, that keyword data is not a privacy breach.

Will Critchlow, Founder and CMO at Distilled

For immediate impact, (not provided) is my least favourite change. Keywords aren’t everything by any means, but they are useful and the public explanations given are just so disingenuous.

If it really meant the explanation given about privacy, they not only wouldn’t be giving up paid keyword data, but also could have found a sensible middle ground of what to share instead of removing it all.

Directionally, I’m also very much not crazy about the UX changes to image search which seek actively to prevent searchers from visiting the sites that contain the images.

I see this as breaking the implied agreement whereby sites allow Google to crawl their sites in exchange for getting traffic when that crawl discovers a good result.

Richard Baxter, CEO at SEOGadget

Well, I think Panda’s increasing aggressiveness has largely been ignored since Penguin came along. I’ll choose that update, but I actually appreciate it.

It’s an interesting update to work with – when you get really deep into technical work, particularly into the log files of affected sites you can really see why there’s a problem. The trick is to compare before and after.

What you find with log files is that they tend to confirm what most SEOs just say because they think it’s best practice. Very ‘thin’ pages with little unique content tend to encourage weird crawl behaviours.

For example, much less of the page (total size vs content downloaded) is downloaded by Googlebot than say, a content rich, really developed page. So, as much as I don’t appreciate my job being harder, I certainly appreciate it being more interesting!

What do you expect/hope to see in 2014?

Julia Logan:

I dream of a strong competitor rising so we all have some choice, as searchers, site owners and SEOs. This probably won’t happen very soon, maybe not in 2014, but can I dream?

Dr Pete Meyers:

I suspect a strong shift to a more card-based search result, akin to Google Now, Google+, and mobile search.

Google wants to mix and match information seamlessly, regardless of how you consume it. I believe we’ll see a Knowledge Graph expansion based on Google’s index – in other words, it’s going to extract ‘knowledge’ directly from websites more and more (and not just a small set of big databases).

I hope it’ll open some data back up and become more transparent, but I don’t expect it.

Google paid placement

Kevin Gibbons, UK MD at Blueglass:

Bigger and better marketing campaigns. Less focus on tactics, and more integrated strategy across multi-channels.  

We’ve certainly seen a shift ourselves towards a more consumer-led and customer centric strategy, looking to improve the overall user experience across multiple channels and devices.

Focusing much more on the bigger picture and being rewarded by Google as a result – as opposed to more tactical bursts of campaigns.

Andrew Girdwood:

Hope and expect are very different. I hope to see keyword data made better in Webmaster Console along with easier data extraction from the console. I doubt we’ll get that. 

I expect to see more chat around Google+ and for Google to fuel that. I predict more SEO teams will spend more time talking about ‘signals’ rather than just ‘links’.

I fear we’ll see trouble when it comes to the difference between editorial and advertorial. The difference does not seem to be widely understood by very many bloggers and digital publishers.

Whether it’s in-house teams or SEO agencies doing the outreach doesn’t seem to matter but too many brands seem too happy either play to those misunderstandings or actively encourage them.

Will Critchlow:

I expect to see some innovative YouTube ad formats that could set it on the way to becoming a real brand-building platform for the web and see it claim a significant chunk of brand advertising spend.

[Coupled with this, I think we will see a subtle shift away from UGC and towards professional content on YouTube – it could become the equivalent of free-to-air TV versus Netflix’s cable equivalent].

I think Dr. Pete is spot on in his predictions of what we will see on the UI front.

I expect to see some live experimenting with more social ranking factors, particularly in the fresh results.

Teddie Cowell, Director of SEO, Mediacom:  

I expect to see a lot happening around interaction, with search results appearing in more places where we haven’t previously seen them – think of the Android 4.4 Kit Kat contacts list as a current example of this.

I hope to see some controls mechanisms in place for Knowledge Graph. There have been a few to many factual inaccuracies; and some highly embarrassing ones, so currently it feels like Google is playing with fire in regards to what the Knowledge Graph says.

With particularly regard to brands, which by their nature as recognised entities are more likely to trigger the Knowledge Graph, it’s very dangerous territory, because coincidentally they are also some of Googles most valued advertisers.

Richard Baxter:

Spammy SEO to be gone. Google keep saying it’s getting better at tackling the bad stuff but there are still plenty of examples around – it’s a case of ‘do what you said you’d do’ – and not giving very poor quality SEO agencies any more fuel by way of case studies on their bad, temporary tactics.

It would make it much easier to get the message across that the good guys do a good job and that SEO is a credible, technical and content marketing discipline that is very much here to stay.

Jimmy McCann, Head of SEO at Search Laboratory:

Improved accuracy in the Webmaster Tools link examples that are given under manual action. These have been automated and incorrect in the past – which is a pain.

It would be better if the examples were more explicit and told you exactly what was required to sort out the penalty.

Adam Skalak, Head of SEO at iCrossing

The most significant shift for SEO is that we are no longer limited by keyword phrases. Google’s 2014 expansion of its semantic-search offering, the Knowledge Graph, offers both opportunities and challenges for brands.

Established brands will benefit from greater exposure in more prominent parts of the results pages. But with Google providing answers directly at the top of the page, brands may struggle to increase their organic traffic as this could remove the need for people to click through to their site at all if they don’t need too much detail.