Google’s Effective ‘White Hat’ Marketing Case Study
There’s the safe way & the high risk approach. The shortcut takers & those who win through hard work & superior offering.
One is white hat and the other is black hat.
With the increasing search ecosystem instability over the past couple years, some see these labels constantly sliding, sometimes on an ex-post-facto basis, turning thousands of white hats into black hats arbitrarily overnight.
Are you a white hat SEO? or a black hat SEO?
Do you even know?
Before you answer, please have a quick read of this Washington Post article highlighting how Google manipulated & undermined the US political system.
.
.
.
.
.
.
.
Seriously, go read it now.
It’s fantastic journalism & an important read for anyone who considers themselves an SEO.
.
.
.
.
.
.
.
.
######
Take the offline analog to Google’s search “quality” guidelines & in spirit Google repeatedly violated every single one of them.
Advertorials
creating links that weren’t editorially placed or vouched for by the site’s owner on a page, otherwise known as unnatural links can be considered a violation of our guidelines. Advertorials or native advertising where payment is received for articles that include links that pass PageRank
Advertorials are spam, except when they are not: “the staff and professors at GMU’s law center were in regular contact with Google executives, who supplied them with the company’s arguments against antitrust action and helped them get favorable op-ed pieces published“
Deception
Don’t deceive your users.
Ads should be clearly labeled, except when they are not: “GMU officials later told Dellarocas they were planning to have him participate from the audience,” which is just like an infomercial that must be labeled as an advertisement!
Preventing Money from Manipulating Editorial
Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google’s AdSense ads and DoubleClick links are blocked from being crawled by a robots.txt file.
Money influencing outcomes is wrong, except when it’s not: “Google’s lobbying corps — now numbering more than 100 — is split equally, like its campaign donations, among Democrats and Republicans. … Google became the second-largest corporate spender on lobbying in the United States in 2012.”
Content Quality
The best way to get other sites to create high-quality, relevant links to yours is to create unique, relevant content that can naturally gain popularity in the Internet community. Creating good content pays off: Links are usually editorial votes given by choice, and the more useful content you have, the greater the chances someone else will find that content valuable to their readers and link to it.
Payment should be disclosed, except when it shouldn’t: “The school and Google staffers worked to organize a second academic conference focused on search. This time, however, Google’s involvement was not publicly disclosed.”
Cloaking
Cloaking refers to the practice of presenting different content or URLs to human users and search engines. Cloaking is considered a violation of Google’s Webmaster Guidelines because it provides our users with different results than they expected.
cloaking is evil, except when it’s not: Even as Google executives peppered the GMU staff with suggestions of speakers and guests to invite to the event, the company asked the school not to broadcast its involvement. “We will certainly limit who we announce publicly from Google”
…and on and on and on…
It’s not safe to assume that just because a specific deceptive technique isn’t included on this page, Google approves of it.
And while they may not approve of something, that doesn’t mean they avoid the strategy when mapping out their own approach.
There’s a lesson & it isn’t a particularly subtle one.
More and more, it looks like that invisible hand shaping the market actually belongs to Google. http://t.co/fFigz7lMSY— Matt Pearce (@mattdpearce) April 13, 2014
Free markets aren’t free. Who could have known?
The Positive Negative SEO Strategy
There’s a case study on Moz on how to get your site back following a link penalty. An SEO working on a clients site describes what happened when their client got hit with a link penalty. Even though the link penalty didn’t appear to be their fault, it still took months to get their rankings back.
Some sites aren’t that lucky. Some sites don’t get their rankings back at all.
The penalty was due to a false-positive. A dubious site links out to a number of credible sites in order to help disguise their true link target. The client site was one of the credible sites, mistaken by Google for a bad actor. Just goes to show how easily credible sites can get hit by negative SEO, and variations thereof.
There’s a tactic in there, of course.
Take Out Your Competitors
Tired of trying to rank better? Need a quicker way? Have we got a deal for you!
Simply build a dubious link site, point some rogue links at sites positioned above yours and wait for Google’s algorithm to do the rest. If you want to get a bit tricky, link out to other legitimate sites, too. Like Wikipedia. Google, even. This will likely confuse the algorithm for a sufficient length of time, giving your tactic time to work.
Those competitors who get hit, and who are smart enough to work out what’s going on, may report your link site, but, hey, there are plenty more link sites where that came from. Roll another one out, and repeat. So long as your link site can’t be connected with you – different PC, different IP address, etc – then what have you got to lose? Nothing much. What have your competitors got to lose? Rank, a lot of time, effort, and the very real risk they won’t get back into Google’s good books. And that’s assuming they work out why they lost rankings.
I’m not advocating this tactic, of course. But we all know it’s out there. It is being used. And the real-world example above shows how easy it is to do. One day, it might be used against you, or your clients.
Grossly unfair, but what can you do about it?
Defensive Traffic Strategy
Pleading to Google is not much of a strategy. Apart from anything else, it’s an acknowledgement that the power is not in your hands, but in the hands of an unregulated arbiter who likely views you as a bit of an annoyance. It’s no wonder SEO has become so neurotic.
It used to be the case that competitors could not take you out pointing unwanted links at you. No longer. So even more control has been taken away from the webmaster.
The way to manage this risk is the same way risk is managed in finance. Risk can be reduced using diversification. You could invest all your money in one company, or you could split it between multiple companies, banks, bonds and other investment classes. If you’re invested in one company, and they go belly up, you lose everything. If you invest in multiple companies and investment classes, then you’re not as affected if one company gets taken out. In other words, don’t put all your eggs in one basket.
It’s the same with web traffic.
1. Multiple Traffic Streams
If you only run one site, try to ensure your traffic is balanced. Some traffic from organic search, some from PPC, some from other sites, some from advertisements, some from offline advertising, some from email lists, some from social media, and so on. If you get taken out in organic search, it won’t kill you. Alternative traffic streams buy you time to get your rankings back.
2. Multiple Pages And Sites
A “web site” is a construct. Is it a construct applicable to a web that mostly orients around individual pages? If you think in terms of pages, as opposed to a site, then it opens up more opportunities for diversification.
Pages can, of course, be located anywhere, not just on your site. These may take the form of well written, evergreen, articles published on other popular sites. Take a look at the top sites in closely related niches and see if there are any opportunities to publish your content on them. Not only does this make your link graph look good, so long as it’s not overt, you’ll also have achieve more diversity.
Consider Barnacle SEO.
Will creatively defines the concept of barnacle SEO as follows:
Attaching oneself to a large fixed object and waiting for the customers to float by in the current.
Directly applied to local search, this means optimizing your profiles or business pages on a well-trusted, high-ranking directory and working to promote those profiles instead of — or in tandem with — your own website.“
You could also build multiple sites. Why have just one site when you can have five? Sure, there’s more overhead, and it won’t be appropriate in all cases, but again, the multiple site strategy is making a comeback due to Google escalating the risk of having only one site. This strategy also helps get your eggs into multiple baskets.
3. Prepare For the Worst
If you’ve got most of your traffic coming from organic search, then you’re taking a high risk approach. You should manage that risk down with diversification strategies first. Part of the strategy for dealing with negative SEO is not to make yourself so vulnerable to it in the first place.
If you do get hit, have a plan ready to go to limit the time you’re out of the game. The cynical might suggest you have a name big enough to make Google look bad if they don’t show you.
Lyrics site Rap Genius says that it is no longer penalized within Google after taking action to correct “unnatural links” that it helped create. The site was hit with a penalty for 10 days, which meant people seeking it by name couldn’t find it.
For everyone else, here’s a pretty thorough guide about how to get back in.
Have your “plead with Google” gambit ready to go at a moments notice. The lead time to get back into Google can be long, so the sooner you get onto it, the better. Of course, this is really the last course of action. It’s preferable not make yourself that vulnerable in the first place.
By diversifying.
Bing Lists ‘Alternatives’ In Search Results
Bing recently stated testing listing ‘alternatives’ near their local search results.
I wasn’t able to replicate these in other search verticals like flight search, or on an iPhone search, but the format of these alternatives looks similar to the forma…
Flip Guest Blogging on its Head, With Steroids
Guest blogging was once considered a widely recommended white hat technique.
Today our monopoly-led marketplace arbitrarily decided this is no longer so.
Stick a fork in it. Torch it. Etc.
It looks like MyBlogGuest was the “winner” – not appearing on branded terms RT @mattcutts Today we took action on a large guest blog network— Rae Hoffman (@sugarrae) March 19, 2014
Now that rules have changed ex post facto, we can expect to deal with a near endless stream of “unnatural” link penalties for doing what was seen at the time as being:
- natural
- widespread
- common
- low risk
- best practice
Google turns your past client investments into new cost centers & penalties. This ought to be a great thing for the SEO industry. Or maybe not.
As Google scares & expunges smaller players from participating in the SEO market, larger companies keep chugging along.
Today a friend received the following unsolicited email:
Curious about their background, he looked up their past coverage: “Written then offers a number of different content licenses that help the advertiser reach this audience, either by re-branding the existing page, moving the content to the advertiser’s website and re-directing traffic there, or just re-publishing the post on the brand’s blog.”
So that’s basically guest blogging at scale.
And it’s not only guest blogging at scale, but it is guest blogging at scale based on keyword performance:
“You give us your gold keywords. Written finds high-performing, gold content with a built-in, engaged audience. Our various license options can bring the audience to you or your brand to the audience through great content.”
What’s worse is how they pitch this to the people they license content from:
I’m sorry, but taking your most valuable content & turning it into duplicate content by syndicating it onto a fortune 500 website will not increase your traffic. The fortune 500 site will outrank you (especially if visitors are redirected to their site!). And when visitors are not redirected, they will still typically outrank you due to their huge domain authority, leading your content on your site to get filtered out of the search results as duplicate content.
And if Google were to come down on anyone in the above sort of situation it would be the smaller independent bloggers who get hit.
This is how SEO works.
Smaller independent players innovate & prove the model.
Google punishes them for being innovative.
And as they are getting punished, a vanilla corporate tweak of the same model rolls out and is white hat.
In SEO it’s not what you do that matters – it’s who your client is.
If you’re not working for a big brand, you’re doing it wrong.
Handling Objections From SEO Clients
If the current war on SEOs by Google wasn’t bad enough if you own the site you work on, then it is doubly so for the SEO working for a client. When the SEO doesn’t have sufficient control over the strategy and technology, it can be difficult to get and maintain rankings.
In this post, we’ll take a look at the challenges and common objections the SEO faces when working on a client site, particularly a client who is engaging an SEO for the first time. The SEO will need to fit in with developers, designers and managers who may not understand the role of SEOs. Here are common objections you can expect, and some ideas on how to counter them.
1. Forget About SEO
The objection is that SEO gets in the way. It’s too hard.
It’s true. SEO is complicated. It can often compromise design and site architecture. To managers and other web technicians, SEO can look like a dark art. Or possibly a con. There are no fixed rules as there are in, say, coding, and results are unpredictable.
So why spend time and money on SEO?
One appropriate response is “because your competitors are”
Building a website is the equivalent of taking the starting line in a race. Some site owners think that’s all they need do. However, the real race starts after the site is built. Every other competitor has a web site, and they’re already off and running in terms of site awareness. Without SEO, visitors may find a site, but if the site owner is not using the SEO channel, and their competitors are, then their competitors have an advantage in terms of reach.
2. Can’t SEOs Do Their Thing After The Site Is Built?
SEO’s can do their thing after the site is built, but it’s more difficult. As a result, it’s likely to be more expensive. Baking SEO into the mix when it is conceived and built is an easier route.
Just as copywriters require space to display their copy, SEO’s require room to manoeuvre. They’ll likely contribute to information architecture, copy, copy markup and internal linking structures. So start talking about SEO as early as possible, and particularly during information architecture.
There are three key areas where SEO needs to integrate with design. One, the requirement that text is machine readable. Search engines “think” mostly in terms of words, so topics and copy need to relate to search terms visitors may use.
Secondly, linking architecture and information hierarchies. If pages are buried deep in the site, but deemed important in terms of search, they will likely be elevated in the hierarchy to a position closer to the home page.
Thirdly, crawl-ability. A search engine sends out a spider, which grabs the source code of your website, and dumps it back in the search engines database. The spider skips from page to page, following links. If a page doesn’t have a crawlable link pointing to it, it will be invisible to search engines. There are various means of making a site easy to crawl, but one straightforward way is to use a site map, linked to from each page on the site. The SEO may also want to ensure the site navigation is crawlable.
3. We Don’t Want The SEO To Interfere With Code
SEO’s do need to tweak code, however the mark-up is largely inconsequential.
SEO’s need to specify title tags and some meta tags. These tags need to be unique for each page on the site, as each page is a possible entry page. A search visitor will not necessarily arrive at the home page first.
The title tag appears in search results as a clickable link, so serves a valuable marketing function. When search visitors consider which link on a search results page to click, the title tag and snippet will influence their decision. The title tag should, therefore, closely match the content of each page.
The second aspect concerns URL’s. Ideally, a URL should contain descriptive words, as opposed to numbers and random letters. For example, acme.com/widgets/red-widgets.htm is good, whilst acme.com/w/12345678&tnr.php, less so.
The more often the keyword appears, the more likely it will be bolded on a search results page, and is therefore more likely to attract a click. It’s also easier for the search engine to determine meaning if a URL is descriptive as opposed to cryptic.
4. I’ve Got An SEO PlugIn. That’s All I Need
SEO Plugins cover the on-site basics. But ranking well involves more than covering the basics.
In order to rank well, a page needs to have links from external sites. The higher quality those sites, the more chances your pages have of ranking well. The SEO will look to identify linking possibilities, and point these links to various internal pages on the site.
It can be difficult, near impossible, to get high quality links to brochure-style advertising pages. Links tend to be directed at pages that have unique value.
So, the type and quality of content has more to do with SEO than the way that content is marked up by a generic plugin. The content must attract links and generate engagement. The visitor needs to see a title on a search result, click through, not click back, and, preferably take some action on that page. That action may be a click deeper into the site, a bookmark, a tweet, or some other measurable form of response.
Content that lends itself to this type of interaction includes blog posts, news feeds, and content intended for social network engagement. In this way, SEO-friendly content can be functionally separated from other types of content. Not every page needs to be SEO’d, so SEO can be sectioned off, if necessary.
5. The SEO Is Just Another Technician
If your aim, or your clients aim, is to attract as much targeted traffic as possible then SEO integration must be taken just as seriously as design, development, copy and other media. SEO is more than a technical exercise, it’s a strategic marketing exercise, much like Public Relations.
SEO considerations may influence your choice of CMS. It may influence your strategic approach in terms of what type of information you publish. It may change the way you engage visitors. Whilst SEO can be bolted-on afterwards, this is a costly and less-effective way of doing SEO, much like re-designing a site is costly and less effective than getting it right in the planning stage.
6. Why Have Our Ranking Disappeared?
The reality of any marketing endeavour is that it will have a shelf-life. Sometimes, that shelf life is short. Other times, it can run for years.
SEO is vulnerable to the changes made by search engines. These changes aren’t advertised in advance, nor are they easily pinned down even after they have occurred. This is why SEO is strategic, just as Public Relations is strategic. The Public Relations campaign you were using a few years ago may not be the same one you use now, and the same goes for SEO.
The core of SEO hasn’t changed much. If you produce content visitors find relevant, and that content is linked to, and people engage with that content, then it has a good chance of doing well in search engines. However, the search engines constantly tweak their settings, and when they do, a lot of previous work – especially if that work was at the margins of the algorithms – can come undone.
So, ranking should never be taken for granted. The value the SEO brings is that they are across underlying changes in the way the search engines work and can adapt your strategy, and site, to the new changes.
Remember, whatever problems you may have with the search engines, the same goes for your competitors. They may have dropped rankings, too. Or they may do so soon. The SEO will try to figure out why the new top ranking sites are ranked well, then adapt your site and strategy so that it matches those criteria.
7. Why Don’t We Just Use PPC Instead?
PPC has many advantages. The biggest advantage is that you can get top positioning, and immediate traffic, almost instantly. The downside is, of course, you pay per click. Whilst this might be affordable today, keep in mind that the search engine has a business objective that demands they reward the top bidders who are most relevant. Their auction model forces prices higher and higher, and only those sites with deep pockets will remain in the game. If you don’t have deep pockets, or want to be beholden to the PPC channel, a long term SEO strategy works well in tandem.
SEO and PPC complement one another, and lulls and challenges in one channel can be made up for by the other. Also, you can feed the keyword data from PPC to SEO to gain a deeper understanding of search visitor behaviour.
8. Does SEO Provide Value For Money?
This is the reason for undertaking any marketing strategy.
An SEO should be able to demonstrate value. One way is to measure the visits from search engines before the SEO strategy starts, and see if these increase significantly post implementation. The value of each search click changes depending on your business case, but can be approximated using the PPC bid prices. Keep in mind the visits from an SEO campaign may be maintained, and increased, over considerable time, thus driving down their cost relative to PPC and other channels.
Keep Visitors Coming Back
Facebook. A mobile phone. Email. How often do you check them? Many of us have developed habits around these services.
The triggers that help create these habits can be baked in to the design of websites. The obvious benefit of doing so is that if you create habits in your users, then you’re less reliant on new search visitors for traffic.
How To Build Habit Forming Products
I recently read a book called “Hooked: How To Build Habit Forming Products” by Nir Eyal. Eyal is an entrepreneur who has built and sold two start ups, including a platform to place advertising within online social games. He also writes for Forbes, TechCrunch,and Psychology Today about the intersection of psychology, technology, and business. This latest book is about how technology shapes behaviour.
If usability is about engineering a site to make things easier, then forming habits is engineering user behaviour so they keep coming back. Forming habits in the user base is a marketers dream, yet a lot of search marketing theory is built around targeting the new visitor. As competition rises on the web, traffic becomes more valuable, and the price rises.
Clicks are likely more profitable the less you have to pay for them. If visitors keep returning because the visitor has formed a habit, then that’s a much more lucrative proposition than having to continually find new visitors. Facebook is a habit. Email is a habit. Google is a habit. Amazon is a habit. We keep returning for that fix.
What techniques can we use to help build habits?
Techniques
The book is well worth a read if you’re interested in the psychology of repeat engagement. There’s a lot of familiar topics presented in the book, with cross-over into other marketing territory such as e-mail and social media marketing, but I found it useful to think of engagement in terms of habit formation. Here’s a taste of what Eyal has discovered about habit forming services.
1. Have A Trigger
A trigger is something that grabs your attention and forces you to react to it. A trigger might be a photo of you that appears on a friends Facebook Feed. It might be the ping of an email. It might be someone reacting to a comment that you made on a forum and receive notification. These triggers help condition a user to take an action.
2. Inspire Action
Action is taken when a user anticipates a reward. An example might be clicking on a link for a free copy of a book. There are two conditions needed for a reward to work. It must be easy and there must be a strong motivation. The investment required – the click and attention – is typically a lower “cost” than the reward – the book. On social sites, like Facebook, the reward of the “like” click is the presumption of a social reward.
3. Variable Reward
The reward in response to the action must be variable. Something different should happen as the result of taking an action. The author gives the example of a slot machine. The reward might occur as the result of an action, or it might not. A slot machine would be boring if you got the exact same result each time you pulled the handle and spun the dials. The fact the slot machine only pays out sometimes is what keeps people coming back. All sports and games work on the basis of variable reward.
An online equivalent is Twitter or Facebook feeds. We keep looking at them because they keep changing. Somedays, there isn’t much of interest. Sometimes there is. Looking at that river of news going past can be an addictive habit, in part, because the reward changes.
4. Investment
The user must invest some time and do some work. Each time they invest some time and work, they add something that improves the service. They may add friends in Facebook. They add follows in Twitter. They build up reputation in forums. By adding to it, the service becomes more valuable both to the owner of the service, but also to the user. The bigger and deeper the network grows, the more valuable it becomes. If all your friends are on it, it’s valuable. This builds ever more triggers, makes actions easier and likely more frequent, and the reward more exciting.
The circle is complete. A habit is formed.
Applying Habit Theory To Websites
Habits create unprompted user engagement. The value is pretty obvious. There’s likely a higher lifetime value per customer than a one-off visit, or on-going visits we have to pay per click. We can spend less time acquiring new customers and more time growing the value to those we already have. If we create an easy mechanism by which that occurs, and spreads, then we’re not as vulnerable to search engines.
If this all sounds very function and product oriented, well, it is. So how does this apply to a published website? A product website that aims for a one off sale?
Think In Terms Of Habit Formation
For one off sales, there aren’t opportunities for habit formation in the same way as there might be for, say, Facebook.
Someone who sells big, one-off purchases may not see much point in having customers check in every day. However, when we think in terms of users habits, we’d likely better understand why we need to be on Facebook in the first place, or why email marketing is still valuable. If the user is there, that’s where we need to be, too. We need to align ourselves with users existing habits.
Developers often give away free apps, but bill for continued use. Once the user gets in the habit, of doing something, price becomes less of an issue. Price is much more of an issue before they form a habit because they wonder if they will get value. AngryBirds, WhatsApp, et al created a habit first, then cashed in once it was established.
A call-to-action is a trigger. If we think about how calls-to-action in social media and mobile applications, they tend to be big, bold and explicit. If users are in the habit of clicking big, bold buttons in other media, then try testing these such buttons against your current calls-to-action on web pages. Look to mimic habits and routines your visitors might use in other applications.
Habits can be a defensive strategy. It’s hard for a user to leave a company around which they’ve formed a habit. On the surface, there is a low switching cost between Google and, say, Bing, but how many people really do switch? Google has locked-in users habit by layering on services such as Gmail, or just the simple act of having people used to its interfaces. The habit of users increases their switching cost.
There’s a great line in the book:
Many innovations fail because consumers irrationally overvalue the old while companies irrationally overvalue the new” – John Gourville
Changing user habits is very difficult. Even Google couldn’t do it with Google Video vs the established YouTube. If you’re thinking of getting into an established market, think about how you’re going to break existing habits. A few new features probably isn’t enough. If breaking established habits seems too difficult, you may decide to pick an entirely new niche and try to get users forming a habit around your offering before other early movers show up.
Eyal also discusses emotional triggers. He uses the example of Instagram where users form a habit for emotional reasons, namely the fear of missing out. The fear of missing out is a more passive, internal trigger.
Make It Easy For The User To Take Action
After the trigger comes action. Usability is all about making it easy for the user to take action. Are you putting unnecessary sign-up stages in the way of a user taking action? Does the user really need to sign up before they take action? If you must have a sign up, how about making that process easier by letting people sign in with Facebook logins, or other shared services, where appropriate? Any barrier to action may lessen the chance of a user forming a habit.
Evan Williams, Blogger & Twitter:
Take a human desire, preferably one that has been around for a really long time…identify that desire, then take out steps
The technologies and sites that go big tend to mirror something people already do and have done for a long time. They just make the process easier and more efficient. Email is easier than writing and posting a letter. Creating a blog is easier than seeking a publishing deal or landing a journalism job at a newspaper. Sharing photos with Facebook is easier than doing so offline.
Apple worked on similar principles:
The most obvious thing is that Jobs wanted his products to be simple above all else. But Jobs realized early on that for them to be simple and easy to use, they had to be based on things that people already understood. (Design geeks have since given this idea a clunky name: so-called skeuomorphic user interfaces.) What was true of the first Macintosh graphical interface is true of the iPhone and iPad–the range of physical metaphors, and, eventually, the physical gestures that control them, map directly with what we already do in the real world. That’s the true key to creating an intuitive interface, and Jobs realized it before computers could really even render the real world with much fidelity at all.[An example of “imputing” Apples values on the smallest decisions: Jobs spent hours honing the window borders of the first Macintosh GUI. When his designers complained, he pointed out that users would look at those details for hours, so they had to be good.
Reducing things to the essentials fosters engagement by making an action easier to take. If in doubt, take steps out, and see what happens.
Vary The Reward
Look for ways to reward the user when they take action. Forums use social rewards, such as reputation and status titles. Facebook has “Like” Buttons. Inherent is this reward system is the thrill of pursuit. When a visitor purchases from you, or signs up for a newsletter, do you make the visitor feel like they’ve “won”?
Placing feeds on your site are another example of variable reward. The feed content is unpredictable, but that very unpredictability may be enough to keep people coming back. Same goes for blog posts. Compare this with a static brochure site where the “reward” will always be the same.
Can you break a process down into steps where the user is rewarded for taking each little step towards a goal? The reward should match the desires of the visitor. Perhaps the reward is monetary, perhaps it’s social. Gamification is becoming big business and it’s based around the idea of varying reward, action and triggers in order to foster engagement.
Gamification has also been used as a tool for customer engagement, and for encouraging desirable website usage behaviour. Additionally, gamification is readily applicable to increasing engagement on sites built on social network services. For example, in August 2010, one site, DevHub, announced that they have increased the number of users who completed their online tasks from 10% to 80% after adding gamification elements. On the programming question-and-answer site Stack Overflow users receive points and/or badges for performing a variety of actions, including spreading links to questions and answers via Facebook and Twitter. A large number of different badges are available, and when a user’s reputation points exceed various thresholds, he or she gains additional privileges, including at the higher end, the privilege of helping to moderate the site
Summary
This is “checking” behaviour. We check for something new. We get a variable reward for checking something new. If we help create this behaviour in our visitors, we get higher engagement signals, and we’re less reliant on new visitors from search engines.
Checking habits may change in the near future as more and more informational “rewards” are added to smartphones. The paper argues that novel informational rewards can lead to habitual behaviors if they are very quickly accessible. In a field experiment, when the phone’s contact book application was augmented with real-time information about contacts’ whereabouts and doings, users started regularly checking the application. The researchers also observed that habit-formation for one application may increase habit-formation for related applications.
Optmizing Against Competitors
You’ve got to feel a little sorry for anyone new to the search marketing field.
On one side, they’ve got to deal with the cryptic black box that is Google. Often inconsistent, always vague, and can be unfair in their dealings with webmasters. On the other side, webmasters must operate in competitive landscapes that often favour incumbent sites, especially if those incumbents are household names.
Sadly, much of the low hanging search fruit is gone. However, there are a number of approaches to optimization that don’t involve link placement and keyword targeting.
Competitive Advantage
Like any highly active and lucrative market sector, the web business can be challenging, but complaining about the nature of the environment will do little good. The only real option is to grab some boxing gloves, jump in the ring and compete.
In the last post, we talked about measurement. We need to make sure we’re measuring the right things in order to win. This post is about measuring our competitors to see if we enjoy a competitive advantage. If not, we need to rethink our approach.
Underlying Advantages
One of the problems with counting links, and other popular SEO metrics, is that they can be reductive. High link counts and pumped-up Google juice do not guarantee success, more traffic, or business success. For example, we might determine our competitor has X links from sites A, B and C, so we should do likewise. If we do likewise, plus a little more, then we win.
But often we don’t.
We often don’t win because there are multiple factors in play. Our competitor’s site might rank for reasons that are difficult to determine, and even more difficult to emulate. They may have brand, engagement metrics or historical advantages. But most challenging of all, they could have some underlying competitive advantage that no amount of link building or ranking for keyword X by a new site will counter. They may just have a better offer.
Winning The Search War Against Your Competitors
There’s an old joke about a two guys out walking in the African Savannah. They come across a hungry lion. The lion eyes them up, then charges them. One man turns and runs. The other man yells at him “you fool, you can’t outrun a lion!” The other man yells back “that’s true, but I don’t have to outrun the lion. I only have to outrun you!”
Once we figure out what Google wants, we then need to outrun other sites in our niche in order to win. Those sites have to deal with Google’s whims, just like we do.
Typically, webmasters will reverse engineer competitor sites, using web metrics as scores to target and beat. Who is linking to this page? How old are the links? What are their most popular keywords? Where are they getting traffic from? That’s part of the puzzle. However, we also need to evaluate non-technical factors that may be underpinning their business.
Competitive Analysis
Competitive intelligence is an ongoing, systematic analysis of our competitors.
The goal of a competitor analysis is to develop a profile of the nature of strategy changes each competitor might make, each competitor’s possible response to the range of likely strategic moves other firms could make, and each competitor’s likely reaction to industry changes and environmental shifts that might take place. Competitive intelligence should have a single-minded objective — to develop the strategies and tactics necessary to transfer market share profitably and consistently from specific competitors to the company. We should look at the sites positioned around and above us and analyse what they do in terms of business.
Do they understand the target market a little better than we do? Are their goals different from ours? If so, how are they different, and why? How are they pricing their products and services? How do their services differ from our own? In other words, do they know something we don’t?
We can optimize for competitive advantage. It’s about identifying what market your competitors capture, and where that market is heading in the future. Once you’ve figured that out, you might be able to discover opportunities your competitors have missed.
How To Undertake Competitive Analysis
It would be great if we could call up our competitors and ask them exactly what they’re doing, how they’re doing it, and where they are heading – and they’d tell us. But we all know that’s not going to happen.
So we have to dig. We don’t want to do too much digging, as it is time consuming, expensive and, truth be told, somewhat tedious. Thankfully, a lot of the answers we need are sitting right in front of us and readily available.
To undertake a competitive analysis, try asking these questions:
- What is the nature of competition?
- Where does the competitor compete?
- Who does the competitor compete against?
- How does the competitor compete?
1. The Nature Of The Competition
The little guy used to prosper in search just by being clever. If you knew the tricks, and the big companies didn’t – and typically, they didn’t – you could beat them easily. This is now harder to do. These days, traditional power structures play a greater role in search results, so it is often the case that big brands can dominate SERPs by virtue of their offline market position. Their market position is creating the signals Google tends to look for, such as regular major press mentions, resulting links and direct search volume, often with little direct SEO effort on the part of the brand.
So, if you’re the little guy coming up against big, entrenched competition, that’s going to be a hard road.
We saw what happened with Adwords, and now the same thing is happening in the main search results. Those with the deepest pockets could run Adwords campaigns that appear to make absolutely no fiscal sense, either because they’re getting their revenue from elsewhere to subsidise the Adwords spend, or, as is often the case, they’re prepared to wage a defensive war of attrition to prevent new competitors entering or dominating their space.
I think these long-term trends are mostly due to increasing competition. As more and more companies bid on Adwords for a finite number of clicks, it inevitably drives up the cost of clicks (simple supply and demand). It also doesn’t help that a lot of Adwords users are not actively managing their campaigns or measuring their ROI, and are consequently bidding at unprofitably high levels. Google also does its best to drive up CPC values in various ways (suggesting ridiculously high default bids, goading you to bid more to get on page 1, not showing your ad at all if you bid too low – even if no other ads appear etc).
Of course, this is just my data for one product in one small market. But the law of shitty clickthrus predicts that all advertising mediums become less and less profitable over time. So I would be surprised if it isn’t a general trend
In the main search results, a large companies position will be influenced by spend they make elsewhere. Big PR media campaigns, and the resulting press, links, and mentions in other channels, all result in a big data footprint of attention and interest that Google is unlikely to miss.
However, the little guy still has one advantage that the big businesses seldom have. The little guy is like the speedboat compared to an ocean liner. They may be small, they may be easily swamped in a storm, but they can change direction very quickly. The ocean liner takes a long time to turn around.
The little guy can change direction and get into new markets quickly – “pivot” in Silicon Valley parlance. The little guy can twist new markets slightly and invent entire new markets, whilst the bigger business tend to sail pre-set courses along known routes. This is how the once nimble Google trounced their search competitors. They didn’t take the competitors head on, they took a different tack (focused on the user, not advertisers), made strategic alignments (Yahoo), a few twists and turns (Overture) , and eventually worked themselves in the center of the search market. Had they just built another Yahoo, they wouldn’t have got very far.
If you’re a small business or new to a market, then it’s not a great idea to take on a big, entrenched business directly. Rather, look for ways you can outmanoeuvre them. Are there changes in the market they aren’t responding to? Are the markets about to change due to innovations coming over the horizon that you can spot, but they can’t? Look for areas of abrupt change. The little guy is typically well placed to take advantage of rapid change in markets. And new, fast developing markets.
Choose your market space carefully.
So, how do you become the next Picasso? The same way you build a powerful brand. Create a new category you can be first in.
…
The best way to become a world-famous artist is to create paintings that are recognized as a new category of art. – Al Ries
2. Where Does The Competitor Compete?
For example, are they limited to a certain geography? Culture? Language? Do they have an offline presence?
You could take their business model to a geographic location they don’t serve. Is there something that succeeds in the US, but has yet to reach Australia? Or Europe? Are your competitors targeting nationally, when you could target locally?
3. Who Do You Compete Against?
Make a list of the top ten competitors in a niche. Compare and contrast their approaches and offerings. Compare their use of language and their relative place in the market. Who is entrenched? Who is up-and-coming?
The up-and-coming sites are interesting. If they’re new, but making headway, it pays to ask why that’s happening. Is it just because they’re getting more links, or is it because they’re doing something new that the market likes? Bit of both?
I think the most interesting opportunities in search are found by watching the sites that aren’t doing much in the way of SEO, but they are rising fast. If they’re not playing hard at “rigging the search vote” in their favour, then their positioning is likely due to genuine interest out in the market.
4. How Does The Competitor Compete
What are the specifics of the products and services they are offering. Lower prices? High service levels? Do they provide information that can’t be obtained elsewhere? Do they have longevity? Money, staff and resources? Are they building brand? What are they doing besides search?
What prevents you doing likewise?
5. Are They More Engaging?
Google talk about engagement a lot, and we saw engagement metrics become important after updates Penquin/Panda.
Panda is really the public face of a much deeper switch towards user engagement. While the Panda score is sitewide the engagement “penalty” or weighting effect on also occurs at the individual page. The pages or content areas that were hurt less by Panda seem to be the ones that were not also being hurt by the engagement issue.
Engagement is a measure of how interesting visitors find a site. Do people search for your competitors by name, do they click through rather than back to the SERPs, and do they talk about that site to others?
The click-back, or lack-thereof, is a hard one to spot if you don’t have access to a websites data. Take a look at your competitors usability. Is it easy to navigate? It is obvious where visitors need to click? Are they easy to order from? Is their offer clear? Do they have fast site response times? Of course, we view these things as fundamental, however many sites still overlook the basics. If you can optimize in these areas, do so. If your competitors ranking above you have good engagement design and content, then you need to do it, too.
One baseline to look at is branded search volumes. If people are specifically & repeatedly looking for something that typically means they are satisfied with it.
Matt Cutts has recently mentioned that incumbent sites may not enjoy the previous “aged” advantages they’ve had in the past.
This may well be the next big Google shift. It makes sense that Google would reward sites that have higher user utility scores, all other factors being equal. Older sites may have built up a lot of links and positive SEO signals over time, but if their information is outdated and their site cumbersome, the site will likely have low utility. Given the rise of social media, which is all about immediacy and relevance (high utility as perceived by the user), Google would be foolish to reward incumbency at the expense of utility. It’s an area we’re watching closely as it may swing back some advantage to the smaller, nimble players.
6. Do They Have A Good Defensive Position?
Is it hard to enter their market? Competitors may have a lot of revenue to throw around, and a considerable historical advantages. Taking on the likes of Trip Advisor would be difficult and expensive, no matter how good the SEO.
If they have a strong defensible position, and you have limited resources, trying creating your own, unique space. For example, in SEO, you could compete with other SEOs for clients (crowded), or your could become a local trainer who trains existing SEOs inhouse (less crowded). You could move from selling widgets to hiring out widgets to people. You could repackage your widgets with other widgets to create a new product. An example might be selling individual kitchen utensils, but packaged together, they become a picnic kit.
Look for ways to create slightly different markets that you can make your own.
7. What’s In Their Marketing?
What does their advertising look like? Scanning competitor’s ads can reveal much about what that competitor believes about marketing and their target market.
Are they changing their message? Offering new products? Rebranding? Positioning differently? This is not absolute, of course, but it could offer up some valuable clues. There’s even a Society of Competitive Intelligence Professionals devoted to this very task.
Big Topic
Whilst competitive analysis is huge topic, the value of even a basic competitive analysis can be considerable.
By doing so, we can adjust our own offering to compete better, or decide that competing directly is not a great idea, and that we would be better off entering a closely-related market, instead . We may create a whole new niche and have no competition. At least, not for a while. We might make a list of all the things we need to do to match and overtake a fast rising new challenger who isn’t doing much in the way of SEO.
There’s much more to search competition that algo watching, keywords and links. And many ways to compete and optimize.
Measure For Business Benefit
Matt Cutts is just toying with SEO’s these days.
Going by some comments, many SEOs still miss the big picture. Google is not in the business of enabling SEOs. So he may as well have a little fun – Matt has “called it” on guest posting.
Okay, I’m calling it: if you’re using guest blogging as a way to gain links in 2014, you should probably stop. Why? Because over time it’s become a more and more spammy practice, and if you’re doing a lot of guest blogging then you’re hanging out with really bad company.
The hen-house erupted.
The hens should know better by now. If a guest post is good for the audience and site, then do it. If it’s being done for no other reason than to boost rank in Google, then that’s a sign a publishing strategy is weak, high risk, and vulnerable to Google’s whims. Change the publishing strategy.
Measuring What Is Important
Although far from perfect, Google is geared towards recognizing utility. If Google doesn’t recognize utility, then Google will become weaker and someone else will take their place. Only a few people remember Alta Vista. They didn’t provide much in the way of utility, and Google ate their lunch.
Which brings me onto the importance of measurement.
It’s important we measure the right things. If people get upset because guest posting is called out, are they upset because they are counting the number of inbound links as if that were the only benefit? Why are they counting inbound links? To get a ranking boost? So, why are some people getting upset? They know Google doesn’t like marketing practices that serve no other purpose than to boost rank. Or are people concerned Google might confuse a post of genuine utility with link spam?
A publishing strategy based on nothing more than Google rankings is not a publishing strategy, it’s a tactic. Given the changes Google has made recently, it’s not a good tactic, because if they can isolate and eliminate SEO tactics, they will. Those who guest post on other sites, and offer guest post placement in order to provide utility, should continue to do so. They are unlikely to eliminate genuine utility, regardless of links, and at worst, they’ll likely ignore the site it appears on.
Interesting
To prosper, we need to be more interesting that the next guy. We need to focus on delivering “interestingness”.
The buzzword term is “visitor engagement”, but that really means “be interesting”. If we provide interesting material, people will read it, and if we provide it on a regular basis, they might come back, or remember our brand name, and then search on that brand name, and then they might link to it, and that this activity combined helps us rank. Ranking is a side effect of being genuinely interesting.
This is not to say measuring links, or page views, are unimportant. But they can be an oversimplification when taken in isolation.
Demand Media’s eHow focused on pageviews rather than engagement. Which is a big part of the reason why the guys who sold them eHow were able to beat them with wikiHow.
Success depends on achieving the underlying business goal. Perhaps high page views are not important if a site is targeting a very specific audience. Perhaps rankings aren’t all that important if most of the audience is on social media or repeat business. Sometimes, focusing on the wrong metrics leads to the wrong marketing tactics.
What else can we measure? Some common stuff….
- Page views
- Subscriptions
- Comments
- Quality of comments
- Syndication
- Time on site
- Videos watched
- Unique visitors
- Traffic sent to partner sites
- Bookmarking activity
- Search engine exposure
- Brand searches
- Offline mentions
- Online mentions
- Customer satisfaction
- Conversion rates
- Number of inquiries
- Relationships
- Sales
- Reduced costs
The choice of what we measure depends on what we’re trying to achieve. The SEO may say they are trying to achieve a high rank, but why? To get more traffic, perhaps. Why do we want more traffic? In the hope more people will buy our widget.
So, if buying more widgets is the goal, then perhaps more energy needs to be placed into converting the traffic we already have, as opposed to spending the same energy getting more? Perhaps more time needs to be spent on conversion optimization. Perhaps more time needs to be spent refining the offer. Or listening to customers. Hearing their objections. Writing Q&A that addresses those objections. Guest posting somewhere else and addressing industry wide objections. Thinking up products to sell to previous customers. Making them aware of changes via an email list. Optimizing the interest factor of your site to make it more interesting than your competitors, then treat the rankings as a bonus. Link building starts with “being interesting”.
When it comes to the guest post, if you’re only doing it to get a link, then you’re almost certainly selling yourself short. A guest post should serve a number of functions, such as building awareness, increasing reach, building brand, and be based on serving your underlying marketing objective. Pick where you post carefully. Deliver real value. If you do guest post, always try and extract way more benefit than just the link.
There was a time when people could put low-quality posts on low-quality sites and enjoy a benefit. But that practice is really just selling a serious web business short.
How Do We Know If We’re Interesting?
There are a couple of different types of measurement marketers use. One is an emotional response, where the visitor becomes “positively interested”. This is measured by recall studies, association techniques, customers surveys and questionnaires. However, the type of response on-line marketers focus on, which is somewhat easier to measure, is behavioural interest. When people are really interested, they do something in response.
So, to measure the effectiveness of a guest posting, we might look for increased name or brand searches. More linkedin views. We might look at how many people travel down the links. We look at what they do when they land on the site, and – the most important bit – whether they do whatever that thing is that translates to the bottom line. Was it subscribing? Commenting? Downloading a white paper? Watching a video? Getting in contact? Tweeting? Bookmarking? What was that thing you wanted them to do in order to serve your bottom line?
Measurement should be flexible and will be geared towards achieving business goals. SEOs may worry that if they don’t show rankings and links, then the customer will be dissatisfied. I’d wager the customer will be a lot more dissatisfied if they do get a lot of links and a rankings boost, yet no improvement in the bottom line. We could liken this to companies that have a lot of meetings. There is an air of busyness, but are they achieving anything worthwhile? Maybe. Maybe not. We should be careful not to mistake frenzy for productivity.
Measuring links, like measuring the number of meetings, is reductive. So is measuring engagement just by looking at clicks. The picture needs to be broad and strategic. So, if guest posts help you build your business, measured by business metrics, keep doing them. Don’t worry about what Google may or may not do, because it’s beyond your control, regardless.
Control what you can. Control the quality of information you provide.
Disavow & Link Removal: Understanding Google
Fear Sells
Few SEOs took notice when Matt Cutts mentioned on TWIG that “breaking their spirits” was essential to stopping spammers. But that single piece of information add layers of insights around things like:
- duplicity on user privacy on organic versus AdWords
- benefit of the doubt for big brands versus absolute apathy toward smaller entities
- the importance of identity versus total wipeouts of those who are clipped
- mixed messaging on how to use disavow & the general fear around links
From Growth to No Growth
Some people internalize failure when growth slows or stops. One can’t raise venture capital and keep selling the dream of the growth story unless the blame is internalized. If one understands that another dominant entity (monopoly) is intentionally subverting the market then a feel good belief in the story of unlimited growth flames out.
Most of the growth in the search channel is being absorbed by Google. In RKG’s Q4 report they mentioned that mobile ad clicks were up over 100% for the year & mobile organic clicks were only up 28%.
Investing in Fear
There’s a saying in investing that “genius is declining interest rates” but when the rates reverse the cost of that additional leverage surfaces. Risks from years ago that didn’t really matter suddenly do.
The same is true with SEO. A buddy of mine mentioned getting a bad link example from Google where the link was in place longer than Google has been in existence. Risk can arbitrarily be added after the fact to any SEO activity. Over time Google can keep shifting the norms of what is acceptable. So long as they are fighting off Wordpress hackers and other major issues they are kept busy, but when they catch up on that stuff they can then focus on efforts to shift white to gray and gray to black – forcing people to abandon techniques which offered a predictable positive ROI.
Defunding SEO is an essential & virtuous goal.
Hiding data (and then giving crumbs of it back to profile webmasters) is one way of doing it, but adding layers of risk is another. What panda did to content was add a latent risk to content where the cost of that risk in many cases vastly exceeded the cost of the content itself. What penguin did to links was the same thing: make the latent risk much larger than the upfront cost.
As Google dials up their weighting on domain authority many smaller sites which competed on legacy relevancy metrics like anchor text slide down the result set. When they fall down the result set, many of those site owners think they were penalized (even if their slide was primarily driven by a reweighting of factors rather than an actual penalty). Since there is such rampant fearmongering on links, they start there. Nearly every widely used form of link building has been promoted by Google engineers as being spam.
- Paid links? Spam.
- Reciprocal links? Spam.
- Blog comments? Spam.
- Forum profile links? Spam.
- Integrated newspaper ads? Spam.
- Article databases? Spam.
- Designed by credit links? Spam.
- Press releases? Spam.
- Web 2.0 profile & social links? Spam.
- Web directories? Spam.
- Widgets? Spam.
- Infographics? Spam.
- Guest posts? Spam.
It doesn’t make things any easier when Google sends out examples of spam links which are sites the webmaster has already disavowed or sites which Google explicitly recommended in their webmaster guidelines, like DMOZ.
It is quite the contradiction where Google suggests we should be aggressive marketers everywhere EXCEPT for SEO & basically any form of link building is far too risky.
It’s a strange world where when it comes to social media, Google is all promote promote promote. Or even in paid search, buy ads, buy ads, buy ads. But when it comes to organic listings, it’s just sit back and hope it works, and really don’t actively go out and build links, even those are so important. – Danny Sullivan
Google is in no way a passive observer of the web. Rather they actively seek to distribute fear and propaganda in order to take advantage of the experiment effect.
They can find and discredit the obvious, but most on their “spam list” done “well” are ones they can’t detect. So, it’s easier to have webmasters provide you a list (disavows), scare the ones that aren’t crap sites providing the links into submission and damn those building the links as “examples” – dragging them into town square for a public hanging to serve as a warning to anyone who dare disobey the dictatorship. – Sugarrae
This propaganda is so effective that email spammers promoting “SEO solutions” are now shifting their pitches from grow your business with SEO to recover your lost traffic
Where Do Profits Come From?
I saw Rand tweet this out a few days ago…
Google’s making manual link building less scalable, and that’s good. It means SEO is more valuable and less of a commodity.— Rand Fishkin (@randfish) January 22, 2014
… and thought “wow, that couldn’t possibly be any less correct.”
When ecosystems are stable you can create processes which are profitable & pay for themselves over the longer term.
I very frequently get the question: ‘what’s going to change in the next 10 years?’ And that is a very interesting question; it’s a very common one. I almost never get the question: ‘what’s not going to change in the next 10 years?’ And I submit to you that that second question is actually the more important of the two – because you can build a business strategy around the things that are stable in time….in our retail business, we know that customers want low prices and I know that’s going to be true 10 years from now. They want fast delivery, they want vast selection. It’s impossible to imagine a future 10 years from now where a customer comes up and says, ‘Jeff I love Amazon, I just wish the prices were a little higher [or] I love Amazon, I just wish you’d deliver a little more slowly.’ Impossible. And so the effort we put into those things, spinning those things up, we know the energy we put into it today will still be paying off dividends for our customers 10 years from now. When you have something that you know is true, even over the long-term, you can afford to put a lot of energy into it. – Jeff Bezos at re: Invent, November, 2012
When ecosystems are unstable, anything approaching boilerplate has an outsized risk added by the dominant market participant. The quicker your strategy can be done at scale or in the third world, the quicker Google shifts it from a positive to a negative ranking signal. It becomes much harder to train entry level employees on the basics when some of the starter work they did in years past now causes penalties. It becomes much harder to manage client relationships when their traffic spikes up and down, especially if Google sends out rounds of warnings they later semi-retract.
What’s more, anything that is vastly beyond boilerplate tends to require a deeper integration and a higher level of investment – making it take longer to pay back. But the budgets for such engagement dry up when the ecosystem itself is less stable. Imagine the sales pitch, “I realize we are off 35% this year, but if we increase the budget 500% we should be in a good spot a half-decade from now.”
All great consultants aim to do more than the bare minimum in order to give their clients a sustainable competitive advantage, but by removing things which are scalable and low risk Google basically prices out the bottom 90% to 95% of the market. Small businesses which hire an SEO are almost guaranteed to get screwed because Google has made delivering said services unprofitable, particularly on a risk-adjusted basis.
Being an entrepreneur is hard. Today Google & Amazon are giants, but it wasn’t always that way. Add enough risk and those streams of investment in innovation disappear. Tomorrow’s Amazon or Google of other markets may die a premature death. You can’t see what isn’t there until you look back from the future – just like the answering machine AT&T held back from public view for decades.
Meanwhile, the Google Venture backed companies keep on keeping on – they are protected.
Gotta love when G Ventures is lead investor in a company that is doing paid blog posts with anchor-friendly links.— valentine (@veezy) January 20, 2014
…and comment spam links, lol. classic.— valentine (@veezy) January 20, 2014
When ad agencies complain about the talent gap, what they are really complaining about is paying people what they are worth. But as the barrier to entry in search increases, independent players die, leaving more SEOs to chase fewer corporate jobs at lower wages. Even companies servicing fortune 500s are struggling.
On an individual basis, the ability to create value and the ability to be fairly compensated for the value you create is not the same thing. Look no further than companies like Google & Apple which engage in flagrantly illegal anti-employee cartel agreements. These companies “partnered” with their direct competitors to screw their own employees. Even if you are on a winning team it does not mean that you will be a winner after you back out higher living costs and such illegal employer agreements.
This is called now the winner-take-all society. In other words the rewards go overwhelmingly to just the thinnest crust of folks. The winner-take-all society creates incredibly perverse incentives to become a cheater-take-all society. Cause my chances of winning an honest competition are very poor. Why would I be the one guy or gal who would be the absolute best in the world? Why not cheat instead?” – William K Black
Meanwhile, complaints about the above sorts of inequality or other forms of asset stripping are pitched as being aligned with Nazi Germany’s treatment of Jews. Obviously we need more H-1B visas to further drive down wages even as graduates are underemployed with a mountain of debt.
A Disavow For Any (& Every) Problem
Removing links is perhaps the single biggest growth area in SEO.
Just this week I got an unsolicited email from an SEO listing directory
We feel you may qualify for a Top position among our soon to be launched Link Cleaning Services Category and we would like to learn more about Search Marketing Info. Due to the demand for link cleaning services we’re poised to launch the link cleaning category. I took a few minutes to review your profile and felt you may qualify. Do you have time to talk this Monday or Tuesday?
Most of the people I interact with tend to skew toward the more experienced end of the market. Some of the folks who join our site do so after their traffic falls off. In some cases the issues look intimately tied to Panda & the sites with hundreds of thousands of pages maybe only have a couple dozen inbound links. In spite of having few inbound links & us telling people the problem looks to be clearly aligned with Panda, some people presume that the issue is links & they still need to do a disavow file.
Why do they make that presumption? It’s the fear message Google has been selling nonstop for years.
Punishing people is much different, and dramatic, from not rewarding. And it feeds into the increasing fear that people might get punished for anything. – Danny Sullivan
What happens when Google hands out free all-you-can-eat gummy bear laxatives to children at the public swimming pool? A tragedy of the commons.
Rather than questioning or countering the fear stuff, the role of the SEO industry has largely been to act as lap dogs, syndicating & amplifying the fear.
- link tool vendors want to sell proprietary clean up data
- SEO consultants want to tell you that they are the best and if you work with someone else there is a high risk hidden in the low price
- marketers who crap on SEO to promote other relabeled terms want to sell you on the new term and paint the picture that SEO is a self-limiting label & a backward looking view of marketing
- paid search consultants want to enhance the perception that SEO is unreliable and not worthy of your attention or investment
Even entities with a 9 figure valuation (and thus plenty of resources to invest in a competent consultant) may be incorrectly attributing SEO performance problems to links.
A friend recently sent me a link removal request from Buy Domains referring to a post which linked to them.
On the face of this, it’s pretty absurd, no? A company which does nothing but trade in names themselves asks that their name reference be removed from a fairly credible webpage recommending them.
The big problem for Buy Domains is not backlinks. They may have had an issue with some of the backlinks from PPC park pages in the past, but now those run through a redirect and are nofollowed.
Their big issue is that they have less than great engagement metrics (as do most marketplace sites other than eBay & Amazon which are not tied to physical stores). That typically won’t work if the entity has limited brand awareness coupled with having nearly 5 million pages in Google’s index.
They not only have pages for each individual domain name, but they link to their internal search results from their blog posts & those search pages are indexed. Here’s part of a recent blog post
And here are examples of the thin listing sorts of pages which Panda was designed in part to whack. These pages were among the millions indexed in Google.
A marketplace with millions of pages that doesn’t have broad consumer awareness is likely to get nailed by Panda. And the websites linking to it are likely to end up in disavow files, not because they did anything wrong but because Google is excellent at nurturing fear.
What a Manual Penalty Looks Like
Expedia saw a 25% decline in search visibility due to an unnatural links penalty , causing their stock to fall 6.4%. Both Google & Expedia declined to comment. It appears that the eventual Expedia undoing stemmed from Hacker News feedback & coverage about an outing story on an SEO blog that certainly sounded like it stemmed from an extortion attempt. USA Today asked if the Expedia campaign was a negative SEO attack.
While Expedia’s stock drop was anything but trivial, they will likely recover within a week to a month.
Smaller players can wait and wait and wait and wait … and wait.
Manual penalties are no joke, especially if you are a small entity with no political influence. The impact of them can be absolutely devastating. Such penalties are widespread too.
In Google’s busting bad advertising practices post they highlighted having zero tolerance, banning more than 270,000 advertisers, removing more than 250,000 publishers accounts, and disapproving more than 3,000,000 applications to join their ad network. All that was in 2013 & Susan Wojcicki mentioned Google having 2,000,000 sites in their display ad network. That would mean that something like 12% of their business partners were churned last year alone.
If Google’s churn is that aggressive on their own partners (where Google has an economic incentive for the relationship) imagine how much broader the churn is among the broader web. In this video Matt Cutts mentioned that Google takes over 400,000 manual actions each month & they get about 5,000 reconsideration request messages each week, so over 95% of the sites which receive notification never reply. Many of those who do reply are wasting their time.
The Disavow Threat
Originally when disavow was launched it was pitched as something to be used with extreme caution:
This is an advanced feature and should only be used with caution. If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results. We recommend that you disavow backlinks only if you believe you have a considerable number of spammy, artificial, or low-quality links pointing to your site, and if you are confident that the links are causing issues for you. In most cases, Google can assess which links to trust without additional guidance, so most normal or typical sites will not need to use this tool.
Recently Matt Cutts has encouraged broader usage. He has one video which discusses proatively disavowing bad links as they come in & another where he mentioned how a large company disavowed 100% of their backlinks that came in for a year.
The idea of proactively monitoring your backlink profile is quickly becoming mainstream – yet another recurring fixed cost center in SEO with no upside to the client (unless you can convince the client SEO is unstable and they should be afraid – which would ultimately retard their longterm investment in SEO).
Given the harshness of manual actions & algorithms like Penguin, they drive companies to desperation, acting irrationally based on fear.
People are investing to undo past investments. It’s sort of like riding a stock down 60%, locking in the losses by selling it, and then using the remaining 40% of the money to buy put options or short sell the very same stock. :D
Some companies are so desperate to get links removed that they “subscribe” sites that linked to them organically with spam email messages asking the links be removed.
Some go so far that they not only email you on and on, but they created dedicated pages on their site claiming that the email was real.
What’s so risky about the above is that many webmasters will remove links sight unseen, even from an anonymous Gmail account. Mix in the above sort of “this message is real” stuff and how easy would it be for a competitor to target all your quality backlinks with a “please remove my links” message? Further, how easy would it be for a competitor aware of such a campaign to drop a few hundred Dollars on Fiverr or Xrummer or other similar link sources, building up your spam links while removing your quality links?
A lot of the “remove my link” messages are based around lying to the people who are linking & telling them that the outbound link is harming them as well: “As these links are harmful to both yours and our business after penguin2.0 update, we would greatly appreciate it if you would delete these backlinks from your website.”
Here’s the problem though. Even if you spend your resources and remove the links, people will still likely add your site to their disavow file. I saw a YouTube video recording of an SEO conference where 4 well known SEO consultants mentioned that even if they remove the links “go ahead and disavow anyhow,” so there is absolutely no upside for publishers in removing links.
How Aggregate Disavow Data Could Be Used
Recovery is by no means guaranteed. In fact of the people who go to the trouble to remove many links & create a disavow file, only 15% of people claim to have seen any benefit.
The other 85% who weren’t sure of any benefit may not have only wasted their time, but they may have moved some of their other projects closer toward being penalized.
Let’s look at the process:
- For the disavow to work you also have to have some links removed.
- Some of the links that are removed may not have been the ones that hurt you in Google, thus removing them could further lower your rank.
- Some of the links you have removed may be the ones that hurt you in Google, while also being ones that helped you in Bing.
- The Bing & Yahoo! Search traffic hit comes immediately, whereas the Google recovery only comes later (if at all).
- Many forms of profits (from client services or running a network of sites) come systematization. If you view everything that is systematized or scalable as spam, then you are not only disavowing to try to recover your penalized site, but you are send co-citation disavow data to Google which could have them torch other sites connected to those same sources.
- If you run a network of sites & use the same sources across your network and/or cross link around your network, you may be torching your own network.
- If you primarily do client services & disavow the same links you previously built for past clients, what happens to the reputation of your firm when dozens or hundreds of past clients get penalized? What happens if a discussion forum thread on Google Groups or elsewhere starts up where your company gets named & then a tsunami of pile on stuff fills out in the thread? Might that be brand destroying?
The disavow and review process is not about recovery, but is about collecting data and distributing pain in a game of one-way transparency. Matt has warned that people shouldn’t lie to Google…
Don’t lie in a reconsideration request. Just don’t go there at all. Makes Matt *angry*! Matt smash spam!— Matt Cutts (@mattcutts) June 16, 2008
…however Google routinely offers useless non-information in their responses.
Some Google webmaster messages leave a bit to be desired.
Hey @mattcutts, have you seen the new Google manual action revoked messages? http://t.co/0NiYG0A3CB. Perhaps outsourcing too far ;-)…?— Dan Sharp (@screamingfrog) January 6, 2014
Recovery is uncommon. Your first response from Google might take a month or more. If you work for a week or two on clean up and then the response takes a month, the penalty has already lasted at least 6 weeks. And that first response might be something like this
Reconsideration request for site.com: Site violates Google’s quality guidelines
We received a reconsideration request from a site owner for site.com/.
We’ve reviewed your site and we believe that site.com/ still violates our quality guidelines. In order to preserve the quality of our search engine, pages from site.com/ may not appear or may not rank as highly in Google’s search results, or may otherwise be considered to be less trustworthy than sites which follow the quality guidelines.
For more specific information about the status of your site, visit the Manual Actions page in Webmaster Tools. From there, you may request reconsideration of your site again when you believe your site no longer violates the quality guidelines.
If you have additional questions about how to resolve this issue, please see our Webmaster Help Forum.
Absolutely useless.
Zero useful information whatsoever.
As people are unsuccessful in the recovery process they cut deeper and deeper. Some people have removed over 90% of their profile without recovering & been nearly a half-year into the (12-step) “recovery” process before even getting a single example of a bad link from Google. In some cases these bad links Google identified were links were obviously created by third party scraper sites & were not in Google’s original sample of links to look at (so even if you looked at every single link they showed you & cleaned up 100% of issues you would still be screwed.)
Another issue with aggregate disavow data is there is a lot of ignorance in the SEO industry in general, and people who try to do things cheap (essentially free) at scale have an outsized footprint in the aggregate data. For instance, our site’s profile links are nofollowed & our profiles are not indexed by Google. In spite of this, examples like the one below are associated with not 1 but 3 separate profiles for a single site.
Our site only has about 20,000 to 25,000 unique linking domains. However over the years we have had well over a million registered user profiles. If only 2% of the registered user profiles were ignorant spammers who spammed our profile pages and then later added our site to a disavow file, we would have more people voting *against* our site than we have voting for it. And that wouldn’t be because we did anything wrong, but rather because Google is fostering an environment of mixed messaging, fear & widespread ignorance.
Another factor with Google saying “you haven’t cut out enough bone marrow yet” along with suggesting that virtually any/every type of link is spam is that there is going to be a lot of other forms of false positives in the aggregate data.
I know some companies specializing in link recovery which in part base some aspects of their disavows on the site’s ranking footprint. Well if you get a manual penalty, a Panda penalty, or your site gets hacked, then those sorts of sites which you are linking to may re-confirm that your site deserves to be penalized (on a nearly automated basis with little to no thought) based on the fact that it is already penalized. Good luck on recovering from that as Google folds in aggregate disavow data to justify further penalties.
Responsibility
All large ecosystems are gamed. We see it with app ratings & reviews, stealth video marketing, advertising, malware installs, and of course paid links.
Historically in search there has been the view that you are responsible for what you have done, but not the actions of others. The alternate roadmap would lead to this sort of insanity:
Our system has noticed that in the last week you received 240 spam emails. In result, your email account was temporarily suspended. Please contact the spammers and once you have a proof they unsuscribed you from their spam databases, we will reconsider reopening your email account.
As Google has closed down their own ecosystem, they allow their own $0 editorial to rank front & center even if it is pure spam, but third parties are now held to a higher standard – you could be held liable for the actions of others.
At the extreme, one of Google’s self-promotional automated email spam messages sent a guy to jail. In spite of such issues, Google remains unfazed, adding a setting which allows anyone on Google+ to email other members.
Ask Google if they should be held liable for the actions of third parties and they will tell you to go to hell. Their approach to copyright remains fuzzy, they keep hosting more third party content on their own sites, and even when that content has been deemed illegal they scream that it undermines their first amendment rights if they are made to proactively filter:
Finally, they claimed they were defending free speech. But it’s the courts which said the pictures were illegal and should not be shown, so the issue is the rule of law, not freedom of speech.
…
the non-technical management, particularly in the legal department, seems to be irrational to the point of becoming adolescent. It’s almost as if they refuse to do something entirely sensible, and which would save them and others time and trouble, for no better reason than that someone asked them to.
Monopolies with nearly unlimited resources shall be held liable for nothing.
Individuals with limited resources shall be liable for the behavior of third parties.
Google Duplicity (beta).
Torching a Competitor
As people have become more acclimated toward link penalties, a variety of tools have been created to help make sorting through the bad ones easier.
“There have been a few tools coming out on the market since the first Penguin – but I have to say that LinkRisk wins right now for me on ease of use and intuitive accuracy. They can cut the time it takes to analyse and root out your bad links from days to minutes…” – Dixon Jones
But as there have been more tools created for sorting out bad links & more tools created to automate sending link emails, two things have happened
- Google is demanding more links be removed to allow for recovery
Failed reconsideration requests are now coming with this email that tells site owners they must remove more links: pic.twitter.com/tiyXtPvY32— Marie Haynes (@Marie_Haynes) January 2, 2014
- people are becoming less responsive to link removal requests as they get bombarded with them
- Some of these tools keep bombarding people over and over again weekly until the link is removed or the emails go to the spam bin
- to many people the link removal emails are the new link request emails ;)
- one highly trusted publisher who participates in our forums stated they filtered the word “disavow” to automatically go to their trash bin
- on WebmasterWorld a member decided it was easier to delete their site than deal with the deluge of link removal spam emails
The problem with Google rewarding negative signals is there are false positives and it is far cheaper to kill a business than it is to build one. The technically savvy teenager who created the original version of the software used in the Target PoS attack sold the code for only $2,000.
There have been some idiotic articles like this one on The Awl suggesting that comment spamming is now dead as spammers run for the hills, but that couldn’t be further from the truth. Some (not particularly popular) blogs are getting hundreds to thousands of spam comments daily & Wordpress can have trouble even backing up the database (unless the comment spam is regularly deleted) as the database can quickly get a million records.
The spam continues but the targets change. A lot of these comments are now pointed at YouTube videos rather than ordinary websites.
As Google keeps leaning into negative signals, one can expect a greater share of spam links to be created for negative SEO purposes.
Maybe this maternity jeans comment spam is tied to the site owner, but if they didn’t do it, how do they prove it?
With WMT admission that linkspam MUST be removed we are past tipping point; it’s now a risk to not engage in neg SEO against all comp. #sad— Cygnus SEO (@CygnusSEO) January 2, 2014
Once again, I’ll reiterate Bill Black
This is called now the winner-take-all society. In other words the rewards go overwhelmingly to just the thinnest crust of folks. The winner-take-all society creates incredibly perverse incentives to become a cheater-take-all society. Cause my chances of winning an honest competition are very poor. Why would I be the one guy or gal who would be the absolute best in the world? Why not cheat instead?” – William K Black
The cost of “an academic test” can be as low as $5. You know you might be in trouble when you see fiverr.com/conversations/theirusername in your referrers:
Our site was hit with negative SEO. We have manually collected about 24,000 bad links for our disavow file (so far). It probably cost the perp $5 on Fiverr to point these links at our site. Do you want to know how bad that sucks? I’ll tell you. A LOT!! Google should be sued enmass by web masters for wasting our time with this “bad link” nonsense. For a company with so many Ph.D’s on staff, I can’t believe how utterly stupid they are
Or, worse yet, you might see SAPE in your referrers
And if the attempt to get you torched fails, they can try & try again. The cost of failure is essentially zero. They can keep pouring on the fuel until the fire erupts.
Even Matt Cutts complains about website hacking, but that doesn’t mean you are free of risk if someone else links to your site from hacked blogs. I’ve been forwarded unnatural link messages from Google which came about after person’s site was added in on a SAPE hack by a third party in an attempt to conceal who the beneficial target was. When in doubt, Google may choose to blame all parties in a scorched Earth strategy.
If you get one of those manual penalties, you’re screwed.
Even if you are not responsible for such links, and even if you respond on the same day, and even if Google believes you, you are still likely penalized AT LEAST for a month. Most likely Google will presume you are a liar and you have at least a second month in the penalty box. To recover you might have to waste days (weeks?) of your life & remove some of your organic links to show that you have went through sufficient pain to appease the abusive market monopoly.
As bad as the above is, it is just the tip of the iceberg.
- People can redirect torched websites.
- People can link to you from spam link networks which rotate links across sites, so you can’t possibly remove or even disavow all the link sources.
- People can order you a subscription of those rotating spam links from hacked sites, where new spam links appear daily. Google mentioned discovering 9,500 malicious sites daily & surely the number has only increased from there.
- People can tie any/all of the above with cloaking links or rel=canonical messages to GoogleBot & then potentially chain that through further redirects cloaked to GoogleBot.
- And on and on … the possibilities are endless.
Extortion
Another thing this link removal fiasco subsidizes is various layers of extortion.
Not only are there the harassing emails threatening to add sites to disavow lists if they don’t remove the links, but some companies quickly escalate things from there. I’ve seen hosting abuse, lawyer threat letters, and one friend was actually sued in court (and the people who sued him actually had the link placed!)
Google created a URL removal tool which allows webmasters to remove pages from third party websites. How long until that is coupled with DDoS attacks? Once effective with removing one page, a competitor might decide to remove another.
Another approach to get links removed is to offer payment. But payment itself might encourage the creation of further spammy links as link networks look to replace their old cashflow with new sources.
The recent Expedia fiasco started as an extortion attempt: “If I wanted him to not publish it, he would “sell the post to the highest bidder.”
Another nasty issue here is articles like this one on Link Research Tools, where they not only highlight client lists of particular firms, but then state which URLs have not yet been penalized followed by “most likely not yet visible.” So long as that sort of “publishing” is acceptable in the SEO industry, you can bet that some people will hire the SEOs nearly guaranteeing a penalty to work on their competitor’s sites, while having an employee write a “case study” for Link Research Tools. Is this the sort of bullshit we really want to promote?
Some folks are now engaging in overt extortion:
I had a client phone me today and say he had a call from a guy with an Indian accent who told him that he will destroy his website rankings if he doesn’t pay him £10 per month to NOT do this.
Branding / Rebranding / Starting Over
Sites that are overly literal in branding likely have no chance at redemption. That triple hyphenated domain name in a market that is seen as spammy has zero chance of recovery.
Even being a generic unbranded site in a YMYL category can make you be seen as spam. The remote rater documents stated that the following site was spam…
… even though the spammiest thing on it was the stuff advertised in the AdSense ads:
For many (most?) people who receive a manual link penalty or are hit by Penguin it is going to be cheaper to start over than to clean up.
At the very minimum it can make sense to lay groundwork for a new project immediately just in case the old site can’t recover or takes nearly a year to recover. However, even if you figure out the technical bits, as soon as you have any level of success (or as soon as you connect your projects together in any way) you once again become a target.
And you can’t really invest in higher level branding functions unless you think the site is going to be around for many years to earn off the sunk cost.
Succeeding at SEO is not only about building rank while managing cashflow and staying unpenalized, but it is also about participating in markets where you are not marginalized due to Google inserting their own vertical search properties.
Even companies which are large and well funded may not succeed with a rebrand if Google comes after their vertical from the top down.
Hope & Despair
If you are a large partner affiliated with Google, hope is on your side & you can monetize the link graph: “By ensuring that our clients are pointing their links to maximize their revenue, we’re not only helping them earn more money, but we’re also stimulating the link economy.”
You have every reason to be Excited, as old projects like Excite or Merchant Circle can be relaunched again and again.
Even smaller players with the right employer or investor connections are exempt from these arbitrary risks.
@badams @seobook They claim to have removed just over 200 links! Here we have people removing hundreds of thousands and still going nowhere!— Rohan Ayyar (@searchrook) January 6, 2014
You can even be an SEO and start a vertical directory knowing you will do well if you can get that Google Ventures investment, even as other similar vertical directories were torched by Panda.
Hehhe just saw a guest post complete with commercial anchor written by an “SEO Guru” whose also a Google employee— Tony Spencer (@notsleepy) January 7, 2014
For most other players in that same ecosystem, the above tailwind is a headwind. Don’t expect much 1 on 1 help in webmaster tools.
In this video Matt Cutts mentioned that Google takes over 400,000 manual actions each month & they get about 5,000 reconsideration request messages each week, so over 95% of the sites which receive notification never reply. Many of those who reply are wasting their time. How many confirmed Penguin 1.0 recoveries are you aware of?
Even if a recovery is deserved, it does not mean one will happen, as errors do happen. And on the off chance recovery happens, recovery does not mean a full restoration of rankings.
There are many things we can learn from Google’s messages, but probably the most important is this:
It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness, it was the epoch of belief, it was the epoch of incredulity, it was the season of Light, it was the season of Darkness, it was the spring of hope, it was the winter of despair, we had everything before us, we had nothing before us, we were all going direct to heaven, we were all going direct the other way – in short, the period was so far like the present period, that some of its noisiest authorities insisted on its being received, for good or for evil, in the superlative degree of comparison only. – Charles Dickens, A Tale of Two Cities
Yahoo! Secured Search Rolls Out
Yahoo! is currently rolling out secured search, which prevents sending referrers to unsecured sites. The roll out is ongoing, but currently they do pass data to secured sites. Unlike Google’s secured search roll out:
rather than showing a referrer wit…
Google’s Chris DiBona On Search Ecosystem Diversity
It’s hard to deny that some folks working at Google are geniuses. It’s also hard to deny the disconnect in their messaging.
As Google locked down their “open” ecosystem (compatibility as a club, abandonware, deleting privacy settings, extensions required to be installed from store, extensions required to be single-purpose, forced Google+ integration, knowledge graph scrape-n-displace, “We could either sue him or hire him,” etc.), I thought an interview of one of their open source evangelists would be entertaining.
Chris DiBona delivered:
Can you imagine if you didn’t have the malware protection and the process isolation of Chrome, that Chrome brought to other browsers? Can you imagine surfing the web the way it is right now? It’s pretty grim. There’s a lot of malware. You end up basically funnelling people into fewer and fewer sites, and therefore fewer and fewer viewpoints and all the rest.
Sometimes large sites serve malware through their ads. Ad networks are one of the best ways to distribute malware. The super networks core to the web ecosystem are home to much of the malware – even GoogleBot was tricked into doing MySQL injection attacks. But even if we ignored that bit, it doesn’t take much insight to realize that Google is achieving that same kill diversity “goal” through other means…
This Mozcast chart always looks the same. As the web grows, Google allows less and less of it to appear in search. http://t.co/PRX4NGHZfH— Dan Thies (@danthies) September 4, 2013
…as they roll out many algorithmic filters, manual penalties, selectively enforce these issues on smaller players (while giving more exploitative entities a free pass), insert their own vertical search services, dial up their weighting on domain authority, and require smaller players to proactively police the rest of the web while Google thinks the n-word 85 times is totally reasonable on their own tier-1 properties.
We have another post coming on the craziness of disavows and link removals, but it has no doubt gone beyond absurd at this point.
Failed reconsideration requests are now coming with this email that tells site owners they must remove more links: pic.twitter.com/tiyXtPvY32— Marie Haynes (@Marie_Haynes) January 2, 2014
With WMT admission that linkspam MUST be removed we are past tipping point; it’s now a risk to not engage in neg SEO against all comp. #sad— Cygnus SEO (@CygnusSEO) January 2, 2014
Why is diversity so important?
Dissent evolves markets. The status quo doesn’t get changed by agreeing & aligning with existing power structures. Anyone who cares to debate this need only look at Google’s ongoing endless string of lawsuits. Most of those lawsuits are associated with Google (rightly or wrongly) taking power from what they view as legacy entities.
Even on a more personal level, one’s investment returns are likely to be better when things are out of favor:
“Investors should remember that excitement and expenses are their enemies. And if they insist on trying to time their participation in equities, they should try to be fearful when others are greedy and greedy only when others are fearful.” – Warren Buffett
In many markets returns and popularity are inversely proportional
Investing in Internet stocks in 1999 was popular, but for those who stayed too long at the party it was a train wreck. Domain name speculators who bought into the carnage a couple years later did well.
Society is so complex & inter-connected that its very easy to think things run far more smoothly than they do & thus buy into to many fibs that are obviously self-evident until the moment they are not.
Popularity is backward looking, enabling the sheep to be sheared.
Unfortunately depth & diversity are being sacrificed to promote pablum from well known entities in formats that are easy to disintermediate & monetize.
Think about it: an actual scientist who produces actual knowledge should be more like a journalist who recycles fake insights! This is beyond popularisation. This is taking something with value and substance and coring it out so that it can be swallowed without chewing. This is not the solution to our most frightening problems – rather this is one of our most frightening problems.
– Benjamin Bratton
Innovative knowledge creation and thought reading tattoos — the singularity is near.
SEO 2014
We’re at the start of 2014.
SEO is finished.
Well, what we had come to know as the practical execution of “whitehat SEO” is finished. Google has defined it out of existence. Research keyword. Write page targeting keyword. Place links with that keyword in the link. Google cares not for this approach.
SEO, as a concept, is now an integral part of digital marketing. To do SEO in 2014 – Google-compliant, whitehat SEO – digital marketers must seamlessly integrate search strategy into other aspects of digital marketing. It’s a more complicated business than traditional SEO, but offers a more varied and interesting challenge, too.
Here are a few things to think about for 2014.
1. Focus On Brand
Big brands not only get a free pass, they can get extra promotion. By being banned. Take a look at Rap Genius. Aggressive link-building strategy leads to de-indexing. A big mea culpa follows and what happens? Not only do they get reinstated, they’ve earned themselves a wave of legitimate links.
Now that’s genius.
Google would look deficient if they didn’t show that site as visitors would expect to find it – enough people know the brand. To not show a site that has brand awareness would make Google look bad.
Expedia’s link profile was similarly outed for appearing to be at odds with Google’s published standards. Could a no-name site pass a hand inspection if they use aggressive linking? Unlikely.
What this shows is that if you have a brand important enough so that Google would look deficient by excluding it, then you will have advantages that no-name sites don’t enjoy. You will more likely pass manual inspections, and you’re probably more likely to get penalties lifted.
What is a brand?
In terms of search, it’s a site that visitors can use a brand search to find. Just how much search volume you require is open to debate, but you don’t need to be a big brand like Apple, or Trip Advisor or Microsoft. Rap Genius aren’t. Ask “would Google look deficient if this site didn’t show up” and you can usually tell that by looking for evidence of search volume on a sites name.
In advertising, brands have been used to secure a unique identity. That identity is associated with a product or service by the customer. Search used to be about matching a keyword term. But as keyword areas become saturated, and Google returns fewer purely keyword-focused pages anyway, developing a unique identity is a good way forward.
If you haven’t already, put some work into developing a cohesive, unique brand. If you have a brand, then think about generating more awareness. This may mean higher spends on brand-related advertising than you’ve allocated in previous years. The success metric is an increase in brand searches i.e. the name of the site.
2. Be Everywhere
The idea of a stand-alone site is becoming redundant. In 2014, you need to be everywhere your potential visitors reside. If your potential visitors are spending all day in Facebook, or YouTube, that’s where you need to be, too. It’s less about them coming to you, which is the traditional search metaphor, and a lot more about you going to them.
You draw visitors back to your site, of course, but look at every platform and other site as a potential extension of your own site. Pages or content you place on those platforms are yet another front door to your site, and can be found in Google searches. If you’re not where your potential visitors are, you can be sure your competitors will be, especially if they’re investing in social media strategies.
A reminder to see all channels as potential places to be found.
Mix in cross-channel marketing with remarketing and consumers get the perception that your brand is bigger. Google ran the following display ad before they broadly enabled retargeting ads. Retargeting only further increases that lift in brand searches.
3. Advertise Everywhere
Are you finding it difficult to get top ten in some areas? Consider advertising with AdWords and on the sites that already hold those positions. Do some brand advertising on them to raise awareness and generate some brand searches. An advert placed on a site that offers a complementary good or service might be cheaper than going to the expense and effort needed to rank. It might also help insulate you from Google’s whims.
The same goes for guest posts and content placement, although obviously you need to be a little careful as Google can take a dim view of it. The safest way is to make sure the content you place is unique, valuable and has utility in it’s own right. Ask yourself if the content would be equally at home on your own site if you were to host it for someone else. If so, it’s likely okay.
4. Valuable Content
Google does an okay job of identifying good content. It could do better. They’ve lost their way a bit in terms of encouraging production of good content. It’s getting harder and harder to make the numbers work in order to cover the production cost.
However, it remains Google’s mission to provide the user with answers the visitor deems relevant and useful. The utility of Google relies on it. Any strategy that is aligned with providing genuine visitor utility will align with Google’s long term goals.
Review your content creation strategies. Content that is of low utility is unlikely to prosper. While it’s still a good idea to use keyword research as a guide to content creation, it’s a better idea to focus on topic areas and creating engagement through high utility. What utility is the user expecting from your chosen topic area? If it’s rap lyrics for song X, then only the rap lyrics for song X will do. If it is plans for a garden, then only plans for a garden will do. See being “relevant” as “providing utility”, not keyword matching.
Go back to the basic principles of classifying the search term as either Navigational, Informational, or Transactional. If the keyword is one of those types, make sure the content offers the utility expected of that type. Be careful when dealing with informational queries that Google could use in it’s Knowledge Graph. If your pages deal with established facts that anyone else can state, then you have no differentiation, and that type of information is more likely to end up as part of Google’s Knowledge Graph. Instead, go deep on information queries. Expand the information. Associate it with other information. Incorporate opinion.
BTW, Bill has some interesting reading on the methods by which Google might be identifying different types of queries.
Methods, systems, and apparatus, including computer program products, for identifying navigational resources for queries. In an aspect, a candidate query in a query sequence is selected, and a revised query subsequent to the candidate query in the query sequence is selected. If a quality score for the revised query is greater than a quality score threshold and a navigation score for the revised query is greater than a navigation score threshold, then a navigational resource for the revised query is identified and associated with the candidate query. The association specifies the navigational resource as being relevant to the candidate query in a search operation.
5. Solve Real Problems
This is a follow-on from “ensuring you provide content with utility”. Go beyond keyword and topical relevance. Ask “what problem is the user is trying to solve”? Is it an entertainment problem? A “How To” problem? What would their ideal solution look like? What would a great solution look like?
There is no shortcut to determining what a user finds most useful. You must understand the user. This understanding can be gleaned from interviews, questionnaires, third party research, chat sessions, and monitoring discussion forums and social channels. Forget about the keyword for the time being. Get inside a visitors head. What is their problem? Write a page addressing that problem by providing a solution.
6. Maximise Engagement
Google are watching for the click-back to the SERP results, an action characteristic of a visitor who clicked through to a site and didn’t deem what they found to be relevant to the search query in terms of utility. Relevance in terms of subject match is now a given.
Big blocks of dense text, even if relevant, can be off-putting. Add images and videos to pages that have low engagement and see if this fixes the problem. Where appropriate, make sure the user takes an action of some kind. Encourage the user to click deeper into the site following an obvious, well placed link. Perhaps they watch a video. Answer a question. Click a button. Anything that isn’t an immediate click back.
If you’ve focused on utility, and genuinely solving a users problem, as opposed to just matching a keyword, then your engagement metrics should be better than the guy who is still just chasing keywords and only matching in terms of relevance to a keyword term.
7. Think Like A PPCer
Treat every click like you were paying for it directly. Once that visitor has arrived, what is the one thing you want them to do next? Is it obvious what they have to do next? Always think about how to engage that visitor once they land. Get them to take an action, where possible.
8.Think Like A Conversion Optimizer
Conversion optimization tries to reduce the bounce-rate by re-crafting the page to ensure it meets the users needs. They do this by split testing different designs, phrases, copy and other elements on the page.
It’s pretty difficult to test these things in SEO, but it’s good to keep this process in mind. What pages of your site are working well and which pages aren’t? Is it anything to do with different designs or element placement? What happens if you change things around? What do the three top ranking sites in your niche look like? If their link patterns are similar to yours, what is it about those sites that might lead to higher engagement and relevancy scores?
9. Rock Solid Strategic Marketing Advantage
SEO is really hard to do on generic me-too sites. It’s hard to get links. It’s hard to get anyone to talk about them. People don’t share them with their friends. These sites don’t generate brand searches. The SEO option for these sites is typically what Google would describe as blackhat, namely link buying.
Look for a marketing angle. Find a story to tell. Find something unique and remarkable about the offering. If a site doesn’t have a clearly-articulated point of differentiation, then the harder it is to get value from organic search if your aim is to do so whilst staying within the guidelines.
10. Links
There’s a reason Google hammer links. It’s because they work. Else, surely Google wouldn’t make a big deal about them.
Links count. It doesn’t matter if they are no-follow, scripted, within social networks, or wherever, they are still paths down which people travel. It comes back to a clear point of differentiation, genuine utility and a coherent brand. It’s a lot easier, and safer, to link build when you’ve got all the other bases covered first.
Did @mattcutts Endorse Rap Genius Link Spam?
On TWIG Matt Cutts spoke about the importance of defunding spammers & breaking their spirits.
If you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is break their spirits. You want to make them frustrated and angry. There are parts of Google’s algorithms specifically designed to frustrate spammers and mystify them and make them frustrated. And some of the stuff we do gives people a hint their site is going to drop and then a week or two later their site actually does drop so they get a little bit more frustrated. And so hopefully, and we’ve seen this happen, people step away from the dark side and say “you know, that was so much pain and anguish and frustration, let’s just stay on the high road from now on” some of the stuff I like best is when people say “you know this SEO stuff is too unpredictable, I’m just going to write some apps. I’m going to go off and do something productive for society.” And that’s great because all that energy is channeled at something good.
What was less covered was that in the same video Matt Cutts made it sound like anything beyond information architecture, duplicate content cleanup & clean URLs was quickly approaching scamming – especially anything to do with links. So over time more and more behaviors get reclassified as black hat spam as Google gains greater control over the ecosystem.
there’s the kind of SEO that is better architecture, cleaner URLs, not duplicate content … that’s just like making sure your resume doesn’t have any typos on it. that’s just clever stuff. and then there’s the type of SEO that is sort of cheating. trying to get a lot of bad backlinks or scamming, and that’s more like lying on your resume. when you get caught sometime’s there’s repercussions. and it definitely helps to personalize because now anywhere you search for plumbers there’s local results and they are not the same across the world. we’ve done a diligent job of trying to crack down on black hat spam. so we had an algorithm named Penguin that launched that kind of had a really big impact. we had a more recent launch just a few months ago. and if you go and patrole the black hat SEO forums where the guys talk about the techniques that work, now its more people trying to sell other people scams rather than just trading tips. a lot of the life has gone out of those forums. and even the smaller networks that they’re trying to promote “oh buy my anglo rank or whatever” we’re in the process of tackling a lot of those link networks as well. the good part is if you want to create a real site you don’t have to worry as much about these bad guys jumping ahead of you. the playing ground is a lot more level now. panda was for low quality. penguin was for spam – actual cheating.
The Matt Cutts BDSM School of SEO
As part of the ongoing campaign to “break their spirits” we get increasing obfuscation, greater time delays between certain algorithmic updates, algorithmic features built explicitly with the goal of frustrating people, greater brand bias, and more outrageous selective enforcement of the guidelines.
Those who were hit by either Panda or Penguin in some cases took a year or more to recover. Far more common is no recovery — ever. How long do you invest in & how much do you invest in a dying project when the recovery timeline is unknown?
You Don’t Get to Fascism Without 2-Tier Enforcement
While success in and of itself may make one a “spammer” to the biased eyes of a search engineer (especially if you are not VC funded nor part of a large corporation), many who are considered “spammers” self-regulate in a way that make them far more conservative than the alleged “clean” sites do.
Pretend you are Ask.com and watch yourself get slaughtered without warning.
Build a big brand & you will have advanced notification & free customer support inside the GooglePlex:
In my experience with large brand penalties, (ie, LARGE global brands) Google have reached out in advance of the ban every single time. – Martin Macdonald
Launching a Viral Linkspam Sitemap Campaign
When RapGenius was penalized, the reason they were penalized is they were broadly and openly and publicly soliciting to promote bloggers who would dump a list of keyword rich deeplinks into their blog posts. They were basically turning boatloads of blogs into mini-sitemaps for popular new song albums.
Remember reading dozens (hundreds?) of blog posts last year about how guest posts are spam & Google should kill them? Well these posts from RapGenius were like a guest post on steroids. The post “buyer” didn’t have to pay a single cent for the content, didn’t care at all about relevancy, AND a sitemap full of keyword rich deep linking spam was included in EACH AND EVERY post.
Most “spammers” would never attempt such a campaign because they would view it as being far too spammy. They would have a zero percent chance of recovery as Google effectively deletes their site from the web.
And while RG is quick to distance itself from scraper sites, for almost the entirety of their history virtually none of the lyrics posted on their site were even licensed.
In the past I’ve mentioned Google is known to time the news cycle. It comes without surprise that on a Saturday barely a week after being penalized Google restored RapGenius’s rankings.
How to Gain Over 400% More Links, While Allegedly Losing
While the following graph may look scary in isolation, if you know the penalty is only a week or two then there’s virtually no downside.
Since being penalized, RapGenius has gained links from over 1,000* domains
- December 25th: 129
- December 26th: 85
- December 27th: 87
- December 28th: 54
- December 29th: 61
- December 30th: 105
- December 31st: 182
- January 1st: 142
- January 2nd: 112
- January 3rd: 122
The above add up to 1,079 & RapGenius only has built a total of 11,930 unique linking domains in their lifetime. They grew about 10% in 10 days!
On every single day the number of new referring domains VASTLY exceeded the number of referring domains that disappeared. And many of these new referring domains are the mainstream media and tech press sites, which are both vastly over-represented in importance/authority on the link graph. They not only gained far more links than they lost, but they also gained far higher quality links that will be nearly impossible for their (less spammy) competitors to duplicate.
They not only got links, but the press coverage acted as a branded advertising campaign for RapGenius.
Here’s some quotes from RapGenius on their quick recovery:
- “we owe a big thanks to Google for being fair and transparent and allowing us back onto their results pages” <– Not the least bit true. RapGenius was not treated fairly, but rather they were given a free ride compared to the death hundreds of thousands of small businesses have been been handed over the past couple years.
- “On guest posts, we appended lists of song links (often tracklists of popular new albums) that were sometimes completely unrelated to the music that was the subject of the post.” <– and yet others are afraid of writing relevant on topic posts due to Google’s ramped fearmongering campaigns
- “we compiled a list of 100 “potentially problematic domains”” <– so their initial list of domains to inspect was less than 10% the number of links they gained while being penalized
- “Generally Google doesn’t hold you responsible for unnatural inbound links outside of your control” <– another lie
- “of the 286 potentially problematic URLs that we manually identified, 217 (more than 75 percent!) have already had all unnatural links purged.” <– even the “all in” removal of pages was less than 25% of the number of unique linking domains generated during the penalty period
And Google allowed the above bullshit during a period when they were sending out messages telling other people WHO DID THINGS FAR LESS EGREGIOUS that they are required to remove more links & Google won’t even look at their review requests for at least a couple weeks – A TIME PERIOD GREATER THAN THE ENTIRE TIME RAPGENIUS WAS PENALIZED FOR.
Failed reconsideration requests are now coming with this email that tells site owners they must remove more links: pic.twitter.com/tiyXtPvY32— Marie Haynes (@Marie_Haynes) January 2, 2014
In Conclusion…
If you tell people what works and why you are a spammer with no morals. But if you are VC funded, Matt Cutts has made it clear that you should spam the crap out of Google. Just make sure you hire a PR firm to trump up press coverage of the “unexpected” event & then have a faux apology saved in advance. So long as you lie to others and spread Google’s propaganda you are behaving in an ethical white hat manner.
Google & @mattcutts didn’t ACTUALLY care about Rap Genius’ link scheme, they just didn’t want to miss a propaganda opportunity.— Ben Cook (@Skitzzo) January 4, 2014
Notes
* These stats are from Ahrefs. A few of these links may have been in place before the penality and only recently crawled. However it is also worth mentioning that all third party databases of links are limited in size & refresh rate by optimizing their capital spend, so there are likely hundreds more links which have not yet been crawled by Ahrefs. One should also note that the story is still ongoing and they keep generating more links every day. By the time the story is done spreading they are likely to see roughly a 30% growth in unique linking domains in about 6 weeks.
Gray Hat Search Engineering
Almost anyone in internet marketing who has spent a couple months in the game has seen some “shocking” case study where changing the color of a button increased sales 183% or such. In many cases such changes only happen when the original site had not had any focus on conversion at all.
Google, on the other hand, has billions of daily searches and is constantly testing ways to increase yield:
The company was considering adding another sponsored link to its search results, and they were going to do a 30-day A/B test to see what the resulting change would be. As it turns out, the change brought massive returns. Advertising revenues from those users who saw more ads doubled in the first 30 days.
…
By the end of the second month, 80 percent of the people in the cohort that was being served an extra ad had started using search engines other than Google as their primary search engine.
One of the reasons traditional media outlets struggle with the web is the perception that ads and content must be separated. When they had regional monopolies they could make large demands to advertisers – sort of like how Google may increase branded CPCs on AdWords by 500% if you add sitelinks. You not only pay more for clicks that you were getting for free, but you also pay more for the other paid clicks you were getting cheaper in the past.
That’s how monopolies work – according to Eric Schmidt they are immune from market forces.
Search itself is the original “native ad.” The blend confuses many searchers as the background colors fade into white.
Google tests colors & can control the flow of traffic based not only on result displacement, but also the link colors.
It was reported last month that Google tested adding ads to the knowledge graph. The advertisement link is blue, while the ad disclosure is to the far right out of view & gray.
I was searching for a video game yesterday & noticed that now the entire Knowledge Graph unit itself is becoming an ad unit. Once again, gray disclosure & blue ad links.
Where Google gets paid for the link, the link is blue.
Where Google scrapes third party content & shows excerpts, the link is gray.
The primary goal of such a knowledge block is result displacement – shifting more clicks to the ads and away from the organic results.
When those blocks appear in the search results, even when Google manages to rank the Mayo Clinic highly, it’s below the fold.
What’s so bad about this practice in health
- Context Matters: Many issues have overlapping symptoms where a quick glance at a few out-of-context symptoms causes a person to misdiagnose themselves. Flu-like symptoms from a few months ago turned out to be indication of a kidney stone. That level of nuance will *never* be in the knowledge graph. Google’s remote rater documents discuss your money your life (YMYL) topics & talk up the importance of knowing who exactly is behind content, but when they use gray font on the source link for their scrape job they are doing just the opposite.
- Hidden Costs: Many of the heavily advertised solutions appearing above the knowledge graph have hidden costs yet to be discovered. You can’t find a pharmaceutical company worth $10s of billions that hasn’t plead guilty to numerous felonies associated with deceptive marketing and/or massaging research.
- Artificially Driving Up Prices: in-patent drugs often cost 100x as much as the associated generic drugs & thus the affordable solutions are priced out of the ad auctions where the price for a click can vastly exceed than the profit from selling a generic prescription drug.
Where’s the business model for publishers when they have real editorial cost & must fact check and regularly update their content, their content is good enough to be featured front & center on Google, but attribution is nearly invisible (and thus traffic flow is cut off)? As the knowledge graph expands, what does that publishing business model look like in the future?
Does the knowledge graph eventually contain sponsored self-assessment medical quizzes? How far does this cancer spread?
Where do you place your chips?
Google believes it can ultimately fulfil people’s data needs by sending results directly to microchips implanted into its user’s brains.
Quickly Reversing the Fat Webmaster Curse
Long story short, -38 pounds in about 2 months or so. Felt great the entire time and felt way more focused day to day. Maybe you don’t have a lot of weight to lose but this whole approach can significantly help you cognitively.
In fact, the diet piece was originally formed for cognitive enhancements rather than weight loss.
Before I get into this post I just want to explicitly state that I am not a doctor, medical professional, medical researcher, or any type of medical/health anything. This is not advice, I am just sharing my experience.
You should consult a healthcare professional before doing any of this stuff.
Unhealthy Work Habits
The work habits associated with being an online marketer lend themselves to packing on some weight. Many of us are in front of a screen for large chunks of the day while also being able to take breaks whenever and wherever we want.
Sometimes those two things add up to a rather sedentary lifestyle with poor eating habits (I’m leaving travel out at the moment but that doesn’t help either).
In addition to the mechanics of our jobs being an issue we also tend to work longer/odd hours because all we really need to get a chunk of our work done is a computer or simply access to the internet. If you take all of those things and add them into the large amount of opportunity that exists on the web you have the perfect recipe for unhealthy, stressful work habits.
These habits tend to carry over into offline areas as well. Think about the things we touch or access every day:
- Computers
- Tablets
- Smartphones
- Search Engines
- Online Tools
- Instant Messaging
- Social Networks
What do many of these have in common? Instant “something”. Instant communication, results, gratification, and on and on. This is what we live in every day. We expect and probably prefer fast, instant, and quick. With that mindset, who has time to cook a healthy meal 3x per day on a regular basis? Some do, for sure. However, much like the office work environment this environment can be one that translates into lots of unhealthy habits and poor health.
I got to the point where I was about 40 pounds overweight with poor physicals and lackluster lipid profiles (high cholesterol, blood pressure, etc). I tried many things, many times but what ultimately turned the corner for me were 3 different investments.
Investment #1 – Standup/Sitdown Desk
Sitting down all day is no bueno. I bought a really high quality standup desk with an electrical motor so I can periodically sit down for a few minutes in between longer periods of standing.
It has a nice, wide table component and is quite sturdy. It also allows for different height adjustments via a simple up and down control:
A couple of tips here:
- Wear comfy shoes
- Take periodic breaks (I do so hourly) to go walk around the house or office or yard
- I also like to look away from the CPU every 20-30 minutes or so, sometimes I get eyestrain but I bought these glasses from Gunnar and it’s relieved those symptoms
Investment #2 – Treaddesk
The reason I didn’t buy the full-on treadmill desk is because I wanted a bigger desk with more options. I bought the Treaddesk, which is essentially the bottom part of the treadmill, and I move it around during the week based on my workflow needs:
They have packages available as well (see the above referenced link).
I have a second, much cheaper standup desk that I hacked together from IKEA:
This desk acts as a holder for my coffee stuff but also allows me to put my laptop on it (which is paired with an external keyboard and trackpad) in case I want to do some lighter work (I have a hard time doing deeper work when doing the treadmill part while working).
I move the Treaddesk back and forth sometimes, but mostly it stays with this IKEA desk. If I have a week where the work is not as deeply analytical and more administrative then I’ll walk at a lower speed on the main computer for a longer period of time.
I tend to walk about 5-7 miles a day on this thing, usually in a block of time where I do that lighter-type work (Quickbooks, cobbling reports together, email triage, very light research/writing, reading, and so on).
Investment #3 – Bulletproof Coffee and the Bulletproof Diet
I’m a big fan of Joe Rogan in general, and I enjoy his podcast. I heard about Dave Asprey on the JRE podcast so I eventually ended up on his site, bulletproofexec.com. I purchased some private coaching with the relevant products and I was off to the races.
I did my own research on some of the stuff and came away confident that “good” fat had been erroneously hated on for years. I highly encourage you to conduct your own research based on your own personal situation, again this is not advice.
I really wanted to drop about 50 pounds so I went all in with Bulletproof Intermittent Fasting. A few quick points:
- I felt great the entire time
- In rare moments where I was hungry at night I just had a tablespoon of organic honey
- I certainly felt a cognitive benefit
- I was never hungry
- I was much more patient with things
- I felt way more focused
So yeah, butter in the coffee and a mostly meat/veggie diet. I cheated from time to time, certainly over the holiday. I lost 38 pounds in slightly over 60 days. Here’s a before and after:
Fat Eric
Not So Fat Eric
I kept this post kind of short and to the point because my desire is not to argue or fight about whether carbs are good or bad, whether fat is good or bad, whether X is right, or whether Y is wrong. This is what worked for me and I was amazed by it, totally amazed by the outcome.
I also do things like cycling and martial arts but I’ve been doing those for awhile, along with running, and while I’ve lost weight I’ve never had it melt away like this.
I’ve stopped the fasting portion and none of the weight has piled back on. Lipid tests have been very positive as well, best in years.
Even if you don’t have a ton of weight to lose, seriously think about the standup desk and treadmill.
Happy New Year!
My Must Have Tools of 2014
There are a lot of tools in the SEO space (sorry, couldn’t resist :D) and over the years we’ve seen tools fall into 2 broad categories. Tools that aim to do just about everything and tools that focus on one discipline of online marketing.
As we contin…
Should Venture Backed Startups Engage in Spammy SEO?
Here’s a recent video of the founders of RapGenius talking at TechCrunch disrupt.
Oops, wrong video. Here’s the right one. Same difference.
Recently a thread on Hacker News highlighted a blog post which pointed how RapGenius was engaging in reciprocal promotional arrangements where they would promote blogs on their Facebook or Twitter accounts if those bloggers would post a laundry list of keyword rich deeplinks at RapGenius.
Matt Cutts quickly chimed in on Hacker News “we’re investigating this now.”
A friend of mine and I were chatting yesterday about what would happen. My prediction was that absolutely nothing would happen to RapGenius, they would issue a faux apology, they would put no effort into cleaning up the existing links, and the apology alone would be sufficient evidence of good faith that the issue dies there.
Today RapGenius published a mea culpa where ultimately they defended their own spam by complaining about how spammy other lyrics websites are. The self-serving jackasses went so far as including this in their post: “With limited tools (Open Site Explorer), we found some suspicious backlinks to some of our competitors”
It’s one thing to in private complain about dealing in a frustrating area, but it’s another thing to publicly throw your direct competitors under the bus with a table of link types and paint them as being black hat spammers.
Google can’t afford to penalize Rap Genius, because if they do Google Ventures will lose deal flow on the start ups Google co-invests in.
In the past some of Google’s other investments were into companies that were pretty overtly spamming. RetailMeNot held multiple giveaways where if you embedded a spammy sidebar set of deeplinks to their various pages they gave you a free t-shirt:
Google’s behavior on such arrangements has usually been to hit the smaller players while looking the other way on the bigger site on the other end of the transaction.
That free t-shirt for links post was from 2010 – the same year that Google invested in RetailMeNot. They did those promotions multiple times & long enough that they ran out of t-shirts!. Now that RTM is a publicly traded billion Dollar company which Google already endorsed by investing in, there’s a zero percent chance of them getting penalized.
To recap, if you are VC-backed you can: spam away, wait until you are outed, when outed reply with a combined “we didn’t know” and a “our competitors are spammers” deflective response.
For the sake of clarity, let’s compare that string of events (spam, warning but no penalty, no effort needed to clean up, insincere mea culpa) to how a websites are treated when not VC backed. For smaller sites it is “shoot on sight” first and then ask questions later, perhaps coupled with a friendly recommendation to start over.
Here’s a post from today highlighting a quote from Google’s John Mueller:
My personal & direct recommendation here would be to treat this site as a learning experience from a technical point of view, and then to find something that you’re absolutely passionate & knowledgeable about and create a website for that instead.
Growth hack inbound content marketing, but just don’t call it SEO.
Growth hacking = using 2005-era spam tactics. http://t.co/5ISCPmMEkp cc @samfbiddle @nitashatiku
— Max Woolf (@minimaxir) December 23, 2013
What’s worse, is with the new fearmongering disavow promotional stuff, not only are some folks being penalized for the efforts of others, but some are being penalized for links that were in place BEFORE Google even launched as a company.
Google wants me to disavow links that existed back when backrub was foreplay and not an algo. Hubris much?
— Cygnus SEO (@CygnusSEO) December 21, 2013
Given that money allegedly shouldn’t impact rankings, its sad to note that as everything that is effective gets labeled as spam, capital and connections are the key SEO innovations in the current Google ecosystem.
Beware Of SEO Truthiness
When SEO started, many people routinely used black-box testing to try any figure out what pages the search engines rewarded.
Black box testing is terminology used in IT. It’s a style of testing that doesn’t assume knowledge of the internal workings of a machine or computer program. Rather, you can only test how the system responds to inputs.
So, for many years, SEO was about trying things out and watching how the search engine responded. If rankings went up, SEOs assumed correlation meant causation, so they did a lot more of whatever it was they thought was responsible for the boost. If the trick was repeatable, they could draw some firmer conclusions about causation, at least until the search engine introduced some new algorithmic code and sent everyone back to their black-box testing again.
Well, it sent some people back to testing. Some SEO’s don’t do much, if any, testing of their own, and so rely on the strategies articulated by other people. As a result, the SEO echo chamber can be a pretty misleading place as “truthiness” – and a lot of false information – gets repeated far and wide, until it’s considered gospel. One example of truthiness is that paid placement will hurt you. Well, it may do, but not having it may hurt you more, because it all really…..depends.
Another problem is that SEO testing can seldom be conclusive, because you can’t be sure of the state of the thing you’re testing. The thing you’re testing may not be constant. For example, you throw up some more links, and your rankings rise, but the rise could be due to other factors, such as a new engagement algorithm that Google implemented in the middle of your testing, you just didn’t know about it.
It used to be a lot easier to conduct this testing. Updates were periodic. Up until that point, you could reasonably assume the algorithms were static, so cause and effect were more obvious than they are today. Danny Sullivan gave a good overview of search history at Moz earlier in the year:
That history shows why SEO testing is getting harder. There are a lot more variables to isolate that there used to be. The search engines have also been clever. A good way to thwart SEO black box testing is to keep moving the target. Continuously roll out code changes and don’t tell people you’re doing it. Or send people on a wild goose chase by arm-waving about a subtle code change made over here, when the real change has been made over there.
That’s the state of play in 2013.
However….(Ranting Time :)
Some SEO punditry is bordering on the ridiculous!
I’m not going to link to one particular article I’ve seen recently, as, ironically, that would mean rewarding them for spreading FUD. Also, calling out people isn’t really the point. Suffice to say, the advice was about specifics, such as how many links you can “safely” get from one type of site, that sort of thing….
The problem comes when we can easily find evidence to the contrary. In this case, a quick look through the SERPs and you’ll find evidence of top ranking sites that have more than X links from Site Type Y, so this suggests….what? Perhaps these sites are being “unsafe”, whatever that means. A lot of SEO punditry is well meaning, and often a rewording of Google’s official recommendations, but can lead people up the garden path if evidence in the wild suggests otherwise.
If one term defined SEO in 2013, it is surely “link paranoia”.
What’s Happening In The Wild
When it comes to what actually works, there are few hard and fast rules regarding links. Look at the backlink profiles for top ranked sites across various categories and you’ll see one thing that is constant….
Nothing is constant.
Some sites have links coming from obviously automated campaigns, and it seemingly doesn’t affect their rankings. Other sites have credible link patterns, and rank nowhere. What counts? What doesn’t? What other factors are in play? We can only really get a better picture by asking questions.
Google allegedly took out a few major link networks over the weekend. Anglo Rank came in for special mention from Matt Cutts.
So, why are Google making a point of taking out link networks if link networks don’t work? Well, it’s because link networks work. How do we know? Look at the back link profiles in any SERP area where there is a lot of money to be made, and the area isn’t overly corporate i.e. not dominated by major brands, and it won’t be long before you spot aggressive link networks, and few “legitimate” links, in the backlink profiles.
Sure, you wouldn’t want aggressive link networks pointing at brand sites, as there are better approaches brand sites can take when it comes to digital marketing, but such evidence makes a mockery of the tips some people are freely handing out. Are such tips the result of conjecture, repeating Google’s recommendations, or actual testing in the wild? Either the link networks work, or they don’t work but don’t affect rankings, or these sites shouldn’t be ranking.
There’s a good reason some of those tips are free, I guess.
Risk Management
Really, it’s a question of risk.
Could these sites get hit eventually? Maybe. However, those using a “disposable domain” approach will do anything that works as far as linking goes, as their main risk is not being ranked. Being penalised is an occupational hazard, not game-over. These sites will continue so long as Google’s algorithmic treatment rewards them with higher ranking.
If your domain is crucial to your brand, then you might choose to stay away from SEO entirely, depending on how you define “SEO”. A lot of digital marketing isn’t really SEO in the traditional sense i.e. optimizing hard against an algorithm in order to gain higher rankings, a lot of digital marketing is based on optimization for people, treating SEO as a side benefit. There’s nothing wrong with this, of course, and it’s a great approach for many sites, and something we advocate. Most sites end up somewhere along that continuum, but no matter where you are on that scale, there’s always a marketing risk to be managed, with perhaps “non-performance” being a risk that is often glossed over.
So, if there’s a take-away, it’s this: check out what actually happens in the wild, and then evaluate your risk before emulating it. When pundits suggest a rule, check to see if you can spot times it appears to work, and perhaps more interestingly, when it doesn’t. It’s in those areas of personal inquiry and testing where gems of SEO insight are found.
SEO has always been a mix of art and science. You can test, but only so far. The art part is dealing with the unknown past the testing point. Performing that art well is to know how to pick truthiness from reality.
And that takes experience.
But mainly a little fact checking :)
SEO Discussions That Need to Die
Sometimes the SEO industry feels like one huge Groundhog Day. No matter how many times you have discussions with people on the same old topics, these issues seem to pop back into blogs/social media streams with almost regular periodicity. And every time it does, just the authors are new, the arguments and the contra arguments are all the same.
Due to this sad situation, I have decided to make a short list of such issues/discussions and hopefully if one of you is feeling particularly inspired by it and it prevents you from starting/engaging in such a debate, then it was worth writing.
So here are SEO’s most annoying discussion topics, in no particular order:
Blackhat vs. Whitehat
This topic has been chewed over and over again so many times, yet people still jump into it with both their feet, having righteous feeling that their, and no one else’s argument is going to change someone’s mind. This discussion is becomes particularly tiresome when people start claiming moral high ground because they are using one over the other. Let’s face it once and for all times: there are no generally moral (white) and generally immoral (black) SEO tactics.
This is where people usually pull out the argument about harming clients’ sites, argument which is usually moot. Firstly, there is a heated debate about what is even considered whitehat and what blackhat. Definition of these two concepts is highly fluid and changes over time. One of the main reasons for this fluidity is Google moving the goal posts all the time. What was once considered purely whitehat technique, highly recommended by all the SEOs (PR submissions, directories, guest posts, etc.) may as of tomorrow become “blackhat”, “immoral” and what not. Also some people consider “blackhat” anything that dares to not adhere to Google Webmaster Guidelines as if it was carved in on stone tablets by some angry deity.
Just to illustrate how absurd this concept is, imagine some other company, Ebay say, creates a list of rules, one of which is that anyone who wants to sell an item on their site, is prohibited from trying to sell it also on Gumtree or Craigslist. How many of you would practically reduce the number of people your product is effectively reaching because some other commercial entity is trying to prevent competition? If you are not making money off search, Google is and vice versa.
It is not about the morals, it is not about criminal negligence of your clients. It is about taking risks and as long as you are being truthful with your clients and yourself and aware of all the risks involved in undertaking this or some other activity, no one has the right to pontificate about “morality” of a competing marketing strategy. If it is not for you, don’t do it, but you can’t both decide that the risk is too high for you while pseudo-criminalizing those who are willing to take that risk.
The same goes for “blackhatters” pointing and laughing at “whitehatters”. Some people do not enjoy rebuilding their business every 2 million comment spam links. That is OK. Maybe they will not climb the ranks as fast as your sites do, but maybe when they get there, they will stay there longer? These are two different and completely legitimate strategies. Actually, every ecosystem has representatives of those two strategies, one is called “r strategy” which prefers quantity over quality, while the K strategy puts more investment in a smaller number of offsprings.
You don’t see elephants calling mice immoral, do you?
Rank Checking is Useless/Wrong/Misleading
This one has been going around for years and keeps raising its ugly head every once in a while, particularly after Google forces another SaaS provider to give up part of its services because of either checking rankings themselves or buying ranking data from a third party provider. Then we get all the holier-than-thou folks, mounting their soap boxes and preaching fire and brimstone on SEOs who report rankings as the main or even only KPI. So firstly, again, just like with black vs. white hat, horses for courses. If you think your way of reporting to clients is the best, stick with it, preach it positively, as in “this is what I do and the clients like it” but stop telling other people what to do!
More importantly, vast majority of these arguments are based on a totally imaginary situation in which SEOs use rankings as their only or main KPI. In all of my 12 years in SEO, I have never seen any marketer worth their salt report “increase in rankings for 1000s of keywords”. As far back as 2002, I remember people were writing reports to clients which had a separate chapter for keywords which were defined as optimization targets, client’s site reached top rankings but no significant increase in traffic/conversions was achieved. Those keywords were then dropped from the marketing plan altogether.
It really isn’t a big leap to understand that ranking isn’t important if it doesn’t result in increased conversions in the end. I am not going to argue here why I do think reporting and monitoring rankings is important. The point is that if you need to make your argument against a straw man, you should probably rethink whether you have a good argument at all.
PageRank is Dead/it Doesn’t Matter
Another strawman argument. Show me a linkbuilder who today thinks that getting links based solely on toolbar PageRank is going to get them to rank and I will show you a guy who has probably not engaged in active SEO since 2002. And not a small amount of irony can be found in the fact that the same people who decry use of Pagerank, a closest thing to an actual Google ranking factor they can see, are freely using proprietary metrics created by other marketing companies and treating them as a perfectly reliable proxy for esoteric concepts which even Google finds hard to define, such as relevance and authority. Furthermore, all other things equal, show me the SEO who will take a pass on a PR6 link for the sake of a PR3 one.
Blogging on “How Does XXX Google Update Change Your SEO” – 5 Seconds After it is Announced
Matt hasn’t turned off his video camera to switch his t-shirt for the next Webmaster Central video and there are already dozens of blog posts discussing to the most intricate of details on how the new algorithm update/penalty/infrastructure change/random- monochromatic-animal will impact everyone’s daily routine and how we should all run for the hills.
Best-case scenario, these prolific writers only know the name of the update and they are already suggesting strategies on how to avoid being slapped or, even better, get out of the doghouse. This was painfully obvious in the early days of Panda, when people were writing their “experiences” on how to recover from the algorithm update even before the second update was rolled out, making any testimony of recovery, in the worst case, a lie or (given a massive benefit of the doubt) a misinterpretation of ranking changes (rank checking anyone).
Put down your feather and your ink bottle skippy, wait for the dust to settle and unless you have a human source who was involved in development or implementation of the algorithm, just sit tight and observe for the first week or two. After that you can write those observations and it will be considered a legitimate, even interesting reporting on the new algorithm but anything earlier than that will paint you as a clueless, pageview chaser, looking to ride the wave of interest with blog post that are often closed with “we will probably not even know what the XXX update is all about until we give it some time to get implemented”. Captain Obvious to the rescue.
Adwords Can Help Your Organic Rankings
This one is like a mythological Hydra – you cut one head off, two new one spring out. This question was answered so many times by so many people, both from within search engines and from the SEO community, that if you are addressing this question today, I am suspecting that you are actually trying to refrain from talking about something else and are using this topic as a smoke screen. Yes, I am looking at you Google Webmaster Central videos. Is that *really* the most interesting question you found on your pile? What, no one asked about <not provided> or about social signals or about role authorship plays on non-personalized rankings or on whether it flows through links or million other questions that are much more relevant, interesting and, more importantly, still unanswered?
Infographics/Directories/Commenting/Forum Profile Links Don’t Work
This is very similar to the blackhat/whitehat argument and it is usually supported by a statement that looks something like “what do you think that Google with hundreds of PhDs haven’t already discounted that in their algorithm?”. This is a typical “argument from incredulity” by a person who glorifies post graduate degrees as a litmus of intelligence and ingenuity. My claim is that these people have neither looked at backlink profiles of many sites in many competitive niches nor do they know a lot of people doing or having a PhD. They highly underrate former and overrate the latter.
A link is a link is a link and the only difference is between link profiles and percentages that each type of link occupies in a specific link profile. Funnily enough, the same people who claim that X type of links don’t work are the same people who will ask for link removal from totally legitimate, authoritative sources who gave them a totally organic, earned link. Go figure.
“But Matt/John/Moultano/anyone-with a brother in law who has once visited Mountain View” said…
Hello. Did you order “not provided will be maximum 10% of your referral data”? Or did you have “I would be surprised if there was a PR update this year”? How about “You should never use nofollow on-site links that you don’t want crawled. But it won’t hurt you. Unless something.”?
People keep thinking that people at Google sit around all day long, thinking how they can help SEOs do their job. How can you build your business based on advice given out by an entity who is actively trying to keep visitors from coming to your site? Can you imagine that happening in any other business environment? Can you imagine Nike marketing department going for a one day training session in Adidas HQ, to help them sell their sneakers better?
Repeat after me THEY ARE NOT YOUR FRIENDS. Use your own head. Even better, use your own experience. Test. Believe your own eyes.
We Didn’t Need Keyword Data Anyway
This is my absolute favourite. People who were as of yesterday basing their reporting, link building, landing page optimization, ranking reports, conversion rate optimization and about every other aspect of their online campaign on referring keywords, all of a sudden fell the need to tell the world how they never thought keywords were an important metric. That’s right buster, we are so much better off flying blind, doing iteration upon iteration of a derivation of data based on past trends, future trends, landing pages, third party data, etc.
It is ok every once in a while to say “crap, Google has really shafted us with this one, this is seriously going to affect the way I track progress”. Nothing bad will happen if you do. You will not lose face over it. Yes there were other metrics that were ALSO useful for different aspects of SEO but it is not as if when driving a car and your brakes die on you, you say “pfffftt stopping is for losers anyway, who wants to stop the car when you can enjoy the ride, I never really used those brakes in the past anyway. What really matters in the car is that your headlights are working”.
Does this mean we can’t do SEO anymore? Of course not. Adaptability is one of the top required traits of an SEO and we will adapt to this situation as we did to all the others in the past. But don’t bullshit yourself and everyone else that 100% <not provided> didn’t hurt you.
Responses to SEO is Dead Stories
It is crystal clear why the “SEO is dead” stories themselves deserve to die a slow and painful death. I am talking here about hordes of SEOs who rise to the occasion every freeking time some 5th rate journalist decides to poke the SEO industry through the cage bars and convince them, nay, prove to them how SEO is not only not dying but is alive and kicking and bigger than ever. And I am not innocent of this myself, I have also dignified this idiotic topic with a response (albeit a short one) but how many times can we rise to the same occasion and repeat the same points? What original angle can you give to this story after 16 years of responding to the same old claims? And if you can’t give an original angle, how in the world are you increasing our collective knowledge by re-warming and serving the same old dish that wasn’t very good first time it was served? Don’t you have rankings to check instead?
There is No #10.
But that’s what everyone does, writes a “Top 10 ways…” article, where they will force the examples until they get to a linkbaity number. No one wants to read a “Top 13…” or a “Top 23…” article. This needs to die too. Write what you have to say. Not what you think will get most traction. Marketing is makeup, but the face needs to be pretty before you apply it. Unless you like putting lipstick on pigs.
Branko Rihtman has been optimizing sites for search engines since 2001 for clients and own web properties in a variety of competitive niches. Over that time, Branko realized the importance of properly done research and experimentation and started publishing findings and experiments at SEO Scientist, with some additional updates at @neyne. He currently consults a number of international clients, helping them improve their organic traffic and conversions while questioning old approaches to SEO and trying some new ones.
Value Based SEO Strategy
One approach to search marketing is to treat the search traffic as a side-effect of a digital marketing strategy. I’m sure Google would love SEOs to think this way, although possibly not when it comes to PPC! Even if you’re taking a more direct, rankings-driven approach, the engagement and relevancy scores that come from delivering what the customer values should serve you well, too.
In this article, we’ll look at a content strategy based on value based marketing. Many of these concepts may be familiar, but bundled together, they provide an alternative search provider model to one based on technical quick fixes and rank. If you want to broaden the value of your SEO offering beyond that first click, and get a few ideas on talking about value, then this post is for you.
In any case, the days of being able to rank well without providing value beyond the click are numbered. Search is becoming more about providing meaning to visitors and less about providing keyword relevance to search engines.
What Is Value Based Marketing?
Value based marketing is customer, as opposed to search engine, centric. In Values Based Marketing For Bottom Line Success, the authors focus on five areas:
- Discover and quantify your customers’ wants and needs
- Commit to the most important things that will impact your customers
- Create customer value that is meaningful and understandable
- Assess how you did at creating true customer value
- Improve your value package to keep your customers coming back
Customers compare your offer against those of competitors, and divide the benefits by the cost to arrive at value. Marketing determines and communicates that value.
This is the step beyond keyword matching. When we use keyword matching, we’re trying to determine intent. We’re doing a little demographic breakdown. This next step is to find out what the customer values. If we give the customer what they value, they’re more likely to engage and less likely to click back.
What Does The Customer Value?
A key question of marketing is “which customers does this business serve”? Seems like an obvious question, but it can be difficult to answer. Does a gym serve people who want to get fit? Yes, but then all gyms do that, so how would they be differentiated?
Obviously, a gym serves people who live in a certain area. So, if our gym is in Manhattan, our customer becomes “someone who wants to get fit in Manhattan”. Perhaps our gym is upmarket and expensive. So, our customer becomes “people who want to get fit in Manhattan and be pampered and are prepared to pay more for it”. And so on, and so on. They’re really questions and statements about the value proposition as perceived by the customer, and then delivered by the business.
So, value based marketing is about delivering value to a customer. This syncs with Google’s proclaimed goal in search, which is to put users first by delivering results they deem to have value, and not just pages that match a keyword term. Keywords need to be seen in a wider context, and that context is pretty difficult to establish if you’re standing outside the search engine looking in, so thinking in terms of concepts related to the value proposition might be a good way to go.
Value Based SEO Strategy
The common SEO approach, for many years, has started with keywords. It should start with customers and the business.
The first question is “who is the target market” and then ask what they value.
Relate what they value to the business. What is the value proposition of the business? Is it aligned? What would make a customer value this business offering over those of competitors? It might be price. It might be convenience. It’s probably a mix of various things, but be sure to nail down the specific value propositions.
Then think of some customer questions around these value propositions. What would be the likely customer objections to buying this product? What would be points that need clarifying? How does this offer differ from other similar offers? What is better about this product or service? What are the perceived problems in this industry? What are the perceived problems with this product or service? What is difficult or confusing about it? What could go wrong with it? What risks are involved? What aspects have turned off previous customers? What complaints did they make?
Make a list of such questions. These are your article topics.
You can glean this information by either interviewing customers or the business owner. Each of these questions, and accompanying answer, becomes an article topic on your site, although not necessarily in Q&A format. The idea is to create a list of topics as a basis for articles that address specific points, and objections, relating to the value proposition.
For example, buying SEO services is a risk. Customers want to know if the money they spend is going to give them a return. So, a valuable article might be a case study on how the company provided return on spend in the past, and the process by which it will achieve similar results in future. Another example might be a buyer concerned about the reliability of a make of car. A page dedicated to reliability comparisons, and another page outlining the customer care after-sale plan would provide value. Note how these articles aren’t keyword driven, but value driven.
Ever come across a FAQ that isn’t really a FAQ? Dreamed-up questions? They’re frustrating, and of little value if the information doesn’t directly relate to the value we seek. Information should be relevant and specific so when people land on the site, there’s more chance they will perceive value, at least in terms of addressing the questions already on their mind.
Compare this approach with generic copy around a keyword term. A page talking about “SEO” in response to the keyword term “SEO“might closely match a keyword term, so that’s a relevance match, but unless it’s tied into providing a customer the value they seek, it’s probably not of much use. Finding relevance matches is no longer a problem for users. Finding value matches often is. Even if you’re keyword focused, added these articles provides you semantic variation that may capture keyword searches that aren’t appearing in keyword tools.
Keyword relevance was a strategy devised at a time when information was less readily available and search engines weren’t as powerful. Finding something relevant was more hit and miss that it is today. These days, there’s likely thousands, if not millions, of pages that will meet relevance criteria in terms of keyword matching, so the next step is to meet value criteria. Providing value is less likely to earn a click back and more likely to create engagement than mere on-topic matching.
The Value Chain
Deliver value. Once people perceive value, then we have to deliver it. Marketing, and SEO in particular, used to be about getting people over the threshold. Today, businesses have to work harder to differentiate themselves and a sound way of doing this is to deliver on promises made.
So the value is in the experience. Why do we return to Amazon? It’s likely due to the end-to-end experience in terms of delivering value. Any online e-commerce store can deliver relevance. Where competition is fierce, Google is selective.
In the long term, delivering value should drive down the cost of marketing as the site is more likely to enjoy repeat custom. As Google pushes more and more results beneath the fold, the cost of acquisition is increasing, so we need to treat each click like gold.
Monitor value. Does the firm keep delivering value? To the same level? Because people talk. They talk on Twitter and Facebook and the rest. We want them talking in a good way, but even if they talk in a negative way, it can still useful. Their complaints can be used as topics for articles. They can be used to monitor value, refine the offer and correct problems as they arise. Those social signals, whilst not a guaranteed ranking boost, are still signals. We need to adopt strategies whereby we listen to all the signals, so to better understand our customers, in order to provide more value, and hopefully enjoy a search traffic boost as a welcome side-effect, so long as Google is also trying to determine what users value. .
Not sounding like SEO? Well, it’s not optimizing for search engines, but for people. If Google is to provide value, then it needs to ensure results provide not just relevant, but offer genuine value to end users. Do Google do this? In many cases, not yet, but all their rhetoric and technical changes suggest that providing value is at the ideological heart of what they do. So the search results will most likely, in time, reflect the value people seek, and not just relevance.
In technical terms, this provides some interesting further reading:
Today, signals such as keyword co-occurrence, user behavior, and previous searches do in fact inform context around search queries, which impact the SERP landscape. Note I didn’t say the signals “impact rankings,” even though rank changes can, in some cases, be involved. That’s because there’s a difference. Google can make a change to the SERP landscape to impact 90 percent of queries and not actually cause any noticeable impact on rankings.
The way to get the context right, and get positive user behaviour signals, and align with their previous searches, is to first understand what people value.