Publisher Blocking: How the Web Was Lost
Streaming Apps
Google recently announced app streaming, where they can showcase & deep link into apps in the search results even if users do not have those apps installed. How it works is rather than users installing the app, Google has the app installed on a computer in their cloud & then shows users a video of the app. Click targets, ads, etc. remain the same.
In writing about the new feature, Danny Sullivan wrote a section on “How The Web Could Have Been Lost”
Imagine if, in order to use the web, you had to download an app for each website you wanted to visit. To find news from the New York Times, you had to install an app that let you access the site through your web browser. To purchase from Amazon, you first needed to install an Amazon app for your browser. To share on Facebook, installation of the Facebook app for your browser would be required. That would be a nightmare.
…
The web put an end to this. More specifically, the web browser did. The web browser became a universal app that let anyone open anything on the web.
To meaningfully participate on those sorts of sites you still need an account. You are not going to be able to buy on Amazon without registration. Any popular social network which allows third party IDs to take the place of first party IDs will quickly become a den of spam until they close that loophole.
In short, you still have to register with sites to get real value out of them if you are doing much beyond reading an article. Without registration it is hard for them to personalize your experience & recommend relevant content.
Desktop Friendly Design
App indexing & deep linking of apps is a step in the opposite direction of the open web. It is supporting proprietary non-web channels which don’t link out. Further, if you thought keyword (not provided) heavily obfuscated user data, how much will data be obfuscated if the user isn’t even using your site or app, but rather is interacting via a Google cloud computer?
- Who visited your app? Not sure. It was a Google cloud computer.
- Where were they located? Not sure. It was a Google cloud computer.
- Did they have problems using your app? Not sure. It was a Google cloud computer.
- What did they look at? Can you retarget them? Not sure. It was a Google cloud computer.
Is an app maker too lazy to create a web equivalent version of their content? If so, let them be at a strategic disadvantage to everyone who put in the extra effort to publish their content online.
If Google has their remote quality raters consider a site as not meeting users needs because they don’t publish a “mobile friendly” version of their site, how can one consider a publisher who creates “app only” content as an entity which is trying hard to meet end user needs?
We know Google hates app install interstitials (unless they are sold by Google), thus the only reason Google would have for wanting to promote these sorts of services would be to justify owning, controlling & monetizing the user experience.
App-solutely Not The Answer
Apps are sold as a way to lower channel risk & gain direct access to users, but the companies owning the app stores are firmly in control.
- Google is being investigated by regulators in multiple markets over their Android bundling contracts. And while bundling is a core feature of the OS, others who bundle are given the boot.
- Google Now on Tap embeds Google search in third party apps.
- The low pricepoints for consumer apps in app stores makes it hard for businesses to justify selling B2B apps for a high enough price to offset the smaller addressable audience.
- It has become harder to sell consumer apps as the app stores have saturated with competition.
2008 I’ll sell apps for $2.99 & make millions
2010 At $0.99 I’ll make $1000s
2012 Ads might cover my rent
2014 Kickstart my app
2015 Hire me— Nick Lockwood (@nicklockwood) August 3, 2015 - Exceptionally popular apps are disabled for interfering with business models of the platforms. Apps and extensions can be disabled at any time, even after the fact, due to violating guidelines or rule changes that turn what was once fine into a guideline violation. In some cases when they are disabled it is done with no option to re-enable.
- The Amazon app on iOS allows you to buy physical goods, but good luck buying an ebook in it. Even Amazon was removed from Google’s Play store after allowing digital purchases. Apple TV doesn’t support Amazon Prime Video. In turn, Amazon has stopped selling some streaming items from Apple and Google.
Everyone wants to “own” the user, but none of the platforms bother to ask if the user wants to be owned:
We’re rapidly moving from an internet where computers are ‘peers’ (equals) to one where there are consumers and ‘data owners’, silos of end user data that work as hard as they can to stop you from communicating with other, similar silos.
…
If the current trend persists we’re heading straight for AOL 2.0, only now with a slick user interface, a couple more features and more users.
You’ve Got AOL
The AOL analogy is widely used:
Katz of Gogobot says that “SEO is a dying field” as Google uses its “monopoly” power to turn the field of search into Google’s own walled garden like AOL did in the age of dial-up modems.
Almost 4 years ago a Google engineer described SEO as a bug. He suggested one shouldn’t be able to rank highly without paying.
It looks like he was right. Google’s aggressive ad placement on mobile SERPs “has broken the will of users who would have clicked on an organic link if they could find one at the top of the page but are instead just clicking ads because they don’t want to scroll down.”
In the years since then we’ve learned Google’s “algorithm” has concurrent ranking signals & other forms of home cooking which guarantees success for Google’s vertical search offerings. The “reasonable” barrier to entry which applies to third parties does not apply to any new Google offerings.
And “bugs” keep appearing in those “algorithms,” which deliver a steady stream of harm to competing businesses.
From Indy to Brand
The waves of algorithm updates have in effect increased the barrier to entry, along with the cost needed to maintain rankings. The stresses and financial impacts that puts on small businesses makes many of them not worth running. Look no further than MetaFilter’s founder seeing a psychologist, then quitting because he couldn’t handle the process.
When Google engineers are not focused on “breaking spirits” they emphasize the importance of happiness.
The ecosystem instability has made smaller sites effectively disappear while delivering a bland and soulless result set which is heavy on brand:
there’s no reason why the internet couldn’t keep on its present course for years to come. Under those circumstances, it would shed most of the features that make it popular with today’s avant-garde, and become one more centralized, regulated, vacuous mass medium, packed to the bursting point with corporate advertising and lowest-common-denominator content, with dissenting voices and alternative culture shut out or shoved into corners where nobody ever looks. That’s the normal trajectory of an information technology in today’s industrial civilization, after all; it’s what happened with radio and television in their day, as the gaudy and grandiose claims of the early years gave way to the crass commercial realities of the mature forms of each medium.
If you participate on the web daily, the change washes over you slowly, and the cumulative effects can be imperceptible. But if you were locked in an Iranian jail for years the change is hard to miss.
These sorts of problems not only impact search, but have an impact on all the major tech channels.
iPhone autocorrect inserted “showgirl” for “shows” and “POV” for “PPC”. This crowd sourcing of autocorrect is not welcomed.— john andrews (@searchsleuth998) November 10, 2015
If you live in Goole, these issues strike close to home.
And there are almost no counter-forces to the well established trend:
Eventually they might even symbolically close their websites, finishing the job they started when they all stopped paying attention to what their front pages looked like. Then, they will do a whole lot of what they already do, according to the demands of their new venues. They will report news and tell stories and post garbage and make mistakes. They will be given new metrics that are both more shallow and more urgent than ever before; they will adapt to them, all the while avoiding, as is tradition, honest discussions about the relationship between success and quality and self-respect.
…
If in five years I’m just watching NFL-endorsed ESPN clips through a syndication deal with a messaging app, and Vice is just an age-skewed Viacom with better audience data, and I’m looking up the same trivia on Genius instead of Wikipedia, and “publications” are just content agencies that solve temporary optimization issues for much larger platforms, what will have been point of the last twenty years of creating things for the web?
A Deal With the Devil
As ad blocking has grown more pervasive, some publishers believe the solution to the problem is through gaining distribution through the channels which are exempt from the impacts of ad blocking. However those channels have no incentive to offer exceptional payouts. They make more by showing fewer ads within featured content from partners (where they must share ad revenues) and showing more ads elsewhere (where they keep all the ad revenues).
So far publishers have been underwhelmed with both Facebook Instant Articles and Apple News. The former for stringent ad restrictions, and the latter for providing limited user data. Google Now is also increasing the number of news stories they show. And next year Google will roll out their accelerated mobile pages offering.
The problem is if you don’t control the publishing you don’t control the monetization and you don’t control the data flow.
Your website helps make the knowledge graph (and other forms of vertical search) possible. But you are paid nothing when your content appears in the knowledge graph. And the knowledge graph now has a number of ad units embedded in it.
A decade ago, when Google pushed autolink to automatically insert links in publisher’s content, webmasters had enough leverage to “just say no.” But now? Not so much. Google considers in-text ad networks spam & embeds their own search in third party apps. As the terms of deals change, and what is considered “best for users” changes, content creators quietly accept, or quit.
Many video sites lost their rich snippets, while YouTube got larger snippets in the search results. Google pays YouTube content creators a far lower revenue share than even the default AdSense agreement offers. And those creators have restrictions which prevent them from using some forms of monetization while forces them to accept other types of bundling.
The most recent leaked Google rater documents suggested the justification for featured answers was to make mobile search quick, but if that were the extent of it then it still doesn’t explain why they also appear on desktop search results. It also doesn’t explain why the publisher credit links were originally a light gray.
With Google everything comes down to speed, speed, speed. But then they offer interstitial ad units, lock content behind surveys, and transform the user intent behind queries in a way that leads them astray.
As Google obfuscates more data & increasingly redirects and monetizes user intent, they promise to offer advertisers better integration of online to offline conversion data.
At the same time, as Google “speeds up” your site for you, they may break it with GoogleWebLight.
If you don’t host & control the user experience you are at the whim of (at best, morally agnostic) self-serving platforms which could care less if any individual publication dies.
It’s White Hat or Bust…
What was that old white hat SEO adage? I forget the precise wording, but I think it went something like…
Don’t buy links, it is too risky & too uncertain. Guarantee strong returns like Google does, by investing directly into undermining the political process by hiring lobbyists, heavy political donations, skirting political donation rules, regularly setting policy, inserting your agents in government, and sponsoring bogus “academic research” without disclosing the payments.
Focus on the user. Put them first. Right behind money.
Ad Network Ménage à Trois: Bing, Yahoo!, Google
Yahoo! Tests Google Again
Back in July we noticed Yahoo! was testing Google-powered search results. From that post…
When Yahoo! recently renewed their search deal with Microsoft, Yahoo! was once again allowed to sell their own desktop search ads & they are only required to give 51% of the search volume to Bing. There has been significant speculation as to what Yahoo! would do with the carve out. Would they build their own search technology? Would they outsource to Google to increase search ad revenues? It appears they are doing a bit of everything – some Bing ads, some Yahoo! ads, some Google ads.
The Growth of Gemini
Since then Gemini has grown significantly:
Yahoo has moved quickly to bring search ad traffic under Gemini for advertisers that have adopted the platform. For some perspective, in September 2015, Yahoo.com produced a little over 50 percent of the clicks that took place across the Bing Ads and Gemini platforms. For advertisers adopting Gemini, Gemini produced 22 percent of combined Bing and Gemini clicks. Given the device breakdown of Yahoo’s traffic, this amounts to about two-thirds of the traffic it is able to control under the renegotiated agreement.
That growth has come at the expense of Bing ad clicks, which have fallen significantly:
Shared Scale to Compete
Years ago Microsoft was partnered into the Yahoo!/Overture ad network to compete against Google. The idea was the companies together would have better scale to compete against Google in search & ads. Greater scale would lead to a more efficient marketplace, which would lead to better ad matching, higher advertiser bids, etc. This didn’t worked as well as anticipated. Originally under-monetization was blamed on poor ad matching. Yahoo! Panama was a major rewrite of their ad system which was supposed to fix the problem, but it didn’t.
Even if issues like bid jamming were fixed & ad matching was more relevant, it still didn’t fix issues with lower ad depth in emerging markets & arbitrage lowering the value of expensive keywords in the United States.
Understanding the Value of Search Clicks
When a person types a keyword into a search box they are expressing significant intent. When a person clicks a link to land on a page they may still have significant interest, but generally there is at least some level of fall off. If I search for a keyword the value of my click is $x, but if I click a link on a “top searches” box, the value of that click may perhaps only be 5% or 10% what the value of a hand typed search. There is less intent.
Here is a picture of the sort of “trending now” box which appears on the Yahoo! homepage.
Typically those sorts of searches include a bunch of female celebrities, but then in any such box there will be one or two money terms added, like [lower blood pressure] or [iPhone 6s]. People who search for those terms might have $5 or $10 of intent, but people who click those links might only have a quarter or 50 cents of intent.
That difference in value can utterly screw an advertiser who gets their high-value keyword featured while they are sleeping or not actively monitoring & managing their ad campaign.
For what it is worth, even Google has tested some of these sort of these “search” traffic generation approaches during the last recession. On the Google AdSense network Google was buying banner ads telling people to search for [credit cards] & if they clicked on those banner ads they ended up on a search result page for [credit cards].
To this day many companies run contextual ads that drive search volume, but the difference between today & the Yahoo! which failed to monetize search is there is (at least currently) a greater focus on traffic quality.
Under-performance Due to Shady Traffic Partners
Yahoo! continued to under-perform in large part because Yahoo! had a lot of “search” partners with many lower quality traffic sources mixed in their traffic stream & they didn’t even allow advertisers to opt out of the partner network until after Yahoo! decided to exit the search market. As bad as the above sounds, it is actually worse, as some larger partners had access to advertiser information in a way that allowed them to aggressively arbitrage away the value of high advertiser bids wherever and whenever an advertiser overbid.
So you would bid thinking you were buying primarily search traffic based on the user intent of a person searching for something, but you might have been getting various layers of arbitrage of lower quality traffic, traffic from domain lander pages, or even some mix of robotic traffic from clickbots. Those $30 search ad clicks are a sure money loser if it is a clickbot software program doing the click.
And not only were some of Yahoo!’s partners driving down the value of clicks on Yahoo! itself, but Yahoo! was paying some of the larger partners in the high 80s to low 90s percent of revenue. Here is a (made up) example chart for illustration purposes, where the (made up) partner is getting a 90% TAC
Advertiser Bid | Y! Search Clicks | Partner Clicks | Total Clicks | Total Revs | TAC | Rev after TAC | |
No Partners | $30 | 3,000 | 0 | 3,000 | $90,000 | $0 | $90,000 |
Bit of Arb | $25 | 3,000 | 1,000 | 4,000 | $100,000 | $22,500 | $77,500 |
Heavy Arb | $10 | 3,000 | 6,000 | 9,000 | $90,000 | $54,000 | $36,000 |
Why did Yahoo! allow the above sort of behavior to go on? It is hard to believe they were completely unaware of what was going on, particularly when it was so obvious to outside observers. More likely it was that they were rapidly losing search share & wanted the topline revenue growth to make their quarterly number. By the time they realized what damage they had already done to their ecosystem, they were already too far down the path to correct it & were afraid to do anything which significantly hit revenues.
The rapid rise and fall of a large Yahoo! search partner named Geosign was detailed by the Canadian Financial Post, in an article which is now offline, but available via the Internet Archive Wayback Machine:
Companies fail all the time. Sometimes with little warning. But companies that are highly profitable and only weeks removed from a record-setting venture capital investment? Not so much. Yet in Geosign’s case, the cuts that began last May continued through the summer. Late last year, fewer than 100 employees remained. Today, Geosign itself no longer exists, its still-functioning website an empty reminder of its former promise. And while the national business media has, until now, overlooked the story – surprising, given the size of the investment and the fact that Google played a direct role in the outcome – within Canada’s technology and venture-capital communities, the $160-million investment is known as the deal “that didn’t go well.” When the collapse happened, even jaded industry watchers accustomed to financial debacles in the tech sector were stunned. “I’ve seen a lot of meltdowns,” says Duncan Stewart, a technology and investment analyst in Toronto. “But something happening like this, over just a few weeks, that’s unprecedented in my experience.”
Other traffic sources like domain parking have also sharply declined, due to a variety of factors like: web browsers replacing address bars with multi-purpose search boxes, shift of consumer internet traffic to mobile devices (which increases reliance on search over direct navigation & apps replace some segment of direct navigation), increased smart pricing, lower revenue sharing percentages, and Yahoo! no longer being able to offer a competitive bid against Google.
When Yahoo! shifted their search ads to Microsoft, Microsoft allowed advertisers to opt out of the partner network. Microsoft also clamped down on some of the lower quality traffic sources with smart pricing, which hit some of the arbitrage businesses hard & even forced Yahoo! to seek refunds from some of their partners for delivering low quality traffic.
Shared Scale to Compete
Microsoft launched their own algorithmic search results on Live Search & their own Microsoft adCenter search ads. Microsoft continued to lose share in search at least until they gave their search engine a memorable name in Bing. The Yahoo! Bing ad network seemed to be gaining momentum when Yahoo! signed a deal with Mozilla to become the default search provider for Firefox, but it appears Yahoo! overpaid for the deal as Yahoo! search revenues ex-TAC were off $60 million YoY in the most recent quarter.
In spite of using an ad-heavy search interface Yahoo! has not grown search ad revenues as quickly as the search market has grown. Yahoo! has continually lost marketshare for years (up until the Mozilla Firefox deal). And even as Microsoft has followed Google in broadened their ad matching, a lot of the other “search” traffic partners Yahoo! once relied on to make their numbers are no longer in the marketplace to augment their data.
The Bing / Yahoo! network search traffic is now much cleaner than the Yahoo! “search” traffic quality of many years ago, but Yahoo! hasn’t replaced some of the old search partners which have died off.
Shared Scale No Longer Important?
Yahoo! increasing the share of their ad clicks which are powered by Gemini lowers the network efficiency of the Yahoo!/Bing ad network. All the talk of “synergy” driving value sort of goes up in smoke when Yahoo! shifts a significant share of their ad clicks away from the original network.
Yahoo! announced a new search deal with Google. Here’s the Tweet version…
$YHOO has signed a 3 year partnership with Google to bolster our search capabilities. This is in addition to our relationship with Microsoft— Yahoo Inc. (@YahooInc) October 20, 2015
…the underlying ethos…
“If you love something, set it free; if it comes backs it’s yours, if it doesn’t, it never was.”
…and the long version…
On October 19, 2015, Yahoo! Inc., a Delaware corporation (“Yahoo”), and Google Inc., a Delaware corporation (“Google”), entered into a Google Services Agreement (the “Services Agreement”). The Services Agreement is effective as of October 1, 2015 and expires on December 31, 2018. Pursuant to the Services Agreement, Google will provide Yahoo with search advertisements through Google’s AdSense for Search service (“AFS”), web algorithmic search services through Google’s Websearch Service, and image search services. The results provided by Google for these services will be available to Yahoo for display on both desktop and mobile platforms. Yahoo may use Google’s services on Yahoo’s owned and operated properties (“Yahoo Properties”) and on certain syndication partner properties (“Affiliate Sites”) in the United States (U.S.), Canada, Hong Kong, Taiwan, Singapore, Thailand, Vietnam, Philippines, Indonesia, Malaysia, India, Middle East, Africa, Mexico, Argentina, Brazil, Colombia, Chile, Venezuela, Peru, Australia and New Zealand.
Under the Services Agreement, Yahoo has discretion to select which search queries to send to Google and is not obligated to send any minimum number of search queries. The Services Agreement is non-exclusive and expressly permits Yahoo to use any other search advertising services, including its own service, the services of Microsoft Corporation or other third parties.
Google will pay Yahoo a percentage of the gross revenues from AFS ads displayed on Yahoo Properties or Affiliate Sites. The percentage will vary depending on whether the ads are displayed on U.S. desktop sites, non-U.S. desktop sites or on the tablet or mobile phone versions of the Yahoo Properties or its Affiliate Sites. Yahoo will pay Google fees for requests for image search results or web algorithmic search results.
Either party may terminate the Services Agreement (1) upon a material breach subject to certain limitations; (2) in the event of a change in control (as defined in the Services Agreement); (3) after first discussing with the other party in good faith its concerns and potential alternatives to termination (a) in its entirety or in the U.S. only, if it reasonably anticipates litigation or a regulatory proceeding brought by any U.S. federal or state agency to enjoin the parties from consummating, implementing or otherwise performing the Services Agreement, (b) in part, in a country other than the U.S., if either party reasonably anticipates litigation or a regulatory proceeding or reasonably anticipates that the continued performance under the Services Agreement in such country would have a material adverse impact on any ongoing antitrust proceeding in such country, (c) in its entirety if either party reasonably anticipates a filing by the European Commission to enjoin it from performing the Services Agreement or that continued performance of the Services Agreement would have a material adverse impact on any ongoing antitrust proceeding involving either party in Europe or India, or (d) in its entirety, on 60 days notice if the other party’s exercise of these termination rights in this clause (3) has collectively and materially diminished the economic value of the Services Agreement. Each party agrees to defend or settle any lawsuits or similar actions related to the Services Agreement unless doing so is not commercially reasonable (taking all factors into account, including without limitation effects on a party’s brand or business outside of the scope of the Services Agreement).
In addition, Google may suspend Yahoo’s use of services upon certain events and may terminate the Services Agreement if such events are not cured. Yahoo may terminate the Services Agreement if Google breaches certain service level and server latency specified in the Services Agreement.
In connection with the Services Agreement, Yahoo and Google have agreed to certain procedures with the Antitrust Division of the United States Department of Justice (the “DOJ”) to facilitate review of the Services Agreement by the DOJ, including delaying the implementation of the Services Agreement in the U.S. in order to provide the DOJ with a reasonable period of review.
Where Are We Headed?
Danny Sullivan mentioned the 51% of search share Yahoo! is required to deliver to Bing applies only to desktop traffic & Yahoo! has no such limit on mobile searches. In theory this could mean Yahoo! could quickly become a Google shop, with Microsoft as a backfill partner.
When asked about the future of Gemini on today’s investor conference call Marissa Mayer stated she expected Gemini to continue scaling more on mobile. She also stated she felt the Google deal would help Yahoo! refine their ad mix & give them additional opportunities in international markets. Yahoo! is increasingly reliant on the US & is unable to bid to win marketshare in foreign markets.
(Myopic) Learning Systems
Marissa Mayer sounded both insightful and myopic on today’s conference call. She mentioned how as they scale up Gemini the cost of that is reflected in foregone revenues from optimizing their learning systems and improving their ad relevancy. On its face, that sort of comment sounds totally reasonable.
An unsophisticated or utterly ignorant market participant might even cheer it on, without realizing the additional complexity, management cost & risk they are promoting.
Where the myopic quick win view falls flat is on the other side of the market.
Sure a large web platform can use big data to optimize their performance and squeeze out additional pennies of yield, but for an advertiser these blended networks can be a real struggle. How do they budget for any given network when a single company is arbitrarily mixing between 3 parallel networks? A small shift in Google AdWords ad spend might not be hard to manage, but what happens if an advertiser suddenly gets a bunch of [trending topic] search ad clicks? Or maybe they get a huge slug of mobile clicks which don’t work very well for their business. Do they disable the associated keyword in Yahoo! Gemini? Or Bing Ads? Or Google AdWords? All 3?’
Do they find that when they pause their ads in one network that quickly leads to the second (or third) network quickly carrying their ads across?
Even if you can track and manage it on a granular basis, the additional management time is non-trivial. One of the fundamental keys to a solid online advertising strategy is to have granular control so you can quickly alter distribution. But if you turn your ads off in one network only to find that leads your ads from the second network to get carried across that creates a bit of chaos. The more networks there are in parallel that bleed together the blurrier things get.
This sort of “overlap = bad” mindset is precisely why search engines suggest creating tight ad campaigns and ad groups. But you lose that control when things arbitrarily shift about.
To appreciate how expensive those sorts of costs can be, consider what has happened with programmatic ads:
Platforms that facilitate automated sales for media companies typically take 10% to 20% of the revenue that passes through their hands, according to the IAB report. Networks that service programmatic buys typically mark up inventory, citing the value that they add, by 30% to 50%. And then there are the essential data-management platforms, which take 10% to 15% of a buy, industry executives said.
If you are managing a client budget for paid search, how do you determine a pre-approved budget for each network when the traffic mix & quality might rapidly oscillate across the networks?
Don’t take my word for it though, read the Yahoo! Ads Twitter account
“Consumers and advertisers are overwhelmed by choice. Our industry needs solutions that eliminate fragmentation” @andrew_snyder #videonomics— YahooAds (@YahooAds) October 21, 2015
When Yahoo! tries to manage their yield they will not only be choosing among 3 parallel networks on their end, but they will also have individual advertisers making a wide variety of changes on the other end. And some of those advertisers will not only be influenced by the ad networks, but also the organic rankings which come with the ads.
If one search engine is ranking you well in the organic search results for an important keyword and another is not, then you should bid more aggressively on your ads on the search engine which is ranking your site, because by voting with your budget you may well be voting on which underlying relevancy algorithm is chosen to deliver the associated organic search results accompanying the ads.
That last point was important & I haven’t seen it mentioned anywhere yet, so it is worth repeating: your PPC ad bids may determine which search relevancy algorithm drives Yahoo! Search organic results.
Time to Quit Digging & Drop The Shovel
The other (BIG) issue is that as they give Google more search marketshare they give Google more granular data, which in turn means they
- make buying on their own network less worthy of the management cost & complexity
- make Google more of a “must buy”
- will never close the monetization gap with Google
Even today Google announced a new tool for offering advertisers granular localized search data. Search partners won’t directly benefit from those tools.
The old problem with Yahoo! was they were heavily reliant on search partners who drove down the traffic value. The future problem may well be if the marginally profitable Bing leaves the search market, Google will drive down the amount of revenue they share with Yahoo!.
If the Yahoo! Google search deal gets approved, Bing might shift back to losing money unless Microsoft buys Yahoo! after the Alibaba share spin out.
Ever track how Google’s TAC has shifted over the past decade?
It has only been a decade so far, but MAYBE THIS TIME IS DIFFERENT.
Ad Network Ménage à Trois: Bing, Yahoo!, Google
Yahoo! Tests Google Again
Back in July we noticed Yahoo! was testing Google-powered search results. From that post…
When Yahoo! recently renewed their search deal with Microsoft, Yahoo! was once again allowed to sell their own desktop search ads & they are only required to give 51% of the search volume to Bing. There has been significant speculation as to what Yahoo! would do with the carve out. Would they build their own search technology? Would they outsource to Google to increase search ad revenues? It appears they are doing a bit of everything – some Bing ads, some Yahoo! ads, some Google ads.
The Growth of Gemini
Since then Gemini has grown significantly:
Yahoo has moved quickly to bring search ad traffic under Gemini for advertisers that have adopted the platform. For some perspective, in September 2015, Yahoo.com produced a little over 50 percent of the clicks that took place across the Bing Ads and Gemini platforms. For advertisers adopting Gemini, Gemini produced 22 percent of combined Bing and Gemini clicks. Given the device breakdown of Yahoo’s traffic, this amounts to about two-thirds of the traffic it is able to control under the renegotiated agreement.
That growth has come at the expense of Bing ad clicks, which have fallen significantly:
Shared Scale to Compete
Years ago Microsoft was partnered into the Yahoo!/Overture ad network to compete against Google. The idea was the companies together would have better scale to compete against Google in search & ads. Greater scale would lead to a more efficient marketplace, which would lead to better ad matching, higher advertiser bids, etc. This didn’t worked as well as anticipated. Originally under-monetization was blamed on poor ad matching. Yahoo! Panama was a major rewrite of their ad system which was supposed to fix the problem, but it didn’t.
Even if issues like bid jamming were fixed & ad matching was more relevant, it still didn’t fix issues with lower ad depth in emerging markets & arbitrage lowering the value of expensive keywords in the United States.
Understanding the Value of Search Clicks
When a person types a keyword into a search box they are expressing significant intent. When a person clicks a link to land on a page they may still have significant interest, but generally there is at least some level of fall off. If I search for a keyword the value of my click is $x, but if I click a link on a “top searches” box, the value of that click may perhaps only be 5% or 10% what the value of a hand typed search. There is less intent.
Here is a picture of the sort of “trending now” box which appears on the Yahoo! homepage.
Typically those sorts of searches include a bunch of female celebrities, but then in any such box there will be one or two money terms added, like [lower blood pressure] or [iPhone 6s]. People who search for those terms might have $5 or $10 of intent, but people who click those links might only have a quarter or 50 cents of intent.
That difference in value can utterly screw an advertiser who gets their high-value keyword featured while they are sleeping or not actively monitoring & managing their ad campaign.
For what it is worth, even Google has tested some of these sort of these “search” traffic generation approaches during the last recession. On the Google AdSense network Google was buying banner ads telling people to search for [credit cards] & if they clicked on those banner ads they ended up on a search result page for [credit cards].
To this day many companies run contextual ads that drive search volume, but the difference between today & the Yahoo! which failed to monetize search is there is (at least currently) a greater focus on traffic quality.
Under-performance Due to Shady Traffic Partners
Yahoo! continued to under-perform in large part because Yahoo! had a lot of “search” partners with many lower quality traffic sources mixed in their traffic stream & they didn’t even allow advertisers to opt out of the partner network until after Yahoo! decided to exit the search market. As bad as the above sounds, it is actually worse, as some larger partners had access to advertiser information in a way that allowed them to aggressively arbitrage away the value of high advertiser bids wherever and whenever an advertiser overbid.
So you would bid thinking you were buying primarily search traffic based on the user intent of a person searching for something, but you might have been getting various layers of arbitrage of lower quality traffic, traffic from domain lander pages, or even some mix of robotic traffic from clickbots. Those $30 search ad clicks are a sure money loser if it is a clickbot software program doing the click.
And not only were some of Yahoo!’s partners driving down the value of clicks on Yahoo! itself, but Yahoo! was paying some of the larger partners in the high 80s to low 90s percent of revenue. Here is a (made up) example chart for illustration purposes, where the (made up) partner is getting a 90% TAC
Advertiser Bid | Y! Search Clicks | Partner Clicks | Total Clicks | Total Revs | TAC | Rev after TAC | |
No Partners | $30 | 3,000 | 0 | 3,000 | $90,000 | $0 | $90,000 |
Bit of Arb | $25 | 3,000 | 1,000 | 4,000 | $100,000 | $22,500 | $77,500 |
Heavy Arb | $10 | 3,000 | 6,000 | 9,000 | $90,000 | $54,000 | $36,000 |
Why did Yahoo! allow the above sort of behavior to go on? It is hard to believe they were completely unaware of what was going on, particularly when it was so obvious to outside observers. More likely it was that they were rapidly losing search share & wanted the topline revenue growth to make their quarterly number. By the time they realized what damage they had already done to their ecosystem, they were already too far down the path to correct it & were afraid to do anything which significantly hit revenues.
The rapid rise and fall of a large Yahoo! search partner named Geosign was detailed by the Canadian Financial Post, in an article which is now offline, but available via the Internet Archive Wayback Machine:
Companies fail all the time. Sometimes with little warning. But companies that are highly profitable and only weeks removed from a record-setting venture capital investment? Not so much. Yet in Geosign’s case, the cuts that began last May continued through the summer. Late last year, fewer than 100 employees remained. Today, Geosign itself no longer exists, its still-functioning website an empty reminder of its former promise. And while the national business media has, until now, overlooked the story – surprising, given the size of the investment and the fact that Google played a direct role in the outcome – within Canada’s technology and venture-capital communities, the $160-million investment is known as the deal “that didn’t go well.” When the collapse happened, even jaded industry watchers accustomed to financial debacles in the tech sector were stunned. “I’ve seen a lot of meltdowns,” says Duncan Stewart, a technology and investment analyst in Toronto. “But something happening like this, over just a few weeks, that’s unprecedented in my experience.”
Other traffic sources like domain parking have also sharply declined, due to a variety of factors like: web browsers replacing address bars with multi-purpose search boxes, shift of consumer internet traffic to mobile devices (which increases reliance on search over direct navigation & apps replace some segment of direct navigation), increased smart pricing, lower revenue sharing percentages, and Yahoo! no longer being able to offer a competitive bid against Google.
When Yahoo! shifted their search ads to Microsoft, Microsoft allowed advertisers to opt out of the partner network. Microsoft also clamped down on some of the lower quality traffic sources with smart pricing, which hit some of the arbitrage businesses hard & even forced Yahoo! to seek refunds from some of their partners for delivering low quality traffic.
Shared Scale to Compete
Microsoft launched their own algorithmic search results on Live Search & their own Microsoft adCenter search ads. Microsoft continued to lose share in search at least until they gave their search engine a memorable name in Bing. The Yahoo! Bing ad network seemed to be gaining momentum when Yahoo! signed a deal with Mozilla to become the default search provider for Firefox, but it appears Yahoo! overpaid for the deal as Yahoo! search revenues ex-TAC were off $60 million YoY in the most recent quarter.
In spite of using an ad-heavy search interface Yahoo! has not grown search ad revenues as quickly as the search market has grown. Yahoo! has continually lost marketshare for years (up until the Mozilla Firefox deal). And even as Microsoft has followed Google in broadened their ad matching, a lot of the other “search” traffic partners Yahoo! once relied on to make their numbers are no longer in the marketplace to augment their data.
The Bing / Yahoo! network search traffic is now much cleaner than the Yahoo! “search” traffic quality of many years ago, but Yahoo! hasn’t replaced some of the old search partners which have died off.
Shared Scale No Longer Important?
Yahoo! increasing the share of their ad clicks which are powered by Gemini lowers the network efficiency of the Yahoo!/Bing ad network. All the talk of “synergy” driving value sort of goes up in smoke when Yahoo! shifts a significant share of their ad clicks away from the original network.
Yahoo! announced a new search deal with Google. Here’s the Tweet version…
$YHOO has signed a 3 year partnership with Google to bolster our search capabilities. This is in addition to our relationship with Microsoft— Yahoo Inc. (@YahooInc) October 20, 2015
…the underlying ethos…
“If you love something, set it free; if it comes backs it’s yours, if it doesn’t, it never was.”
…and the long version…
On October 19, 2015, Yahoo! Inc., a Delaware corporation (“Yahoo”), and Google Inc., a Delaware corporation (“Google”), entered into a Google Services Agreement (the “Services Agreement”). The Services Agreement is effective as of October 1, 2015 and expires on December 31, 2018. Pursuant to the Services Agreement, Google will provide Yahoo with search advertisements through Google’s AdSense for Search service (“AFS”), web algorithmic search services through Google’s Websearch Service, and image search services. The results provided by Google for these services will be available to Yahoo for display on both desktop and mobile platforms. Yahoo may use Google’s services on Yahoo’s owned and operated properties (“Yahoo Properties”) and on certain syndication partner properties (“Affiliate Sites”) in the United States (U.S.), Canada, Hong Kong, Taiwan, Singapore, Thailand, Vietnam, Philippines, Indonesia, Malaysia, India, Middle East, Africa, Mexico, Argentina, Brazil, Colombia, Chile, Venezuela, Peru, Australia and New Zealand.
Under the Services Agreement, Yahoo has discretion to select which search queries to send to Google and is not obligated to send any minimum number of search queries. The Services Agreement is non-exclusive and expressly permits Yahoo to use any other search advertising services, including its own service, the services of Microsoft Corporation or other third parties.
Google will pay Yahoo a percentage of the gross revenues from AFS ads displayed on Yahoo Properties or Affiliate Sites. The percentage will vary depending on whether the ads are displayed on U.S. desktop sites, non-U.S. desktop sites or on the tablet or mobile phone versions of the Yahoo Properties or its Affiliate Sites. Yahoo will pay Google fees for requests for image search results or web algorithmic search results.
Either party may terminate the Services Agreement (1) upon a material breach subject to certain limitations; (2) in the event of a change in control (as defined in the Services Agreement); (3) after first discussing with the other party in good faith its concerns and potential alternatives to termination (a) in its entirety or in the U.S. only, if it reasonably anticipates litigation or a regulatory proceeding brought by any U.S. federal or state agency to enjoin the parties from consummating, implementing or otherwise performing the Services Agreement, (b) in part, in a country other than the U.S., if either party reasonably anticipates litigation or a regulatory proceeding or reasonably anticipates that the continued performance under the Services Agreement in such country would have a material adverse impact on any ongoing antitrust proceeding in such country, (c) in its entirety if either party reasonably anticipates a filing by the European Commission to enjoin it from performing the Services Agreement or that continued performance of the Services Agreement would have a material adverse impact on any ongoing antitrust proceeding involving either party in Europe or India, or (d) in its entirety, on 60 days notice if the other party’s exercise of these termination rights in this clause (3) has collectively and materially diminished the economic value of the Services Agreement. Each party agrees to defend or settle any lawsuits or similar actions related to the Services Agreement unless doing so is not commercially reasonable (taking all factors into account, including without limitation effects on a party’s brand or business outside of the scope of the Services Agreement).
In addition, Google may suspend Yahoo’s use of services upon certain events and may terminate the Services Agreement if such events are not cured. Yahoo may terminate the Services Agreement if Google breaches certain service level and server latency specified in the Services Agreement.
In connection with the Services Agreement, Yahoo and Google have agreed to certain procedures with the Antitrust Division of the United States Department of Justice (the “DOJ”) to facilitate review of the Services Agreement by the DOJ, including delaying the implementation of the Services Agreement in the U.S. in order to provide the DOJ with a reasonable period of review.
Where Are We Headed?
Danny Sullivan mentioned the 51% of search share Yahoo! is required to deliver to Bing applies only to desktop traffic & Yahoo! has no such limit on mobile searches. In theory this could mean Yahoo! could quickly become a Google shop, with Microsoft as a backfill partner.
When asked about the future of Gemini on today’s investor conference call Marissa Mayer stated she expected Gemini to continue scaling more on mobile. She also stated she felt the Google deal would help Yahoo! refine their ad mix & give them additional opportunities in international markets. Yahoo! is increasingly reliant on the US & is unable to bid to win marketshare in foreign markets.
(Myopic) Learning Systems
Marissa Mayer sounded both insightful and myopic on today’s conference call. She mentioned how as they scale up Gemini the cost of that is reflected in foregone revenues from optimizing their learning systems and improving their ad relevancy. On its face, that sort of comment sounds totally reasonable.
An unsophisticated or utterly ignorant market participant might even cheer it on, without realizing the additional complexity, management cost & risk they are promoting.
Where the myopic quick win view falls flat is on the other side of the market.
Sure a large web platform can use big data to optimize their performance and squeeze out additional pennies of yield, but for an advertiser these blended networks can be a real struggle. How do they budget for any given network when a single company is arbitrarily mixing between 3 parallel networks? A small shift in Google AdWords ad spend might not be hard to manage, but what happens if an advertiser suddenly gets a bunch of [trending topic] search ad clicks? Or maybe they get a huge slug of mobile clicks which don’t work very well for their business. Do they disable the associated keyword in Yahoo! Gemini? Or Bing Ads? Or Google AdWords? All 3?’
Do they find that when they pause their ads in one network that quickly leads to the second (or third) network quickly carrying their ads across?
Even if you can track and manage it on a granular basis, the additional management time is non-trivial. One of the fundamental keys to a solid online advertising strategy is to have granular control so you can quickly alter distribution. But if you turn your ads off in one network only to find that leads your ads from the second network to get carried across that creates a bit of chaos. The more networks there are in parallel that bleed together the blurrier things get.
This sort of “overlap = bad” mindset is precisely why search engines suggest creating tight ad campaigns and ad groups. But you lose that control when things arbitrarily shift about.
To appreciate how expensive those sorts of costs can be, consider what has happened with programmatic ads:
Platforms that facilitate automated sales for media companies typically take 10% to 20% of the revenue that passes through their hands, according to the IAB report. Networks that service programmatic buys typically mark up inventory, citing the value that they add, by 30% to 50%. And then there are the essential data-management platforms, which take 10% to 15% of a buy, industry executives said.
If you are managing a client budget for paid search, how do you determine a pre-approved budget for each network when the traffic mix & quality might rapidly oscillate across the networks?
Don’t take my word for it though, read the Yahoo! Ads Twitter account
“Consumers and advertisers are overwhelmed by choice. Our industry needs solutions that eliminate fragmentation” @andrew_snyder #videonomics— YahooAds (@YahooAds) October 21, 2015
When Yahoo! tries to manage their yield they will not only be choosing among 3 parallel networks on their end, but they will also have individual advertisers making a wide variety of changes on the other end. And some of those advertisers will not only be influenced by the ad networks, but also the organic rankings which come with the ads.
If one search engine is ranking you well in the organic search results for an important keyword and another is not, then you should bid more aggressively on your ads on the search engine which is ranking your site, because by voting with your budget you may well be voting on which underlying relevancy algorithm is chosen to deliver the associated organic search results accompanying the ads.
That last point was important & I haven’t seen it mentioned anywhere yet, so it is worth repeating: your PPC ad bids may determine which search relevancy algorithm drives Yahoo! Search organic results.
Time to Quit Digging & Drop The Shovel
The other (BIG) issue is that as they give Google more search marketshare they give Google more granular data, which in turn means they
- make buying on their own network less worthy of the management cost & complexity
- make Google more of a “must buy”
- will never close the monetization gap with Google
Even today Google announced a new tool for offering advertisers granular localized search data. Search partners won’t directly benefit from those tools.
The old problem with Yahoo! was they were heavily reliant on search partners who drove down the traffic value. The future problem may well be if the marginally profitable Bing leaves the search market, Google will drive down the amount of revenue they share with Yahoo!.
If the Yahoo! Google search deal gets approved, Bing might shift back to losing money unless Microsoft buys Yahoo! after the Alibaba share spin out.
Ever track how Google’s TAC has shifted over the past decade?
It has only been a decade so far, but MAYBE THIS TIME IS DIFFERENT.
Virtual Real Estate Virtually Disappears
Back in 2009 Google executives were scared of not being able to retain talent with stock options after Google’s stock price cratered with the rest of the market & Google’s ad revenue growth rate slid to zero. That led them to reprice employee stock options. That is as close as Google has come to a “near death” experience since their IPO. They’ve consistently grown & become more dominant.
In November of 2009 I cringed when I saw the future of SEO in Google SERPs where the organic results were outright displaced & even some of the featured map listings had their phone numbers removed.
Investing in Search
In 2012 a Googler named Jon Rockway was more candid than Googlers are typically known for being: “SEO isn’t good for users or the Internet at large. … It’s a bug that you could rank highly in Google without buying ads, and Google is trying to fix the bug.”
It isn’t surprising Google greatly devalued keyword domain names & hit sites like eHow hard. And it isn’t surprising Demand Media is laying off staff and is rumored to be exploring selling their sites. If deleting millions of articles from eHow doesn’t drive a recovery, how much money can they lose on the rehab project before they should just let it go?
“If you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is break their spirits.” – Matt Cutts
Through a constant ex-post-facto redefinition of “what is spam” to include most anything which is profitable, predictable & accessible, Google engineers work hard to “deny people money.”
Over time SEO became harder & less predictable. The exception being Google investments like Thumbtack, in which case other’s headwind became your tailwind & a list of techniques declared off limits became a strategy guidebook.
Communications got worse, Google stopped even pretending to help the ecosystem, and they went so far as claiming that even asking for a link was spam. All the while, as they were curbing third party investment into the ecosystem (“deny them money”), they work on PR for their various investments & renamed the company from Google to Alphabet so they can expand their scope of investments.
“We also like that it means alpha‑bet (Alpha is investment return above benchmark), which we strive for!” – Larry Page
From Do/Know/Go to Scrape/Displace/Monetize
It takes a lot of effort & most people are probably too lazy to do it, but if you look at the arch of Google’s patents related to search quality, many of the early ones revolved around links. Then many focused on engagement related signals. Chrome & Android changed the pool of signals Google had access to. Things like Project Fi, Gogle Fiber, Nest, and Google’s new OnHub router give them more of that juicy user data. Many of their recently approved patents revolve around expanding the knowledge graph so that they may outright displace the idea of having a neutral third party result set for an increasing share of the overall search pie.
Searchers can instead get bits of “knowledge” dressed in various flavors of ads.
This sort of displacement is having a significant impact on a variety of sites. But for most it is a slow bleed rather than an overnight sudden shift. In that sort of environment, even volunteer run sites will eventually atrophy. They will have fewer new users, and as some of the senior people leave, eventually fewer will rise through the ranks. Or perhaps a greater share of the overall ranks will be driven by money.
Jimmy Wales stated: “It is also false that ‘Wikipedia thrives on clicks,’ at least as compared to ad-revenue driven sites… The relationship between ‘clicks’ and the things we care about: community health and encyclopedia quality is not nothing, but it’s not as direct as some think.”
Most likely the relationship *is* quite direct, but there is a lagging impact. Today’s major editors didn’t join the site yesterday & take time to rise through the ranks.
As the big sites become more closed off the independent voices are pushed aside or outright disappear.
If Google works hard enough at prioritizing “deny people money” as a primary goal, then they will eventually get an index quality that reflects that lack of payment. Plenty of good looking & well-formatted content, but a mix of content which:
- is monetized indirectly & in ways which are not clearly disclosed
- has interstitial ads and slideshows where the ads look like the “next” button & the “next” button is colored the same color as the site’s background
- is done as “me too” micro-reporting with no incremental analysis
- is algorithmically generated
Celebrating Search “Innovation”
There has been a general pattern in search innovation. Google introduces a new feature, pitches it as being the next big thing, gets people to adopt it, collects data on the impact of the feature, clamps down on selectively allowing it, perhaps removes the feature outright from organic search results, permanently adds the feature to their ad units.
This sort of pattern has happened so many times it is hard to count.
Google puts faces in search results for authorship & to promote Google+, Google realizes Google+ is a total loser & disconnects it, new ad units for local services show faces in the search results. What was distracting noise was removed, then it was re-introduced as part of an ad unit.
I’m confused didn’t Google pull authorship cuz faces in the SERPS was a bad experience for the users? pic.twitter.com/mI2NdyGzd7— Michael Gray (@graywolf) July 31, 2015
The same sort of deal exists elsewhere. Google acquires YouTube, launches universal search, offers video snippets, increases size of video snippets. Then video snippets get removed from most listings “because noise.” YouTube gets enlarged video snippets. And, after removing the “noise” of video stills in the search results Google is exploring testing video ads in the search results.
Some sites which bundle software got penalized in organic search and are not even allowed to buy AdWords ads. At an extreme degree, sites which bundled no software, but simply didn’t link to an End User Agreement (EULA) from the download page were penalized. Which leads to uncomfortable conversations like this one:
Google Support: I looked through this, and it seemed that one of the issues was a lack of an End User Agreement (EULA)
Simtec: An EULA is displayed by the setup program before installing starts. Also, the end user license agreements are linked to from here http://www.httpwatch.com/buy/orderingfaq.aspx#licensetypes
Google Support: Hmm, They do want it on the download page itself
Simtec: How come there isn’t one here? google.co.uk/chrome/browser/desktop/
Google Support: LOL
Simtec: No really?
Google Support: That’s a great question
Of course, it goes without saying that much of the Google Chrome install base came from negative option software bundling on Adobe Flash security updates.
Google claimed helpful hotel affiliate sites should be rated as spam, then they put their own affiliate ads in hotel search results & even recommended hotel searches in the knowledge graph on city name searches.
Google created a penalty for sites which have an ad heavy interface. Many of Google’s search results are nothing but ads for the entire first screen.
Google search engineers have recently started complaining about interstitial ads & suggested they might create a “relevancy” signal based on users not liking those. At the same time, an increasing number of YouTube videos have unskippable pre-roll ads. And the volume of YouTube ad views is so large that it is heavily driving down Google’s aggregate ad click price. On top of this, Google also offers a survey tool which publishers can lock content behind & requires users to answer a question before they can see the full article they just saw ranking in the search results.
“Everything is possible, but nothing is real.” – Living Colour
Blue Ocean Opportunity
Amid the growing ecosystem instability & increasing hypocrisy, there have perhaps been only a couple “blue ocean” areas left in organic search: local search & brand.
And it appears Google might be well on their way in trying to take those away.
For years brand has been the solution to almost any SEO problem.
I wonder how many SEOs working for big brands have done absolutely nothing of value since 2012 yet still look like geniuses to executives.— Ross Hudgens (@RossHudgens) August 7, 2015
But Google has been increasing the cost of owning a brand. They are testing other ad formats to drive branded search clicks through more expensive ad formats like PLAs & they have been drastically increasing brand CPCs on text ads. And while that second topic has recently gained broader awareness, it has been a trend for years now: “Over the last 12 months, Brand CPCs on Google have increased 80%” – George Michie, July 30, 2013.
There are other subtle ways Google has attacked brand, including:
- penalties on many of the affiliates of those brands
- launching their own vertical search ad offerings in key big-money verticals
- investing billions in “disruptive” start ups which are exempt from the algorithmic risks other players must deal with
- allowing competitors to target competing brands not only within the search results, but also as custom affinity audiences
- linking to competing businesses in the knowledge graph
Google has recently dialed up monetization of local search quite aggressively as well. I’ve long highlighted how mobile search results are ad heavy & have grown increasingly so over time. Google has recently announced call only ad formats, a buy button for mobile ads, local service provider ads, appointment scheduling in the SERPs, direct hotel booking, etc.
And, in addition to all the above new ad formats, recently it was noticed Google is now showing 3 ads on mobile devices even for terms without much commercial intent, like [craft beer].
I like how this new-ish search box takes up a ton of space and puts all these ads right in the prime viewing area pic.twitter.com/CcYdW118gf— Jared McKiernan (@jaredmckiernan) August 18, 2015
Now that the mobile search interface is literally nothing but ads above the fold, early data shows a significant increase in mobile ad clicks. Of course it doesn’t matter if there are 2 or 3 ads, if Google shows ad extensions on SERPs with only 2 ads to ensure they drive the organic results “out of sight, out of mind.”
Earlier this month it was also noticed Google replaced 7-pack local results with 3-pack local results for many more search queries, even on desktop search results. On some of these results they only show a call button, on others they show links to sites. It is a stark contrast to the vast array of arbitrary (and even automated) ad extensions in AdWords.
Why would they determine users want to see links to the websites & the phone numbers, then decide overnight users don’t want those?
Why would Google determine for many years that 7 is a good number of results to show, and then overnight shift to showing 3?
If Google listed 7 ads in a row people might notice the absurdity of it and complain. But if Google only shows 3 results, then they can quickly convert it into an ad unit with little blowback.
You don’t have to be a country music fan to know the Austin SEO limits in a search result where the local results are now payola.
Try not to hurt your back while looking down for the organic search results!
Here are two tips to ensure any SEO success isn’t ethereal: don’t be nearby, and don’t be a business. :D
Virtual Real Estate Virtually Disappears
Back in 2009 Google executives were scared of not being able to retain talent with stock options after Google’s stock price cratered with the rest of the market & Google’s ad revenue growth rate slid to zero. That led them to reprice employee stock options. That is as close as Google has come to a “near death” experience since their IPO. They’ve consistently grown & become more dominant.
In November of 2009 I cringed when I saw the future of SEO in Google SERPs where the organic results were outright displaced & even some of the featured map listings had their phone numbers removed.
Investing in Search
In 2012 a Googler named Jon Rockway was more candid than Googlers are typically known for being: “SEO isn’t good for users or the Internet at large. … It’s a bug that you could rank highly in Google without buying ads, and Google is trying to fix the bug.”
It isn’t surprising Google greatly devalued keyword domain names & hit sites like eHow hard. And it isn’t surprising Demand Media is laying off staff and is rumored to be exploring selling their sites. If deleting millions of articles from eHow doesn’t drive a recovery, how much money can they lose on the rehab project before they should just let it go?
“If you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is break their spirits.” – Matt Cutts
Through a constant ex-post-facto redefinition of “what is spam” to include most anything which is profitable, predictable & accessible, Google engineers work hard to “deny people money.”
Over time SEO became harder & less predictable. The exception being Google investments like Thumbtack, in which case other’s headwind became your tailwind & a list of techniques declared off limits became a strategy guidebook.
Communications got worse, Google stopped even pretending to help the ecosystem, and they went so far as claiming that even asking for a link was spam. All the while, as they were curbing third party investment into the ecosystem (“deny them money”), they work on PR for their various investments & renamed the company from Google to Alphabet so they can expand their scope of investments.
“We also like that it means alpha‑bet (Alpha is investment return above benchmark), which we strive for!” – Larry Page
From Do/Know/Go to Scrape/Displace/Monetize
It takes a lot of effort & most people are probably too lazy to do it, but if you look at the arch of Google’s patents related to search quality, many of the early ones revolved around links. Then many focused on engagement related signals. Chrome & Android changed the pool of signals Google had access to. Things like Project Fi, Gogle Fiber, Nest, and Google’s new OnHub router give them more of that juicy user data. Many of their recently approved patents revolve around expanding the knowledge graph so that they may outright displace the idea of having a neutral third party result set for an increasing share of the overall search pie.
Searchers can instead get bits of “knowledge” dressed in various flavors of ads.
This sort of displacement is having a significant impact on a variety of sites. But for most it is a slow bleed rather than an overnight sudden shift. In that sort of environment, even volunteer run sites will eventually atrophy. They will have fewer new users, and as some of the senior people leave, eventually fewer will rise through the ranks. Or perhaps a greater share of the overall ranks will be driven by money.
Jimmy Wales stated: “It is also false that ‘Wikipedia thrives on clicks,’ at least as compared to ad-revenue driven sites… The relationship between ‘clicks’ and the things we care about: community health and encyclopedia quality is not nothing, but it’s not as direct as some think.”
Most likely the relationship *is* quite direct, but there is a lagging impact. Today’s major editors didn’t join the site yesterday & take time to rise through the ranks.
As the big sites become more closed off the independent voices are pushed aside or outright disappear.
If Google works hard enough at prioritizing “deny people money” as a primary goal, then they will eventually get an index quality that reflects that lack of payment. Plenty of good looking & well-formatted content, but a mix of content which:
- is monetized indirectly & in ways which are not clearly disclosed
- has interstitial ads and slideshows where the ads look like the “next” button & the “next” button is colored the same color as the site’s background
- is done as “me too” micro-reporting with no incremental analysis
- is algorithmically generated
Celebrating Search “Innovation”
There has been a general pattern in search innovation. Google introduces a new feature, pitches it as being the next big thing, gets people to adopt it, collects data on the impact of the feature, clamps down on selectively allowing it, perhaps removes the feature outright from organic search results, permanently adds the feature to their ad units.
This sort of pattern has happened so many times it is hard to count.
Google puts faces in search results for authorship & to promote Google+, Google realizes Google+ is a total loser & disconnects it, new ad units for local services show faces in the search results. What was distracting noise was removed, then it was re-introduced as part of an ad unit.
I’m confused didn’t Google pull authorship cuz faces in the SERPS was a bad experience for the users? pic.twitter.com/mI2NdyGzd7— Michael Gray (@graywolf) July 31, 2015
The same sort of deal exists elsewhere. Google acquires YouTube, launches universal search, offers video snippets, increases size of video snippets. Then video snippets get removed from most listings “because noise.” YouTube gets enlarged video snippets. And, after removing the “noise” of video stills in the search results Google is exploring testing video ads in the search results.
Some sites which bundle software got penalized in organic search and are not even allowed to buy AdWords ads. At an extreme degree, sites which bundled no software, but simply didn’t link to an End User Agreement (EULA) from the download page were penalized. Which leads to uncomfortable conversations like this one:
Google Support: I looked through this, and it seemed that one of the issues was a lack of an End User Agreement (EULA)
Simtec: An EULA is displayed by the setup program before installing starts. Also, the end user license agreements are linked to from here http://www.httpwatch.com/buy/orderingfaq.aspx#licensetypes
Google Support: Hmm, They do want it on the download page itself
Simtec: How come there isn’t one here? google.co.uk/chrome/browser/desktop/
Google Support: LOL
Simtec: No really?
Google Support: That’s a great question
Of course, it goes without saying that much of the Google Chrome install base came from negative option software bundling on Adobe Flash security updates.
Google claimed helpful hotel affiliate sites should be rated as spam, then they put their own affiliate ads in hotel search results & even recommended hotel searches in the knowledge graph on city name searches.
Google created a penalty for sites which have an ad heavy interface. Many of Google’s search results are nothing but ads for the entire first screen.
Google search engineers have recently started complaining about interstitial ads & suggested they might create a “relevancy” signal based on users not liking those. At the same time, an increasing number of YouTube videos have unskippable pre-roll ads. And the volume of YouTube ad views is so large that it is heavily driving down Google’s aggregate ad click price. On top of this, Google also offers a survey tool which publishers can lock content behind & requires users to answer a question before they can see the full article they just saw ranking in the search results.
“Everything is possible, but nothing is real.” – Living Colour
Blue Ocean Opportunity
Amid the growing ecosystem instability & increasing hypocrisy, there have perhaps been only a couple “blue ocean” areas left in organic search: local search & brand.
And it appears Google might be well on their way in trying to take those away.
For years brand has been the solution to almost any SEO problem.
I wonder how many SEOs working for big brands have done absolutely nothing of value since 2012 yet still look like geniuses to executives.— Ross Hudgens (@RossHudgens) August 7, 2015
But Google has been increasing the cost of owning a brand. They are testing other ad formats to drive branded search clicks through more expensive ad formats like PLAs & they have been drastically increasing brand CPCs on text ads. And while that second topic has recently gained broader awareness, it has been a trend for years now: “Over the last 12 months, Brand CPCs on Google have increased 80%” – George Michie, July 30, 2013.
There are other subtle ways Google has attacked brand, including:
- penalties on many of the affiliates of those brands
- launching their own vertical search ad offerings in key big-money verticals
- investing billions in “disruptive” start ups which are exempt from the algorithmic risks other players must deal with
- allowing competitors to target competing brands not only within the search results, but also as custom affinity audiences
- linking to competing businesses in the knowledge graph
Google has recently dialed up monetization of local search quite aggressively as well. I’ve long highlighted how mobile search results are ad heavy & have grown increasingly so over time. Google has recently announced call only ad formats, a buy button for mobile ads, local service provider ads, appointment scheduling in the SERPs, direct hotel booking, etc.
And, in addition to all the above new ad formats, recently it was noticed Google is now showing 3 ads on mobile devices even for terms without much commercial intent, like [craft beer].
I like how this new-ish search box takes up a ton of space and puts all these ads right in the prime viewing area pic.twitter.com/CcYdW118gf— Jared McKiernan (@jaredmckiernan) August 18, 2015
Now that the mobile search interface is literally nothing but ads above the fold, early data shows a significant increase in mobile ad clicks. Of course it doesn’t matter if there are 2 or 3 ads, if Google shows ad extensions on SERPs with only 2 ads to ensure they drive the organic results “out of sight, out of mind.”
Earlier this month it was also noticed Google replaced 7-pack local results with 3-pack local results for many more search queries, even on desktop search results. On some of these results they only show a call button, on others they show links to sites. It is a stark contrast to the vast array of arbitrary (and even automated) ad extensions in AdWords.
Why would they determine users want to see links to the websites & the phone numbers, then decide overnight users don’t want those?
Why would Google determine for many years that 7 is a good number of results to show, and then overnight shift to showing 3?
If Google listed 7 ads in a row people might notice the absurdity of it and complain. But if Google only shows 3 results, then they can quickly convert it into an ad unit with little blowback.
You don’t have to be a country music fan to know the Austin SEO limits in a search result where the local results are now payola.
Try not to hurt your back while looking down for the organic search results!
Here are two tips to ensure any SEO success isn’t ethereal: don’t be nearby, and don’t be a business. :D
Yahoo! Search Testing Google Search Results
Search PandaMonium
A couple days ago Microsoft announced a deal with AOL to have AOL sell Microsoft display ads & for Bing to power AOL’s organic search results and paid search ads for a decade starting in January.
The search landscape is still und…
Yahoo! Search Testing Google Search Results
Search PandaMonium
A couple days ago Microsoft announced a deal with AOL to have AOL sell Microsoft display ads & for Bing to power AOL’s organic search results and paid search ads for a decade starting in January.
The search landscape is still und…
Web Design Resources for Non-Designers
Most of you are too busy monitoring Google’s latest algorithm updates, examining web analytics, and building links and content to stay up to date on the design world.
Usually, creative people who excel at design aren’t very good at the left-brain thinking required to succeed in the highly-technical search engine optimization industry. Likewise, very few people with the analytical mindset required for search engine optimization would do well in the free-spirited design industry.
Unfortunately, in the real world, you’re often expected to do exactly that. And while most people understand that it would be ludicrous to expect their doctor to also troubleshoot their plumbing, they don’t seem to understand why they shouldn’t expect the person responsible for their SEO to also handle their design needs from time to time.
So you’re often forced to design things for your clients from time to time. Or sometimes, you just need to whip up something for yourself instead of trying to find someone who can deliver what you need on Fiverr.
Since you probably won’t start sporting a black turtleneck and talking about crop marks, press checks, or CMYK colors anytime soon, it seems silly to shell out thousands of dollars on software you’ll only use occasionally, so I’ve compiled a list of design resources for non-designers.
The resources in this list are every bit as powerful as any of the professional-grade software, but they are free. (Some do offer premium versions with more options.) The only downside is that it might be a little bit tougher to find tutorials for some of these programs compared to the industry standard software like Adobe Photoshop or Illustrator.
Image editing
We all need to edit and create images from time to time, but if you only do it occasionally, software like Adobe Photoshop and Illustrator works out to be pretty expensive. Fortunately, there are several feature-rich image editing programs available.
- Gimp – Anything you can do with Photoshop can be done with Gimp, and it runs on Windows, Mac, and Linux. The learning curve can be steep, but it’s worth the time.
- Pixlr – If you’re used to Photoshop, this program has a very similar interface, and it even opens native .psd files with the original layers intact.
- Canva – The drag-and-drop interface of this web-based design program make graphic design quick and simple, plus it comes with a library of over one million professional stock images.
- Inkscape – Easily create illustrations, logos, technical drawings, and vector images with this free alternative to Illustrator.
- SVG Editor – If you’re obsessed about website speed, you probably love SVGs (scalable vector graphics) and this handy tool from Google make it easy to create and edit them.
3D
OK, so you’re not going to compete with Pixar anytime soon, but 3D capabilities do come in handy for designing mockups of books and DVDs, creating characters, and even complete photorealistic animations.
- Online 3d Package – This tool lets you quickly and easily create photorealistic mockups of books, boxes, DVDs, and CDs.
- Blender – If you occasionally need to create 3D renderings but can’t justify spending big bucks for professional-grade software that you’ll only use a few times, Blender is the perfect (and free) alternative.
Web design
Designing a website requires a blend of creative and technical skills. Fortunately, there are plenty of tools available to efficiently complete both. From the pretty parts, to the nuts and bolts, to the little details, here is everything you’ll need:
Palette generator – Upload an image and this tool will generate the perfect color palette to compliment it that you can download as a CSS file.
Subtle Paterns – Creating seamless backgrounds can be a pain, so instead of starting from scratch, just download from over 400 high-quality seamless background images, including textures and patterns.
Web page editors
Whether you’re building a website from scratch with a WYSIWYG editor or fine-tuning the code on an existing website with an HTML editor, web design software will probably get a lot of use in your hands. If you have the technical chops to hand code your websites, that’s ideal, but if not, or if you just don’t want to, here are several options:
- Kompozer – With a WYSIWYG editor in one tab and raw HTML in the other, on-the-fly editing with built-in FTP, Kompozer will make creating and editing web page a breeze.
- Google Webdesigner – Build HTML5-based designs and motion graphics that can run on any device without writing any code! (If you want to get your hands dirty, you can edit all HTML and CSS by hand.
- Expression Web – Microsoft offers another free web page editor which has made significant improvements since that abomination called Frontpage.
Favicon Generator – A truly polished website needs consistent branding throughout, and that means all the little details, including a favicon—that tiny little image that sits in the tab or bookmarks. Just upload an image file, such as your logo, and this handy tool will spit out the .ico files you need.
Web Developer Toolbar – This browser toolbar is available for Firefox and Chrome, and helps you troubleshoot your website and even test it at various screen sizes.
Infographics
Infographics are still an effective method to earn social shares and links, and they are a great way to present a lot of data-rich information, but they can be a pain to create. Here are several tools to simplify the process that might even be better (and easier) than traditional design software.
- Infogram – Build beautiful data-driven infographics in just three steps with this free tool.
- Piktochart – With a simple point and click editor and over 4,000 graphics, icons, and template files, Piktochart makes it easy to create infographics that look exactly the way you want.
- Easel.ly – Loaded with tons of creative templates and an east-to-use interface, this is another powerful tool to create your own stunning infographics.
- Venngage – This drag and drop interface provides all the charts, maps, icons and templates you’ll need to design attention-grabbing infographics.
- Vizualize.me – Turn your boring resume into a unique visual expression of your skills and experience to stand out from the crowd.
If you are in a saturated market or have a great idea you are certain will be a success then it may make sense to splash out for a custom designed graphic, but in less competitive market some of the above quick-n-easy tools can still be remarkably effective.
Data visualization
Google Charts is a great way to create all sorts of charts, and the best part is that you can create them on the fly by passing variables in the URL.
Typography
Today you have plenty of options when it comes to font choices, so please stop using Arial, and for the love of all that is good, never use Comic Sans or I will hunt you down. You can choose from thousands of free fonts, so it’s easy to pick one that fits your project perfectly.
- Typegenius – Choosing the perfect font combo can be tough, but Typegenious makes it easy. Just pick a starter font from the drop down list and the site will recommend fonts that pair well with it.
- Google Fonts – I recommend embedding Google fonts instead hosting them on your own server because they load more quickly and there is a chance they’re already cached on visitors’ computers.
- Font Awesome – This is an awesome (hence the name) way to add all sorts of scalable icons without a load of extra http requests. Simply load one font for access to 519 icons that colored, scaled, and styled with CSS.
- DaFont – Download and instal these fonts (.ttf or .otf formats) for designing documents or images on your computer.
- What the Font – If you’ve ever experienced the rage-inducing task of figuring out what font was used when your client only has a 72dpi jpg and no idea how to track down their previous designer, then this is the tool for you. Just upload your image and it goes to work figuring what font it is.
Social media
Social media can multiply your website’s exposure exponentially, but it takes a lot of work. From branding profiles on each network to crafting engaging visual content your fans will share, you’ll have to create a lot of graphics to feed the beast. Doing that manually, the old-fashioned way is tedious and slow, so I recommend these tools to speed up your workflow.
Easy Cover Maker – Stop wasting time trying to position your cover and profile photo for your Facebook and Google+ page. This tool lets you drag everything into position in one handy interactive window, then download the image files.
Quote creators
- Quotes Cover – Just select a quote or enter your own text, apply various effects for your own unique style, and download eye catching pictures perfect for social media. It even creates the perfect dimensions based on how you intend to use it.
- Chisel – This tool has the most user-friendly interface and tons of great images and fonts to create the exact message you want to share.
- Recite This – There are plenty of images and fonts available, but the downside is you have to scroll through images one at a time, and fonts are selected randomly.
Jing – From the makers of Camtasia, this free program gives you the ability to capture images or video (up to 5 minutes long) of your computer screen, then share it with the click of a button.
Social Kit – Create cover images, profile pictures, and ad banners for Facebook, Twitter, Google+, and YouTube with this free, up-to-date Photoshop plugin.
Social media image size guide – The folks over at Sprout Social created (and maintain) this handy and comprehensive Google doc listing the image sizes for all major social media networks, and since it’s a shared document, you can have Google notify you anytime it’s updated!
Meme creators
Instead of wasting time searching for the perfect meme, why not just create your own?
Stock photos
Powerful photos can mean the difference between a dry post that visitors ignore and one that entices them to read more. The good news is you don’t have to take your own spend a fortune on stock photos because there are several free and low-cost options available.
- Unsplash – These are not your typical cheesy stock photos; they lean more towards the artistic side. New photos are uploaded every day and they’re all 100% free.
- StockVault – With over 54 thousand free images available, both artistic and corporate-style, you should be able to find the perfect photo for just about any project.
- Dreamstime & iStockPhoto – Both of these sites give you the option of a subscription model or a pay-as-you-go credits. Many images on one are available on the other, but I’ve found great images that were only on one of the two sites, so it’s worthwhile to check both.
Inspiration
Even the best designers hit a wall, creatively speaking, so it helps to look for inspiration. These sites curate the best designs around and are updated regularly, so you’ll find plenty of fresh ideas for your project.
Tutorials
Since you’re days are filled with keyword research, content development, link building, and other SEO-related tasks, you probably don’t have time to stay up-to-date on the latest design trends and techniques. No worries—with these websites, you’ll be able to find a tutorial to walk you through just about any design challenge.
- CSS-Tricks – Whenever I have a CSS question, I always slap “css tricks” on the end of my search because Chris Coyer has the most detailed, yet easy-to-understand tutorials on damn near every scenario you could imagine.
- Tuts+ – Learn everything about graphic design, web design, programming, and more with a growing library of articles and tutorials.
- Smashing Magazine – This is probably one of the most comprehensive web design resources you’ll find anywhere, going wide and deep on every aspect of web design.
About the Author
Jeremy Knauff is the founder of Spartan Media, a proud father, husband, and US Marine Corps veteran. He has spent over 15 years helping small business up to Fortune 500 companies make their mark online, and now he’s busy building his own media empire. You can follow Spartan Media on Twitter and Facebook.
Web Design Resources for Non-Designers
Most of you are too busy monitoring Google’s latest algorithm updates, examining web analytics, and building links and content to stay up to date on the design world.
Usually, creative people who excel at design aren’t very good at the left-brain thinking required to succeed in the highly-technical search engine optimization industry. Likewise, very few people with the analytical mindset required for search engine optimization would do well in the free-spirited design industry.
Unfortunately, in the real world, you’re often expected to do exactly that. And while most people understand that it would be ludicrous to expect their doctor to also troubleshoot their plumbing, they don’t seem to understand why they shouldn’t expect the person responsible for their SEO to also handle their design needs from time to time.
So you’re often forced to design things for your clients from time to time. Or sometimes, you just need to whip up something for yourself instead of trying to find someone who can deliver what you need on Fiverr.
Since you probably won’t start sporting a black turtleneck and talking about crop marks, press checks, or CMYK colors anytime soon, it seems silly to shell out thousands of dollars on software you’ll only use occasionally, so I’ve compiled a list of design resources for non-designers.
The resources in this list are every bit as powerful as any of the professional-grade software, but they are free. (Some do offer premium versions with more options.) The only downside is that it might be a little bit tougher to find tutorials for some of these programs compared to the industry standard software like Adobe Photoshop or Illustrator.
Image editing
We all need to edit and create images from time to time, but if you only do it occasionally, software like Adobe Photoshop and Illustrator works out to be pretty expensive. Fortunately, there are several feature-rich image editing programs available.
- Gimp – Anything you can do with Photoshop can be done with Gimp, and it runs on Windows, Mac, and Linux. The learning curve can be steep, but it’s worth the time.
- Pixlr – If you’re used to Photoshop, this program has a very similar interface, and it even opens native .psd files with the original layers intact.
- Canva – The drag-and-drop interface of this web-based design program make graphic design quick and simple, plus it comes with a library of over one million professional stock images.
- Inkscape – Easily create illustrations, logos, technical drawings, and vector images with this free alternative to Illustrator.
- SVG Editor – If you’re obsessed about website speed, you probably love SVGs (scalable vector graphics) and this handy tool from Google make it easy to create and edit them.
3D
OK, so you’re not going to compete with Pixar anytime soon, but 3D capabilities do come in handy for designing mockups of books and DVDs, creating characters, and even complete photorealistic animations.
- Online 3d Package – This tool lets you quickly and easily create photorealistic mockups of books, boxes, DVDs, and CDs.
- Blender – If you occasionally need to create 3D renderings but can’t justify spending big bucks for professional-grade software that you’ll only use a few times, Blender is the perfect (and free) alternative.
Web design
Designing a website requires a blend of creative and technical skills. Fortunately, there are plenty of tools available to efficiently complete both. From the pretty parts, to the nuts and bolts, to the little details, here is everything you’ll need:
Palette generator – Upload an image and this tool will generate the perfect color palette to compliment it that you can download as a CSS file.
Subtle Paterns – Creating seamless backgrounds can be a pain, so instead of starting from scratch, just download from over 400 high-quality seamless background images, including textures and patterns.
Web page editors
Whether you’re building a website from scratch with a WYSIWYG editor or fine-tuning the code on an existing website with an HTML editor, web design software will probably get a lot of use in your hands. If you have the technical chops to hand code your websites, that’s ideal, but if not, or if you just don’t want to, here are several options:
- Kompozer – With a WYSIWYG editor in one tab and raw HTML in the other, on-the-fly editing with built-in FTP, Kompozer will make creating and editing web page a breeze.
- Google Webdesigner – Build HTML5-based designs and motion graphics that can run on any device without writing any code! (If you want to get your hands dirty, you can edit all HTML and CSS by hand.
- Expression Web – Microsoft offers another free web page editor which has made significant improvements since that abomination called Frontpage.
Favicon Generator – A truly polished website needs consistent branding throughout, and that means all the little details, including a favicon—that tiny little image that sits in the tab or bookmarks. Just upload an image file, such as your logo, and this handy tool will spit out the .ico files you need.
Web Developer Toolbar – This browser toolbar is available for Firefox and Chrome, and helps you troubleshoot your website and even test it at various screen sizes.
Infographics
Infographics are still an effective method to earn social shares and links, and they are a great way to present a lot of data-rich information, but they can be a pain to create. Here are several tools to simplify the process that might even be better (and easier) than traditional design software.
- Infogram – Build beautiful data-driven infographics in just three steps with this free tool.
- Piktochart – With a simple point and click editor and over 4,000 graphics, icons, and template files, Piktochart makes it easy to create infographics that look exactly the way you want.
- Easel.ly – Loaded with tons of creative templates and an east-to-use interface, this is another powerful tool to create your own stunning infographics.
- Venngage – This drag and drop interface provides all the charts, maps, icons and templates you’ll need to design attention-grabbing infographics.
- Vizualize.me – Turn your boring resume into a unique visual expression of your skills and experience to stand out from the crowd.
If you are in a saturated market or have a great idea you are certain will be a success then it may make sense to splash out for a custom designed graphic, but in less competitive market some of the above quick-n-easy tools can still be remarkably effective.
Data visualization
Google Charts is a great way to create all sorts of charts, and the best part is that you can create them on the fly by passing variables in the URL.
Typography
Today you have plenty of options when it comes to font choices, so please stop using Arial, and for the love of all that is good, never use Comic Sans or I will hunt you down. You can choose from thousands of free fonts, so it’s easy to pick one that fits your project perfectly.
- Typegenius – Choosing the perfect font combo can be tough, but Typegenious makes it easy. Just pick a starter font from the drop down list and the site will recommend fonts that pair well with it.
- Google Fonts – I recommend embedding Google fonts instead hosting them on your own server because they load more quickly and there is a chance they’re already cached on visitors’ computers.
- Font Awesome – This is an awesome (hence the name) way to add all sorts of scalable icons without a load of extra http requests. Simply load one font for access to 519 icons that colored, scaled, and styled with CSS.
- DaFont – Download and instal these fonts (.ttf or .otf formats) for designing documents or images on your computer.
- What the Font – If you’ve ever experienced the rage-inducing task of figuring out what font was used when your client only has a 72dpi jpg and no idea how to track down their previous designer, then this is the tool for you. Just upload your image and it goes to work figuring what font it is.
Social media
Social media can multiply your website’s exposure exponentially, but it takes a lot of work. From branding profiles on each network to crafting engaging visual content your fans will share, you’ll have to create a lot of graphics to feed the beast. Doing that manually, the old-fashioned way is tedious and slow, so I recommend these tools to speed up your workflow.
Easy Cover Maker – Stop wasting time trying to position your cover and profile photo for your Facebook and Google+ page. This tool lets you drag everything into position in one handy interactive window, then download the image files.
Quote creators
- Quotes Cover – Just select a quote or enter your own text, apply various effects for your own unique style, and download eye catching pictures perfect for social media. It even creates the perfect dimensions based on how you intend to use it.
- Chisel – This tool has the most user-friendly interface and tons of great images and fonts to create the exact message you want to share.
- Recite This – There are plenty of images and fonts available, but the downside is you have to scroll through images one at a time, and fonts are selected randomly.
Jing – From the makers of Camtasia, this free program gives you the ability to capture images or video (up to 5 minutes long) of your computer screen, then share it with the click of a button.
Social Kit – Create cover images, profile pictures, and ad banners for Facebook, Twitter, Google+, and YouTube with this free, up-to-date Photoshop plugin.
Social media image size guide – The folks over at Sprout Social created (and maintain) this handy and comprehensive Google doc listing the image sizes for all major social media networks, and since it’s a shared document, you can have Google notify you anytime it’s updated!
Meme creators
Instead of wasting time searching for the perfect meme, why not just create your own?
Stock photos
Powerful photos can mean the difference between a dry post that visitors ignore and one that entices them to read more. The good news is you don’t have to take your own spend a fortune on stock photos because there are several free and low-cost options available.
- Unsplash – These are not your typical cheesy stock photos; they lean more towards the artistic side. New photos are uploaded every day and they’re all 100% free.
- StockVault – With over 54 thousand free images available, both artistic and corporate-style, you should be able to find the perfect photo for just about any project.
- Dreamstime & iStockPhoto – Both of these sites give you the option of a subscription model or a pay-as-you-go credits. Many images on one are available on the other, but I’ve found great images that were only on one of the two sites, so it’s worthwhile to check both.
Inspiration
Even the best designers hit a wall, creatively speaking, so it helps to look for inspiration. These sites curate the best designs around and are updated regularly, so you’ll find plenty of fresh ideas for your project.
Tutorials
Since you’re days are filled with keyword research, content development, link building, and other SEO-related tasks, you probably don’t have time to stay up-to-date on the latest design trends and techniques. No worries—with these websites, you’ll be able to find a tutorial to walk you through just about any design challenge.
- CSS-Tricks – Whenever I have a CSS question, I always slap “css tricks” on the end of my search because Chris Coyer has the most detailed, yet easy-to-understand tutorials on damn near every scenario you could imagine.
- Tuts+ – Learn everything about graphic design, web design, programming, and more with a growing library of articles and tutorials.
- Smashing Magazine – This is probably one of the most comprehensive web design resources you’ll find anywhere, going wide and deep on every aspect of web design.
About the Author
Jeremy Knauff is the founder of Spartan Media, a proud father, husband, and US Marine Corps veteran. He has spent over 15 years helping small business up to Fortune 500 companies make their mark online, and now he’s busy building his own media empire. You can follow Spartan Media on Twitter and Facebook.
But First, A Word From Our Sponsors…
Yesterday Google shared they see greater mobile than desktop search volumes in 10 countries including Japan and the United States.
3 years ago RKG shared CTR data which highlighted how mobile search ads were getting over double the CTR as desktop search ads.
The basic formula: less screen real estate = higher proportion of user clicks on ads.
Google made a big deal of their “mobilepocalypse” update to scare other webmasters into making their sites mobile friendly. Part of the goal of making sites “mobile friendly” is to ensure it isn’t too ad dense (which in turn lowers accidental ad clicks & lowers monetization). Not only does Google have an “ad heavy” relevancy algorithm which demotes ad heavy sites, but they also explicitly claim even using a moderate sized ad unit on mobile devices above the fold is against their policy guidelines:
Is placing a 300×250 ad unit on top of a high-end mobile optimized page considered a policy violation?
Yes, this would be considered a policy violation as it falls under our ad placement policies for site layout that pushes content below the fold. This implementation would take up too much space on a mobile optimized site’s first view screen with ads and provides a poor experience to users. Always try to think of the users experience on your site – this will help ensure that users continue to visit.
So if you make your site mobile friendly you can’t run Google ads above the fold unless you are a large enough publisher that the guidelines don’t actually matter.
If you spend the extra money to make your site mobile friendly, you then must also go out of your way to lower your income.
What is the goal of the above sort of scenario? Defunding content publishers to ensure most the ad revenues flow to Google.
If you think otherwise, consider the layout of the auto ads & hotel ads Google announced yesterday. Top of the search results, larger than 300×250.
If you do X, you are a spammer. If Google does X, they are improving the user experience.
@aaronwall they will personally do everything they penalize others for doing; penalties are just another way to weaken the market.— Cygnus SEO (@CygnusSEO) May 5, 2015
The above sort of contrast is something noticed by non-SEOs. The WSJ article about Google’s new ad units had a user response stating:
With this strategy, Google has made the mistake of an egregious use of precious mobile screen space in search results. This entails much extra fingering/scrolling to acquire useful results and bypass often not-needed coincident advertising. Perhaps a moneymaker by brute force; not a good idea for utility’s sake.
That content displacement with ads is both against Google’s guidelines and algorithmically targeted for demotion – unless you are Google.
How is that working for Google partners?
According to eMarketer, by 2019 mobile will account for 72% of US digital ad spend. Almost all that growth in ad spend flows into the big ad networks while other online publishers struggle to monetize their audiences:
Facebook and Google accounted for a majority of mobile ad market growth worldwide last year. Combined, the two companies saw net mobile ad revenues increase by $6.92 billion, claiming 75.2% of the additional $9.2 billion that went toward mobile in 2013.
Back to the data RKG shared. Mobile is where the growth is…
…and the smaller the screen size the more partners are squeezed out of the ecosystem…
The high-intent, high-value search traffic is siphoned off by ads.
What does that leave for the rest of the ecosystem?
It is hard to build a sustainable business when you have to rely almost exclusively on traffic with no commercial intent.
One of the few areas that works well is perhaps with evergreen content which has little cost of maintenance, but even many of those pockets of opportunity are disappearing due to the combination of the Panda algorithm and Google’s scrape-n-displace knowledge graph.
.@mattcutts I think I have spotted one, Matt. Note the similarities in the content text: pic.twitter.com/uHux3rK57f— dan barker (@danbarker) February 27, 2014
Even companies with direct ad sales teams struggle to monetize mobile:
At The New York Times, for instance, more than half its digital audience comes from mobile, yet just 10% of its digital-ad revenue is attributed to these devices.
Other news websites also get the majority of their search traffic from mobile.
Why do news sites get so much mobile search traffic? A lot of it is navigational & beyond that most of it is on informational search queries which are hard to monetize (and thus have few search ads) and hard to structure into the knowledge graph (because they are about news items which only just recently happened).
If you look at the organic search traffic breakdown in your analytics account & you run a site which isn’t a news site you will likely see a far lower share of search traffic from mobile. Websites outside of the news vertical typically see far less mobile traffic. This goes back to Google dominating the mobile search interface with ads.
Mobile search ecosystem breakdown
- traffic with commercial intent = heavy ads
- limited commercial intent but easy answer = knowledge graph
- limited commercial intent & hard to answer = traffic flows to news sites
Not only is Google monetizing a far higher share of mobile search traffic, but they are also aggressively increasing minimum bids.
As Google continues to gut the broader web publishing ecosystem, they can afford to throw a few hundred million in “innovation” bribery kickback slush funds. That will earn them some praise in the short term with some of the bigger publishers, but it will make those publishers more beholden to Google. And it is even worse for smaller publishers. It means the smaller publishers are not only competing against algorithmic brand bias, confirmation bias expressed in the remote rater documents, & wholesale result set displacement, but some of their bigger publishing competitors are also subsidized directly by Google.
Ignore the broader ecosystem shifts.
Ignore the hypocrisy.
Focus on the user.
Until you are eating cat food.
But First, A Word From Our Sponsors…
Yesterday Google shared they see greater mobile than desktop search volumes in 10 countries including Japan and the United States.
3 years ago RKG shared CTR data which highlighted how mobile search ads were getting over double the CTR as desktop search ads.
The basic formula: less screen real estate = higher proportion of user clicks on ads.
Google made a big deal of their “mobilepocalypse” update to scare other webmasters into making their sites mobile friendly. Part of the goal of making sites “mobile friendly” is to ensure it isn’t too ad dense (which in turn lowers accidental ad clicks & lowers monetization). Not only does Google have an “ad heavy” relevancy algorithm which demotes ad heavy sites, but they also explicitly claim even using a moderate sized ad unit on mobile devices above the fold is against their policy guidelines:
Is placing a 300×250 ad unit on top of a high-end mobile optimized page considered a policy violation?
Yes, this would be considered a policy violation as it falls under our ad placement policies for site layout that pushes content below the fold. This implementation would take up too much space on a mobile optimized site’s first view screen with ads and provides a poor experience to users. Always try to think of the users experience on your site – this will help ensure that users continue to visit.
So if you make your site mobile friendly you can’t run Google ads above the fold unless you are a large enough publisher that the guidelines don’t actually matter.
Update: Looks like Google updated the (utterly absurd) above policy on May 2, 2017 to now allow ads above the fold on mobile.
If you spend the extra money to make your site mobile friendly, you then must also go out of your way to lower your income.
What is the goal of the above sort of scenario? Defunding content publishers to ensure most the ad revenues flow to Google.
If you think otherwise, consider the layout of the auto ads & hotel ads Google announced yesterday. Top of the search results, larger than 300×250.
If you do X, you are a spammer. If Google does X, they are improving the user experience.
@aaronwall they will personally do everything they penalize others for doing; penalties are just another way to weaken the market.— Cygnus SEO (@CygnusSEO) May 5, 2015
The above sort of contrast is something noticed by non-SEOs. The WSJ article about Google’s new ad units had a user response stating:
With this strategy, Google has made the mistake of an egregious use of precious mobile screen space in search results. This entails much extra fingering/scrolling to acquire useful results and bypass often not-needed coincident advertising. Perhaps a moneymaker by brute force; not a good idea for utility’s sake.
That content displacement with ads is both against Google’s guidelines and algorithmically targeted for demotion – unless you are Google.
How is that working for Google partners?
According to eMarketer, by 2019 mobile will account for 72% of US digital ad spend. Almost all that growth in ad spend flows into the big ad networks while other online publishers struggle to monetize their audiences:
Facebook and Google accounted for a majority of mobile ad market growth worldwide last year. Combined, the two companies saw net mobile ad revenues increase by $6.92 billion, claiming 75.2% of the additional $9.2 billion that went toward mobile in 2013.
Back to the data RKG shared. Mobile is where the growth is…
…and the smaller the screen size the more partners are squeezed out of the ecosystem…
The high-intent, high-value search traffic is siphoned off by ads.
What does that leave for the rest of the ecosystem?
It is hard to build a sustainable business when you have to rely almost exclusively on traffic with no commercial intent.
One of the few areas that works well is perhaps with evergreen content which has little cost of maintenance, but even many of those pockets of opportunity are disappearing due to the combination of the Panda algorithm and Google’s scrape-n-displace knowledge graph.
.@mattcutts I think I have spotted one, Matt. Note the similarities in the content text: pic.twitter.com/uHux3rK57f— dan barker (@danbarker) February 27, 2014
Even companies with direct ad sales teams struggle to monetize mobile:
At The New York Times, for instance, more than half its digital audience comes from mobile, yet just 10% of its digital-ad revenue is attributed to these devices.
Other news websites also get the majority of their search traffic from mobile.
Why do news sites get so much mobile search traffic? A lot of it is navigational & beyond that most of it is on informational search queries which are hard to monetize (and thus have few search ads) and hard to structure into the knowledge graph (because they are about news items which only just recently happened).
If you look at the organic search traffic breakdown in your analytics account & you run a site which isn’t a news site you will likely see a far lower share of search traffic from mobile. Websites outside of the news vertical typically see far less mobile traffic. This goes back to Google dominating the mobile search interface with ads.
Mobile search ecosystem breakdown
- traffic with commercial intent = heavy ads
- limited commercial intent but easy answer = knowledge graph
- limited commercial intent & hard to answer = traffic flows to news sites
Not only is Google monetizing a far higher share of mobile search traffic, but they are also aggressively increasing minimum bids.
As Google continues to gut the broader web publishing ecosystem, they can afford to throw a few hundred million in “innovation” bribery kickback slush funds. That will earn them some praise in the short term with some of the bigger publishers, but it will make those publishers more beholden to Google. And it is even worse for smaller publishers. It means the smaller publishers are not only competing against algorithmic brand bias, confirmation bias expressed in the remote rater documents, & wholesale result set displacement, but some of their bigger publishing competitors are also subsidized directly by Google.
Ignore the broader ecosystem shifts.
Ignore the hypocrisy.
Focus on the user.
Until you are eating cat food.
Google Mobilepocalypse Update
A day after the alleged major update, I thought it would make sense to highlight where we are at in the cycle.
Yesterday Google suggested their fear messaging caused 4.7% of webmasters to move over to mobile friendly design since the update was origin…
Google Mobilepocalypse Update
A day after the alleged major update, I thought it would make sense to highlight where we are at in the cycle.
Yesterday Google suggested their fear messaging caused 4.7% of webmasters to move over to mobile friendly design since the update was origin…
Consensus Bias
The Truth About Subjective Truths
A few months ago there was an article in New Scientist about Google’s research paper on potentially ranking sites based on how factual their content is. The idea is generally and genuinely absurd.
- You can’t copyright facts, which means that if this were a primary ranking signal & people focused on it then they would be optimizing their site to be scraped-n-displaced into the knowledge graph. Some people may sugar coat the knowledge graph and rich answers as opportunity, but it is Google outsourcing the cost of editorial labor while reaping the rewards.
.@mattcutts I think I have spotted one, Matt. Note the similarities in the content text: pic.twitter.com/uHux3rK57f— dan barker (@danbarker) February 27, 2014
- If Google is going to scrape, displace & monetize data sets, then the only ways to really profit are:
- focus on creating the types of content which can’t be easily scraped-n-displaced, or
- create proprietary metrics of your own, such that if they scrape them (and don’t cheat by hiding the source) they are marketing you
- In some areas (especially religion and politics) certain facts are verboten & people prefer things which provide confirmation bias of their pre-existing beliefs. End user usage data creates a “relevancy” signal out of comfortable false facts and personalization reinforces it.
- In some areas well known “facts” are sponsored falsehoods. In other areas some things slip through the cracks.
- In some areas Google changes what is considered fact based on where you are located.
How Google keeps everyone happy pic.twitter.com/KmBBzpfzdf— Amazing Maps (@Amazing_Maps) March 31, 2015
- Those who have enough money can create their own facts. It might be painting the perception of a landscape, hiring thousands of low waged workers to manipulate public perception on key issues and new technologies, or more sophisticated forms of social network analysis and manipulation to manipulate public perceptions.
- The previously mentioned links were governmental efforts. However such strategies are more common in the commercial market. Consider how Google has sponsored academic conferences while explicitly telling the people who put them on to hide the sponsorship as part of their lobbying efforts.
- Then there is the blurry area where government and commerce fuse, like when Google put about a half-dozen home team players in key governmental positions during the FTC investigation of Google. Google claimed lobbying was disgusting until they experienced the ROI firsthand.
- In some areas “facts” are backward looking views of the market which are framed, distorted & intentionally incomplete. There was a significant gap between internal voices and external messaging in the run up to the recent financial crisis. Even large & generally trustworthy organizations have some serious skeletons in their closets.
@mattcutts I wonder, what sort of impact does http://t.co/vdg3ARGSz2 have on their E-A-T? expertise +1, authority +1, trustworthiness -_?— aaron wall (@aaronwall) April 6, 2015
- In other areas the inconvenient facts get washed away over time by money.
For a search engine to be driven primarily by group think (see unity100’s posts here) is the death of diversity.
Less Diversity, More Consolidation
The problem is rarely attributed to Google, but as ecosystem diversity has declined (and entire segments of the ecosystem are unprofitable to service), more people are writing things like: “The market for helping small businesses maintain a home online isn’t one with growing profits – or, for the most part, any profits. It’s one that’s heading for a bloody period of consolidation.”
As companies grow in power the power gets monetized. If you can manipulate people without appearing to do so you can make a lot of money.
If you don’t think Google wants to disrupt you out of a job, you’ve been asleep at the wheel for the past decade— Michael Gray (@graywolf) March 13, 2015
We Just Listen to the Data (Ish)
As Google sucks up more data, aggregates intent, and scrapes-n-displaces the ecosystem they get air cover for some of their gray area behaviors by claiming things are driven by the data & putting the user first.
Those “data” and altruism claims from Google recently fell flat on their face when the Wall Street Journal published a number of articles about a leaked FTC document.
- How Google Skewed Search Results
- Inside the U.S. Antitrust Probe of Google
- Key quotes from the document from the WSJ & more from Danny Sullivan
- The PDF document is located here.
That PDF has all sorts of goodies in it about things like blocking competition, signing a low margin deal with AOL to keep monopoly marketshare (while also noting the general philosophy outside of a few key deals was to squeeze down on partners), scraping content and ratings from competing sites, Google force inserting itself in certain verticals anytime select competitors ranked in the organic result set, etc.
As damning as the above evidence is, more will soon be brought to light as the EU ramps up their formal statement of objection, as Google is less politically connected in Europe than they are in the United States:
“On Nov. 6, 2012, the night of Mr. Obama’s re-election, Mr. Schmidt was personally overseeing a voter-turnout software system for Mr. Obama. A few weeks later, Ms. Shelton and a senior antitrust lawyer at Google went to the White House to meet with one of Mr. Obama’s technology advisers. … By the end of the month, the FTC had decided not to file an antitrust lawsuit against the company, according to the agency’s internal emails.”
What is wild about the above leaked FTC document is it goes to great lengths to show an anti-competitive pattern of conduct toward the larger players in the ecosystem. Even if you ignore the distasteful political aspects of the FTC non-decision, the other potential out was:
“The distinction between harm to competitors and harm to competition is an important one: according to the modern interpretation of antitrust law, even if a business hurts individual competitors, it isn’t seen as breaking antitrust law unless it has also hurt the competitive process—that is, that it has taken actions that, for instance, raised prices or reduced choices, over all, for consumers.” – Vauhini Vara
Part of the reason the data set was incomplete on that front was for the most part only larger ecosystem players were consulted. Google engineers have went on record stating they aim to break people’s spirits in a game of psychological warfare. If that doesn’t hinder consumer choice, what does?
@aaronwall rofl. Feed the dragon Honestly these G investigations need solid long term SEOs to testify as well as brands.— Rishi Lakhani (@rishil) April 2, 2015
When the EU published their statement of objections Google’s response showed charts with the growth of Amazon and eBay as proof of a healthy ecosystem.
The market has been consolidated down into a few big winners which are still growing, but that in and of itself does not indicate a healthy nor neutral overall ecosystem.
The long tail of smaller e-commerce sites which have been scrubbed from the search results is nowhere to be seen in such charts / graphs / metrics.
The other obvious “untruth” hidden in the above Google chart is there is no way product searches on Google.com are included in Google’s aggregate metrics. They are only counting some subset of them which click through a second vertical ad type while ignoring Google’s broader impact via the combination of PLAs along with text-based AdWords ads and the knowledge graph, or even the recently rolled out rich product answer results.
Who could look at the following search result (during anti-trust competitive review no less) and say “yeah, that looks totally reasonable?”
Google has allegedly spent the last couple years removing “visual clutter” from the search results & yet they manage to product SERPs looking like that – so long as the eye candy leads to clicks monetized directly by Google or other Google hosted pages.
The Search Results Become a Closed App Store
Search was an integral piece of the web which (in the past) put small companies on a level playing field with larger players.
That it no longer is.
WOW. RT @aimclear: 89% of domains that ranked over the last 7 years are now invisible, #SEO extinction. SRSLY, @marcustober #SEJSummit— Jonah Stein (@Jonahstein) April 15, 2015
“What kind of a system do you have when existing, large players are given a head start and other advantages over insurgents? I don’t know. But I do know it’s not the Internet.” – Dave Pell
The above quote was about app stores, but it certainly parallels a rater system which enforces the broken window fallacy against smaller players while looking the other way on larger players, unless they are in a specific vertical Google itself decides to enter.
“That actually proves my point that they use Raters to rate search results. aka: it *is* operated manually in many (how high?) cases. There is a growing body of consensus that a major portion of Googles current “algo” consists of thousands of raters that score results for ranking purposes. The “algorithm” by machine, on the majority of results seen by a high percentage of people, is almost non-existent.” … “what is being implied by the FTC is that Googles criteria was: GoogleBot +10 all Yelp content (strip mine all Yelp reviews to build their database). GoogleSerps -10 all yelp content (downgrade them in the rankings and claim they aren’t showing serps in serps). That is anticompetitive criteria that was manually set.” – Brett Tabke
The remote rater guides were even more explicitly anti-competitive than what was detailed in the FTC report. For instance, requiring hotel affiliate sites rated as spam even if they are helpful, for no reason other than being affiliate sites.
Is Brand the Answer?
About 3 years ago I wrote a blog post about how branding plays into SEO & why it might peak. As much as I have been accused of having a cynical view, the biggest problem with my post was it was naively optimistic. I presumed Google’s consolidation of markets would end up leading Google to alter their ranking approach when they were unable to overcome the established consensus bias which was subsidizing their competitors. The problem with my presumption is Google’s reliance on “data” was a chimera. When convenient (and profitable) data is discarded on an as need basis.
Or, put another way, the visual layout of the search result page trumps the underlying ranking algorithms.
Google has still highly disintermediated brand value, but they did it via vertical search, larger AdWords ad units & allowing competitive bidding on trademark terms.
If Not Illegal, then Scraping is Certainly Morally Deplorable…
As Google scraped Yelp & TripAdvisor reviews & gave them an ultimatum, Google was also scraping Amazon sales rank data and using it to power Google Shopping product rankings.
Around this same time Google pushed through a black PR smear job of Bing for doing a similar, lesser offense to Google on rare, made-up longtail searches which were not used by the general public.
While Google was outright stealing third party content and putting it front & center on core keyword searches, they had to use “about 100 “synthetic queries”—queries that you would never expect a user to type” to smear Bing & even numerous of these queries did not show the alleged signal.
Here are some representative views of that incident:
- “We look forward to competing with genuinely new search algorithms out there—algorithms built on core innovation, and not on recycled search results from a competitor. So to all the users out there looking for the most authentic, relevant search results, we encourage you to come directly to Google. And to those who have asked what we want out of all this, the answer is simple: we’d like for this practice to stop.” – Google’s Amit Singhal
- “It’s cheating to me because we work incredibly hard and have done so for years but they just get there based on our hard work. I don’t know how else to call it but plain and simple cheating. Another analogy is that it’s like running a marathon and carrying someone else on your back, who jumps off just before the finish line.” Amit Singhal, more explicitly.
- “One comment that I’ve heard is that “it’s whiny for Google to complain about this.” I agree that’s a risk, but at the same time I think it’s important to go on the record about this.” – Matt Cutts
- “I’ve got some sympathy for Google’s view that Bing is doing something it shouldn’t.” – Danny Sullivan
What is so crazy about the above quotes is Google engineers knew at the time what Google was doing with Google’s scraping. I mentioned that contrast shortly after the above PR fiasco happened:
when popular vertical websites (that have invested a decade and millions of Dollars into building a community) complain about Google disintermediating them by scraping their reviews, Google responds by telling those webmasters to go pound sand & that if they don’t want Google scraping them then they should just block Googlebot & kill their search rankings
Learning the Rules of the Road
If you get a sense “the rules” are arbitrary, hypocritical & selectively enforced – you may be on to something:
- “The bizrate/nextag/epinions pages are decently good results. They are usually well-format[t]ed, rarely broken, load quickly and usually on-topic. Raters tend to like them” … which is why … “Google repeatedly changed the instructions for raters until raters assessed Google’s services favorably”
- and while claimping down on those services (“business models to avoid“) … “Google elected to show its product search OneBox “regardless of the quality” of that result and despite “pretty terribly embarrassing failures” ”
- and since Google knew their offerings were vastly inferior, “most of us on geo [Google Local] think we won’t win unless we can inject a lot more of local directly into google results” … thus they added “a ‘concurring sites’ signal to bias ourselves toward triggering [display of a Google local service] when a local-oriented aggregator site (i.e. Citysearch) shows up in the web results””
Google’s justification for not being transparent is “spammer” would take advantage of transparency to put inferior results front and center – the exact same thing Google does when it benefits the bottom line!
Around the same time Google hard-codes the self-promotion of their own vertical offerings, they may choose to ban competing business models through “quality” score updates and other similar changes:
The following types of websites are likely to merit low landing page quality scores and may be difficult to advertise affordably. In addition, it’s important for advertisers of these types of websites to adhere to our landing page quality guidelines regarding unique content.
- eBook sites that show frequent ads
- ‘Get rich quick’ sites
- Comparison shopping sites
- Travel aggregators
- Affiliates that don’t comply with our affiliate guidelines
The anti-competitive conspiracy theory is no longer conspiracy, nor theory.
Key points highlighted by the European Commission:
- Google systematically positions and prominently displays its comparison shopping service in its general search results pages, irrespective of its merits. This conduct started in 2008.
- Google does not apply to its own comparison shopping service the system of penalties, which it applies to other comparison shopping services on the basis of defined parameters, and which can lead to the lowering of the rank in which they appear in Google’s general search results pages.
- Froogle, Google’s first comparison shopping service, did not benefit from any favourable treatment, and performed poorly.
- As a result of Google’s systematic favouring of its subsequent comparison shopping services “Google Product Search” and “Google Shopping”, both experienced higher rates of growth, to the detriment of rival comparison shopping services.
- Google’s conduct has a negative impact on consumers and innovation. It means that users do not necessarily see the most relevant comparison shopping results in response to their queries, and that incentives to innovate from rivals are lowered as they know that however good their product, they will not benefit from the same prominence as Google’s product.
Overcoming Consensus Bias
Consensus bias is set to an absurdly high level to block out competition, slow innovation, and make the search ecosystem easier to police. This acts as a tax on newer and lesser-known players and a subsidy toward larger players.
Eventually that subsidy would be a problem to Google if the algorithm was the only thing that matters, however if the entire result set itself can be displaced then that subsidy doesn’t really matter, as it can be retracted overnight.
Whenever Google has a competing offering ready, they put it up top even if they are embarrassed by it and 100% certain it is a vastly inferior option to other options in the marketplace.
That is how Google reinforces, then manages to overcome consensus bias.
How do you overcome consensus bias?
Consensus Bias
The Truth About Subjective Truths
A few months ago there was an article in New Scientist about Google’s research paper on potentially ranking sites based on how factual their content is. The idea is generally and genuinely absurd.
- You can’t copyright facts, which means that if this were a primary ranking signal & people focused on it then they would be optimizing their site to be scraped-n-displaced into the knowledge graph. Some people may sugar coat the knowledge graph and rich answers as opportunity, but it is Google outsourcing the cost of editorial labor while reaping the rewards.
.@mattcutts I think I have spotted one, Matt. Note the similarities in the content text: pic.twitter.com/uHux3rK57f— dan barker (@danbarker) February 27, 2014
- If Google is going to scrape, displace & monetize data sets, then the only ways to really profit are:
- focus on creating the types of content which can’t be easily scraped-n-displaced, or
- create proprietary metrics of your own, such that if they scrape them (and don’t cheat by hiding the source) they are marketing you
- In some areas (especially religion and politics) certain facts are verboten & people prefer things which provide confirmation bias of their pre-existing beliefs. End user usage data creates a “relevancy” signal out of comfortable false facts and personalization reinforces it.
- In some areas well known “facts” are sponsored falsehoods. In other areas some things slip through the cracks.
- In some areas Google changes what is considered fact based on where you are located.
How Google keeps everyone happy pic.twitter.com/KmBBzpfzdf— Amazing Maps (@Amazing_Maps) March 31, 2015
- Those who have enough money can create their own facts. It might be painting the perception of a landscape, hiring thousands of low waged workers to manipulate public perception on key issues and new technologies, or more sophisticated forms of social network analysis and manipulation to manipulate public perceptions.
- The previously mentioned links were governmental efforts. However such strategies are more common in the commercial market. Consider how Google has sponsored academic conferences while explicitly telling the people who put them on to hide the sponsorship as part of their lobbying efforts.
- Then there is the blurry area where government and commerce fuse, like when Google put about a half-dozen home team players in key governmental positions during the FTC investigation of Google. Google claimed lobbying was disgusting until they experienced the ROI firsthand.
- In some areas “facts” are backward looking views of the market which are framed, distorted & intentionally incomplete. There was a significant gap between internal voices and external messaging in the run up to the recent financial crisis. Even large & generally trustworthy organizations have some serious skeletons in their closets.
@mattcutts I wonder, what sort of impact does http://t.co/vdg3ARGSz2 have on their E-A-T? expertise +1, authority +1, trustworthiness -_?— aaron wall (@aaronwall) April 6, 2015
- In other areas the inconvenient facts get washed away over time by money.
For a search engine to be driven primarily by group think (see unity100’s posts here) is the death of diversity.
Less Diversity, More Consolidation
The problem is rarely attributed to Google, but as ecosystem diversity has declined (and entire segments of the ecosystem are unprofitable to service), more people are writing things like: “The market for helping small businesses maintain a home online isn’t one with growing profits – or, for the most part, any profits. It’s one that’s heading for a bloody period of consolidation.”
As companies grow in power the power gets monetized. If you can manipulate people without appearing to do so you can make a lot of money.
If you don’t think Google wants to disrupt you out of a job, you’ve been asleep at the wheel for the past decade— Michael Gray (@graywolf) March 13, 2015
We Just Listen to the Data (Ish)
As Google sucks up more data, aggregates intent, and scrapes-n-displaces the ecosystem they get air cover for some of their gray area behaviors by claiming things are driven by the data & putting the user first.
Those “data” and altruism claims from Google recently fell flat on their face when the Wall Street Journal published a number of articles about a leaked FTC document.
- How Google Skewed Search Results
- Inside the U.S. Antitrust Probe of Google
- Key quotes from the document from the WSJ & more from Danny Sullivan
- The PDF document is located here.
That PDF has all sorts of goodies in it about things like blocking competition, signing a low margin deal with AOL to keep monopoly marketshare (while also noting the general philosophy outside of a few key deals was to squeeze down on partners), scraping content and ratings from competing sites, Google force inserting itself in certain verticals anytime select competitors ranked in the organic result set, etc.
As damning as the above evidence is, more will soon be brought to light as the EU ramps up their formal statement of objection, as Google is less politically connected in Europe than they are in the United States:
“On Nov. 6, 2012, the night of Mr. Obama’s re-election, Mr. Schmidt was personally overseeing a voter-turnout software system for Mr. Obama. A few weeks later, Ms. Shelton and a senior antitrust lawyer at Google went to the White House to meet with one of Mr. Obama’s technology advisers. … By the end of the month, the FTC had decided not to file an antitrust lawsuit against the company, according to the agency’s internal emails.”
What is wild about the above leaked FTC document is it goes to great lengths to show an anti-competitive pattern of conduct toward the larger players in the ecosystem. Even if you ignore the distasteful political aspects of the FTC non-decision, the other potential out was:
“The distinction between harm to competitors and harm to competition is an important one: according to the modern interpretation of antitrust law, even if a business hurts individual competitors, it isn’t seen as breaking antitrust law unless it has also hurt the competitive process—that is, that it has taken actions that, for instance, raised prices or reduced choices, over all, for consumers.” – Vauhini Vara
Part of the reason the data set was incomplete on that front was for the most part only larger ecosystem players were consulted. Google engineers have went on record stating they aim to break people’s spirits in a game of psychological warfare. If that doesn’t hinder consumer choice, what does?
@aaronwall rofl. Feed the dragon Honestly these G investigations need solid long term SEOs to testify as well as brands.— Rishi Lakhani (@rishil) April 2, 2015
When the EU published their statement of objections Google’s response showed charts with the growth of Amazon and eBay as proof of a healthy ecosystem.
The market has been consolidated down into a few big winners which are still growing, but that in and of itself does not indicate a healthy nor neutral overall ecosystem.
The long tail of smaller e-commerce sites which have been scrubbed from the search results is nowhere to be seen in such charts / graphs / metrics.
The other obvious “untruth” hidden in the above Google chart is there is no way product searches on Google.com are included in Google’s aggregate metrics. They are only counting some subset of them which click through a second vertical ad type while ignoring Google’s broader impact via the combination of PLAs along with text-based AdWords ads and the knowledge graph, or even the recently rolled out rich product answer results.
Who could look at the following search result (during anti-trust competitive review no less) and say “yeah, that looks totally reasonable?”
Google has allegedly spent the last couple years removing “visual clutter” from the search results & yet they manage to product SERPs looking like that – so long as the eye candy leads to clicks monetized directly by Google or other Google hosted pages.
The Search Results Become a Closed App Store
Search was an integral piece of the web which (in the past) put small companies on a level playing field with larger players.
That it no longer is.
WOW. RT @aimclear: 89% of domains that ranked over the last 7 years are now invisible, #SEO extinction. SRSLY, @marcustober #SEJSummit— Jonah Stein (@Jonahstein) April 15, 2015
“What kind of a system do you have when existing, large players are given a head start and other advantages over insurgents? I don’t know. But I do know it’s not the Internet.” – Dave Pell
The above quote was about app stores, but it certainly parallels a rater system which enforces the broken window fallacy against smaller players while looking the other way on larger players, unless they are in a specific vertical Google itself decides to enter.
“That actually proves my point that they use Raters to rate search results. aka: it *is* operated manually in many (how high?) cases. There is a growing body of consensus that a major portion of Googles current “algo” consists of thousands of raters that score results for ranking purposes. The “algorithm” by machine, on the majority of results seen by a high percentage of people, is almost non-existent.” … “what is being implied by the FTC is that Googles criteria was: GoogleBot +10 all Yelp content (strip mine all Yelp reviews to build their database). GoogleSerps -10 all yelp content (downgrade them in the rankings and claim they aren’t showing serps in serps). That is anticompetitive criteria that was manually set.” – Brett Tabke
The remote rater guides were even more explicitly anti-competitive than what was detailed in the FTC report. For instance, requiring hotel affiliate sites rated as spam even if they are helpful, for no reason other than being affiliate sites.
Is Brand the Answer?
About 3 years ago I wrote a blog post about how branding plays into SEO & why it might peak. As much as I have been accused of having a cynical view, the biggest problem with my post was it was naively optimistic. I presumed Google’s consolidation of markets would end up leading Google to alter their ranking approach when they were unable to overcome the established consensus bias which was subsidizing their competitors. The problem with my presumption is Google’s reliance on “data” was a chimera. When convenient (and profitable) data is discarded on an as need basis.
Or, put another way, the visual layout of the search result page trumps the underlying ranking algorithms.
Google has still highly disintermediated brand value, but they did it via vertical search, larger AdWords ad units & allowing competitive bidding on trademark terms.
If Not Illegal, then Scraping is Certainly Morally Deplorable…
As Google scraped Yelp & TripAdvisor reviews & gave them an ultimatum, Google was also scraping Amazon sales rank data and using it to power Google Shopping product rankings.
Around this same time Google pushed through a black PR smear job of Bing for doing a similar, lesser offense to Google on rare, made-up longtail searches which were not used by the general public.
While Google was outright stealing third party content and putting it front & center on core keyword searches, they had to use “about 100 “synthetic queries”—queries that you would never expect a user to type” to smear Bing & even numerous of these queries did not show the alleged signal.
Here are some representative views of that incident:
- “We look forward to competing with genuinely new search algorithms out there—algorithms built on core innovation, and not on recycled search results from a competitor. So to all the users out there looking for the most authentic, relevant search results, we encourage you to come directly to Google. And to those who have asked what we want out of all this, the answer is simple: we’d like for this practice to stop.” – Google’s Amit Singhal
- “It’s cheating to me because we work incredibly hard and have done so for years but they just get there based on our hard work. I don’t know how else to call it but plain and simple cheating. Another analogy is that it’s like running a marathon and carrying someone else on your back, who jumps off just before the finish line.” Amit Singhal, more explicitly.
- “One comment that I’ve heard is that “it’s whiny for Google to complain about this.” I agree that’s a risk, but at the same time I think it’s important to go on the record about this.” – Matt Cutts
- “I’ve got some sympathy for Google’s view that Bing is doing something it shouldn’t.” – Danny Sullivan
What is so crazy about the above quotes is Google engineers knew at the time what Google was doing with Google’s scraping. I mentioned that contrast shortly after the above PR fiasco happened:
when popular vertical websites (that have invested a decade and millions of Dollars into building a community) complain about Google disintermediating them by scraping their reviews, Google responds by telling those webmasters to go pound sand & that if they don’t want Google scraping them then they should just block Googlebot & kill their search rankings
Learning the Rules of the Road
If you get a sense “the rules” are arbitrary, hypocritical & selectively enforced – you may be on to something:
- “The bizrate/nextag/epinions pages are decently good results. They are usually well-format[t]ed, rarely broken, load quickly and usually on-topic. Raters tend to like them” … which is why … “Google repeatedly changed the instructions for raters until raters assessed Google’s services favorably”
- and while claimping down on those services (“business models to avoid“) … “Google elected to show its product search OneBox “regardless of the quality” of that result and despite “pretty terribly embarrassing failures” ”
- and since Google knew their offerings were vastly inferior, “most of us on geo [Google Local] think we won’t win unless we can inject a lot more of local directly into google results” … thus they added “a ‘concurring sites’ signal to bias ourselves toward triggering [display of a Google local service] when a local-oriented aggregator site (i.e. Citysearch) shows up in the web results””
Google’s justification for not being transparent is “spammer” would take advantage of transparency to put inferior results front and center – the exact same thing Google does when it benefits the bottom line!
Around the same time Google hard-codes the self-promotion of their own vertical offerings, they may choose to ban competing business models through “quality” score updates and other similar changes:
The following types of websites are likely to merit low landing page quality scores and may be difficult to advertise affordably. In addition, it’s important for advertisers of these types of websites to adhere to our landing page quality guidelines regarding unique content.
- eBook sites that show frequent ads
- ‘Get rich quick’ sites
- Comparison shopping sites
- Travel aggregators
- Affiliates that don’t comply with our affiliate guidelines
The anti-competitive conspiracy theory is no longer conspiracy, nor theory.
Key points highlighted by the European Commission:
- Google systematically positions and prominently displays its comparison shopping service in its general search results pages, irrespective of its merits. This conduct started in 2008.
- Google does not apply to its own comparison shopping service the system of penalties, which it applies to other comparison shopping services on the basis of defined parameters, and which can lead to the lowering of the rank in which they appear in Google’s general search results pages.
- Froogle, Google’s first comparison shopping service, did not benefit from any favourable treatment, and performed poorly.
- As a result of Google’s systematic favouring of its subsequent comparison shopping services “Google Product Search” and “Google Shopping”, both experienced higher rates of growth, to the detriment of rival comparison shopping services.
- Google’s conduct has a negative impact on consumers and innovation. It means that users do not necessarily see the most relevant comparison shopping results in response to their queries, and that incentives to innovate from rivals are lowered as they know that however good their product, they will not benefit from the same prominence as Google’s product.
Overcoming Consensus Bias
Consensus bias is set to an absurdly high level to block out competition, slow innovation, and make the search ecosystem easier to police. This acts as a tax on newer and lesser-known players and a subsidy toward larger players.
Eventually that subsidy would be a problem to Google if the algorithm was the only thing that matters, however if the entire result set itself can be displaced then that subsidy doesn’t really matter, as it can be retracted overnight.
Whenever Google has a competing offering ready, they put it up top even if they are embarrassed by it and 100% certain it is a vastly inferior option to other options in the marketplace.
That is how Google reinforces, then manages to overcome consensus bias.
How do you overcome consensus bias?
Designing for Privacy
Information is a commodity. Corporations are passing around consumer behavioral profiles like brokers with stocks, and the vast majority of the American public is none the wiser of this market’s scope. Very few people actually check the permissions portion of the Google Play store page before downloading a new app, and who has time to pore over the tedious 48-page monstrosity that is the iTunes terms and conditions contract?
With the advent of wearables, ubiquitous computing, and widespread mobile usage, the individual’s market share of their own information is shrinking at an alarming rate. In response, a growing (and vocal) group of consumers is voicing its concerns about the impact of the effective end of privacy online. And guess what? It’s up to designers to address those concerns in meaningful ways to assuage consumer demand.
But how can such a Sisyphean feat be managed? In a world that demands personalized service at the cost of privacy, how can you create and manage a product that strikes the right balance between the two?
That’s a million dollar question, so let’s break it into more affordable chunks.
Transparency
The big problem with informed consent is the information. It’s your responsibility to be up front with your users as to what exactly they’re trading you in return for your product/service. Not just the cash flow, but the data stream as well. Where’s it going? What’s it being used for?
99.99% of all smartphone apps ask for permission to modify and delete the contents of a phone’s data storage. 99.9999% of the time that doesn’t mean it’s going to copy and paste contact info, photos, or personal correspondences. But that .0001% is mighty worrisome.
Let your users know exactly what you’re asking from them, and what you’ll do with their data. Advertise the fact that you’re not sharing it with corporate interests to line your pockets. And if you are, well, stop that. It’s annoying and you’re ruining the future.
How can you advertise the key points of your privacy policies? Well, you could take a cue from noted online retailer Zappos.com. Their “PROTECTING YOUR PERSONAL INFORMATION” page serves as a decent template for transparency.
They have clearly defined policies about what they will and won’t do to safeguard shopper information. For one, they promise never to “rent, sell or share” user data to anyone, and immediately below, they link to their privacy policy, which weighs in a bit heavy at over 2500 words, but is yet dwarfed by other more convoluted policies.
They also describe their efforts to safeguard user data from malicious hacking threats through the use of SSL tech and firewalls. Then they have an FAQ addressing commonly expressed security concerns. Finally, they have a 24/7 contact line to assure users of personal attention to their privacy queries.
Now it should be noted that this is a template for a good transparency practices, and not precisely a great example of it. The content and intention is there, so what’s missing?
Good UX.
The fine print is indeed a little too fine, the text is a bit too dense (at least where the actual privacy policy is concerned), and the page itself is buried within the fat footer on the main page.
So who does a better job?
CodePen has actually produces an attractively progressive solution.
As you can see, CodePen has taken the time to produce two different versions of their ToS. A typical, lengthy bit of legalese on the left, and an easily readable layman’s version on the right. Providing these as a side by side comparison shows user appreciation and an emphasis on providing a positive UX.
This is all well and good for the traditional web browsing environment, but most of the problems with privacy these days stem from mobile usage. Let’s take a look at how mobile applications are taking advantage of the lag between common knowledge and current technology to make a profit off of private data.
Mobile Permissions
In the mobile space, the Google Play store does a decent job of letting users know what permissions they’re giving, whenever they download an app with its “Permission details” tab:
As you can see, Instagram is awfully nosy, but that’s no surprise. Instagram has come under fire for their privacy policies before. What’s perhaps more surprising, is the unbelievable ubiquity with which invasive data gathering is practiced in the mobile space. Compare Instagram’s permissions to another popular application you might have added to your smartphone’s repertoire:
Why, pray tell, does a flashlight have any need for your location, photos/media/files, device ID and/or call information? I’ll give you a clue: it doesn’t.
“Brightest Flashlight Free” scoops up personal data and sells it to advertisers. The developer was actually sued in 2013 for having a poorly written privacy policy. One that did not disclose the apps malicious intentions to sell user data.
Now the policy is up to date, but the insidious data gathering and selling continues. Unfortunately, it isn’t the only flashlight application to engage in the same sort of dirty data tactics. The fact is, you have to do a surprising amount of research to find any application that doesn’t grab a bit more data than advertised, especially when the global market for mobile-user data approaches nearly $10 billion.
For your peace of mind, there is at least one example of an aptly named flashlight application which doesn’t sell your personal info to the highest bidder.
But don’t get too enthusiastic just yet. This is just one application. How many do you have downloaded on your smartphone? Chances are pretty good that you’re harboring a corporate spy on your mobile device.
Hell, even the Holy Bible takes your data:
Is nothing sacred? To the App developer’s credit, they’ve expressed publicly that they’ll never sell user data to third party interests, but it’s still a wakeup call.
Privacy and UX
What then, are some UX friendly solutions? Designers are forced to strike a balance. Apps need data to run more efficiently, and to better serve users. Yet users aren’t used to the concerns associated with the wholesale data permissions required of most applications. What kind of design patterns can be implemented to bring in a bit of harmony?
First and foremost, it’s important to be utilitarian in your data gathering. Offering informed consent is important, letting your users know what permissions they’re granting and why, but doing so in performant user flows is paramount.
For example, iOS has at least one up on Android with their “dynamic permissions.” This means iOS users have the option of switching up their permissions in-app, rather than having to decide all or nothing upon installation as with Android apps.
http://techcrunch.com/2014/04/04/the-right-way-to-ask-users-for-ios-permissions/
Note how the Cluster application prompts the user to give access to their photos as their interacting with the application, and reassures them of exactly what the app will do. The user is fully informed, and offers their consent as a result of being asked for a certain level of trust.
All of this is accomplished while they’re aiming to achieve a goal within the app. This effectively moves permission granting to 100% because the developers have created a sense of comfort with the application’s inner workings. That’s what designing for privacy is all about: slowly introducing a user to the concept of shared data, and never taking undue advantage of an uninformed user.
Of course, this is just one facet of the privacy/UX conversation. Informing a user of what they’re allowing is important, but reassuring them that their data is secure is even more so.
Safeguarding User Data
Asking a user to trust your brand is essential to a modern business model, you’re trying to engender a trust based relationship with all of your visitors, after all. The real trick, however, is convincing users that their data is safe in your hands—in other words, it won’t be sold to or stolen by 3rd parties, be they legitimate corporations or malicious hackers.
We touched on this earlier with the Zappos example. Zappos reassures its shoppers with SSL, firewalls, and a personalized promise never to share or sell data. All of which should be adopted as industry standards and blatantly advertised to assuage privacy concerns.
Building these safeguards into your service/application/website/what-have-you is extremely important. To gain consumer trust is to first provide transparency in your own practices, and then to protect your users from the wolves at the gate.
Fortunately, data protection is a booming business with a myriad of effective solutions currently in play. Here are just a few of the popular cloud-based options:
Whatever security solutions you choose, the priorities remain the same. Build trust, and more importantly: actually deserve whatever trust you build.
It hardly needs to be stated, but the real key to a future where personal privacy still exists, is to actually be better people. The kind that can be trusted to hold sensitive data.
Is such a future still possible? Let us know what you think in the comment section.
Kyle Sanders is a member of SEOBook and founder of Complete Web Resources, an Austin-based SEO and digital marketing agency.
Google Mobile Search Result Highlights
Google recently added highlights at the bottom of various sections of their mobile search results. The highlights appear on ads, organic results, and other various vertical search insertion types. The colors vary arbitrarily by section and are pattern…
Responsive Design for Mobile SEO
Why Is Mobile So Important?
If you look just at your revenue numbers as a publisher, it is easy to believe mobile is of limited importance. In our last post I mentioned an AdAge article highlighting how the New York Times was generating over half their traffic from mobile with it accounting for about 10% of their online ad revenues.
Large ad networks (Google, Bing, Facebook, Twitter, Yahoo!, etc.) can monetize mobile *much* better than other publishers can because they aggressively blend the ads right into the social stream or search results, causing them to have a much higher CTR than ads on the desktop. Bing recently confirmed the same trend RKG has highlighted about Google’s mobile ad clicks:
While mobile continues to be an area of rapid and steady growth, we are pleased to report that the Yahoo Bing Network’s search and click volumes from smart phone users have more than doubled year-over-year. Click volumes have generally outpaced growth in search volume
Those ad networks want other publishers to make their sites mobile friendly for a couple reasons…
- If the downstream sites are mobile friendly, then users are more likely to go back to the central ad / search / social networks more often & be more willing to click out on the ads from them.
- If mobile is emphasized in importance, then those who are critical of the value of the channel may eat some of the blame for relative poor performance, particularly if they haven’t spent resources optimizing user experience on the channel.
Further Elevating the Importance of Mobile
Modern Love, by Banksy. @SachinKalbag pic.twitter.com/Xzcxnkmmnx— Anand Ranganathan (@ARangarajan1972) November 29, 2014
Google has hinted at the importance of having a mobile friendly design, labeling friendly sites, testing labeling slow sites & offering tools to test how mobile friendly a site design is.
Today Google put out an APB warning they are going to increase the importance of mobile friendly website design:
Starting April 21, we will be expanding our use of mobile-friendliness as a ranking signal. This change will affect mobile searches in all languages worldwide and will have a significant impact in our search results.
In the past Google would hint that they were working to clean up link spam or content farms or website hacking and so on. In some cases announcing such efforts was done to try to discourage investment in the associated strategies, but it is quite rare that Google pre-announces an algorithmic shift which they state will be significant & they put an exact date on it.
I wouldn’t recommend waiting until the last day to implement the design changes, as it will take Google time to re-crawl your site & recognize if the design is mobile friendly.
Those who ignore the warning might be in for significant pain.
Some sites which were hit by Panda saw a devastating 50% to 70% decline in search traffic, but given how small mobile phone screen sizes are, even ranking just a couple spots lower could cause an 80% or 90% decline in mobile search traffic.
Another related issue referenced in the above post was tying in-app content to mobile search personalization:
Starting today, we will begin to use information from indexed apps as a factor in ranking for signed-in users who have the app installed. As a result, we may now surface content from indexed apps more prominently in search. To find out how to implement App Indexing, which allows us to surface this information in search results, have a look at our step-by-step guide on the developer site.
Google also announced today they are extending AdWords-styled ads to their Google Play search results, so they now have a direct economic incentive to allow app activity to bleed into their organic ranking factors.
m. Versus Responsive Design
Some sites have a separate m. version for mobile visitors, while other sites keep consistent URLs & employ responsive design. How the m. version works is on the regular version of their site (say like www.seobook.com) a webmaster could add an alternative reference to the mobile version in the head section of the document
<link rel=”alternate” media=”only screen and (max-width: 640px)” href=”http://m.seobook.com/” >
…and then on the mobile version, they would rel=canonical it back to the desktop version, likeso…
<link rel=”canonical” href=”http://www.seobook.com/” >
With the above sort of code in place, Google would rank the full version of the site on desktop searches & the m. version in mobile search results.
3 or 4 years ago it was a toss up as to which of these 2 options would win, but over time it appears the responsive design option is more likely to win out.
Here are a couple reasons responsive is likely to win out as a better solution:
- If people share a mobile-friendly URL on Twitter, Facebook or other social networks & the URL changes, then when someone on a desktop computer clicks on the shared m. version of the page with fewer ad units & less content on the page, then the publisher is providing a worse user experience & is losing out on incremental monetization they would have achieved with the additional ad units.
- While some search engines and social networks might be good at consolidating the performance of the same piece of content across multiple URL versions, some of them will periodically mess it up. That in turn will lead in some cases to lower rankings in search results or lower virality of content on social networks.
- Over time there is an increasing blur between phones and tablets with phablets. Some high pixel density screens on cross over devices may appear large in terms of pixel count, but still not have particularly large screens, making it easy for users to misclick on the interface.
- When Bing gave their best practices for mobile, they stated: “Ideally, there shouldn’t be a difference between the “mobile-friendly” URL and the “desktop” URL: the site would automatically adjust to the device — content, layout, and all.” In that post Bing shows some examples of m. versions of sites ranking in their mobile search results, however for smaller & lesser known sites they may not rank the m. version the way they do for Yelp or Wikipedia, which means that even if you optimize the m. version of the site to a great degree, that isn’t the version all mobile searchers will see. Back in 2012 Bing also stated their preference for a single version of a URL, in part based on lowering crawl traffic & consolidation of ranking signals.
In addition to responsive web design & separate mobile friendly URLs, a third configuration option is dynamic serving, which uses the Vary HTTP header to detect the user-agent & use that to drive the layout.
Solutions for Quickly Implementing Responsive Design
New Theme / Design
If your site hasn’t been updated in years you might be suprised at what you find available on sites like ThemeForest for quite reasonable prices. Many of the options are responsive out of the gate & look good with a day or two of customization. Theme subscription services like WooThemes and Elegant Themes also have responsive options.
Child Themes
Some of the default Wordpress themes are responsive. Creating a child theme is quite easy. The popular Thesis and Studiopress frameworks also offer responsive skins.
PSD to HTML HTML to Responsive HTML
Eeek! … 11% Of Americans Think #HTML Is A Sexually Transmitted Disease http://t.co/np0irmI1DW via @broderick— L2Code HTML (@L2CodeHTML) January 10, 2015
Some of the PSD to HTML conversion services like PSD 2 HTML, HTML Slicemate & XHTML Chop offer responsive design conversion of existing HTML sites in as little as a day or two, though you will likely need to do at least a few minor changes when you put the designs live to compensate for issues like third party ad units and other minor issues.
If you have an existing Wordpress theme, you might want to see if you can zip it up and send it to them, or else they may make your new theme as a child theme of 2015 or such. If you are struggling to get them to convert your Wordpress theme over (like they are first converting it to a child theme of 2015 or such) then another option would be to have them do a static HTML file conversion (instead of a Wordpress conversion) and then feed that through a theme creation tool like Themespress.
Other Things to Look Out For
Third Party Plug-ins & Ad Code Gotchas
Google allows webmasters to alter the ad calls on their mobile responsive AdSense ad units to show different sized ad units to different screen sizes & skip showing some ad units on smaller screens. An AdSense code example is included in an expandable section at the bottom of this page.
<style type=”text/css”>
.adslot_1 { display:inline-block; width: 320px; height: 50px; }
@media (max-width: 400px) { .adslot_1 { display: none; } }
@media (min-width:500px) { .adslot_1 { width: 468px; height: 60px; } }
@media (min-width:800px) { .adslot_1 { width: 728px; height: 90px; } }
</style>
<ins class=”adsbygoogle adslot_1″
data-ad-client=”ca-pub-1234″
data-ad-slot=”5678″></ins>
<script async src=”//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js”></script>
<script>(adsbygoogle = window.adsbygoogle || []).push({});</script>
For other ads which perhaps don’t have a “mobile friendly” option you could use CSS to either set the ad unit to display none or to set the ad unit to overflow using code like either of the following
hide it:
@media only screen and (max-width: ___px) {
.bannerad {
display: none;
}
}
overflow it:
@media only screen and (max-width: ___px) {
.ad-unit {
max-width: ___px;
overflow: scroll;
}
}
Before Putting Your New Responsive Site Live…
Back up your old site before putting the new site live.
For static HTML sites or sites with PHP or SHTML includes & such…
- Download a copy of your existing site to local.
- Rename that folder to something like sitename.com-OLDVERSION
- Upload the sitename.com-OLDVERSION folder to your server. If anything goes drastically wrong during your conversion process you can rename the new site design to something like sitename.com-HOSED then set the sitename.com-OLDVERSION folder to sitename.com to quickly restore the site.
- Download your site to local again.
- Ensure your new site design is using a different CSS folder or CSS filename such that they old and new versions of the design can be live at the same time while you are editing the site.
- Create a test file with the responsive design on your site & test that page until things work well enough.
- Once that page works well enough, test changing your homepage over to the new design & then save and upload it to verify it works properly. In addition to using your cell phone you could see how it looks on a variety of devices via the mobile browser testing emulation tool in Chrome, or a wide array of third party tools like: MobileTest.me, iPadPeek, Mobile Phone Emulator, Browshot, Matt Kersley’s responsive web design testing tool, BrowserStack, Cross Browser Testing, & the W3C mobileOK Checker. Paid services like Keynote offer manual testing rather than emulation on a wide variety of devices. There is also paid downloadable desktop emulation software like Multi-browser view.
- Once you have the general “what needs changed in each file” down, then use find & replace to bulk edit the remaining files to make the changes to make them responsive.
- Use a tool like FileZilla to quickly bulk upload the files.
- Look through key pages and if there are only a few minor errors then fix them and re-upload them. If things are majorly screwed up then revert to the old design being live and schedule a do over on the upgrade.
- If you have a decently high traffic site, it might make sense to schedule the above process for late Friday night or an off hour on the weekend, such that if anything goes astray you have fewer people seeing the errors while you frantically rush to fix them. :)
If you have little faith in the above test-it-live “methodology” & would prefer a slower & lower stress approach, you could create a test site on another domain name for testing purposes. Just be sure to include a noindex directive in the robots.txt file or password protect access to the site while testing. When you get things worked out on it, make sure your internal links are referencing the correct domain name, and that you have removed any block via robots.txt or password protection.
For a site with a CMS the above process is basically the same, except for how you might need to create a different backup. If you are uploading a Wordpress or Drupal theme, then change the name at least slightly so you can keep the old and new designs live at the same time so you can quickly switch back to the old design if you need to.
If you have a mixed site with Wordpress & static files or such then it might make sense to test changing the static files first, get those to work well & then create a Wordpress theme after that.
You Can’t Copyright Facts
The Facts of Life
When Google introduced the knowledge graph one of their underlying messages behind it was “you can’t copyright facts.”
Facts are like domain names or links or pictures or anything else in terms of being a layer of information which can be highly valued or devalued through commoditization.
When you search for love quotes, Google pulls one into their site & then provides another “try again” link.
Since quotes mostly come from third parties they are not owned by BrainyQuotes and other similar sites. But here is the thing, if those other sites which pay to organize and verify such collections have their economics sufficiently undermined then they go away & then Google isn’t able to pull them into the search results either.
The same is true with song lyrics. If you are one of the few sites paying to license the lyrics & then Google puts lyrics above the search results, then the economics which justified the investment in licensing might not back out & you will likely go bankrupt. That bankruptcy wouldn’t be the result of being a spammer trying to work an angle, but rather because you had a higher cost structure from trying to do things the right way.
Never trust a corporation to do a librarian’s job.
What’s Behind Door Number One?
Google has also done the above quote-like “action item” types of onebox listings in other areas like software downloads
Where there are multiple versions of the software available, Google is arbitrarily selecting the download page, even though a software publisher might have a parallel SAAS option or other complex funnels based on a person’s location or status as a student or such.
Mix in Google allowing advertisers to advertise bundled adware, and it becomes quite easy for Google to gum up the sales process and undermine existing brand equity by sending users to the wrong location. Here’s a blog post from Malwarebytes referencing
- their software being advertised on their brand term in Google via AdWords ads, engaging in trademark infringement and bundled with adware.
- numerous user complaints they received about the bundleware
- required legal actions they took to take the bundler offline
Brands are forced to buy their own brand equity before, during & after the purchase, or Google partners with parasites to monetize the brand equity:
The company used this cash to build more business, spending more than $1 million through at least seven separate advertising accounts with Google.
…
The ads themselves said things like “McAfee Support – Call +1-855-[redacted US phone number]” and pointed to domains like mcafee-support.pccare247.com.
…
One PCCare247 ad account with Google produced 71.7 million impressions; another generated 12.4 million more. According to records obtained by the FTC, these combined campaigns generated 1.5 million clicks
Since Google requires Chrome extensions be installed from their own website it makes it hard (for anyone other than Google) to monetize them, which in turn makes it appealing for people to sell the ad-ons to malware bundlers. Android apps in the Google Play store are yet another “open” malware ecosystem.
FACT: This Isn’t About Facts
Google started the knowledge graph & onebox listings on some utterly banal topics which were easy for a computer to get right, though their ambitions vastly exceed the starting point. The starting point was done where it was because it was low-risk and easy.
When Google’s evolving search technology was recently covered on Medium by Steven Levy he shared that today the Knowledge Graph appears on roughly 25% of search queries and that…
Google is also trying to figure out how to deliver more complex results — to go beyond quick facts and deliver more subjective, fuzzier associations. “People aren’t interested in just facts,” she says. “They are interested in subjective things like whether or not the television shows are well-written. Things that could really help take the Knowledge Graph to the next level.”
Even as the people who routinely shill for Google parrot the “you can’t copyright facts” mantra, Google is telling you they have every intent of expanding far beyond it. “I see search as the interface to all computing,” says Singhal.
Even if You Have Copyright…
What makes the “you can’t copyright facts” line so particularly disingenuous was Google’s support of piracy when they purchased YouTube:
cofounder Jawed Karim favored “lax” copyright policy to make YouTube “huge” and hence “an excellent acquisition target.” YouTube at one point added a “report copyrighted content” button to let users report infringements, but removed the button when it realized how many users were reporting unauthorized videos. Meanwhile, YouTube managers intentionally retained infringing videos they knew were on the site, remarking “we should KEEP …. comedy clips (Conan, Leno, etc.) [and] music videos” despite having licenses for none of these. (In an email rebuke, cofounder Steve Chen admonished: “Jawed, please stop putting stolen videos on the site. We’re going to have a tough time defending the fact that we’re not liable for the copyrighted material on the site because we didn’t put it up when one of the co-founders is blatantly stealing content from other sites and trying to get everyone to see it.”)
To some, the separation of branding makes YouTube distinct and separate from Google search, but that wasn’t so much the case when many sites lost their video thumbnails and YouTube saw larger thumbnails on many of their listings in Google. In the above Steven Levy article he wrote: “one of the highest ranked general categories was a desire to know “how to” perform certain tasks. So Google made it easier to surface how-to videos from YouTube and other sources, featuring them more prominently in search.”
Altruism vs Disruption for the Sake of it
Whenever Google implements a new feature they can choose not to monetize it so as to claim they are benevolent and doing it for users without commercial interest. But that same unmonetized & for users claim was also used with their shopping search vertical until one day it went paid. Google claimed paid inclusion was evil right up until the day it claimed paid inclusion was a necessity to improve user experience.
There was literally no transition period.
Many of the “informational” knowledge block listings contain affiliate links pointing into Google Play or other sites. Those affiliate ads were only labeled as advertisements after the FTC complained about inconsistent ad labeling in search results.
Health is Wealth
Google recently went big on the knowledge graph by jumping head first into the health vertical:
starting in the next few days, when you ask Google about common health conditions, you’ll start getting relevant medical facts right up front from the Knowledge Graph. We’ll show you typical symptoms and treatments, as well as details on how common the condition is—whether it’s critical, if it’s contagious, what ages it affects, and more. For some conditions you’ll also see high-quality illustrations from licensed medical illustrators. Once you get this basic info from Google, you should find it easier to do more research on other sites around the web, or know what questions to ask your doctor.
Google’s links to the Mayo Clinic in their knowledge graph are, once again, a light gray font.
In case you didn’t find enough background in Google’s announcement article, Greg Sterling shared more of Google’s views here. A couple notable quotes from Greg…
Cynics might say that Google is moving into yet another vertical content area and usurping third-party publishers. I don’t believe this is the case. Google isn’t going to be monetizing these queries; it appears to be genuinely motivated by a desire to show higher-quality health information and educate users accordingly.
- Google doesn’t need to directly monetize it to impact the economics of the industry. If they shift a greater share of clicks through AdWords then that will increase competition and ad prices in that category while lowering investment in SEO.
- If this is done out of benevolence, it will appear *above* the AdWords ads on the search results — unlike almost every type of onebox or knowledge graph result Google offers.
- If it is fair for him to label everyone who disagrees with his thesis as a cynic then it is of course fair for those “cynics” to label Greg Sterling as a shill.
Google told me that it hopes this initiative will help motivate the improvement of health content across the internet.
By defunding and displacing something they don’t improve its quality. Rather they force the associated entities to cut their costs to try to make the numbers work.
If their traffic drops and they don’t do more with less, then…
- their margins will fall
- growth slows (or they may even shrink)
- their stock price will tank
- management will get fired & replaced and/or they will get took private by private equity investors and/or they will need to do some “bet the company” moves to find growth elsewhere (and hope Google doesn’t enter that parallel area anytime soon)
When the numbers don’t work, publishers need to cut back or cut corners.
Things get monetized directly, monetized indirectly, or they disappear.
Some of the more hated aspects of online publishing (headline bait, idiotic correlations out of context, pagination, slideshows, popups, fly in ad units, auto play videos, full page ad wraps, huge ads eating most the above the fold real estate, integration of terrible native ad units promoting junk offers with shocking headline bait, content scraping answer farms, blending unvetted user generated content with house editorial, partnering with content farms to create subdomains on trusted blue chip sites, using Narrative Science or Automated Insights to auto-generate content, etc.) are not done because online publishers want to be jackasses, but because it is hard to make the numbers work in a competitive environment.
Publishers who were facing an “oh crap” moment when seeing print Dollars turn into digital dimes are having round number 2 when they see those digital dimes turn into mobile pennies:
At The New York Times, for instance, more than half its digital audience comes from mobile, yet just 10% of its digital-ad revenue is attributed to these devices.
If we lose some diversity in news it isn’t great, though it isn’t the end of the world. But what makes health such an important area is it is literally a matter of life & death.
Its importance & the amount of money flowing through the market ensures there is heavy investment in misinforming the general population. The corruption is so bad some people (who should know better) instead fault science.
@johnandrews @aaronwall it must be nice to say, you know what we’re keeping that traffic for ourselves, and nobody says a damn thing— Michael Gray (@graywolf) February 10, 2015
… and, only those who hate free speech, democracy & the country could possibly have anything negative to say about it. :D
Not to worry though. Any user trust built through the health knowledge graph can be monetized through a variety of other fantastic benevolent offers.
Once again, Google puts the user first.
Mozilla Firefox Dumps Google in Favor of Yahoo! Search
Firefox users conduct over 100 billion searches per year & starting in December Yahoo! will be the default search choice in the US, under a new 5 year agreement.
Google has been the Firefox global search default since 2004. Our agreement came up for renewal this year, and we took this as an opportunity to review our competitive strategy and explore our options.
In evaluating our search partnerships, our primary consideration was to ensure our strategy aligned with our values of choice and independence, and positions us to innovate and advance our mission in ways that best serve our users and the Web. In the end, each of the partnership options available to us had strong, improved economic terms reflecting the significant value that Firefox brings to the ecosystem. But one strategy stood out from the rest.
In Russia they’ll default to Yandex & in China they’ll default to Baidu.
One weird thing about that announcement is there is no mention of Europe & Google’s dominance is far greater in Europe. I wonder if there was a quiet deal with Google in Europe, if they still don’t have their Europe strategy in place, or what their strategy is.
Google paid Firefox roughly $300 million per year for the default search placement. Yahoo!’s annual search revenue is on the order of $1.8 billion per year, so if they came close to paying $300 million a year, then Yahoo! has to presume they are going to get at least a few percentage points of search marketshare lift for this to pay for itself.
It also makes sense that Yahoo! would be a more natural partner fit for Mozilla than Bing would. If Mozilla partnered with Bing they would risk developer blowback from pent up rage about anti-competitive Internet Explorer business practices from 10 or 15 years ago.
It is also worth mentioning our recent post about how Yahoo! boosts search RPM by doing about a half dozen different tricks to preference paid search results while blending in the organic results.
Yahoo Ads | Yahoo Organic Results | |
Placement | top of the page | below the ads |
Background color | none / totally blended | none |
Ad label | small gray text to right of advertiser URL | n/a |
Sitelinks | often 5 or 6 | usually none, unless branded query |
Extensions | star ratings, etc. | typically none |
Keyword bolding | on for title, description, URL & sitelinks | off |
Underlines | ad title & sitelinks, URL on scroll over | off |
Click target | entire background of ad area is clickable | only the listing title is clickable |
Though the revenue juicing stuff from above wasn’t present in the screenshot Mozilla shared about Yahoo!’s new clean search layout they will offer Firefox users.
It shows red ad labels to the left of the ads and bolding on both the ads & organics.
Here is Marissa Mayer’s take:
At Yahoo, we believe deeply in search – it’s an area of investment and opportunity for us. It’s also a key growth area for us – we’ve now seen 11 consecutive quarters of growth in our search revenue on an ex-TAC basis. This partnership helps to expand our reach in search and gives us an opportunity to work even more closely with Mozilla to find ways to innovate in search, communications, and digital content. I’m also excited about the long-term framework we developed with Mozilla for future product integrations and expansion into international markets.
Our teams worked closely with Mozilla to build a clean, modern, and immersive search experience that will launch first to Firefox’s U.S. users in December and then to all Yahoo users in early 2015.
Even if Microsoft is only getting a slice of the revenues, this makes the Bing organic & ad ecosystem stronger while hurting Google. (Unless of course this is a step 1 before Marissa finds a way to nix the Bing deal and partner back up with Google on search). Yahoo! already has a partnership to run Google contextual ads. A potential Yahoo! Google search partnership was blocked back in 2008. Yahoo! also syndicates Bing search ads in a contextual format to other sites through Media.net and has their Gemini Stream Ads product which powers some of their search ads on mobile devices and on content sites is a native ad alternative to Outbrain and Taboola. When they syndicate the native ads to other sites, the ads are called Yahoo! Recommends.
Both Amazon and eBay have recently defected (at least partially) from the Google ad ecosystem. Amazon has also been pushing to extend their ad network out to other sites.
Greg Sterling worries this might be a revenue risk for Firefox: “there may be some monetary risk for Firefox in leaving Google.” Missing from that perspective:
- How much less Google paid Mozilla before the most recent contract lifted by a competitive bid from Microsoft
- If Bing goes away, Google will drastically claw down on the revenue share offered to other search partners.
- Google takes 45% from YouTube publishers
- Google took over a half-decade (and a lawsuit) to even share what their AdSense revenue share was
- look at eHow’s stock performance
- While Google’s search ad revenue has grown about 20% per year their partner ad network revenues have stagnated as their traffic acquisition costs as a percent of revenue have dropped
The good thing about all the Google defections is the more networks there are the more opportunities there are to find one which works well / is a good fit for whatever you are selling, particularly as Google adds various force purchased junk to their ad network – be it mobile “Enhanced” campaigns or destroying exact match keyword targeting.