Beware Of SEO Truthiness

When SEO started, many people routinely used black-box testing to try any figure out what pages the search engines rewarded.

Black box testing is terminology used in IT. It’s a style of testing that doesn’t assume knowledge of the internal workings of a machine or computer program. Rather, you can only test how the system responds to inputs.

So, for many years, SEO was about trying things out and watching how the search engine responded. If rankings went up, SEOs assumed correlation meant causation, so they did a lot more of whatever it was they thought was responsible for the boost. If the trick was repeatable, they could draw some firmer conclusions about causation, at least until the search engine introduced some new algorithmic code and sent everyone back to their black-box testing again.

Well, it sent some people back to testing. Some SEO’s don’t do much, if any, testing of their own, and so rely on the strategies articulated by other people. As a result, the SEO echo chamber can be a pretty misleading place as “truthiness” – and a lot of false information – gets repeated far and wide, until it’s considered gospel. One example of truthiness is that paid placement will hurt you. Well, it may do, but not having it may hurt you more, because it all really…..depends.

Another problem is that SEO testing can seldom be conclusive, because you can’t be sure of the state of the thing you’re testing. The thing you’re testing may not be constant. For example, you throw up some more links, and your rankings rise, but the rise could be due to other factors, such as a new engagement algorithm that Google implemented in the middle of your testing, you just didn’t know about it.

It used to be a lot easier to conduct this testing. Updates were periodic. Up until that point, you could reasonably assume the algorithms were static, so cause and effect were more obvious than they are today. Danny Sullivan gave a good overview of search history at Moz earlier in the year:

That history shows why SEO testing is getting harder. There are a lot more variables to isolate that there used to be. The search engines have also been clever. A good way to thwart SEO black box testing is to keep moving the target. Continuously roll out code changes and don’t tell people you’re doing it. Or send people on a wild goose chase by arm-waving about a subtle code change made over here, when the real change has been made over there.

That’s the state of play in 2013.

However….(Ranting Time :)

Some SEO punditry is bordering on the ridiculous!

I’m not going to link to one particular article I’ve seen recently, as, ironically, that would mean rewarding them for spreading FUD. Also, calling out people isn’t really the point. Suffice to say, the advice was about specifics, such as how many links you can “safely” get from one type of site, that sort of thing….

The problem comes when we can easily find evidence to the contrary. In this case, a quick look through the SERPs and you’ll find evidence of top ranking sites that have more than X links from Site Type Y, so this suggests….what? Perhaps these sites are being “unsafe”, whatever that means. A lot of SEO punditry is well meaning, and often a rewording of Google’s official recommendations, but can lead people up the garden path if evidence in the wild suggests otherwise.

If one term defined SEO in 2013, it is surely “link paranoia”.

What’s Happening In The Wild

When it comes to what actually works, there are few hard and fast rules regarding links. Look at the backlink profiles for top ranked sites across various categories and you’ll see one thing that is constant….

Nothing is constant.

Some sites have links coming from obviously automated campaigns, and it seemingly doesn’t affect their rankings. Other sites have credible link patterns, and rank nowhere. What counts? What doesn’t? What other factors are in play? We can only really get a better picture by asking questions.

Google allegedly took out a few major link networks over the weekend. Anglo Rank came in for special mention from Matt Cutts.

So, why are Google making a point of taking out link networks if link networks don’t work? Well, it’s because link networks work. How do we know? Look at the back link profiles in any SERP area where there is a lot of money to be made, and the area isn’t overly corporate i.e. not dominated by major brands, and it won’t be long before you spot aggressive link networks, and few “legitimate” links, in the backlink profiles.

Sure, you wouldn’t want aggressive link networks pointing at brand sites, as there are better approaches brand sites can take when it comes to digital marketing, but such evidence makes a mockery of the tips some people are freely handing out. Are such tips the result of conjecture, repeating Google’s recommendations, or actual testing in the wild? Either the link networks work, or they don’t work but don’t affect rankings, or these sites shouldn’t be ranking.

There’s a good reason some of those tips are free, I guess.

Risk Management

Really, it’s a question of risk.

Could these sites get hit eventually? Maybe. However, those using a “disposable domain” approach will do anything that works as far as linking goes, as their main risk is not being ranked. Being penalised is an occupational hazard, not game-over. These sites will continue so long as Google’s algorithmic treatment rewards them with higher ranking.

If your domain is crucial to your brand, then you might choose to stay away from SEO entirely, depending on how you define “SEO”. A lot of digital marketing isn’t really SEO in the traditional sense i.e. optimizing hard against an algorithm in order to gain higher rankings, a lot of digital marketing is based on optimization for people, treating SEO as a side benefit. There’s nothing wrong with this, of course, and it’s a great approach for many sites, and something we advocate. Most sites end up somewhere along that continuum, but no matter where you are on that scale, there’s always a marketing risk to be managed, with perhaps “non-performance” being a risk that is often glossed over.

So, if there’s a take-away, it’s this: check out what actually happens in the wild, and then evaluate your risk before emulating it. When pundits suggest a rule, check to see if you can spot times it appears to work, and perhaps more interestingly, when it doesn’t. It’s in those areas of personal inquiry and testing where gems of SEO insight are found.

SEO has always been a mix of art and science. You can test, but only so far. The art part is dealing with the unknown past the testing point. Performing that art well is to know how to pick truthiness from reality.

And that takes experience.

But mainly a little fact checking :)

Categories: 

Search Market Share Frozen, Overall Query Volume Down

It’s very cold in many parts of the US right now. And just like much of the country, the relative market share positions of the major search engines are essentially frozen. November search market share data from comScore reflects virtually no change from last month. Google lost a fraction of…

Please visit Search Engine Land for the full article.

Mobile App Metrics that Matter – Whiteboard Friday

Posted by adamsinger

Releasing a mobile app to the public is certainly an accomplishment, but launch day is nowhere near the end of the process. It’s just as vital to measure people’s interaction with your apps as it is to measure their interaction with your web properties.

In today’s Whiteboard Friday, Adam Singer—Google’s Product Marketing Manager on Google Analytics—walks us through some of the most important metrics to watch to make sure your app is as successful as possible.

Adam Singer – Mobile App Metrics (that Matter) – Whiteboard Friday

For reference, here’s a still of this week’s whiteboard!

Video Transcription

Howdy, Moz fans. I am Adam Singer (Twitter, Google+), Product Marking Manager on Google Analytics, as well as blogger at TheFutureBuzz.com, and I happen to be up here in Seattle and the Moz folk asked me if I’d be willing to do a Whiteboard Friday. So I’ve actually been watching Whiteboard Fridays for probably the last six or seven years. It feels like that long. I don’t know if you guys have been doing them that long, but it feels like a long time.

So I’m excited to come in today and chat with you about a subject I’ve been talking about at conferences all over the world, we’ve been sharing on our blog, on ClickZ—I write a once monthly column at ClickZ—mobile app analytics. So app analytics are really important. Pew just did research. More than half of Americans now own a smartphone. We’ve also seen a lot of really interesting pieces of research sharing that for some retailers they’re actually getting more conversions on mobile via apps and via mobile sites than desktop.

So, obviously, apps are really important, and via our own research that we did on the Analytics Team, last year we found that around 87% of marketers are actually planning to increase their emphasis on mobile app analytics and app measurement into 2013. We also found out that around half of marketers were either completely new or novice at app analytics, so they didn’t have much experience.

So this is an area as a marketer, if you’ve never measured a mobile app before, it’s an area you’re going to need to get into, because in the future I think pretty much every company that is interested in maintaining a relationship with their users in a location-agnostic setting, not just in front of their desktop, but wherever they go, will have a mobile app.

So I want to talk about some important mobile app metrics that matter. So, thank you, Jennifer, on the Moz team—sorry, Moz, not SEOmoz anymore—drew my little diagram for me. So really the buckets for apps that matter are really three: acquisition, engagement, and outcomes. So let’s go through these metrics, and it’s slightly different than web. So if you’ve only measured on web, this will be different, but at the same time there’s a sort of one-to-one with different metrics, for example pages and screens per session.

So let’s take a look. For acquisition metrics, app downloads are really important. So when you’re acquiring new users, you definitely want to look at who’s actually downloading your app, what channels are most effective at acquisition, what channels are actually bringing you high quality users.

You also want to look at new users and active users. So this is important. You want to make sure you’re not just acquiring a whole bunch of new users, but you want to make sure that you actually have a steady stream of people actively launching your app. So when we talk about engagement in just a second, we’ll show you why that’s important. But I think a lot of marketers make the mistake of doing a good job bringing people to their app download page, getting people to install the app, and then they’re really not concerned with if that user sticks around. For apps it’s really important. If people download your app, use it once and then never use it again, you’ve kind of failed.

Also for acquisition, demographics are really important for apps. So you especially want to look at where people are coming from; which on apps is really interesting because they might not be at home, they might be at home; as well as acquisition channels. So whether you have an android or an iOS app, the channels that your users come from are going to be pretty important, and if you’re already looking at web analytics, these will be familiar to you. You’ll see acquisition sources from search, hopefully from email campaigns. If you’re doing that to market your app via email, make sure you tag those links. And how people are coming to your page in the Play Store. In the iOS marketplace, it’s a little bit more of a black box, but certainly you’ll still want to take a look.

Next up under engagement, so engagement metrics are really important for apps. I’d actually say engagements are the most important metrics to look at, because, again, if people install your app once and never launch it again, you’ve kind of failed. So engagement flow is important for apps. These are reports we have in Google Analytics mobile app analytics, but certainly no matter what app analytics platform that you’re using, there will be a visualization tool to actually look at how people move through your app, as well, app screens, so what screens people look at. App screens is an interesting one because you could have a lot of people viewing multiple screens on your app. Is this a good thing? Maybe.

You want to take a look at are they actually accomplishing what you want, because you might have too many screens. What we’ve seen for apps is that by reducing the number of screens and perhaps putting more content on one screen that someone can slide through, get an overview of quickly, and then drill down into a more specific feature or screen on your app, you can increase the engagement with your app significantly rather than creating frustration if someone has to continue to click on different screens on your app to get to what they want. So I think you’ll notice a lot of the apps that are most sticky for you, at least I find, actually have less screens.

Loyalty and retention is really important. So whatever app analytics tool you’re using, you want to be looking at your loyalty reports to determine who’s launching your app, not just one or two times, but you want to see in a given month people launching your app 10 times, 11 times, 20 times, even 50 times.

So if your app is really sticky, people will be using it more consistently. So really, if you have a lot of people downloading your app, but then you notice those same users aren’t very loyal, they’re not launching your app a lot of times throughout the month, you want to reevaluate your app before you go out and do more acquisition, because there’s nothing worse than spending more money in online advertising and mobile app advertising to get more users if they’re not engaging with your app.

So figure that out soon. Make sure that your app is sticky. This is even more important than web because what you want ideally is you want to be using your analytics to make your app better, and you want it to be so good that it’s on the home screen of your user’s device. It’s not buried on a second or third screen that they never actually launch on their iPhone or on their Android.

So that gets us to outcomes, everyone’s favorite report. So if you’re kicking butt with acquisition and you have a really sticky app that people are using all the time, you’ll want to next focus on outcomes. So outcomes, similar to web, are really conversion areas for our app, where we’re actually making money; metrics that have economic impact for our business.

So, things like app sales, if people are actually buying your app, that would show up in outcome reports. Ad monetization, if you have in-app monetization for ads, that’s a great way to monetize your app. Especially if you have a game, it’s a great way to make money from your app using a tool like AdMob. You want to determine how you can maximize ad revenue without being intrusive, because you definitely don’t want to have an ad experience in an app that’s going to detract from the app.

You want to make sure that’s it’s a balance. If you’re a new site, you want to make sure that there are not ads coming over your content and causing users to accidentally click them. You want to make sure that the ads are relevant and that the ads are useful, and that they’re not disruptive to the experience.

You also want to consider in-app purchases. So if you’re a game app, for example, a lot of game apps are really successful at charging users to unlock secret features or extra things inside your app. Maybe it’s a way to get an advantage over the other players in the game. In-app purchases is a great way to do that. You want to measure those and determine which in-app purchases are sticky. I have a few friends that are app developers, and that’s the bread and butter of their monetization for their apps.

You’ll also want to look at goal conversion. So if you actually don’t sell anything in your app, if you’re, for example, E*Trade – and I have an E*Trade account, I’m a big fan of theirs – you would want to track goal conversions, such as maybe to them a goal conversion is me looking at the trade screen or me looking at my portfolio or some other action in the app. Because what you don’t want is to not know what success looks like in your app.

You want to understand what you want your users doing, and that way you can actually have some goals to measure against. If you’re not selling anything in your app, just like on web, assign a value to those goals. Because once you do that, all of these other buckets become more interesting when you can do segmentation and you want to look at, “Hey, what users on the acquisition side of the equation are actually coming through to purchase?” Or, “Which users are engaging really well, but aren’t necessarily making me more revenue?”

So you’ll want to segment that data, and you’ll want to look at which users are completing your desired goals. So that’s just a service level overview.

Some other things that I didn’t go through were the developer reports, like crashes and exceptions. Certainly, if you have an app, those are important as well. If you’re a marketer, look at those reports too, because you want to push your development team to eliminate any of the crashes in your app. Those aren’t good things. You can suffer attrition, certainly, unless your app is really, really sticky. People might launch it once, and enough crashes they might not ever come back. So those are important reports to look at too.

But I just wanted to provide an overview to you guys today. Hopefully, you are measuring apps right now. We have a free app analytics tool at Google.

But no matter what app tool you use, you definitely want to be measuring. Data is really important for apps. If you have any questions, feel free to tweet at me @AdamSinger. Always happy to help out with app measurement, and have an awesome weekend Mozzers.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Bing’s Knowledge Repository, Satori, Adds More Interactive Content

Dr. Richard Qian from the Microsoft Bing Index & Knowledge Team announced some major updates to Satori, Bing’s version of Google’s knowledge graph. The updates are vast and incredibly smart, making the Bing knowledge repository a lot more useful and interactive. The new features…

Please visit Search Engine Land for the full article.

Google Adds “Now” Content, Features To iOS Maps Apps

In October, Google started adding Google Now style cards to its PC Maps. Now, Google is adding some of those same features to its iOS Maps apps. If you update Google Maps for iOS and sign in, you can now see flight, hotel, and restaurant reservations from Gmail in Maps — and then search or…

Please visit Search Engine Land for the full article.

Survey: Majority Use Mobile Search, Find It “Harder Than PC”

In the cross-platform, post-PC era, the big question the search industry faces is whether and how PC search behaviors are extending into mobile. The short answer is they are. Google has sponsored research that shows majorities of smartphone owners use search engines (74 percent). Yet mobile search,…

Please visit Search Engine Land for the full article.

4 Recent Changes To SEO That Are Vital To Holiday Retail Strategy

It’s the holiday season again, and while certain sectors may be winding down for a relaxing month of festivities, online sellers are revving up their retail strategy. It’s a crucial, make-or-break time of year for retailers to excel in customer service, gift-wrapping services, and, of…

Please visit Search Engine Land for the full article.