Using Universal Analytics to Measure Interaction with Assets that Live on Different Sites
This is a technical post that requires an understanding of JavaScript or at least a technical understanding of the web. The new version of Google Analytics, Universal Analytics, has a little gem in it called the Measurement Protocol. The Measurement Protocol lets you send data to your GA account using HTTP POST requests. In other […]
How Google Understands (Named) Entities: Google Trends Update Visualizes That!
Recently I’ve shared an indepth look into how Google Now Topics Gives You Insight into Hummingbird Update by @ajkohn.
read more
David Brent, Meet Clippy the Paperclip (How to create a mascot)
Mascots are all the rage these days. A large gang of fluffy characters is slowly taking over the internet, popping up on our screens with their goggling eyes and goofy grins.
Apart from a mild distaste for overly cutesy things, I applaud this trend.
A major update to our Local SEO plugin
Today we’re releasing a major update to our Local SEO plugin, bringing it to version 1.2. This new version has functionality that quite a few of you requested, most important of them is a new store locator option. Store locator When you’re a brand that’s sold in several stores throughout a country, continent or even…
This post first appeared on Yoast. Whoopity Doo!
After the Penalty, What Do You Do for Links?
From February 2005 through May 2005 Google went through what I have sometimes called the Google Awful Update. Their search results were characterized by many URL-only listings and they often displayed 2-year-old data in place of contemporary information that their crawlers should have been picking up. Danny Sullivan seemed unaware of the issue at the time when I mentioned it to him but I confirmed it was happening to many Websites. The only way you could improve your Google performance at the time (if your site was affected by the Awful Update) was to publish new content, which could not be replaced by 2-year-old data and was not shown in URL-only format. What I took away from that experience was that “Google remembers everything”. I have been telling people that Google remembers everything for years. At the SMX Advanced 2007 conference in Seattle, during the “You-and-A” session with Matt Cutts, this memory thing came up again with an example where Matt mentioned if you own 200 spammy sites your 201st site may be flagged for review. This was not the first time the issue had come up. At an earlier conference during a Website review session someone asked Matt why […]
Gianluca Fiorelli’s Super Search Update – November 2013 edition
The end of the year is coming, and with it the time of looking back at what happened in 2013 and start trying to guess what 2014 will offer.
Usually this translates into boring lists and weird previews that really don’t understand the real meaning of the events of past.
But if someone like Dr Pete presents his on preview, well… I trust them, because very few persons have such a constant and attentive view of the changes Google commits daily to its own SERPs…
Click and discover the best about Search and Digital Marketing shared during the last weeks.
Post from Gianluca Fiorelli on State of Digital
Gianluca Fiorelli’s Super Search Update – November 2013 edition
Hummingbird’s Unsung Impact on Local Search
Posted by David-Mihm
Though I no longer actively consult for clients, there seems to have been a significant qualitative shift in local results since Google’s release of Hummingbird that I haven’t seen reported on search engine blogs and media outlets. The columns I have seen have generally espoused advice to take advantage of what Hummingbird was designed to do rather than looked at the outcome of the update.
From where I sit, the outcome has been a slightly lower overall quality in Google’s local results, possibly due in part to a “purer” ranking algorithm in local packs. While these kinds of egregious results reported soon after Hummingbird’s release have mostly disappeared, it’s the secondary Hummingbird flutter, which may have coincided with the November 14th “update,” that seems to have caused the most noticeable changes.
I’ll be working with Dr. Pete to put together more quantitative local components of Mozcast in the coming months, but for the time being, I’ll just have to describe what I’m seeing today with a fairly simplistic analysis.
To do the analysis, I performed manual searches for five keywords, both geo-modified and generic, in five diverse markets around the country. I selected these keywords based on terms that I knew Google considered to have “local intent” across as broad a range of industries as I could think of. After performing the searches, I took note of the top position and number of occurrences of four types of sites, as well as position and number of results in each “pack.”
| Keywords | Markets | Result Type Taxonomy |
| personal injury lawyer | Chicago | national directory (e.g., Yelp) |
| assisted living facility | Portland | regional directory (e.g., ArizonaGolf.com) |
| wedding photographer | Tampa | local business website (e.g., AcmeElectric.com) |
| electrician | Burlington | barnacle webpage (e.g., facebook.com/acmeelectric) |
| pet store | Flagstaff | national brand (e.g., Petsmart.com) |
I also performed an even smaller analysis using three keywords that returned carousel results (thanks to SIM Partners for this sample list of keywords): “golf course,” “restaurant,” and “dance club.”
Again, a very simple analysis that is by no means intended to be a statistically significant study. I fully realize that these results may be skewed by my Portland IP address (even though I geo-located each time I searched for each market), data center, time of day, etc.
I’ll share with you some interim takeaways that I found interesting, though, as I work on a more complete version with Dr. Pete over the winter.
1. Search results in search results have made a comeback in a big way
If anything, Hummingbird or the November 14th update seem to have accelerated the trend that started with the Venice update: more and more localized organic results for generic (un-geo-modified) keywords.
But the winners of this update haven’t necessarily been small businesses. Google is now returning specific metro-level pages from national directories like Yelp, TripAdvisor, Findlaw, and others for these generic keywords.
This trend is even more pronounced for keywords that do include geo-modifiers, as the example below for “pet store portland” demonstrates.

Results like the one above call into question Google’s longstanding practice of minimizing the frequency with which these pages occur in Google search results. While the Yelp example above is one of the more blatant instances that I came across, plenty of directories (including WeddingWire, below) are benefitting from similar algorithmic behavior. In many cases the pages that are ranking are content-thin directory pages—the kind of content to which Panda, and to some extent Penguin, were supposed to minimize visibility.

Overall, national directories were the most frequently-occurring type of organic result for the phrases I looked at—a performance amplified when considering geo-modified keywords alone.

National brands as a result type is underrepresented due to ‘personal injury lawyer,’ ‘electrician,’ and ‘wedding photographer’ keyword choices. For the keywords where there are relevant national brands (‘assisted living facility’ and ‘pet store’), they performed quite well.
2. Well-optimized regional-vertical directories accompanied by content still perform well
While a number of thriving directories were wiped out by the initial Panda update, here’s an area where the Penguin and Hummingbird updates have been effective. There are plenty of examples of high-quality regionally focused content rewarded with a first-page position—in some cases above the fold. I don’t remember seeing as many of these kinds of sites over the last 18 months as I do now.
Especially if keywords these sites are targeting return carousels instead of packs, there’s still plenty of opportunity to rank: in my limited sample, an average of 2.3 first-page results below carousels were for regional directory-style sites.


3. There’s little-to-no blending going on in local search anymore
While Mike Blumenthal and Darren Shaw have theorized that the organic algorithm still carries weight in terms of ranking Place results, visually, authorship has been separated from place in post-Hummingbird SERPs.
Numerous “lucky” small businesses (read: well-optimized small businesses) earned both organic and map results across all industries and geographies I looked at.

4. When it comes to packs, position 4 is the new 1
The overwhelming majority of packs seem to be displaying in position 4 these days, especially for “generic” local intent searches. Geo-modified searches seem slightly more likely to show packs in position #1, which makes sense since the local intent is explicitly stronger for those searches.

Together with point #3 in this post, this is yet another factor that is helping national and regional directories compete in local results where they couldn’t before—additional spots appear to have opened up above the fold, with authorship-enabled small business sites typically shown below rather than above or inside the pack. 82% of the searches in my little mini-experiment returned a national directory in the top three organic results.

5. The number of pack results seems now more dependent on industry than geography
This is REALLY hypothetical, but prior to this summer, the number of Place-related results on a page (whether blended or in packs) seemed to depend largely on the quality of Google’s structured local business data in a given geographic area. The more Place-related signals Google had about businesses in a given region, and the more confidence Google had in those signals, the more local results they’d show on a page. In smaller metro areas for example, it was commonplace to find 2- and 3-packs across a wide range of industries.
At least from this admittedly small sample size, Google increasingly seems to be a show a consistent number of pack results by industry, regardless of the size of the market.
| Keyword | # in Pack | Reason for Variance |
| assisted living facility | 6.9 | 6-pack in Burlington |
| electrician | 6.9 | 6-pack in Portland |
| personal injury lawyer | 6.4 | Authoritative OneBox / Bug in Chicago |
| pet store | 3.0 | |
| wedding photographer | 7.0 |
This change may have more to do with the advent of the carousel than with Hummingbird, however. Since the ranking of carousel results doesn’t reliably differ from that of (former) packs, it stands to reason that visual display of all local results might now be controlled by a single back-end mechanism.
6. Small businesses are still missing a big opportunity with basic geographic keyword optimization
This is more of an observational bullet point than the others. While there were plenty of localized organic results featuring small business websites, these tended to rank lower than well-optimized national directories (like Yelp, Angie’s List, Yellowpages.com, and others) for small-market geo-modified phrases (such as “electrician burlington”).

For non-competitive phrases like this, even a simple website with no incoming links of note can rank on the first page (#7) just by including “Burlington, VT” in its homepage Title Tag. With just a little TLC—maybe a link to a contact page that says “contact our Burlington electricians”—sites like this one might be able to displace those national directories in positions 1-2-3.
7. The Barnacle SEO strategy is underutilized in a lot of industries
Look at the number of times Facebook and Yelp show up in last year’s citation study I co-authored with Whitespark’s Darren Shaw. Clearly these are major “fixed objects” to which small businesses should be attaching their exoskeletons.
Yet 74% of searches I conducted as part of this experiment returned no Barnacle results.

This result for “pet store chicago” is one of the few barnacles that I came across—and it’s a darn good result! Not only is Liz (unintenionally?) leveraging the power of the Yelp domain, but she gets five schema’d stars right on the main Google SERP—which has to increase her clickthrough rate relative to her neighbors.

Interestingly, the club industry is one outlier where small businesses are making the most of power profiles. This might have been my favorite result—the surprisingly competitive “dance club flagstaff” where Jax is absolutely crushing it on Facebook despite no presence in the carousel.

What does all this mean?
I have to admit, I don’t really know the answer to this question yet. Why would Google downgrade the visibility of its Place-related results just as the quality of its Places backend has finally come up to par in the last year? Why favor search-results-in-local-search-results, something Google has actively and successfully fought to keep out of other types of searches for ages? Why minimize the impact of authorship profiles just as they are starting to gain widespread adoption by small business owners and webmasters?
One possible reason might be in preparation for more card-style layouts on mobile phones and wearable technology. But why force these (I believe slightly inferior) results on users of desktop computers, and so far in advance of when cards will be the norm?
At any rate, here are five takeaways from my qualitative review of local results in the last couple of months.
- Reports of directories’ demise have been greatly exaggerated. For whatever reason (?), Google seems to be giving directories a renewed lease on life. With packs overwhelmingly in the fourth position, they can now compete for above-the-fold visibility in positions 1-2-3, especially in smaller and mid-size metro areas.
- Less-successful horizontal directories (non-Yelps and TripAdvisors, e.g.) should consider the economics of their situation. Their ship has largely sailed in larger metro areas like Chicago and Portland. But they still have the opportunity to dominate smaller markets. I realize you probably can’t charge a personal injury lawyer in Burlington what you charge his colleague in downtown Chicago. But, in terms of the lifetime value of who will actually get business from your advertising packages, the happy Burlington attorney probably exceeds the furious one from Chicago (if she is even able to stay in business through the end of her contract with you).
- The Barnacle opportunity is huge, for independent and national businesses alike. With Google’s new weighting towards directories in organic results and the unblending of packs, barnacle listings present an opportunity for savvy businesses to earn three first-page positions for the same keyword—one pack listing, one web listing, and one (or more) barnacle listing.
- National brands who haven’t taken my advice to put in a decent store locator yet should surely do so now. Well-structured regional pages, and easily-crawled store-level pages, can get great visibility pretty easily. (If you’re a MozCon attendee or have purchased access, you can learn more about this advice in my MozCon 2013 presentation.)
-
Andrew Shotland already said it in the last section of his Search Engine Land column, but regionally-focused sites—whether directories or businesses—should absolutely invest in great content. With Penguin and Hummingbird combined, thin-content websites of all sizes are having a harder time ranking relative to slightly thicker content directories.
Well, that’s my take on what’s happening in local search these days…is the Moz community seeing the same things? Do you think the quality of local results has improved or declined since Hummingbird? Have you perceived a shift since November 14th? I’d be particularly interested to hear comments from SEOs in non-U.S. markets, as I don’t get the chance to dive into those results nearly as often as I’d like.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
Smartphone crawl errors in Webmaster Tools
Webmaster level: all
Some smartphone-optimized websites are misconfigured in that they don’t show searchers the information they were seeking. For example, smartphone users are shown an error page or get redirected to an irrelevant page, but desktop users are shown the content they wanted. Some of these problems, detected by Googlebot as crawl errors, significantly hurt your website’s user experience and are the basis of some of our recently-announced ranking changes for smartphone search results.
Starting today, you can use the expanded Crawl Errors feature in Webmaster Tools to help identify pages on your sites that show these types of problems. We’re introducing a new Smartphone errors tab where we share pages we’ve identified with errors only found with Googlebot for smartphones.

Some of the errors we share include:
-
Server errors: A server error is when Googlebot got an HTTP error status code when it crawled the page.
-
Not found errors and soft 404s: A page can show a “not found” message to Googlebot, either by returning an HTTP 404 status code or when the page is detected as a soft error page.
-
Faulty redirects: A faulty redirect is a smartphone-specific error that occurs when a desktop page redirects smartphone users to a page that is not relevant to their query. A typical example is when all pages on the desktop site redirect smartphone users to the homepage of the smartphone-optimized site.
-
Blocked URLs: A blocked URL is when the site’s robots.txt explicitly disallows crawling by Googlebot for smartphones. Typically, such smartphone-specific robots.txt disallow directives are erroneous. You should investigate your server configuration if you see blocked URLs reported in Webmaster Tools.
Fixing any issues shown in Webmaster Tools can make your site better for users and help our algorithms better index your content. You can learn more about how to build smartphone websites and how to fix common errors. As always, please ask in our forums if you have any questions.
Posted by Pierre Far, Webmaster Trends Analyst
11 Marketing Survival Lessons Learned from Accidentally Enraging an Island City-State
Posted by DannyDover
My initial response to the massive traffic increase was not exactly professional.
“HOLY FREAKING CRAP BALLS!“, I blurted out. I searched the room for a fellow nerd to share my e-thusiasm with, but only found a room full of strangers eating sandwiches.


Over the course of the next few days, the post received more than 600,000 unique visitors. If you segment the traffic to only include visits from Singapore, the number of unique visitors is equivalent to 10% of the entire population of the country (although admittedly this metric is a bit inflated due to people reading the post on multiple devices.)
Some context
I support myself financially as a storytelling consultant. On a day-to-day level this means I work on marketing strategy, creative writing, and web development. Admittedly it is a weird mix, but I enjoy the lifestyle.
I am currently living in Vietnam, but recently spent two months living in Singapore.
Like I do with all of my travels, I penned a blog post about my experience living in Singapore and hit publish. You can read the entire post here, but the quick summary is:
- Singapore has accomplished a lot in a short amount of time.
- I am deeply concerned that the societal and cultural costs of these accomplishments are harming the populace (I cited concerning data points related to stress).
- I have limited time and resources, and will not be returning to Singapore.
My blog is fairly well read, so I was surprised that this post started out as one of my least-read posts. After a few days the post was, for most intents and purposes, just another link in the archive.
Last Wednesday, I grabbed my normal Vietnamese breakfast (a local sandwich called a Bánh mì and a coconut milk-based smoothie) and went into my co-working office to start on my to-do list for the day.
I have been trying to convert bad online habits into good ones, so when I found myself craving a peek at Facebook, I clicked on my Google Analytics shortcut instead. It opened up my real-time report, and I practically dropped my meal.
Marketing lessons learned
The next few days were the craziest marketing adventure that I have ever had. The following are the key lessons I learned from this experience:
1. Honesty is power
I think the key reason that this post resonated with people was that it was uncommonly honest. (This is a trait I picked up from Rand when I worked at Moz. It isn’t a marketing trait, it is a life trait.) This post was published on my personal blog where I don’t have any ads or up-sells. I write posts there solely because I enjoy writing. In this case, I thought I had some interesting insights about Singapore and wanted to share my honest thoughts. The power in this was that when people read it, they too wanted to share my thoughts (along with their own!) with their online friends.
2. Be conscious of the clickstream
In the post I cited some suicide statistics that were quite alarming. As the thousands of comments about the post came in (mostly via Facebook), I continually received the criticism that my data was incorrect. I triple-checked my sources (they checked out) and tried to reply to as many of the false claims of bad data as possible. It wasn’t until two days later that I realized that people Googling the statistics were taken straight to a Wikipedia article that listed outdated data. After I updated the Wikipedia article to include the most recent data, the data criticism comments immediately stopped. I could have saved myself a giant headache if I had just viewed the situation from the readers’ perspective and found the misinformation on Wikipedia earlier.
3. Be a first-responder
As the comments came in, I was alerted (rudely and repeatedly) that I had erroneously cited a date as 2011 rather than 2001. My first thought was just to subtly update the number but was worried this might start a backlash. For this reason, I called Jessica Dover. Jessica has worked on social media strategy for many of the world’s most well known celebrities and has solved more social media problems than I have followers. (Disclaimer: She also happens to be my sister, but I honestly think that has hindered her more than helped her :-p. Her success is hard-earned and her own.) Without hesitation, she told me exactly what to do.
- Publicly thank the readers for all of their feedback.
- Acknowledge that you are listening to them.
- Acknowledge the error and then actually fix it.
This strategy worked wonders. I fixed my mistake and the amount of comments on the blog post quadrupled (after the audience was reassured that I was listening and responding). Huge win!
If you don’t have your own social media mentor like Jessica, Moz’s Q&A can be a great source of information.
4. Patch the holes in your net
At the onset, I was receiving a lot of traffic but none of it was converting (my conversion events were email captures and social follows). When I couldn’t fix this myself, I called another member of my marketing SWAT team, Joe Chura. Joe runs an agency called Launch Digital Marketing. I think they are the most underrated team in the industry. In no time, they had a plan. Following their advice I installed two WordPress plugins:
- MailChimp for WordPress Lite: There are lots of plugins that add MailChimp to a WordPress site but this is the only one that I know of that adds an opt-out check box below your comment reply box. If your readers are already entering their e-mail address in order to leave a comment, they might as well be asked if they want to sign up for your newsletter. For the text box, I used the text “I want to be kept up-to-date on Life Listed and receive free resources!”
- Flare: This is my favorite social media sharing plugin (there are countless other options). This version is technically no longer under active development (they are building a new version to replace it), so I had disabled it on my site. Launch convinced me to re-add it.
After I added these plugins, it doubled the size of my mailing list and started what eventually became a viral spread of the blog post on Twitter. These were huge wins. (Hat tip to Dan Andrews for being at the forefront of that Twitter storm.)
Again, if you don’t have your own marketing SWAT team, Moz’s Q&A can be a great resource.
5. If you have to think about server optimization, it is too late
Throughout the entire process my server never went down. I credit this to two things:
First, props to WPengine (my host) for being seamless. They handled the spike without any hiccups or annoying interruptions. I will likely have to pay an overage fee but that is a MUCH better option than having a site outage.
Second, I credit preparation. I have long been using a tool called http://gtmetrix.com/ to diagnose speed problems on my site. (Hat tip to Jon over at Raven for introducing this tool to me). I love this tool because it combines the Google Page Speed tool and Yahoo’s YSlow into one convenient and easy to understand interface. Luckily, I had implemented all of the recommended fixes well before this traffic spike. I am kind of a speed optimization nerd. :-p
6. Take comfort in the negativity slope
When I first posted the blog post, no one cared. When it started to gain some traction, I was immediately told how stupid it and I were. As it gained momentum the amount of naysayers increased. It wasn’t until the post reached full velocity that the supporters started to outnumber the naysayers. This has been a trend that I have observed with all of my successful content. I now take comfort in knowing that it is going to get worse until it suddenly gets better. Negativity online is a slope, and luckily it does have a peak.
7. Facebook’s walled garden is much worse than it was before
Facebook once offered a tool called Facebook Insights for Domains. This tool allowed you to get valuable information on any traffic that was referred to your verified domain from Facebook. Unfortunately, Facebook has killed it off. When my post went viral on Facebook, I had no visibility other than that the traffic was coming from Facebook and Facebook mobile. I had no idea what pages or groups the applicable conversations were happening on, and thus had no way to respond to conversations happening behind the wall. This was a huge frustration throughout the whole process.
8. A rising tide…
When people came to my website to read the Singapore post, many of them checked out my other posts as well (this is to be expected). In response to this, I published a post that I thought would also be applicable to the new readers. Due to the increased visibility, this post (on useful money philosophies) subsequently went mildly viral. This in turn drove even more conversions.
9. Be aware of parallel universes
Stories exist in parallel universes:
- What the storyteller experiences
- The story the storyteller shares
- The story as the audience members understand it
These are all very different stories!
Many of the comments, compliments and criticism that I received about the Singapore post had absolutely nothing to do with the words written in my article. For many, it was their personal experiences, not my blog post, that drove their responses. At first, this was a major frustration point for me. It wasn’t until I mapped out the perspectives in the above list that I calmed down and started to appreciate the storytelling experience.
10. Listen first, then wait, then react
When the responses came in, I was vastly outnumbered (it was literally 500,000 to 1)! The only way I was able to deal with that amount of volume was to listen, learn from an expert (see lesson 3), collect data, process that data, and then react. I let the first several dozen comments come in before I started to respond. I think this was critical in me being able to follow and supplement the large-scale discussion.
11. Titles are 60% of the battle
The click-worthiness of the blog post title was a major contributing factor to its success. (Second only to its honesty). Admittedly it was an attention-grabbing title but at the same time it was true. I actually will never be returning to Singapore. I didn’t perform any keyword research or A/B tests when picking the blog post title. Instead, I just picked something that I figured I would want to click. The best titles are always that simple.
When I look back on this marketing adventure, I feel thankful. The world, not just Singapore, is in an amazing state of change right now. I am glad that my little voice was able to contribute a little bit to the global discussion.
If you would like to hear about other marketing adventures, feel free to connect with me on Google+.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
Google Wants All Of Your User Names And Passwords
Google is currently developing a new security platform called, U2F (Universal 2nd Factor).
read more
About Your Conference Presentation…
A client recently asked for notes on their draft presentation for an upcoming conference. I have attended a number of conferences lately and, like the rest of you, have had to sit through some pretty poor presentations. While I don’t claim to be God’s gift to Powerpoint, or even Greg Gifford, I am at that […]
The post About Your Conference Presentation… appeared first on Local SEO Guide.
Clean Up 404 Errors Automatically With Chrome?
I just came across this very interesting page on Google Chrome’s Privacy Policy.
read more
The Year in Search – A Round Up of 2013
As 2013 draws to a close come with me, dear reader, as we review this year in search.
We saw yet more animal-shaped Google updates, spent a horrifying amount of time watching Harlem Shake videos and there was some noise about GIFs being pronounced JIF.
Deepcrawl.co.uk – The Crawler of Choice for LARGE Websites
We were approached by Matt at DeepCrawl.co.uk to review their relatively young, but capable site crawl cloud based platform. When we first received the request, I was a little unsure as to how useful this tool will be when compared to well-known & comprehensive tools like Screaming Frog and IIS SEO Toolkit, both of which […]
The post Deepcrawl.co.uk – The Crawler of Choice for LARGE Websites appeared first on SEOgadget.
4 Tools To Gather User Feedback
As search marketers we invest a lot of time and effort in driving traffic to our sites.
Why Create a Microsite About Yourself?
Someone found SEO Theory for an interesting question that I haven’t really addressed in the past. And, in fact, if I had addressed it in the past the topic would probably need a refresh by now anyway. The question concerns creating microsites for one’s self. Why do that? Search engine optimization is about obtaining the best possible performance from search referral traffic. That means you want the traffic to be interested in your content and you want the people searching for your content to be able to easily find your content. So when YOU are the topic of the query, having a microsite does sometimes make sense. In fact, every Web searchable social media profile you create is a microsite; your Twitter account, your Linkedin profile, your Pinterest board, your Facebook page — these are all little sites that tell people something about you. Google+ gives you the ability — through its (now drifting) Authorship markup — to create a hub for your microsites so that people can see where your content may be found. You can link to social media profiles or blogs where you contribute content. But Who Really Needs a Microsite? So, with all these social media […]
Demystifying Viewing Patterns
Lately I’ve been intrigued by something called the ‘viewing pattern’ of people. This is a pattern in which people view, in this case, websites. There are really a lot of ideas about this out there. Now I’m wondering: is there one right pattern? In other words: is there one pattern we should follow when designing our product…
This post first appeared on Yoast. Whoopity Doo!
How to Build Your Own Mass Keyword Difficulty Tool
Posted by MartinMacDonald
Despite keywords being slightly out of fashion, thanks to the whole (not provided) debacle, it remains the case that a large part of an SEO’s work revolves around discovering opportunity and filling that same opportunity …
Don’t get stuck in a content rut, this week’s DistilledLive video
When you realise something works in the online world, it’s easy to keep at it and get stuck in a content rut.
For this week’s DistilledLive, Jess and Britt maximise the perks of being in the same office (and time zone) by taking a look beyond competitor’s borders when it comes to approaching your content.
Value Based SEO Strategy
One approach to search marketing is to treat the search traffic as a side-effect of a digital marketing strategy. I’m sure Google would love SEOs to think this way, although possibly not when it comes to PPC! Even if you’re taking a more direct, rankings-driven approach, the engagement and relevancy scores that come from delivering what the customer values should serve you well, too.
In this article, we’ll look at a content strategy based on value based marketing. Many of these concepts may be familiar, but bundled together, they provide an alternative search provider model to one based on technical quick fixes and rank. If you want to broaden the value of your SEO offering beyond that first click, and get a few ideas on talking about value, then this post is for you.
In any case, the days of being able to rank well without providing value beyond the click are numbered. Search is becoming more about providing meaning to visitors and less about providing keyword relevance to search engines.
What Is Value Based Marketing?
Value based marketing is customer, as opposed to search engine, centric. In Values Based Marketing For Bottom Line Success, the authors focus on five areas:
- Discover and quantify your customers’ wants and needs
- Commit to the most important things that will impact your customers
- Create customer value that is meaningful and understandable
- Assess how you did at creating true customer value
- Improve your value package to keep your customers coming back
Customers compare your offer against those of competitors, and divide the benefits by the cost to arrive at value. Marketing determines and communicates that value.
This is the step beyond keyword matching. When we use keyword matching, we’re trying to determine intent. We’re doing a little demographic breakdown. This next step is to find out what the customer values. If we give the customer what they value, they’re more likely to engage and less likely to click back.
What Does The Customer Value?
A key question of marketing is “which customers does this business serve”? Seems like an obvious question, but it can be difficult to answer. Does a gym serve people who want to get fit? Yes, but then all gyms do that, so how would they be differentiated?
Obviously, a gym serves people who live in a certain area. So, if our gym is in Manhattan, our customer becomes “someone who wants to get fit in Manhattan”. Perhaps our gym is upmarket and expensive. So, our customer becomes “people who want to get fit in Manhattan and be pampered and are prepared to pay more for it”. And so on, and so on. They’re really questions and statements about the value proposition as perceived by the customer, and then delivered by the business.
So, value based marketing is about delivering value to a customer. This syncs with Google’s proclaimed goal in search, which is to put users first by delivering results they deem to have value, and not just pages that match a keyword term. Keywords need to be seen in a wider context, and that context is pretty difficult to establish if you’re standing outside the search engine looking in, so thinking in terms of concepts related to the value proposition might be a good way to go.
Value Based SEO Strategy
The common SEO approach, for many years, has started with keywords. It should start with customers and the business.
The first question is “who is the target market” and then ask what they value.
Relate what they value to the business. What is the value proposition of the business? Is it aligned? What would make a customer value this business offering over those of competitors? It might be price. It might be convenience. It’s probably a mix of various things, but be sure to nail down the specific value propositions.
Then think of some customer questions around these value propositions. What would be the likely customer objections to buying this product? What would be points that need clarifying? How does this offer differ from other similar offers? What is better about this product or service? What are the perceived problems in this industry? What are the perceived problems with this product or service? What is difficult or confusing about it? What could go wrong with it? What risks are involved? What aspects have turned off previous customers? What complaints did they make?
Make a list of such questions. These are your article topics.
You can glean this information by either interviewing customers or the business owner. Each of these questions, and accompanying answer, becomes an article topic on your site, although not necessarily in Q&A format. The idea is to create a list of topics as a basis for articles that address specific points, and objections, relating to the value proposition.
For example, buying SEO services is a risk. Customers want to know if the money they spend is going to give them a return. So, a valuable article might be a case study on how the company provided return on spend in the past, and the process by which it will achieve similar results in future. Another example might be a buyer concerned about the reliability of a make of car. A page dedicated to reliability comparisons, and another page outlining the customer care after-sale plan would provide value. Note how these articles aren’t keyword driven, but value driven.
Ever come across a FAQ that isn’t really a FAQ? Dreamed-up questions? They’re frustrating, and of little value if the information doesn’t directly relate to the value we seek. Information should be relevant and specific so when people land on the site, there’s more chance they will perceive value, at least in terms of addressing the questions already on their mind.
Compare this approach with generic copy around a keyword term. A page talking about “SEO” in response to the keyword term “SEO“might closely match a keyword term, so that’s a relevance match, but unless it’s tied into providing a customer the value they seek, it’s probably not of much use. Finding relevance matches is no longer a problem for users. Finding value matches often is. Even if you’re keyword focused, added these articles provides you semantic variation that may capture keyword searches that aren’t appearing in keyword tools.
Keyword relevance was a strategy devised at a time when information was less readily available and search engines weren’t as powerful. Finding something relevant was more hit and miss that it is today. These days, there’s likely thousands, if not millions, of pages that will meet relevance criteria in terms of keyword matching, so the next step is to meet value criteria. Providing value is less likely to earn a click back and more likely to create engagement than mere on-topic matching.
The Value Chain
Deliver value. Once people perceive value, then we have to deliver it. Marketing, and SEO in particular, used to be about getting people over the threshold. Today, businesses have to work harder to differentiate themselves and a sound way of doing this is to deliver on promises made.
So the value is in the experience. Why do we return to Amazon? It’s likely due to the end-to-end experience in terms of delivering value. Any online e-commerce store can deliver relevance. Where competition is fierce, Google is selective.
In the long term, delivering value should drive down the cost of marketing as the site is more likely to enjoy repeat custom. As Google pushes more and more results beneath the fold, the cost of acquisition is increasing, so we need to treat each click like gold.
Monitor value. Does the firm keep delivering value? To the same level? Because people talk. They talk on Twitter and Facebook and the rest. We want them talking in a good way, but even if they talk in a negative way, it can still useful. Their complaints can be used as topics for articles. They can be used to monitor value, refine the offer and correct problems as they arise. Those social signals, whilst not a guaranteed ranking boost, are still signals. We need to adopt strategies whereby we listen to all the signals, so to better understand our customers, in order to provide more value, and hopefully enjoy a search traffic boost as a welcome side-effect, so long as Google is also trying to determine what users value. .
Not sounding like SEO? Well, it’s not optimizing for search engines, but for people. If Google is to provide value, then it needs to ensure results provide not just relevant, but offer genuine value to end users. Do Google do this? In many cases, not yet, but all their rhetoric and technical changes suggest that providing value is at the ideological heart of what they do. So the search results will most likely, in time, reflect the value people seek, and not just relevance.
In technical terms, this provides some interesting further reading:
Today, signals such as keyword co-occurrence, user behavior, and previous searches do in fact inform context around search queries, which impact the SERP landscape. Note I didn’t say the signals “impact rankings,” even though rank changes can, in some cases, be involved. That’s because there’s a difference. Google can make a change to the SERP landscape to impact 90 percent of queries and not actually cause any noticeable impact on rankings.
The way to get the context right, and get positive user behaviour signals, and align with their previous searches, is to first understand what people value.
