Bing: Here’s 7 Ways Our Image Search Is Better Than Google

Microsoft’s Meenaz Merchant, Senior Program Manager at Bing, posted a behind the scenes look at why Bing is a better image search engine than Google. Here are the seven examples on scenarios on why Bing is better than Google with image search, as described on the Bing blog. (1) Entity…

Please visit Search Engine Land for the full article.

Search In Pics: Google Analytics Hippo, Google Shop & New Belgium Brewery

In this week’s Search In Pictures, here are the latest images culled from the Web, showing what people eat at the search engine companies, how they play, who they meet, where they speak, what toys they have, and more. New Belgium Brewing Android Sign: Source: Google+ Android Backpacks:…

Please visit Search Engine Land for the full article.

Google’s new paid and organic report: An expert view

 

Google’s blog article highlights three areas the new report can help.

  • Discover additional paid search terms – identify keywords you are ranking for organically but not bidding on.
  • Optimise presence on high value queries – monitor your high value queries for organic results.
  • Measure changes – website improvements or AdWords changes are more easily monitored across paid, organic, and combined traffic.

We asked Rishi LakhaniDan Barker and Matt Holland for their thoughts on this new report.

Why is Google doing this?

Rishi Lakhani:

Fairly simple, to show people using the very flawed Webmaster Tools click through data how much more money they could be spending.

Dan Barker:

It’s a sensible thing to do from a ‘user’ point of view, and from a business point of view.

From a user point of view: Marketing teams will often look at this data across four or five different reports in Google Analytics (or an alternative), or will manually join together this information in spreadsheets to understand performance.

Adding it into AdWords makes it a little bit easier, and frankly gives a more useful, ‘actionable’ view of the data than most other tools. The one caveat there is it will be interesting to see how accurate the ‘organic’ data is there.

To get the data you have to link your AdWords and Webmaster Tools accounts. The Webmaster Tools ‘organic search’ data has been notoriously inaccurate, and rounds that inaccurate data to the nearest thousand/hundred/ten. The screengrab in the announcement seems to suggest this data is more accurate (it’s not rounded), but without actually using it and checking it, it’s tough to tell.

From a business point of view this benefits Google too: Showing users “hey, you’re getting all this organic search traffic, why not pay us a little bit of money for even more traffic?” is a really useful thing to be able to do.

If they present this information in Google Analytics, you’re two steps away from actually spending the money to increase PPC traffic; here in AdWords you can bump up bids instantly, add extra keywords to campaigns, etc.

Matt Holland:

The new report will enable search marketers to understand the opportunity when paid and natural is combined, and consequently amend their bidding strategies of paid keywords. There will be additional analysis required by search marketers but they will have a more integrated view of performance which will influence their strategies for both paid and natural moving forward.

 

How will this affect search marketers, day-to-day? 

Rishi Lakhani:

Too early to tell, but PPC agencies will have a ball. Google’s case study insists 18% uplift in click-through rate (CTR) with combined SEO and PPC. So they are trying to get businesses to pay more, by running more ads. It’s really that simple. 

There may be other implications, but it’s early days.

Dan Barker:

The big question here is around ‘not provided’ search data. Google have been hiding a lot of keywords from users over the last couple of years. At present, there is no accurate (or even semi-accurate) way to get that data. If these reports are better than the usual Webmaster Tools data, it may become a standard place for search marketers to go to check organic performance.

Aside from the ‘not provided’ issue, and the ‘accuracy’ issue (both of which are pretty big), the data here is mostly available elsewhere, but most users would have to do some manual work to join it together and analyse. From that point of view, if users have been joining that data together, this will save them time; if they haven’t been using that data this is another tool to help them understand where their site’s traffic is coming from, how it fits together, and how they can improve it.

What impact, if any, will this have on search agency client relations?

Rishi Lakhani:

More money for paid search agencies, so better relations? ;( 

Dan Barker:

From a search agency perspective it’s interesting. If you’re an agency managing both SEO & PPC, it helps you understand where you could steal budget from one area to bump up another, or where you have gaps between your different channels that can easily be plugged.

Where companies use separate PPC and SEO agencies, this report pits them against each other a tiny bit more. From an agency/client point of view, it’s another area where clients will be asking for reports and, hopefully, more testing around the effects of PPC vs SEO vs both together. Two of the numbers here that you don’t often see elsewhere are ‘listings per query’ and ‘total share of search clicks’. Those are both useful for judging yourself against the market, and (for clients) pushing agencies to do more. 

Matt Holland:

Clients who have separate agencies managing their paid and natural search will need to ensure that they link their AdWords and Webmaster Tools accounts together. However, issues may arise where the paid search agency has data on natural agency in AdWords, and is suggesting joined-up strategies on data that the natural agency will not have access to.

The paid search agency could give read-only access to the natural agency, in order to see this data, but it remains to be seen if this will be necessary.

 

Positives/drawbacks?

Rishi Lakhani:

The major drawback I see is for SEO. By using data from Webmaster Tools, you are using flawed data to overvalue PPC. CTR in webmaster tools is a joke in my opinion, however, it’s an opinion; maybe this tool will help confirm or disapprove that.

Either way, I assure you PPC data will look better, which means more direct money will be ploughed into the Goog. Which may leave even less investment in SEO

On the other hand, it may be interesting to finally be able to judge position/keyword/CTR for SEO, if the data is correct. That may actually grow the need to fine tune top tier rankings. 

Dan Barker:

If the data is anywhere near accurate, this is really useful for clever marketers. There are lots of nice, simple things you can test here. One of the big messages Google have wheeled out over the years is “when you’re present for both PPC and SEO traffic, 1+1=3”. This puts that data in the hands of AdWords users.

The only major drawbacks are:

Drawback A) it’s yet another report to look at, yet another source of data to try and align with everything else, and yet another thing to learn about/decide whether it’s useful/fit into your process. It adds a little bit of complexity.

Drawback B) is slightly more theoretical, and is related to encouraging PPC spend. If everyone starts using this, and takes the nudge to spend extra on PPC wherever they’re having success via organic search, it means the PPC market gets a little bit more competitive, bids bump up a little bit more, etc.

Matt Holland:

A drawback is the data in Webmaster Tools only goes back 90 days currently. It would help to be able to look back further when advertisers are trying to analyse data to plan ahead for seasonal events.

Shatner’s Tweet Hints That He May Like Bing Just As Much As Bing Likes Captain Kirk

In a brief exchange on Twitter yesterday evening, the original Captain Kirk gave Bing a shout-out, recommending the Microsoft search engine could possibly enlighten a fellow Twitter user. William Shatner responded “I hear Google pr Bing helps with that problem” after @Andypops1 tweeted…

Please visit Search Engine Land for the full article.

Report: Google’s Not Provided Reached 49% & Much Higher In Technology Industry

BrightEdge has released a report showing that for the 8,400 brands they tracked over the last quarter, 49% of the queries collected did not provide (i.e. not provided) search query data due to Google’s secure search. 49% was the average, in fact, it is higher for the technology industry….

Please visit Search Engine Land for the full article.

Penalized & Sad: When To Abandon The Sinking Ship

Sometimes the captain can’t afford to go down with the ship, no matter the temptation or emotional investment. Every few weeks, I receive another hopeless phone call from another desperate webmaster. Since Penguin, I’ve seen more ships go down than I care to count. Websites that have delved…

Please visit Search Engine Land for the full article.

Solving the Pogo-Stick Problem – Whiteboard Friday

Posted by randfish

Getting your site to display at the top of a SERP is quite an accomplishment, but it also takes quite a bit of effort to keep it there. If people click through to your site only to click their back buttons and look for another result, the search engines are going to catch on, and you could fall in the rankings.

In today’s Whiteboard Friday, Rand helps us broaden our thinking to satisfy the searchers and keep them from pogo-sticking back to the SERP.

Whiteboard Friday – Solving the Pogo-Stick Problem

Pro tip: Learn more about on-page optimization for content and UX at Moz Academy.

For reference, here’s a still image of this week’s whiteboard:

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. Today I want to talk to you about the pogo-sticking problem.

So here’s the story. Basically search engines, Google included, use a lot of different kinds of data for their ranking algorithms, but one of the pieces that’s in there, we don’t know exactly how big it might be, but it’s certainly possible that it’s sizeable, is what’s called pogo sticking. They measure this feature or this occurrence where someone performs a search. I performed a search here for IT consultants, and there are a few listings that come up. I click on “IT Boston.” It takes me to IT Boston’s website, and then I decide, maybe in the first five or ten seconds, “You know what? This site is not solving my problem. This isn’t really what I wanted,” and I go right back to the same search result.

Either I click back or I search for it again or I search for something different, and then I go and click on other results. Maybe I click on this “Is IT Consulting Dead?” It’s sort of a link bait article from some news source, BuzzFeed maybe, click on that, go to that page, and I stay on it and I don’t come back to the search result.

Google measures these kinds of things. So does Bing. They measure this pogo-sticking, and they come up with essentially, this is a very simplistic representation of what actually happens, but X% of people pogo stick away from IT Boston in their first 5 seconds of visiting the site, Y% do it for this BuzzFeed page, and Z% do it for IT 101. We’re going to calculate some average, the average pogo-sticking as sorted and weighted by the ranking position for this particular search result.

Here’s the problem. For every search result, there’s some different pogo-sticking rate. But great pages and sites tend to have the trait that they’ve got really low pogo-sticking rates. If IT Boston is a great result, people click it and they stay. Their search query has been satisfied. Google likes that. That means that a searcher is made happy, and they’re not coming back and doing other searches and clicking other results. Sometimes this might be okay. Maybe there are some sorts of searches where Google says, “Oh, lots of people do click multiple times, and lots of people do bounce back and forth and it’s fine.” But for the vast majority of searches this is really important to get right. So I have some tactical tips for you.

If you’ve got a pogo-sticking problem, a high bounce rate, people are going back to the search results, clicking on your competitors’ links, that kind of thing, the number one thing you can do is get in the searcher’s head. This is different, might be different from getting in your customer’s head. You might say, “Hey, we’ve designed this excellent landing page. It’s really focused. If the 10% of people who search, who are our kind of customers, come to this page, they’re going to convert.”

The challenge there is you’ve got to think bigger. You have to think about all the searchers, the 90% of the searchers who may not be your customer and how do you answer their query, because otherwise you’re probably going to be falling in those search results. What questions do those people have? What makes them engage versus leave? What is it, when this person performs a search, that they want to know? And if you don’t know, you can ask.

One of my top recommendations for people who have just kind of a crummy page is, “I want you to go out and survey people in your office, people who work with you, people who are long-time customers, people who are in your network. I want you to survey them, and I want you to ask them, ‘Imagine you have performed a search for X. Tell me the first, most important thing you’re looking for. Now tell me the second thing that you’d probably be interested in, and now tell me the third thing.’ ” People will just free-form leave a couple phrases or sentences in those boxes, send it back to you. Boom. Now you know what people want. If you don’t have that sort of searcher empathy built into your head already, you can do it this way, through the surveying system, and then you can make a page that people are going to love. You can answer those questions.

Number two, I see a lot of search results out there that are missing design and UX elements that are critical to success. If you’ve got this crappy, crummy 1990s design aesthetic going on or even a more updated thing, but it’s just not a very usable website, the navigation’s poor, the images are poor, the content quality is poor, you’ve got to work on that. If you can’t say with conviction that you have the highest quality, most usable, beautiful, high visual-quality page in the results, get to work man. Get to work. This stuff is really important.

If you’re looking, by the way, one of my top suggestions is to check out Dribbble.com. That’s D-r-i-b-b-b-l-e.com. Wonderful designers are available on there. Some of them are very expensive. Some of them are less expensive. Great resource to check out.

Number three, the last thing I’ll mention on tactical tips for this is load speed and device support. A lot of times I do see this problem where someone goes to a page and then after two or three seconds if something hasn’t loaded, they go back. You can work on this. Even if you have a relatively robust page, you can get elements to load in those critical first second, second and a half time frames. Check out developers.google.com/speed/pagespeed. They’ve got an analysis tool and a system you can walk through to make sure that that works.

You should also be multi-device compliant. Make sure that if you don’t have responsive design, you at least have a mobile-friendly site, an iPad-friendly site. I do love responsive design. I recommend it. But this becomes a challenge too, because remember, if lots of people are searching on mobile and they’re bouncing back because your page is slow or it doesn’t work with a mobile device, you’re in trouble. Those stats are going to hurt you in the results.

All right, everyone. I hope you’ve enjoyed this edition of Whiteboard Friday. We’ll see you again next week. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!