Buying A Business

As Google makes life more difficult for SEOs, pure-play SEO business models, such as affiliate and Adsense, can start to lose their shine. Google can remove you from Adsense without warning, and the affiliate model has always had hooks.

One of the problems with affiliate and Adsense has always been that it is difficult to lock in and build value using these models. If the customer is “owned” by someone else, then a lot of the value of the affiliate/Adsense middle-man lies in the SERP placement. When it comes time to sell, apart from possible type-in domain value, how much intrinsic value does such a site have? Rankings are by no means assured.

So, if these areas are no longer earning you what they once did, it makes sense to explore other options, including vertical integration. Valuable online marketing skills can be readily bolted onto an existing business, preferably to a business operating in an area that hasn’t taken full advantage of search marketing in the past.

Even if you plan on building a business as opposed to buying, looking at businesses for sale in the market you intend to build can supply you with great information. You can gauge potential income, level of competition, and undertaking a thorough business analysis can help you discover the hidden traps before you experience them yourself. If there are a lot of businesses for sale in the market you’re looking to enter, and their figures aren’t that flash, then that’s obviously a warning sign.

Such analysis can also help you formulate your own exit strategy. What would make the business you’re building look attractive to a buyer further down the track? It can be useful to envision the criteria for a business you’d like to buy, and then analyse backwards to give you ideas on how to get there.

In this article, we’ll take the 3,000 ft view and look at the main considerations and the type of advice you’ll need. We’ll also take a look at the specifics of buying an existing SEO business.

Build Or Buy?

There are a number of pros and cons for either option and a lot depends on your current circumstances.

You might be an existing owner-operator who wants to scale up, or perhaps add another revenue stream. Can you get there faster and more profitably by taking over a competitor, rather than scaling up your own business?

If you’re an employee thinking of striking out on your own and becoming your own boss, can you afford the time it takes to build revenue from scratch, or would you prefer instant cashflow?

The Advantages Of Building From Scratch

Starting your own business is low cost. Many online businesses cost next to nothing to start. Register the business. Open a bank account. Fill out a few forms and get a business card. You’re in business.

You don’t need to pay for existing assets or a customer base, and you won’t get stuck with any of the negatives an existing business may have built up, like poor contracts, bad debts and a tainted reputation. You can design the business specifically for the market opportunity you’ve spotted. You won’t have legacy issues. It’s yours. It will reflect you and no one else, at least to start with. The decisions are yours. You don’t have to honor existing contracts, deal with clients or suppliers you had no part in being contractually obliged to in the first place.

In short, you don’t have legacy issues.

What’s not to like?

There is more risk. You don’t yet know if your business will work, so it’s going to require time and money to find out. There are no guarantees. It can be difficult to get funding, as banks like to see a trading history before they’ll lend. It can be very difficult to get the right employees, especially early on, as highly skilled employees don’t tend to favor uncertain startups, unless they’re getting equity share. You have to start a structure from scratch. Is the structure appropriate? How will you know? You need to make a myriad of decisions, from advertising, to accommodation, to wages, to pricing, and with little to go on, apart from well-meaning advice and a series of hunches and experiments. Getting the numbers right is typically arrived at via a lot of trial and error, usually error. You have no cashflow. You have no customers. No systems. No location.

Not that the downsides should stop anyone from starting their own business. If it was easy, everyone would do it, but ask anyone who has started a business, and they’ll likely tell you that sure, it’s hard, but also fun, and they wouldn’t go back to being an employee.

There is another option.

Buy It

On the plus side, you have cash flow from day one. The killer of any business is cash flow. You can have customers signed up. People may be saying great things about you. You may have a great idea, and other people see that it is, indeed, a great idea.

But if the cash flow doesn’t turn up on time, the lights go out.

If you buy an existing business with sound cashflow, you not only keep the lights on, you’re more likely to raise finance. In many cases, the seller can finance you. If that’s the case, then for a small deposit you get the cashflow you need, based on the total business value, from day one.

You’ve got a structure in place. If the business is profitable and running well, then you don’t need to experiment to find out what works. You know your costs, how much you need to spend, and how much to allocate to which areas. You can then optimize it. You have customers, likely assistance from the vendor, and the knowledge from existing suppliers and employees. There is a reduced risk of failure. Of course, you pay a price for such benefits.

To buy a business, you need money. Whatsmore, you’re betting that money on someone elses idea, not your own, and it can be difficult to spot the traps. You can, of course, reshape and respin the business in your own image. You can get stuck with a structure that wasn’t built to your specifications. You might not like some of the legacy issues, including suppliers, existing contracts or employees.

If you decide buying a business is the right thing for you, then you’ll need good advice.

Advice

According to a survey conducted by businessforsale.com, businesses can take an average of nine months to sell:

  • 28% of brokers said within 6 months
  • 31% of brokers said within 9 months
  • 21% of brokers said within 12 months
  • 10.5% of brokers said that more than 12 months was required to sell a business

Buying a business is more complicated than buying an asset, such as a website. You could buy only the assets of a business – more on that shortly – but often the businesses are sold as a going concern, which means you may take on all the potential liabilities of that business, too.

Hence the need for sound advice in three main areas. Assemble a team to cover legal, accounting and business advisory.

Legal

Buying is a business, like buying a house, is a legal transaction, consisting of a number of legal issues. They key issues are you want to know exactly what you’re buying and won’t be left with any unexpected liabilities. You also want to make sure the seller won’t compete with you by re-entering the market after you buy.

One of the first things I do with clients is make sure they understand what they are buying,….They need to be able to tell me if they are buying assets, such as customer list and equipment, or the business, with the warts and ugliness that come with it.

There are a number of potential traps:

Among the things to worry about when you buy an existing business: undisclosed debts, overstated earnings, poor employee relations, overvalued inventory and pending lawsuits, to name a few. Hidden liabilities can exist in all sorts of areas – from land contaminated with toxic chemicals, to accounts receivable that look solid but prove to be uncollectible, to inventory that’s defective or dated

There’s an important distinction between buying the assets of a business and buying a business. Buyers typically want to buy the assets, such as a customer list, supply contracts, or plant. Sellers typically want to sell the entire business entity.

If you buy only a corporation’s assets, you don’t assume its liabilities, including taxes.
If you buy a corporation’s shares of stock, however, you end up with both its assets and liabilities – including known and unknown taxes. An example of an unknown tax debt would be one that resulted from an IRS audit that has not yet begun. The seller of the corporate shares is released from all corporate debts unless he personally guarantees them or agrees to be liable for them after the transfer

Which is an important distinction. However, most smaller business sales are likely to be asset sales, as they are often sole proprietorships or partnerships.

There are also financial implications in terms of tax writeoffs.

Accountant

There are two main areas accountants look at when evaluating a business. The financial history, and the tax ramifications.

Advisors often recommend looking at more than just the last years books:

In order to know whether or not the asking price for the business is fair, it is very important that you look through the books of the company over a number of financial periods. Don’t make the mistake of asking for just last year’s accounts. You should have at least three and preferably five years of records for the business. If it is half way through the financial year, ask for an interim set of accounts for this year. You need to be assured that trading conditions have not deteriorated from the last financial year. If you are looking to put your hard-earned money (and other’s equally hard-earned money) into a business, you want to make sure that the business is not going backwards. You need to look for evidence of year on year growth at acceptable margins. Remember, any company can show regular growth but it must be profitable. Fire sales can increase revenue with little or no impact on margin or worse, the revenue can be unprofitable

The other main area is tax.

Again, this is where the difference between assets and equity is important. There are tax advantages in buying assets, as you can depreciate based on the purchase price:

Property acquired by purchase. The depreciable basis is equal to the asset’s purchase price, minus any discounts, and plus any sales taxes, delivery charges, and installation fees. For real estate, you can also include costs of legal and accounting fees, revenue stamps, recording fees, title abstracts/insurance, surveys, and real estate taxes assumed for the seller

We’ve barely scratched the surface, and your financial advice will be considerably more detailed, taking into account multipliers, profit and revenue, and more. Business valuation is a specialist area, and if you want to read more on this topic, I found The Small Business Valuation Book a good resource.

Business Advisor

After lawyers and accountants, the third member of your evaluation team should be a qualified business adviser who is familiar with businesses in your area of interest.

A thorough competitive analysis should be a first step. Where does this business sit in relation to existing competition? How easy is it for new competitors to enter the market? How much risk is involved?

Whenever you buy an existing business and look at its records, you’re looking at the past. There’s no guarantee things won’t change going forward. If you’re negotiating to buy a business and you think the seller is giving you a great deal, be very suspicious–there’s probably something heading down the road at 90 miles an hour that will blow this business apart when it hits

That’s the same if you plan to build a business from scratch, the difference being you probably won’t have to risk as much up front.

It can also pay to go through a broker acting on your behalf, as opposed to the seller. “>Brokers can:

Prescreen businesses for you. Good brokers turn down many of the businesses they are asked to sell, whether because the seller won’t provide full financial disclosures or because the business is overpriced. Going through a broker helps you avoid these bad risks.
Help you pinpoint your interest. A good broker starts by finding out about your skills and interests, then helps you select the right business for you. With the help of a broker, you may discover that an industry you had never considered is the ideal one for you.
NegotiateThe negotiating process is really when brokers earn their keep. They help both parties stay focused on the ultimate goal and smooth over any problems that may arise.
Assist with paperwork. Brokers know the latest laws and regulations affecting everything from licenses and permits to financing and escrow. They also know the most efficient ways to cut through red tape, which can slash months off the purchase process. Working with a broker reduces the risk that you’ll neglect some crucial form, fee or step in the process.

Buy An Existing SEO Business

If you want to build an SEO business, here’s a good idea of what’s involved in building one up to scale:

When you are building your agency, you need to focus on getting clients that pay you 6 figures a year. It’s hard to build a profitable agency and provide great results when someone only pays you a few grand a month.

There’s a lot of competition in this market because there are no real barriers to entry. Anyone can call themselves an SEO and anyone can advertise such services. The result is that it can be pretty difficult to differentiate yourself.

The advantages of buying an SEO business are the same for buying any other type of business i.e. you get instant cashflow, a client list, and reputation. The standard analysis, as outlined in this article, applies. Evaluate financials, legal issues and position in the market, the same as any other business.

If you’re considering buying an SEO business you need to pay particular attention to reputation. It’s a market where, I think it’s fair to say, there is a significant level of hype. Customers are often oversold on benefits that don’t eventuate i.e. a focus on rankings that don’t result in leads or customers.

Reputable SEO businesses are unlikely to have a high level of customer churn. Look for customer lists where the customers has been with the agency for a good length of time, and are ordering more services. Look for locked in forward contracts. It’s pretty easy for other SEOs to poach other customers by offering them lower prices. Again, this is why reputation and evidence of high service levels are important.

One valuable aspect, as Neil alludes to in his article, is relationships:

In the short run you will lose money from business development, but in the long run you’ll be able to make it up. The quickest way for you to increase your revenue is to be the outsourced arm of bigger agencies. As an SEO company, look for ad agencies to partner with, as there are way bigger ad agencies than seo agencies. Feel free and cold call them, offer to help them for free with their own website, and if you do well they’ll drive a lot of clients to you

Look at how the agency gets work. If it comes from established, larger advertising agencies, then these relationships are valuable. They typically result in a steady flow of new work without the need for new advertising spend. Look at the promises that have been made to clients. For example, ongoing payment may rely on performance metrics, such as ongoing rankings.

Further Resources:

Hopefully this has article has given you some food for thought. If you’re capital rich and time poor, then buying an established business can be an attractive proposition. Here are some of the sources used in this article, and further reading:

Categories: 

Keyword Research Plus

If we’re targeting keywords, getting good traffic as a result, but not converting as much traffic as we’d like, then it might be due to a market validation problem.

Basic keyword research typically involves looking at the nature of the web site, creating a list of terms that describe the offers being made, expanding the keyword list out using keyword research tools, and then targeting those keyword terms.

However, if that’s all a search marketer does, and fails to get conversions and engagement as a result, then they might be asking the wrong questions.

Asking The Right Questions

Consider Coca-Cola.

Coca Cola undertook extensive market testing and research before they introduced “New Coke”, yet New Coke failed miserably. Their competitors, Pepsi, used a blind taste test, asking people if they preferred Coke or Pepsi. Coca Cola ran their own testing, and the results were not good. The majority preferred Pepsi.

However, the problem in asking people to take just one sip and compare was to ask the wrong question. People may have preferred the first sip of Pepsi, but they preferred the less sweet Coke when they consumed an entire glass. In “Inside Coca-Cola: A CEO’s Life Story of Building the World’s Most Popular Brand”, Neville Isdell also postulates that new coke failed because original coke was about the iconic. It was linked to history. It wasn’t just about the taste of the first sip, it was also about the place of Coke in culture. There was a lot more to it than the first, sugary hit.

Coca Cola asked the wrong questions. Getting the context right was important if they were to understand the answers.

If you’ve designed relevant landing pages but not getting the conversion rate you desire, no matter how much split/run testing you do, or if you’ve managed to rank #1 for your chosen term, and you’ve written some great copy, but the traffic just keeps bouncing away, then it might be a problem with positioning in the market.

These market validation ideas apply mostly to search marketers who build their own sites, but it’s also applicable to marketers working on client sites if those client sites have poor targeting. Bolting on search marketing won’t do much good if a site is making substandard, or redundant offers.

Market Validation

“Market Validation” was a concept defined by Rob Adams in his book “If You Build It They Will Come”. It’s the process of figuring out if a market exists before you go to the expense and time of filling that market. Market validation is typically used by entrepreneurs in order to determine if they should enter a market, however the more general aspects can also be applied to search marketing.

Two aspects that are particularly useful for search marketers, especially those marketers who care what happens after the click, is a market analysis – to determine what stage in the market the business is at – and a competitive analysis. Armed with this information, they’ll know how to best pitch the offer, which, when combined with effective copywriting and calls to action should increase engagement and conversion.

Market Stage

Entrepreneurs are concerned with the growth rate of a market sector.

Typically, entrepreneurs want to get into fast rising, new markets, as opposed to mature or sunset markets. It’s difficult for new entrants to compete with incumbents, as doing so involves high costs. It is estimated that the cost of taking a customer off a competitor is typically three to ten times the cost of acquiring a new customer.

Try to figure out the stage of growth of the market. If the site operates in a mature market, with multiple competitors, then aspects such as price and features are important. In a mature market, the site you’re working with should be competitive on these aspects, else a top ranking position and compelling copy won’t help much, as the buyer will likely be comparing offers.

Similarly, if the client is competitive in these areas, then it pays to push these aspects hard in your copy and calls to action. For example, if a mobile phone site focused, first and foremost, on buyer education, it probably won’t do as well as a site that focuses on price and features. Generally speaking, buyers in this mature market sector don’t need to be educated on the merits of a mobile phone. They’re probably mainly interested in looks, availability, price and features.

If your client is in a fast growing new market, then there’s typically a lot more buyer education involved. People need to be convinced of new offers, so consider making your copy more education focused in these niches.

For example, when the iPhone came out, it didn’t have any direct competition. Apple didn’t need to push hard on price or features – there were cheaper phones, and there were phones that could do some things better, but there was nothing directly comparable in the smartphone market. Only recently, now that the market has matured, are Apple focusing on price with the introduction of lower priced entry level phones. This is a characteristic of more mature markets with high levels of competition and price pressure.

Here’s an example of mobile phone makers targeting a submarket of a mature market, differentiated by age:

Since mobile phone penetration has reached almost saturation levels in Europe and the United Kingdom, mobile service providers are focusing attention on the 55–65 and 65-plus segment to improve usage and penetration. Their high disposable incomes and their ability to devote time to new habits are seen as a lucrative market opportunity. 5 At the other end of the demographic scale, Red Bull has built a following among youth worldwide.

Identify what stage the business is at, and adjust your approach based on the strengths or weaknesses of that market.

Market Segment

The more specific the keyword, the more the keyword is likely to identify subcategories within broader markets. For example, a travel agent could target a general term like “hotels in Spain”, or the more specific “luxury hotels in Marbella”.

Look for competitive strengths a business may have in a submarket and consider focusing search marketing efforts in these areas first. An easy-win builds confidence. Is this submarket fast-growing? Even better. Build both confidence and revenue. It may lead to more of the business being refocused around these submarkets.

Are there some submarkets that have decent keyword volume, but they’re mature? Ensure that you have some competitive advantage in terms of pricing and features before devoting too much time targeting them.

Even if the traffic isn’t particularly high in some submarkets, at very least you’ll have earned the engagement metrics Google likes to see, and likely built some brand reputation in these submarkets that can then be leveraged into other submarkets.

Lifecycles

Determine the audience in term of product lifecycle.

Are you targeting keyword areas relating to new products? If so, you’re most likely talking to early adopters. Therefore, the pitch is likely to involve aspects such as education, being first, desirability, being forward-thinking, and standing out from the crowd. The pitch is less likely to focus on negating buyer risk.

If you’re dealing with a business later in the lifecycle, then you’ll likely be talking more about price and comparing and differentiating features.

Competitive Analysis

Competitive analysis is perhaps the most important, yet often overlooked, aspect of SEO/PPC.

Top rankings can be a waste of time if direct competitors are more competitive on features, price, service and brand recognition. Buyers will compare these aspects clicking from link to link, or will use third-party comparison sites, a sure signal of a mature market.

Find out what competitors are doing. And what they’re not doing. Try creating a competitive matrix:

A competitive matrix is an analysis tool that helps you establish your company’s competitive advantage. It provides an easy-to-read portrait of your competitive landscape and your position in the marketplace. The matrix can be just a simple chart. In the left column, you list the main features and benefits of your product or service. On the top row, you list your company and the names of your competitors. Then fill in the chart with the appropriate information for each company. For example, if you own a dry cleaning service, you might list the different services you offer or the quick turnaround you provide on items (24 hours), and then note how your competitors fail at these features.

If there are competitors, then obviously a market exists. Compare your competitors against as many keyword terms as possible, and see how well they’re doing in each keyword area. Not just in terms of ranking, but in terms of their offer and the maturity of the market. If there are numerous competitors gunning for the same keyword terms, then determine if your offer is strong enough that should you beat their rankings, you can still stand up to a side-by-side feature, service and price comparison. Is there a submarket in which they are weak? Would you be better off devoting your time and energy to this submarket?

Examine their pitch. In any competitive niche, the pitch made by those occupying the top three spots in Adwords over time is likely to be the most relevant to the audience. If their pitch wasn’t relevant, it’s unlikely they could remain in those positions, due to quality score metrics, and the financial strength to keep outbidding competitors. There are exceptions i.e. competitors running losses for some reason, but generally, it’s safe to assume they’re doing something right.

Are they talking price? Features? Are they using video? Are they using long copy? Are they using people in their photographs? How big is their text? What’s the path to ordering? Do they highlight a phone number, or do they bury it? Pull their offer, and presentation of that offer, apart.

Make a note of everything the top three or four sites Adwords sites are doing and then emulate the commonalities. This gives you a strong baseline for further experimentation, testing and positioning on the SEO side. Keep in mind it’s not good enough to beat these competitors by a small margin. Incumbents often have brand awareness and customer bases (high trust levels), so to counter that, your should be considerably better. A “better offer” can mean superior price or features, but it can also be better service levels, a more specific solution, or a fresh new angle on an existing solution.

Also consider substitutions.

If a buyer can substitute a product or service, then this offers a potential opportunity. For example, lets say a buyer has a transportation problem. They could buy a car to solve that problem. Or, they could lease a car on a pay-per-drive model. The pay-per-drive model is a substitution threat for car sellers. If you take a step back and determine what problem the visitor trying to solve, as opposed to leaping to conclusions about the obvious keyword that describes that solution, then you might find rich, unmined substitution keywords. Perhaps your offer can be repackaged slightly differently in order to mine a substitution keyword stream.

Of course, people don’t always buy on price and features, even if the market is mature, but they still need a compelling value proposition. One example is organic produce. It’s typically more expensive, and the “features” are the same, but the context is different. The produce is sold on environmental values.

So look for value propositions that customers might respond to, but competitors aren’t taking advantage of. Or you can extend the ones they use. Now that Google is coming from behind with their own Motorola phones they are extending Apple’s designed in California with made in America.

Summary

There are many links on the page a searcher can click. The more mature the market, the more relevant search results they’re likely to encounter, and those results, both PPC and natural search are likely to match their intent. At that point, getting the offer right is important. If you can’t compete in terms of offer, try looking for submarkets and position there, instead.

I hope this article has given you some new angles to explore. A good reference book on the topic of market validation, and the inspiration for this article, is “If You Build It They Will Come”, by Rob Adams.

Categories: 

Why Webmasters Pass Their Margins Onto the Googleplex

In previous articles, we’ve looked at the one-sided deal that has emerged when it comes to search engines and publishers. Whilst there is no question that search engines provide value to end users, it’s clear that the search engines are taking the lionshare of the value when it comes to web publishing.

That isn’t sustainable.

The more value stripped from publishing, the less money will be spent on publishing in future. In this respect, the search engines current business model undermines their own long-term value to end users.

In this ecosystem, the incentive is to publish content that is cheap to produce. Content might also be loss-leader content that serves as a funnel leading to a transaction. Some of the content might be advertorial, the result of direct sponsorship, and may well include paid links. Curiously, it has been suggested by a Google rep that “….you blur the lines between advertising and content. That’s really what we’ve been advocating our advertisers to do”. Some of it might be “the right kind of native“, courtesy of Google Doubleclick. Some of the higher value content tends to be a by-product of the education sector, however the education sector may be the next in line to suffer a commodification of value.

There is little return to be had in producing high value content and making it publicly available for free, with no strings attached, so naturally such content is disappearing behind paywalls and taking other forms.

YouTube

Some YouTube producers are rebelling.

In a recent post, Jason Calacanis outlines the problem for video content producers. He maintains that Google’s cut of the rewards amounts to 45%, and that this cut simply isn’t sustainable for video producers as their margins aren’t that high.

Successful media businesses today have margins in the 20% to 50% range–if they hit profitability. That means if you give a partner 45% off the top, you have no chance of breaking even (emphasis mine). In fact, this absurd revenue is so bad that people have made amazingly clever strategies to skirt them, like VICE producing the Snoop Lion documentary and Grace Helbig becoming the face of Lowe’s Hardware. A full 100% of that money goes to the content creator — boxing out YouTube. More on this later.

Sure, it can *feel* like you’re making money, but when you look across the landscape of YouTube businesses — and I won’t call anyone out here — it’s very, very clear they are losing millions and millions of dollars a year.

YouTube doesn’t have to worry because they simply lop off 45% of the revenue from the top for providing video hosting. Hosting for them is, essentially, free since they have a huge — and growing — network of fiber (see ‘Google’s Fiber Takeover Plan Expands: Will Kill Cable & Carriers’).

Since YouTube doesn’t have to create any content, just aggregate it, they don’t need to worry about the individual profitability of any one brand……With YouTube, as with their AdSense product, Google is trying to insert itself between publishers and advertisers and extract a massive tax. In the case of YouTube, it’s a 45% tax

In a subsequent post, Calacanis laments that whilst a lot of publishers got back to him in support of his views, he received no contact from YouTube, even though he is supposedly a high value “partner”.

And what do YouTube do for this 45% cut? Hosting? They’ve pretty much outsourced support and liability to the MCNs for no money down. I imagine running a video network is pretty expensive, although I wonder about the true costs for Google. Calacanis obviously doesn’t think they’re great enough to justify the cut.

PPC Not Immune

Paid search also extracts a high tax.

Let’s run the numbers. A site has an average order price of $100. The site converts at 1% i.e. a site makes a sale to one in every hundred visitors. Sales are $1 per visitor. If the total cost of providing the order is $50, then the profit is 50 cents per visitor. The site can pay the search engine up to 49 cents per click and make a profit.

Let’s say the site invested heavily in conversion optimization to raise the conversion rate. They redesign their site, they refine their offer to give users exactly what they want, they optimize the sales funnel, and they manage to double their conversion rate to 2%. Now, for every 100 visitors, they make $2 per visitor. They can now bid up to $1.99 and still make a profit.

Great, right.

But along comes the competition. They also invest heavily in conversion optimization, and copy, and process, and they double their conversion rates, too. These sites must then keep upping their bids to stay on top in the auction process. Who benefits?

The search engine does.

The search engine benefits from this content improvement in the form of higher bid prices. The producer improves the value of their sites to users, but whilst the competition is doing the same thing, the real winner is the search engine.

This is one reason the search engine spokespeople will advise you to focus on delivering value to customers. The more value you create, the more value you’re going to end up passing to a search engine. As publishing becomes easier, the more gets published, yet the amount of attention remains relatively static. The competition increases, and it is likely that those with the deepest pockets eventually win high value and/or mature verticals.

How To Deal With It

Whilst we’re waiting for a new paradigm to come along that swings the pendulum back in favor of publishers – and we may be waiting some time – we need to think about how to extract more value from each visitor. This is not meant as a beat-up on the search engines – I’m glad they exist and enjoy most of what they do – rather this is about trying to get a handle on the ecosystem as it stands and how to thrive in it, rather than be crushed by it. In long tail markets – and web content is a l-o-n-g tail market – most of the value flows to the person organizing the market.

The key to prospering in this environment – if you don’t have the deepest pockets and you don’t organize the market – is to build relationships.

SEO is built largely on the premise that a relationship doesn’t exist between searcher and publisher. If a relationship already existed, the searcher would go direct to the publisher site, or conduct a brand search. I’m sure that’s how most people reading this article arrived on SEOBook.

So, try to make the most of every search visitor by turning them into non-search visitors. The search engine gets to extract a lot of value on first visit, especially if they arrive via PPC, but if you can then establish an on-going relationship with that visitor, then you get to retain value.

1. Encourage Subscriptions

Subscriptions can be in the form of bookmarking, signing up to Twitter, on Facebook, email subscriptions, RSS, and forum subscriptions. Encourage users to find you, in future, via channels over which you have more control. If you’ve buried these subscription calls to action, make them overt.

2. Form Alliances

Share exit traffic with like-minded but non-competitive sites. Swap advertising. Make guest posts and allow others to do likewise. Interview each other. If appropriate, instigate affiliate programs. Invest in and grow your personal networks.

3. Invest In Brand

Define a unique brand. Push your URL and brand everywhere. Take it offline. Even down to the basics like business cards, pens, whatever, emblazoned with your logo and URL. If you don’t have a definitive brand in your space, pivot and build one. Own your brand search, at very least.

4. Widen Distribution Channels

Publish ebooks. Build apps. Publish white papers. Make videos. Think of every medium and channel in which you can replicate your web publishing efforts.

Once you establish a relationship, give people reasons to come back. Think of what you do in terms of a platform, destination or place. How would this change your current approach? Ensure your business is positioned correctly so that people perceive a unique value.

You can then treat search engine traffic as a bonus, as opposed to the be all and end all of your business.

Categories: 

Specialization Strategy

Last week, I reviewed “Who Owns The Future?” by Jaron Lanier. It’s a book about the impact of technology on the middle class.

I think the reality Janier describes in that book is self-evident – that the middle class is being gouged out by large data aggregators – but it’s hard, having read it and accepted his thesis, not to feel the future of the web might be a little bleak. Laniers solution of distributing value back into the chain via reverse linking is elegant, but is probably unlikely to happen, and even if it does, unlikely to happen in a time frame that is of benefit to people in business now.

So, let’s take a look at what can be done.

There are two options open to someone who has recognized the problem. Either figure out how to jump ahead of it, or stay still and get flattened by it.

Getting Ahead Of The SEO Pack

If your business model relies on the whims of a large data aggregator – and I hope you realize it really, really shouldn’t if at all humanly possible – then, you need to get a few steps ahead of it, or out of its path.

There’s a lot of good advice in Matt Cutt’s latest video:

It could be argued that video has a subtext of the taste of things to come, but even at face value, Cutts advice is sound. You should make something compelling, provide utility, and provide a good user experience. Make design a fundamental piece of your approach. In so doing, you’ll keep people coming back to your site. Too much focus on isolated SEO tactics, such as link building, may lead to a loss of focus on the bigger picture.

In the emerging environment, the big picture partly means “avoid getting crushed by a siren server”, although that’s my characterization, and unlikely to be Cutts’! Remember, creating quality, relevant content didn’t prevent people from being stomped by Panda and Penguin. All the link building you’re doing today won’t help you when a search engine makes a significant change to the way they count and value links.

And that day is coming.

Are You Flying A Helicopter?

Johnon articulately poses part of the problem:

Fast forward and we’re all spending our days flying these things (computers). But are we doing any heavy lifting? Are we getting the job done, saving the day, enabling the team? Or are we just “flying around” like one of those toy indoor helicopters, putzing around the room dodging lamps and co-workers’ monitors until we run out of battery power and drop to the floor? And we call it work.”…More than ever, we have ways to keep “busy” with SEO. The old stand-byes “keyword research” and “competitive analysis” and “SERP analysis” can keep us busy day after day. With TRILLIONS of links in place on the world wide web, we could link analyze for weeks if left alone to our cockpits. And I suppose every one of you SEOs out there could rationalize and justify the effort and expense (and many of you agency types do just that.. for a living). The helicopter is now cheap, fast, and mobile. The fuel is cheap as well, but it turns out there are two kinds of fuel for SEO helicopters. The kind the machine needs to fly (basic software and electricity), and the kind we need to actually do any work with it (seo data sets, seo tools, and accurate and effective information). The latter fuel is not cheap at all. And it’s been getting more and more expensive. Knowing how to fly one of these things is not worth much any more. Knowing how to get the work done is

A lot of SEO work falls into this category.

There is a lot of busy-ness. A lot of people do things that appear to make a difference. Some people spend entire days appearing to make a difference. Then they appear to make a difference again tomorrow.

But the question should always be asked “are they achieving anything in business terms?”

It doesn’t matter if we call it SEO, inbound marketing, social media marketing, or whatever the new name for it is next week, it is the business results that count. Is this activity growing a business and positioning it well for the future?

If it’s an activity that isn’t getting results, then it’s a waste of time. In fact, it’s worse than a waste of time. It presents an opportunity cost. Those people could have been doing something productive. They could have helped solve real problems. They could have been building something that endures. All the linking building, content creation, keyword research and tweets with the sole intention of manipulating a search engine to produce higher rankings isn’t going to mean much when the search engine shifts their algorithms significantly.

And that day is coming.

Pivot

To avoid getting crushed by a search engine, you could take one of two paths.

You could spread the risk. Reverse-engineer the shifting algorithms, with multiple sites, and hope to stay ahead of them that way. Become the gang of moles – actually, a “labour” of moles, in proppa Enlush – they can’t whack. Or, at least, a labour of moles they can’t whack all at the same time! This is a war of attrition approach and it is best suited to aggressive, pure-play search marketing where the domains are disposable.

However, if you are building a web presence that must endure, and aggressive tactics don’t suit your model, then SEO, or inbound, or whatever it is called next week, should only ever be one tactic within a much wider business strategy. To rely on SEO means being vulnerable to the whims of a search engine, a provider over which you have no control. When a marketing tactic gets diminished, or no longer works, it pays to have a model that allows you to shrug it off as an inconvenience, not a major disaster.

The key is to foster durable and valuable relationships, as opposed to providing information that can be commodified.

There are a number of ways to achieve this, but one good way is to offer something unique, as opposed to being one provider among many very similar providers. Beyond very basic SEO, the value proposition of SEO is to rank higher than similar competitors, and thereby gain more visibility. This value proposition is highly dependent on a supplier over which we have no control. Another way of looking at it is to reduce the competition to none by focusing on specialization.

Specialize, Not Generalize

Specialization involves working in a singular, narrowly defined niche. It is sustainable because it involves maintaining a superior, unique position relative to competitors.

Specialization is a great strategy for the web, because the web has made markets global. Doing something highly niche can be done at scale by extending the market globally, a strategy that can be difficult to achieve at a local market level. Previously, generalists could prosper by virtue of geographic limits. Department stores, for example. These days, those departments stores need to belong to massive chains, and enjoy significant economies of scale, in order to prosper.

Specialization is also defensive. The more specialized you are, they less likely the large data aggregators will be interested in screwing you. Niche markets are too small for them to be bother with. If your niche is defined too widely, like travel, or education, or photography, for example, you may face threats from large aggregators, but this can be countered, in part, by design, which we’ll look at over the coming week.

If you don’t have a high degree of specialization, and your business relies solely on beating similar business by doing more/better SEO, then you’re vulnerable to the upstream traffic provider – the search engine. By solving a niche problem in a unique way, you change the supply/demand equation. The number of competing suppliers becomes “one” or “a few”. If you build up sufficient demand for your unique service, then the search engines must show you, else they look deficient.

Of course, it’s difficult to find a unique niche. If it’s profitable, then you can be sure you’ll soon have competition. However, consider than many big companies started out as niche offerings. Dell, for example. They were unique because they sold cheap PCs, built from components, and were made to order. Dell started in a campus dormitory room.

What’s the alternative? Entering a crowded market of me-too offerings? A lot of SEO falls into this category and it can be a flawed approach in terms of strategy if the underlying business isn’t positioned correctly. When the search engines have shifted their algorithms in the past, many of these businesses have gone up in smoke as a direct result because the only thing differentiating them was their SERP position.

By taking a step back, focusing on relationships and specific, unique value propositions, business can avoid this problem.

Advantages Of Specialization

Specialization makes it easier to know and deeply understand a customers needs. The data you collect by doing so would be data a large data aggregator would have difficulty obtaining, as it is nuanced and specific. It’s less likely to be part of an easily identified big-data pattern, so the information is less likely to be commodified. This also helps foster a durable relationship.

Once you start finely segmenting markets, especially new and rising markets, you’ll gain unique insights and acquire unique data. You gain a high degree of focus. Check out “Business Lessons From Pumpkin Hackers”. You may be capable of doing a lot of different things, and many opportunities will come up that fall slightly outside your specialization, but there are considerable benefits in ignoring them and focusing on growing the one, significant opportunity.

Respin

Are you having trouble competing against other consultants? Consider respinning so you serve a specific niche. To specialize, an SEO might build a site all about dentistry and then offer leads and advertising to dentists, dental suppliers, dental schools, and so on. Such a site would build up a lot of unique and actionable data about the traffic in this niche. They might then use this platform as a springboard to offering SEO services to pre-qualified dentists in different regions, given dentistry is a location dependent activity, and therefore it is easy for the SEO to get around potential conflicts of interest. By specializing in this way, the SEO will likely understand their customer better than the generalist. By understanding the customer better, and gaining a track record with a specific type of customer, it gives the SEO an advantage when competing with other SEO firms for dentists SEO work. If you were a dentist wanting SEO services, who’s pitch stands out? The generalist SEO agency, or the SEO who specializes in web marketing for dentists?

Similarly, you could be a generalist web developer, or you could be the guy who specializes in payment gateways for mobile. Instead of being a web designer, how about being someone who specializes in themes for Oxwall? And so on. Think about ways you can re-spin a general thing you do into a specific thing for which there is demand, but little supply.

One way of getting a feel for areas to specialize in is to use Adwords as a research tool. For example, “oxwall themes” has almost no Adwords competition and around 1,300 searches per month. Let’s say 10% of that figure are willing to pay for themes. That’s 130 potential customers. Let’s say a specialist designer converts 10% of those, that’s 13 projects per month. Let’s say those numbers are only half right. That’s still 6-7 projects per month.

Having decided to specialize in a clearly defined, narrow market segment, and having good product or service knowledge and clear focus, you are much more likely to be able to spot the emerging pain points of your customers. Having this information will help you stand out from the crowd. Your pitches, your website copy, and your problem identification and solutions will make it harder for more generalist competitors to sound like they don’t know what they are talking about. This is the unique selling proposition (USP), of course. It’s based on the notion of quality. Reputation then spreads. It’s difficult for a siren server to insert itself between word of mouth gained from good reputation.

Differentiation is the aim of all businesses, no matter what the size. So, if one of your problems is being too reliant on search results, take a step back and determine if your offer is specialized enough. If you’re offering the same as your competitors, then you’re highly vulnerable to algorithm shifts. It’s hard to “own” generalist keyword terms, and a weak strategic position if your entire business success is reliant upon doing so.

Specialization lowers the cost of doing business. An obvious example can be seen in PPC/SEO. If you target a general term, it can be expensive to maintain position. In some cases, it’s simply impossible unless you’re already a major player. If you specialize, your marketing focus can be narrower, which means your marketing cost is lower. You also gain supply-side advantages, as you don’t need to source a wide range of goods, or hire as many people with different skillsets, as the generalist must do.

Once you’re delivering clear and unique value, you can justify higher prices. It’s difficult for buyers to make direct comparisons, because, if you have a high degree of specialization, there should be few available to them. If you are delivering that much more value, you deserve to be paid for it. The less direct competition you have, the less price sensitive your offering. If you offer the same price as other offerings, and your only advantage is SERP positioning, then that’s a vulnerable business positioning strategy.

If you properly execute a specialization strategy, you tend to become more lean and agile. You may be able to compete with larger competitors as you can react quicker than they can. Chances are, your processes are more streamlined as they are geared towards doing one specific thing. The small, specialized business is unlikely to have the chain of command and management structure that can slow decision making down in organizations that have a broader focus.

Specialized businesses tend to be more productive than their generalist counterparts as their detailed knowledge of a narrow range of processes and markets mean they can produce more with less. The more bases you cover, the more organisational aspects come into play, and the slower the process becomes.

In Summary

There are benefits in being a generalist, of course, however, if you’re a small operator and find yourself highly vulnerable to the whims of search engines, then it can pay to take a step back, tighten your focus, and try to dominate more specialist niches. The more general you go, the more competition you tend to encounter. The more competition you encounter in the SERPs, the harder you have to fight, and the more vulnerable you are to big data aggregators. The highly specialized are far more likely to fly under the radar, and are less vulnerable to big-brand bias in major verticals. They key to not being overly dependent on search engines is to develop enduring relationships, and specialization based on a strong, unique value proposition is one way of doing so.

Next article, we’ll look at differentiation by UX design and user experience.

Categories: 

SEO: Dirty Rotten Scoundrels

SEO is a dirty word.

PPC isn’t a dirty word.

Actually, they’re not words they’re acronyms, but you get my drift, I’m sure :)

It must be difficult for SEO providers to stay on the “good and pure” side of SEO when the definitions are constantly shifting. Recently we’ve seen one prominent SEO tool provider rebrand as an “inbound marketing” tools provider and it’s not difficult to appreciate the reasons why.

SEO, to a lot of people, means spam. The term SEO is lumbered, rightly or wrongly, with negative connotations.

Email Optimization

Consider email marketing.

Is all email marketing spam? Many would consider it annoying, but obviously not all email marketing is spam.

There is legitimate email marketing, whereby people opt-in to receive email messages they consider valuable. It is an industry worth around $2.468 billion. There are legitimate agencies providing campaign services, reputable tools vendors providing tools, and it can achieve measurable marketing results where everyone wins.

Yet, most email marketing is spam. Most of it is annoying. Most of it is irrelevant. According to a Microsoft security report, 97% of all email circulating is spam.

So, only around 3% of all email is legitimate. 3% of email is wanted. Relevant. Requested.

One wonders how much SEO is legitimate? I guess it depends what we mean by legitimate, but if we accept the definition I’ve used – “something relevant wanted by the user” – then, at a guess, I’d say most SEO these days is legitimate, simply because being off-topic is not rewarded. Most SEOs provide on-topic content, and encourage businesses to publish it – free – on the web. If anything, SEOs could be accused of being too on-topic.

The proof can be found in the SERPs. A site is requested by the user. If a site is listed matches their query, then the user probably deems it to be relevant. They might find that degree of relevance, personally, to be somewhat lacking, in which case they’ll click-back, but we don’t have a situation where search results are rendered irrelevant by the presence of SEO.

Generally speaking, search appears to work well in terms of delivering relevance. SEO could be considered cleaner than email marketing in that SEOs are obsessed with being relevant to a user. The majority of email marketers, on the other hand, couldn’t seem to care less about what is relevant, just so long as they get something, anything, in front of you. In search, if a site matches the search query, and the visitor likes it enough to register positive quality metrics, then what does it matter how it got there?

It probably depends on whos’ business case we’re talking about.

Advertorials

Matt Cutts has released a new video on Advertorials and Native Advertising.

Matt makes a good case. He reminds us of the idea on which Google was founded, namely citation. If people think a document is important, or interesting, they link to it.

This idea came from academia. The more an academic document is cited, and cited by those with authority, the more relevant that document is likely to be. Nothing wrong with that idea, however some of the time, it doesn’t work. In academic circles, citation is prone to corruption. One example is self-citation.

But really, excessive self-citation is for amateurs: the real thing is forming a “citation cartel” as Phil David from The Scholarly Kitchen puts it. In April this year, after receiving a “tip from a concerned scientist” Davis did some detective work using the JCR data and found that several journals published reviews citing an unusually high number of articles fitting the JIF window from other journals. In one case, theMedical Science Monitor published a 2010 review citing 490 articles, 445 of them were published in 2008-09 in the journal Cell Transplantation (44 of the other 45 were for articles from Medical Science Journal published in 2008-09 as well). Three of the authors were Cell Transplantation editors

So, even in academia, self-serving linking gets pumped and manipulated. When this idea is applied to the unregulated web where there is vast sums of money at stake, you can see how citation very quickly changes into something else.

There is no way linking is going to stay “pure” in such an environment.

The debate around “paid links” and “paid placement” has been done over and over again, but in summary, the definition of “paid” is inherently problematic. For example, some sites invite guest posting, pay the writers nothing in monetary terms, but the payment is a link back to the writers site. The article is a form of paid placement, it’s just that no money changes hands. Is the article truly editorial?

It’s a bit grey.

A lot of the time, such articles pump the writers business interests. Is that paid content, and does it need to be disclosed? Does it need to be disclosed to both readers and search engines? I think Matt’s video suggests it isn’t a problem, as utility is provided, but a link from said article may need to be no-followed in order to stay within Google’s guidelines.

Matt wants to see clear and conspicuous disclosure of advertorial content. Paid links, likewise. The disclosure should be made both to search engines and readers.

Which is interesting.

Why would a disclosure need to be made to a search engine spider? Granted, it makes Google’s job easier, but I’m not sure why publishers would want to make Google’s job easier, especially if there’s nothing in it for the publishers.

But here comes the stick, and not just from the web spam team.

Google News have stated they may remove a publication if a publication is taking money for paid content and not adequately disclosing that fact – in Google’s view – to both readers and search engines, then that publication may be kicked from Google News. In so doing, Google increase the risk to the publisher, and therefore the cost, in accepting paid links or paid placement.

So, that’s why a publisher will want to make Google’s job easier. If they don’t, they run the risk of invisibility.

Now, on one level, this sounds fair and reasonable. The most “merit worthy” content should be at the top. A ranking should not depend on how deep your pockets are i.e. the more links you can buy, the more merit you have.

However, one of the problems is that the search results already work this way. Big brands often do well in the SERPs due to reputation gained, in no small part, from massive advertising spend that has the side effect, or sometimes direct effect, of inbound links. Do these large brands therefore have “more merit” by virtue of their deeper pockets?

Google might also want to consider why a news organization would blur advertorial lines when they never used to. Could it be because their advertising is no longer paying them enough to survive?

SEO Rebalances The Game

SEO has helped level the playing field for small businesses, in particular. The little guy didn’t have deep pockets, but he could play the game smarter by figuring out what the search engines wanted, algorithmicly speaking, and giving it to them.

I can understand Google’s point of view. If I were Google, I’d probably think the same way. I’d love a situation where editorial was editorial, and business was PPC. SEO, to me, would mean making a site crawlable and understandable to both visitors and bots, but that’s the end of it. Anything outside that would be search engine spam. It’s neat. It’s got nice rounded edges. It would fit my business plan.

But real life is messier.

If a publisher doesn’t have the promotion budget of a major brand, and they don’t have enough money to outbid big brands on PPC, then they risk being invisible on search engines. Google search is pervasive, and if you’re not visible in Google search, then it’s a lot harder to make a living on the web. The risk of being banned for not following the guidelines is the same as the risk of playing the game within the guidelines, but not ranking. That risk is invisibility.

Is the fact a small business plays a game that is already stacked against them, by using SEO, “bad”? If they have to pay harder than the big brands just to compete, and perhaps become a big brand themselves one day, then who can really blame them? Can a result that is relevant, as far as the user is concerned, still really be labelled “spam”? Is that more to do with the search engines business case that actual end user dissatisfaction?

Publishers and SEOs should think carefully before buying into the construct that SEO, beyond Google’s narrow definition, is spam. Also consider that the more people who can be convinced to switch to PPC and/or stick to just making sites more crawlable, then the more spoils for those who couldn’t care less how SEO is labelled.

It would be great if quality content succeeded in the SERPs on merit, alone. This would encourage people to create quality content. But when other aspects are rewarded, then those aspects will be played.

Perhaps if the search engines could be explicit about what they want, and reward it when they’re delivered it, then everyone’s happy.

I guess the algorithms just aren’t that clever yet.

Categories: 

Inbound, Outbound, Outhouse

Jon Henshaw put the hammer down on inbound marketing highlighting how the purveyors of “the message” often do the opposite of what they preach. So much of the marketing I see around that phrase is either of the “clueless newb” variety, or paid push marketing of some stripe.

@seobook why don’t you follow more of your followers?

— Randy Milanovic (@kayak360) May 19, 2013

One of the clueless newb examples smacked me in the face last week on Twitter, where some “HubSpot certified partner” (according to his Twitter profile) complained to me about me not following enough of our followers, then sent a follow up spam asking if I saw his artice about SEO.

@seobook Have you seen: socialmediatoday.com/randy-milanovi…

— Randy Milanovic (@kayak360) May 19, 2013

The SEO article was worse than useless. It suggested that you shouldn’t be “obvious” & that you should “naturally attract links.” Yet the article itself was a thin guest post containing the anchor text search engine optimization deep linking to his own site. The same guy has a “book” titled Findability: Why Search Engine Optimization is Dying.

Why not promote the word findability with the deep link if he wants to claim that SEO is dying? Who writes about how something is dying, yet still targets it instead of the alleged solution they have in hand?

If a person wants to claim that anchor text is effective, or that push marketing is key to success, it is hard to refute those assertations. But if you are pushy & aggressive with anchor text, then the message of “being natural” and “just let things flow” is at best inauthentic, which is why sites like Shitbound.org exist. ;)

Some of the people who wanted to lose the SEO label suggested their reasoning was that the acronym SEO was stigmatized. And yet, only a day after rebranding, these same folks that claim they will hold SEO near and dear forever are already outing SEOs.

Sad but fact: Rand Fishkin outs another site that just happens to be competing with Distilled twitter.com/randfish/statu…

— john andrews (@johnandrews) May 31, 2013

The people who want to promote the view that “traditional” SEO is black hat and/or ineffective have no problems with dumping on & spamming real people. It takes an alleged “black hat” to display any concern with how actual human beings are treated.

If the above wasn’t bad enough, SEO is getting a bad name due to the behavior of inbound tool vendors. Look at the summary on a blog post from today titled Lies The SEO Publicity Machine Tells About PPC (When It Thinks No One’s Looking)

Then he told me he wasn’t seeing any results from following all the high-flown rhetoric of the “inbound marketing, content marketing” tool vendor. “Last month, I was around 520 visitors. This month, we’re at 587.”

Want to get to 1,000? Work and wait and believe for another year or two. Want to get to 10,000? Forget it.

You could grow old waiting for the inbound marketing fairy tale to come true.

Of course I commented on the above post & asked Andrew if he could put “inbound marketer” in the post title, since that’s who was apparently selling hammed up SEO solutions.

In response to Henshaw’s post (& some critical comments) calling inbound marketing incomplete marketing Dharmesh Shah wrote:

When we talk about marketing, we position classical outbound techniques as generally being less effective (and more expensive) over time. Not that they’re completely useless — just that they don’t work as well as they once did, and that this trend would continue.”

Hugh MacLeod is brilliant with words. He doesn’t lose things in translation. His job is distilling messages to their core. And what did his commissions for HubSpot state?

  • thankfully consiging traditional marketing to the dustbin of history since 2006
  • traditional marketing is easy. all you have to do is pretend it works
  • the good news is, your customers are just as sick of traditional marketing as you are
  • hey, remember when traditional marketing used to work? neither do we
  • traditional marketing doesn’t work. it never did


Claiming that “traditional marketing” doesn’t work – and never did, would indeed be claiming that classical marketing techniques are ineffective / useless.

If something “doesn’t work” it is thus “useless.”

You never hear a person say “my hammer works great, it’s useless!”

As always, watch what people do rather than what they say.

When prescription and behavior are not aligned, it is the behavior that is worth emulating.

That’s equally true for keyword rich deeplink in a post telling you to let SEO happen naturally and for people who relabel things while telling you not to do what they are doing.

If “traditional marketing” doesn’t work AND they are preaching against it, why do they keep doing it?

Follow the money.

Categories: 

Growing An SEO Business By Removing Constraints

If you run an SEO business, or any service business, you’ll know how hard it can be to scale up operations. There are many constraints that need to be overcome in order to progress.

We’ll take a look at a way to remove barriers to growth and optimize service provision using the Theory Of Constraints. This approach proposes a method to identify the key constraints to performance which hinder growth and expansion.

The Theory Of Constraints has been long been used for optimizing manufacturing…..

We had no legs to stand on to maintain our current customer base let alone acquire and keep new business. This was not an ideal position to be in, particularly in a down economy when we couldn’t afford to have sales reduce further

… but more recently, it’s been applied to services, too.

The results were striking. The number of days to decide food stamp eligibility dropped from 15 to 11; phone wait times were reduced from 23 minutes to nine minutes. Budgetary savings have exceeded the $9 million originally cut

It’s one way of thinking about how to improve performance by focusing on bottlenecks. If you’re experiencing problems such as being overworked and not having enough time, it could offer a solution.

First we’ll take a look at the theory, then apply it to an SEO agency. It can be applied to any type of business, of course.

Theory Of Constraints

Any manageable system as being limited in achieving more of its goals by a very small number of constraints. There is always at least one constraint, and TOC uses a focusing process to identify the constraint and restructure the rest of the organization around it

If there weren’t constraints, you could grow your business as large and as fast as you wanted.

You can probably think of numerous constraints that prevent you from growing your business. However, the theory holds that most constraints are really side issues, and that organizations are constrained by only one constraint at any one time.

A constraint is characterized as the “weakest link”.

The weakest link holds everything else up. Once this constraint has been eliminated or managed, another “weakest link” may well emerge, so the process is repeated until the business is optimized. Constraints can be people, procedures, supplies, management, and systems.

In Dr. Eli Goldratt’s book, “The Goal“, Golddratt identifies the five steps to identify and address the constraint:

  • Identify the constraint
  • Exploit the constraint
  • Subordinate everything else to the constraint
  • Elevate the constraint
  • Go back to step 1
  • 1. Identify The Constraint

    What is the biggest bottleneck that holds back company performance? What activity always seems to fall behind schedule, or take the most time? This activity might not be the main activity of the company. It could be administrative. It could be managerial.

    If you’re not sure, try the “Five Whys” technique to help determine the root cause:

    By repeatedly asking the question “Why” (five is a good rule of thumb), you can peel away the layers of symptoms which can lead to the root cause of a problem. Very often the ostensible reason for a problem will lead you to another question. Although this technique is called “5 Whys,” you may find that you will need to ask the question fewer or more times than five before you find the issue related to a problem

    2. Exploit The Constraint

    Once the constraint is identified, you then utilize the constraint to its fullest i.e. you try to make sure that constraint is working at maximum performance. What is preventing the constraint from working at maximum performance?

    If the constraint is staff, you might look at ways for people to produce more work, perhaps by automating some of their workload, or allocating less-essential work to someone else. It could involve more training. It could involve adopting different processes.

    3. Subordinate Everything Else To The Constraint

    Identify all the non-constraints that may prevent the constraint from working at maximum performance. These might be activities or processes the constraint has to undertake but aren’t directly related to the constraint.

    For example, a staff member who is identified as a constraint might have a billing task that could either by automated or allocated to someone else.

    The constraint should not be limited by anything outside their control. The constraint can’t do any more than it possibly can i.e. if your constraint is time, you can’t have someone work anymore than 24 hours in a day! More practically, 8 hours a day.

    Avoid focusing on non-constraints. Optimizing non-constraints might feel good, but they won’t do much to affect overall productivity.

    4. Elevate The Constraint

    Improve productivity of the constraint by lifting the performance of the constraint. Once you’ve identified the constraint, and what is limiting performance, then you typically find spare capacity emerges. You then increase the workload. The productivity of the entire company is now lifted. Only then would you hire an additional person, if necessary.

    5. Repeat

    The final step is to repeat the process.

    The process is repeated because the weakest link may now move to another area of the business. For example, if more key workers have been hired to maximize throughput, then the constraint may have shifted to a management level, because the supervisory workload has increased.

    If so, this new constraint gets addressed via the same process.

    Applying The Theory Of Constraints To An SEO Agency

    Imagine Acme SEO Inc.

    Acme SEO are steadily growing their client base and have been meeting their clients demands. However, they’ve noticed projects are taking longer and longer to finish. They’re reluctant to take on new work, as it appears they’re operating at full capacity.

    When they sit down to look at the business in terms of constraints, they find that they’re getting the work, they’re starting the work on time, but the projects slow down after that point. They frequently rush to meet deadlines, or miss them. SEO staff appear overworked. If the agency can’t get through more projects, then they can’t grow. Everything else in the business, from the reception to sales, depends on it. Do they just hire more people?

    They apply the five steps to define the bottleneck and figure out ways to optimize performance.

    Step One

    Identify the constraint. What is the weakest link? What limits the SEO business doing more work? Is it the employees? Are they skilled enough? How about the systems they are using? Is there anything getting in the way of them completing their job?

    Try asking the Five Whys to get to the root of the problem:

    1. Why is this process taking so long? Because there is a lot of work involved.
    2. Why is there a lot of work involved? Because it’s complex.
    3. Why is it complex? Because there is a lot of interaction with the client.
    4. Why is there a lot of interaction with the client? Because they keep changing their minds.
    5. Why do they keep changing their demands? Because they’re not clear about what they want.

    Step Two

    Exploiting the constraint. How can the SEO work at maximum load?

    If an SEO isn’t doing as much as they could be, is it due to project management and training issues? Do people need more direct management? More detailed processes? More training?

    It sounds a bit ruthless, especially when talking about people, but really it’s about constructively dealing with the identified bottlenecks, as opposed to apportioning blame.

    In our example, the SEOs have the skills necessary, and work hard, but the clients kept changing scope, which is leading to a lot of rework and administrative overhead.

    Once that constraint had been identified, changes were made to project management, eliminating the potential for scope creep after the project had been signed off, thus helping maximize the throughput of the worker.

    Step Three

    Subordinate the constraint. So, the process has been identified as the cause of a constraint. By redesigning the process to control scope creep before the SEO starts, say at a sales level, they free up more time. When the SEO works on the project, they’re not having to deal with administrative overhead that has a high time cost, therefore their utility is maximised.

    The SEO is now delivering more forward momentum.

    Step Four

    Elevate the performance of the constraint. They monitor the performance of the SEO. Does the SEO now have spare capacity? Is the throughput increasing? Have they done everything possible to maximize the output? Are there any other processes holding up the SEO? Should the SEO be handling billing when someone else could be doing that work? Is the SEO engaged in pre-sales when that work could be handled by sales people?

    Look for work being done that takes a long time, but doesn’t contribute to output. Can these tasks be handed to someone else – someone who isn’t a constraint?

    If the worker is working at maximum utility, then adding another worker might solve the bottleneck. Once the bottleneck is removed, performance improves.

    Adding bodies is the common way service based industry, like SEO, scales up. A consultancy bills hours, and the more bodies, the more hours they can ill. However, if the SEO role is optimized to start with, then they might find they have spare capacity opening up so don’t need as many new hires.

    Step Five

    Repeat.

    Goldratt stressed that using the Theory Of Constraints to optimize business is an on-going task. You identify the constraint – which may not necessarily be the most important aspect of the business i.e. it could be office space – which then likely shifts the weakest link to another point. You then optimize that point, and so on. Fixing the bottleneck is just the beginning of a process.

    It’s also about getting down to the root of the problem, which is why the Five Whys technique can be so useful. Eliminating a bottleneck sounds simple, and a quick fix, but the root of the problem might not be immediately obvious.

    In our example, it appeared as though the staff are the problem, so the root cause could be misdiagnosed as “we need more staff”. In reality, the root cause of the bottleneck was a process problem.

    Likewise some problems aligned with an employee on a specific project might be tied to the specific client rather than anything internal to your company. Some people are never happy & will never be satisfied no matter what you do. Probably the best way to deal with people who are never satisfied is to end those engagements early before they have much of an impact on your business. The best way to avoid such relationships in the first place is to have some friction upfront so that those who contact you are serious about working with you. It can also be beneficial to have some of your own internal sites to fall back on, such that when consulting inquiries are light you do not chase revenue at the expense of lower margins from clients who are not a good fit. These internal projects also give you flexibility to deal with large updates by being able to push some of your sites off into the background while putting out any fires that emerge from the update. And those sorts of sites give you a testing platform to further inform your strategy with client sites.

    How have you addressed business optimization problems? What techniques have you found useful, and how well did they work?

    Further Resources:

    I’ve skimmed across the surface, but there’s a lot more to it. Here’s some references used in the article, and further reading…

    Categories: 

    GoogleMart

    It was hard to spot, at first.

    It started with one store on the outskirts of town. It was big. Monolithic. It amalgamated a lot of cheap, largely imported stuff and sold the stuff on. The workers were paid very little. The suppliers were squeezed tight on their margins.

    And so it grew.

    And as it grew, it hollowed out the high street. The high street could not compete with the monoliths sheer power. They couldn’t compete with the monoliths influence on markets. They couldn’t compete with the monoliths unique insights gained from clever number crunching of big data sets.

    I’m talking about Wal Mart, of course.

    Love ‘em or loathe ‘em, Walmart gave people what they wanted, but in so doing, hollowed out a chunk of America’s middle class. It displaced a lot of shop keepers. It displaced small business owners on Main Street. It displaced the small family retail chain that provided a nice little middle class steady earner.

    Where did all those people go?

    It was not only the small, independent retail businesses and local manufacturers who are fewer in number. Their closure triggered flow-on effects. There was less demand for the services they used, such as local small business accountants, the local lawyer, small advertising companies, local finance companies, and the host of service providers that make up the middle class ecosystem.

    Where did they all go?

    Some would have taken up jobs at WalMart, of course. Some would become unemployed. Some would close their doors are take early retirement. Some would change occupations and some would move away to where prospects were better.

    What does any of this have to do with the internet?

    The same thing is happening on the internet.

    And if you’re a small business owner, located on the web-equivalent of the high street, or your business relies on those same small business owners, then this post is for you.

    Is Technology Gutting The Middle Class?

    I’ve just read “Who Owns The Future”, by Jaron Lanier. Everyone who has anything to do with the internet – and anyone who is even remotely middle class – will find it asks some pretty compelling questions about our present and future.

    Consider this.

    At the height of it’s power, the photography company Kodak employed more than 140,000 people and wa worth $28 billion. They even invented the first digital camera. But today Kodak is bankrupt, and the new face of digital photography has become Instagram. When it was sold to Facebook for a billion dollars in 2012, Instagram only employed 13 people

    Great for Instagram. Bad for Kodak. And bad for the people who worked for Kodak. But, hey. That’s progress, right? Kodak had an outdated business model. Technology overtook them.

    That’s true. It is progress. It’s also true that all actions have consequences. The consequence of transformative technology is that, according to Lanier, it may well end up destroying the middle class if too much of the value is retained in the large technology companies.

    Lanier suggests that the advance of technology is not replacing as many jobs as it destroys, and those jobs that are destroyed are increasingly middle class.

    Not Political (Kinda)

    I don’t wish to make this post political, although all change is inherently political. I’m not taking political sides. This issue cuts across political boundaries. I have a lot of sympathy for technological utopian ideas and the benefits technology brings, and have little time for luddism.

    However, it’s interesting to focus on the the consequences of this shift in wealth and power brought about by technology and whether enough people in the internet value chain receive adequate value for their efforts.

    If the value doesn’t flow through, as capitalism requires in order to function well, then few people win. Are children living at home longer than they used to? Are people working longer hours than they used to in order to have the same amount of stuff? Has the value chain been broken, Lanier asks? And, if so, what can be done to fix it?

    What Made Instagram Worth One Billion Dollars?

    Lanier points out that Instagram wasn’t worth a billion dollars because it had extraordinary employees doing amazing things.

    The value of Instagram came from network effects.

    Millions of people using Instagram gave the Instagram network value. Without the user base, Instagram is just another photo app.

    Who got paid in the end? Not the people who gave the network value. The people who got paid were the small group at the top who organized the network. The owners of the “Siren Servers“:

    The power rests in what Lanier calls the “Siren Servers”: giant corporate repositories of information about our lives that we have given freely and often without consent, now being used for huge financial benefit by a super-rich few

    The value is created by all the people who make up the network, but they only receive a small slither of that value in the form of a digital processing tool. To really benefit, you have to own, or get close to, a Siren Server.

    Likewise, most of Google’s value resides in the network of users. These users feed value into Google simply by using it and thereby provide Google with a constant stream of data. This makes Google valuable. There isn’t much difference between Google and Bing in terms of service offering, but one is infinitely more valuable than the other purely by virtue of the size of the audience. Same goes for Facebook over Orkut.

    You Provide Value

    Google are provided raw materials by people. Web publishers allow Google to take their work, at no charge, and for Google to use that work and add value to Google’s network. Google then charges advertisers to place their advertising next to the aggregated information.

    Why do web publishers do this?

    Publishers create and give away their work in the hope they’ll get traffic back, from which they may derive benefit. Some publishers make money, so they can then pay real-world expenses, like housing, food and clothing. The majority of internet publishers make little, or nothing, from this informal deal. A few publishers make a lot. The long tail, when it comes to internet publishing, is pretty long. The majority of wealth, and power, is centralized at the head.

    Similarly, Google’s users are giving away their personal information.

    Every time someone uses Google, they are giving Google personal information of value. Their search queries. They browsing patterns. Their email conversations. Their personal network of contacts. Aggregate that information together, and it becomes valuable information, indeed. Google records this information, crunches it looking for patterns, then packages it up and sells it to advertisers.

    What does Google give back in return?

    Web services.

    Is it a fair exchange of value?

    Lanier argues it isn’t. What’s more, it’s an exchange of value so one-sided that it’s likely to destroy the very ecosystem on which companies like Google are based – the work output, and the spending choices, of the middle class. If few of the people who publish can make a reasonable living doing so, then the quality of what gets published must decrease, or cease to exist.

    People could make their money in other ways, including offline. However, consider that the web is affecting a lot of offline business, already. The music industry is a faint shadow of what it once was, even as recent as one decade ago. There are a lot fewer middle class careers in the music industry now. Small retailers are losing out to the web. Fewer jobs there. The news industry is barely making any money. Same goes for book publishers. All these industries are struggling as online aggregators carve up their value chains.

    Now, factor in all the support industries of these verticals. Then think about all the industries likely to be affected in the near future – like health, or libraries, or education, for example. Many businesses that used to hire predominantly middle class people are going out of business, downsizing their operations, or soon to have chunks of their value displaced.

    It’s not Google’s aim to gut the middle class, of course. This post is not an anti-Google rant, either, simply a look at action and consequence. What is the effect of technology and, in particular, the effect of big technology companies on the web, most of whom seem obsessed with keeping you in their private, proprietary environments for as long as possible?

    Google’s aim is index all the worlds information and make it available. That’s a good aim. It’s a useful, free service. But Lanier argues that gutting the middle class is a side-effect of re-contextualising, and thereby devaluing, information. Information may want to be free, but the consequence of free information is that those creating the information may not get paid. Many of those who do get paid may be weaker organizations more willing to sacrifice editorial quality in able to stay in business. We already see major news sites with MFA-styled formatting on unvetted syndicated press releases. What next?

    You may notice that everyone is encouraged to “share” – meaning “give away” – but sharing doesn’t seem to extend to the big tech companies, themselves.

    They charge per click.

    Robots.txt

    One argument is that if someone doesn’t like Google, or any search engine, they should simply block that search engine via robots.txt. The problem with that argument is it’s like saying if you don’t like aspects of your city, you should move to the middle of the wilderness. You could, but really you’d just like to make the city a better place to be, and to see it thrive and prosper, and be able to thrive within it.

    Google provides useful things. I use Google, just like I use my iPhone. I know the deal. I get the utility in exchange for information, and this exchange is largely on their terms. What Lanier proposes is a solution that not only benefits the individual, and the little guy, but ultimately the big information companies, themselves.

    Money Go Round

    Technology improvements have created much prosperity and the development of a strong middle class. But the big difference today is that what is being commoditized is information itself. In a world increasingly controlled by software that acts as our interface to information, if we commoditize information then we commoditize everything else.

    If those creating the information don’t get paid, quality must decrease, or become less available than it otherwise would be. They can buy less stuff in the real world. If they can’t buy as much stuff in the real world, then Google and Facebook’s advertisers have fewer people to talk to that they otherwise would.

    It was all a social construct to begin with, so what changed, to get to your question, is that at the turn of the [21st] century it was really Sergey Brin at Google who just had the thought of, well, if we give away all the information services, but we make money from advertising, we can make information free and still have capitalism. But the problem with that is it reneges on the social contract where people still participate in the formal economy. And it’s a kind of capitalism that’s totally self-defeating because it’s so narrow. It’s a winner-take-all capitalism that’s not sustaining

    That isn’t a sustainable situation long-term. A winner-takes-all system centralizes wealth and power at the top, whilst everyone else occupies the long tail. Google has deals in place with large publishers, such as AP, AFP and various European agencies, but this doesn’t extend to smaller publishers. It’s the same in sports. The very top get paid ridiculous amounts of money whilst those only a few levels down are unlikely to make rent on their earnings.

    But doesn’t technology create new jobs? People who were employed at Kodak just go do something else?

    The latest waves of high tech innovation have not created jobs like the old ones did. Iconic new ventures like Facebook employ vastly fewer people than big older companies like, say, General Motors. Put another way, the new schemes…..channel much of the productivity of ordinary people into an informal economy of barter and reputation, while concentrating the extracted old -fashioned wealth for themselves. All activity that takes place over digital networks becomes subject to arbitrage, in the sense that risk is routed to whoever suffers lesser computation resources

    The people who will do well in such an environment will likely be employees of those who own the big data networks, like Google. Or they will be the entrepreneurial and adaptable types who manage to get close to them – the companies that serve WalMart or Google, or Facebook, or large financial institutions, or leverage off them – but Lanier argues there simply aren’t enough of those roles to sustain society in a way that gave rise to these companies in the first place.

    He argues this situation disenfranchises too many people, too quickly. And when that happens, the costs spread to everyone, including the successful owners of the networks. They become poorer than they would otherwise be by not returning enough of the value that enables the very information they need to thrive. Or another way of looking at it – who’s going to buy all the stuff if only a few people have the money?

    The network, whether it be a search engine, a social network, an insurance company, or an investment fund uses information to concentrate power. Lanier argues they are all they same as they operate in pretty much the same way. The use network effects to mine and crunch big data, and this, in turn, grows their position at the expense of smaller competitors, and the ecosystem that surrounds them.

    It doesn’t really matter what the intent was. The result is that the technology can prevent the middle class from prospering and when that happens, everyone ultimately loses.

    So What Does He Propose Can Be Done?

    A few days ago, Matt Cutts released a video about what site owners can expect from the next round of Google changes.

    Google have announced a web spam change, called Penguin 2.0. They’ll be “looking at” advertorials, and native advertising. They’ll be taking a “stronger line” on this form of publishing. They’ll also be “going upstream” to make link spammers less effective.

    Of course, whenever Google release these videos, the webmaster community goes nuts. Google will be making changes, and these changes may either make your day, or send you to the wall.

    The most interesting aspect of this, I think, is the power relationship. If you want to do well in Google’s search results then there is no room for negotiation. You either do what they want or you lose out. Or you may do what they want and still lose out. Does the wealth and power sit with the publisher?

    Nope.

    In other news, Google just zapped another link network.

    Cutts warns they’ll be going after a lot of this happening. Does wealth and power sit with the link buyer or seller?

    Nope.

    Now, Google are right to eliminate or devalue sites that they feel devalues their search engine. Google have made search work. Search was all but dead twelve years ago due to the ease with which publishers could manipulate the results, typically with off-topic junk. The spoils of solving this problem have flowed to Google.

    The question is has too much wealth flowed to companies like Google, and is this situation going to kill off large chunks of the ecosystem on which it was built? Google isn’t just a player in this game, they’re so pervasive they may as well be the central planner. Cutts is running product quality control. The customers aren’t the publishers, they’re the advertisers.

    It’s also interesting to note what these videos do not say. Cutts video was not about how your business could be more prosperous. It was all about your business doing what Google wants in order for Google to be more prosperous. It’s irrelevant if you disagree or not, as you don’t get to dictate terms to Google.

    That’s the deal.

    Google’s concern lies not with webmasters just as WalMarts concern lies not with small town retailers. Their concern is to meet company goals and enhance shareholder value. The effects aren’t Google or WalMarts fault. They are just that – effects.

    The effect of Google pursuing those objectives might be to gouge out the value of publishing, and in so doing, gouge out a lot of the value of the middle class. The Google self-drive cars project is fascinating from a technical point of view – the view Google tends to focus on – but perhaps even more fascinating when looked at from a position they seldom seem to consider, at least, not in public, namely what happens to all those taxi drivers, and delivery drivers, who get their first break in society doing this work? Typically, these people are immigrants. Typically, they are poor but upwardly mobile.

    That societal effect doesn’t appear to be Google’s concern.

    So who’s concern should it be?

    Well, perhaps it really should be Google’s concern, as it’s in their own long-term best interest:

    Today, a guitar manufacturer might advertise through Google. But when guitars are someday spun out of 3D printers, there will be no one to buy an ad if guitar designs are “free”. Yet Google’s lifeblood is information put online for free. That is what Google’s servers organize. Thus Google’s current business model is a trap in the longterm

    Laniers suggestion is everyone gets paid, via micro-payments, linked back to the value they helped create. These payments continue so long as people are using their stuff, be it a line of code, a photograph, a piece of music, or an article.

    For example, if you wrote a blog post, and someone quoted a paragraph of it, you would receive a tiny payment. The more often you’re quoted, the more relevant you are, therefore the more payment you receive. If a search engine indexes your pages, then you receive a micro-payment in return. If people view your pages, you receive a micro-payment. Likewise, when you consume, you pay. If you conduct a search, then you run Google’s code, and Google gets paid. The payments are tiny, as far as the individual is concerned, but they all add up.

    Mammoth technical issues of doing this aside, the effect would be to take money from the head and pump it back into the tail. It would be harder to build empires off the previously free content others produce. It would send money back to producers.

    It also eliminates the piracy question. Producers would want people to copy, remix and redistribute their content, as the more people that use it, the more money they make. Also, with the integration of two-way linking, the mechanism Lanier proposes to keep track of ownership and credit, you’d always know who is using your content.

    Information would no longer be free. It would be affordable, in the broadest sense of the word. There would also be a mechanism to reward the production, and a mechanism to reward the most relevant information the most. The more you contribute to the net, and the more people use it, the more you make. Tiny payments. Incremental.

    Interesting Questions

    So, if these questions are of interest to you, I’d encourage you to read “Who Owns The Future” by Jaron Lanier. It’s often rambling – in a good way – and heads off on wild tangents – in a good way, and you can tell there is a very intelligent and thoughtful guy behind it all. He’s asking some pretty big, relevant questions. His answers are sketches that should be challenged, argued, debated and enlarged.

    And if big tech companies want a challenge that really will change the world, perhaps they could direct all that intellect, wealth and power towards enriching the ecosystem at a pace faster than they potentially gouge it.

    Categories: 

    LarryWorld

    It’s hard to disagree with Larry Page.

    In his recent speech at Google I/O, Page talked about privacy and how it impairs Google. “Why are people so focused on keeping their medical history private”? If only people would share more, then Google could do more.

    Well, quite.

    We look forward to Google taking the lead in this area and opening up their systems to public inspection. Perhaps they could start with the search algorithms. If Google would share more, publishers could do more.

    What’s not to like? :)

    But perhaps that’s comparing apples with oranges. The two areas may not be directly comparable as the consequences of opening up the algorithm would likely destroy Google’s value. Google’s argument against doing so has been that the results would suffer quality issues.

    Google would not win.

    TechnoUtopia

    If Page’s vision sounds somewhat utopian, then perhaps we should consider where Google came from.

    In a paper entitled “The Politics Of Search: A Decade Retrospective”, Laura Granker points out that when Google started out, the web was a more utopian place.

    A decade ago, the Internet was frequently viewed through a utopian lens, with scholars redicting that this increased ability to share, access, and produce content would reduce barriers to information access…Underlying most of this work is a desire to prevent online information from merely mimicking the power structure of the conglomerates that dominate the media landscape. The search engine, subsequently, is seen as an idealized vehicle that can differentiate the Web from the consolidation that has plagued ownership and content in traditional print and broadcast media

    At the time, researchers Introna and Nissenbaum felt that online information was too important to be shaped by market forces alone. They correctly predicted this would lead to a loss of information quality, and a lack of diversity, as information would pander to popular tastes.

    They advocated, perhaps somewhat naively in retrospect, public oversight of search engines and algorithm transparency to correct these weaknesses. They argued that doing so would empower site owners and users.

    Fast forward to 2013, and there is now more skepticism about such utopian values. Search engines are seen as the gatekeepers of information, yet they remain secretive about how they determine what information we see. Sure, they talk about their editorial process in general terms, but the details of the algorithms remain a closely guarded secret.

    In the past decade, we’ve seen a considerable shift in power away from publishers and towards the owners of big data aggregators, like Google. Information publishers are expected to be transparent – so that a crawler can easily gather information, or a social network can be, well, social – and this has has advantaged Google and Facebook. It would be hard to run a search engine or a social network if publishers didn’t buy into this utopian vision of transparency.

    Yet, Google aren’t quite as transparent with their own operation. If you own a siren server, then you want other people to share and be open. But the same rule doesn’t apply to the siren server owner.

    Opening Up Health

    Larry is concerned about constraints in healthcare, particularly around access to private data.

    “Why are people so focused on keeping their medical history private?” Page thinks it’s because people are worried about their insurance. This wouldn’t happen if there was universal care, he reasons.

    I don’t think that’s correct.

    People who live in areas where there is universal healthcare, like the UK, Australia and New Zealand, are still very concerned about the privacy of their data. People are concerned that their information might be used against them, not just by insurance companies, but by any company, not to mention government agencies and their employees.

    People just don’t like the idea of surveillance, and they especially don’t like the idea of surveillance by advertising companies who operate inscrutable black boxes.

    Not that good can’t come from crunching the big data linked to health. Page is correct in saying there is a lot of opportunity to do good by applying technology to the health sector. But first companies like Google need to be a lot more transparent about their own data collection and usage in order to earn trust. What data are they collecting? Why? What is it used for? How long is it kept? Who can access it? What protections are in place? Who is watching the watchers?

    Google goes someway towards providing transparency with their privacy policy. A lesser known facility, called Data Liberation allows you to move data out of Google, if you wish.

    I’d argue that in order for people to trust Google to a level Page demands would require a lot more rigor and transparency, including third party audit. There are also considerable issues to overcome, in terms of government legislation, such as privacy acts. Perhaps the most important question is “how does this shift power balances”? No turkey votes for an early Christmas. If your job relies on being a gatekeeper of health information, you’re hardly going to hand that responsibility over to Google.

    So, it’s not a technology problem. And not just because people afraid of insurance companies. And it’s not because people aren’t on board with the whole Burning-Man-TechnoUtopia vision. It’s to do with trust. People would like to know what they’re giving up, to whom, and what they’re getting in return. And it’s about power and money.

    Page has answered some of the question, but not nearly enough of it. Something might be good for Google, and it might be good for others, but people want a lot more than just his word on it.

    Sean Gallagher writes in ArsTechnica:

    The changes Page wants require more than money. They require a change of culture, both political and national. The massively optimistic view that technology can solve all of what ails America—and the accompanying ideas on immigration, patent reform, and privacy—are not going to be so easy to force into the brains of the masses.

    The biggest reason is trust. Most people trust the government because it’s the government—a 226-year old institution that behaves relatively predictably, remains accountable to its citizens, and is governed by source code (the Constitution) that is hard to change. Google, on the other hand, is a 15-year old institution that is constantly shifting in nature, is accountable to its stockholders, and is governed by source code that is updated daily. You can call your Congressman and watch what happens in Washington on C-SPAN every day. Google is, to most people, a black box that turns searches and personal data into cash”

    And it may do so at their expense, not benefit.

    Categories: 

    Why the Yahoo! Search Revenue Gap Won’t Close

    In spite of Yahoo! accepting revenue guarantees for another year from Microsoft, recently there has been speculation that Yahoo! might want to get out of their search ad deal with Microsoft. I am uncertain if the back channeled story is used as leverage to secure ongoing minimum revenue agreements, or if Yahoo! is trying to set the pretext narrative to later be able to push through a Google deal that might otherwise get blocked by regulators.

    When mentioning Yahoo!’s relative under-performance on search, it would be helpful to point out the absurd amount of their “search” traffic from the golden years that was various forms of arbitrage. Part of the reason (likely the primary reason) Yahoo! took such a sharp nose dive in terms of search revenues (from $551 million per quarter to as low as $357 million per quarter) was that Microsoft used quality scores to price down the non-search arbitrage traffic streams & a lot of that incremental “search” volume Yahoo! had went away.

    There were all sorts of issues in place that are rarely discussed. Exit traffic, unclosible windows, forcing numerous clicks, iframes in email spam, raw bot clicks, etc. … and some of this was tied to valuable keyword lists or specific juicy keywords. I am not saying that Google has outright avoided all arbitrage (Ask does boatloads of it in paid + organic & Google at one point tested doing some themselves on credit cards keywords) but it has generally been a sideshow at Google, whereas it was the main attraction at Yahoo!.

    And that is what drove down Yahoo!’s click prices.

    Yahoo! went from almost an “anything goes” approach to their ad feed syndication, to the point where they made a single syndication partner Cyberplex’s Tsavo Media pay them $4.8 million for low quality traffic. There were a number of other clawbacks that were not made public.

    Given that we are talking $4.8 million for a single partner & this alleged overall revenue gap between Google AdWords & Bing Ads is somewhere in the $100 million or so range, these traffic quality issues & Microsoft cleaning up the whoring of the ad feed that Yahoo! partners were doing is a big deal. It had a big enough impact that it caused some of the biggest domain portfolios to shift from Yahoo! to Google. I am a bit surprised to see it so rarely mentioned in these discussions.

    Few appreciate how absurd the abuses were. For years Yahoo! not only required you to buy syndication (they didn’t have a Yahoo!-only targeting option until 2010 & that only came about as a result of a lawsuit) but even when you blocked a scammy source of traffic, if that scammy source was redirecting through another URL you would have no way of blocking the actual source, as mentioned by Sean Turner:

    To break it down, yahoo gives you a feed for seobook.com & you give me a feed for turner.com. But all links that are clicked on turner.com redirect through seobook.com so that it shows up in customer logs as seobook.com If you block seobook.com, it will block ads from seobook.com, but not turner.com. The blocked domain tool works on what domains display, not on where the feed is redirected through. So if you are a customer, there is no way to know that turner.com is sending traffic (since it’s redirecting through seobook.com) and no way to block it through seobook.com since that tool only works on the domain that is actually displaying it.

    I found it because we kept getting traffic from gogogo.com. We had blocked it over and over and couldn’t figure out why they kept sending us traffic. We couldn’t find our ad on their site. I went to live.com and ran a site:gogogo.com search and found that it indexed some of those landing pages that use gogogo.com as a monetization service.

    The other thing that isn’t mentioned is the longterm impact of a Yahoo! tie up with Google. Microsoft pays Yahoo! an 88% revenue share (and further guarantees on top of that), provides the organic listings free, manages all the technology, and allows Yahoo! to insert their own ads in the organic results.

    If Bing were to exit the online ad market, maybe Yahoo! could make an extra $100 million in the first year of an ad deal with Google, but if there is little to no competition a few years down the road, then when it comes time for Yahoo! to negotiate revenue share rates with Google, you know Google would cram down a bigger rake.

    This isn’t blind speculation or theory, but aligned with Google’s current practices. Look no further than Google’s current practices with YouTube, where “partners” are paid different rates & are forbidden to mention their rates publicly: “The Partner Program forbids participants to reveal specifics about their ad-share revenue.”

    Transparency is a one way street.

    Google further dips into leveraging that “home team always wins” mode of negotiating rates by directly investing in some of the aggregators/networks which offer sketchy confidential contracts < ahref=”http://obviouslybenhughes.com/post/13933948148/before-you-sign-that-machinima-contract-updated”>soaking the original content creators.:

    As I said, the three images were posted on yfrog. They were screenshots of an apparently confidential conversation had between MrWonAnother and a partner support representative from Machinima, in which the representative explained that the partner was locked indefinitely into being a Machinima partner for the rest of eternity, as per signed contract. I found this relevant, informative and honestly shocking information and decided to repost the images to obviouslybenhughes.com in hopes that more people would become aware of the darker side of YouTube partnership networks.

    Negotiating with a monopoly that controls the supply chain isn’t often a winning proposition over the long run.

    Competition (or at least the credible risk of it) is required to shift the balance of power.

    The flip side of the above situation – where competition does help market participants to get a better revenue share – can be seen in the performance of AOL in their ad negotiation in 2005. AOL’s credible threat to switched to Microsoft had Google invest a billion Dollars into AOL, where Google later had to write down $726 million of that investment. If there was no competition from Microsoft, AOL wouldn’t have received that $726 million (and likely would have had a lower revenue sharing rate and missed out on some of the promotional AdWords credits they received).

    The same sort of “shifted balance of power” was seen in the Mozilla search renewal with Google, where Google paid Mozilla 3X as much due to a strong bid from Microsoft.

    The iPad search results are becoming more like phone search results, where ads dominate the interface & a single organic result is above the fold. And Google pushed their “ehnanced” ad campaigns to try to push advertisers into paying higher ad rates on those clicks. It would be a boon for Google if they can force advertisers to pay the same CPC as desktop & couple it with that high mobile ad CTR.

    Google owning Chrome + Android & doing deals with Apple + Mozilla means that it will be hard for either Microsoft or Yahoo! to substantially grow search marketshare. But if they partner with Google it will be a short term lift in revenues and dark clouds on the horizon.

    I am not claiming that Microsoft is great for Yahoo!, or that they are somehow far better than Google, only that Yahoo! is in a far better position when they have multiple entities competing for their business (as highlighted in the above Mozilla & AOL examples).

    Link Madness

    Link paranoia is off the scale. As the “unnatural link notifications” fly, the normally jittery SEO industry has moved deep into new territory, of late.

    I have started to wonder if some of these links (there are hundreds since the site is large) may be hurting my site in the Google Algo. I am considering changing most of my outbound links to rel=”nofollow”. It is not something I want to do but … “

    We’ve got site owners are falling to their knees, confessing to be link spammers, and begging for forgiveness. Even when they do, many sites don’t return. Some sites have been returned, but their rankings, and traffic, haven’t recovered. Many sites carry similar links, but get a free pass.

    That’s the downside of letting Google dictate the game, I guess.

    Link Removal

    When site owners are being told by Google that their linking is “a problem,” they tend to hit the forums and spread the message, so the effect is multiplied.

    Why does Google bother with the charade of “unnatural link notifications,” anyway?

    If Google has found a problem with links to a site, then they can simply ignore or discount them, rather than send reports prompting webmasters to remove them. Alternatively, they could send a report saying they’ve already discounted them.

    So one assumes Google’s strategy is a PR – as in public relations – exercise to plant a bomb between link buyers and link sellers. Why do that? Well, a link is a link, and one could conclude that Google must still have problems nailing down the links they don’t like.

    So they get some help.

    The disavow links tool, combined with a re-inclusion request, is pretty clever. If you wanted a way to get site owners to admit to being link buyers, and to point out the places from which they buy links, or you want to build a database of low quality links, for no money down, then you could hardly imagine a better system of outsourced labour.

    If you’re a site owner, getting hundreds, if not thousands, of links removed is hardly straightforward. It’s difficult, takes a long time, and is ultimately futile.

    Many site owners inundated with link removal requests have moved to charging removal fees, which in many cases is understandable, given it takes some time and effort to verify the true owner of a link, locate the link, remove it, and update the site.

    As one rather fed-up sounding directory owner put it:

    Blackmail? Google’s blackmailing you, not some company you paid to be listed forever. And here’s a newsflash for you. If you ask me to do work, then I demand to be paid. If the work’s not worth anything to you, then screw off and quit emailing me asking me to do it for free.

    Find your link, remove it, confirm it’s removed, email you a confirmation, that’s 5 minutes. And that’s $29US. Don’t like it? Then don’t email me. I have no obligation to work for you for free, not even for a minute. …. I think the best email I got was someone telling me that $29 was extortion. I then had to explain that $29 wasn’t extortion – but his new price of $109 to have the link removed, see, now THAT’S extortion.

    if it makes you sleep at night, you might realize that you paid to get in the directory to screw with your Google rankings, now you get to pay to get out of the directory, again to screw your Google rankings. That’s your decision, quit complaining about it like it’s someone else’s fault. Not everyone has to run around in circles because you’re cleaning up the very mess that you made

    Heh.

    In any case, if these links really did harm a site – which is arguable – then it doesn’t take a rocket scientist to guess the next step. Site owners would be submitting their competitors links to directories thick and fast.

    Cue Matt Cutts on negative SEO….

    Recovery Not Guaranteed

    Many sites don’t recover from Google penalties, no matter what they do.

    It’s conceivable that a site could have a permanent flag against it no matter what penance has been paid. Google takes into account your history in Adwords, so it’s not a stretch to imagine similar flags may continue to exist against domains in their organic results.

    The most common reason is not what they’re promoting now, its what they’ve promoted in the past.
    Why would Google hold that against them? It’s probably because of the way affiliates used to churn and burn domains they were promoting in years gone by…

    This may be the reason why some recovered sites just don’t rank like they used to after they’ve returned. They may carry permanent negative flags.

    However, the reduced rankings and traffic when/if a site does return may have nothing to do with low-quality links or previous behaviour. There are many other factors involved in ranking and Google’s algorithm updates aren’t sitting still, so it’s always difficult to pin down.

    Which is why the SEO environment can be a paranoid place.

    Do Brands Escape?

    Matt Cutts is on record discussing big brands, saying they get punished, too. You may recall the case of Interflora UK.

    Google may well punish big brands, but the punishment might be quite different to the punishment handed out to a no-brand site. It will be easier for a big brand to return, because if Google don’t show what Google users expect to see in the SERPs then Google looks deficient.

    Take, for example, this report received – amusingly – by the BBC:

    I am a representative of the BBC site and on Saturday we got a ‘notice of detected unnatural links’. Given the BBC site is so huge, with so many independently run sub sections, with literally thousands or agents and authors, can you give us a little clue as to where we might look for these ‘unnatural links

    If I was the BBC webmaster, I wouldn’t bother. Google isn’t going to dump the BBC sites as Google would look deficient. If Google has problems with some of the links pointing to the BBC, then perhaps Google should sort it out.

    Take It On The Chin, Move On

    Many of those who engaged in aggressive link tactics knew the deal. They went looking for an artificial boost in relevancy, and as a result of link building, they achieved a boost in the SERPs.

    That is playing the game that Google, a search engine that factors in backlinks, “designed”. By design, Google rewards well-linked sites by ranking them above others.

    The site owners enjoyed the pay-off at the expense of their less aggressive competitors. The downside – there’s always a downside – is that Google may spot the artificial boost in relevancy, now or in the future, and and may slam the domain as a result.

    That’s part of the game, too.

    Some cry about it, but Google doesn’t care about crying site owners, so site owners should build that risk into their business case from the get go.

    Strategically, there are two main ways of looking at “the game”:

    Whack A Mole: Use aggressive linking for domains you’re prepared to lose. If you get burned, then that’s a cost of playing the game. Run multiple domains using different link graphs for each and hope that a few survive at any one time, thus keeping you in the game. If some domains get taken out, then take it on the chin. Try to get reinstated, and if you can’t, then torch them and move on.

    Ignore Google: If you operate like Google doesn’t exist, then it’s pretty unlikely Google will slam you, although there are no guarantees. In any case, a penalty and a low ranking are the same thing in terms of outcome.

    Take one step back. If your business relies on Google rankings, then that’s a business risk. If you rely entirely on Google rankings, then that’s a big business risk. I’m not suggesting it’s not a risk worth taking, but only you can answer that what risks make sense for your business.

    If the whack a mole strategy is not for you, and you want to lower the business risk of Google’s whims, then it makes sense to diversify the ways in which you get traffic so that if one traffic stream fails, then all is not lost. If you’re playing for the long term, then establishing brand, diversifying traffic, and treating organic SEO traffic as a bonus should be considerations. You then don’t need to worry about what Google may or may not do as Google aren’t fueling your engine.

    Some people run both these strategies simultaneously, which is an understandable way of managing risk. Most people probably sit somewhere in the middle and hope for the best.

    Link Building Going Forward

    The effect of Google’s fear, uncertainty and doubt strategy is that a lot of site owners are going to be running scared or confused, or both.

    Just what is acceptable?

    Trouble is, what is deemed acceptable today might be unacceptable next week. It’s pretty difficult, if not impossible, for a site owner to wind the clock back once they undertake a link strategy, and who knows what will be deemed unacceptable in a years time.

    Of course, Google doesn’t want site owners to think in terms of a “link strategy”, if the aim of said link strategy is to “inflate rankings”. That maxim has remained constant.

    If you want to take a low-risk approach, then it pays to think of Google traffic as a bonus. Brett Tabke, founder of WebmasterWorld, used to keep a sticker on his monitor that said “Pretend The Search Engines Don’t Exist”, or words to that effect. I’m reminded of how useful that message still is today, as it’s a prompt to think strategically beyond SEO. If you disappeared from Google today, would your business survive? If the answer is no, then you should revise your strategy.

    Is there a middle ground?

    Here are a few approaches to link building that will likely stand the test of time, and incorporate strategy that provides resilience from Google’s whims. The key is having links for reasons besides SEO, even if you part of their value is higher rankings.

    1. Publisher

    Publish relevant, valuable content, as determined by your audience.

    It’s no longer enough to publish pages of information on a topic, the information must have demonstrable utility i.e. other people need to deem it valuable, reference it, visit it, and talk about it. Instead of putting your money into buying links, you put your money into content development and then marketing it to people. The links will likely follow. This is passive link acquisition.

    It’s unlikely these types of links will ever be a problem, as the link graph is not going to look contrived. If any poor quality links slip into this link graph, then they’re not going to be the dominant feature. The other signals will likely trump them and therefore diminish their impact.

    Build brand based on unique, high quality information, and then market it to people by via multiple channels, and the links tend to follow, which then boost your ranking in Google. Provide a high degree of utility, first and foremost.

    One problem with this model is that it’s easy for other people to steal your utility. This is a big problem and prevents investment in quality content. One way of getting around this is to use some content as loss-leader and lock the rest away behind pay-walls. You give the outside world, and Google, just enough, but if they want the rest, then they’re going to need to sign up.

    Think carefully about the return on giving the whole farm away to a crawler. Think about providing utility, not “content”.

    2. Differentiation

    There is huge first mover advantage when it comes to getting links.

    If a new field opens up, and you get there first, or early, then it’s relatively easy to build a credible link graph. As a field expands, the next layer involves a lot of meta activity i.e. bloggers, media and other information curators writing about that activity. At the start of any niche, there aren’t many players to talk about, so the early movers get all the links.

    As a field matures, you get a phenomenon Mike Grehan aptly characterised as “filthy linking rich

    The law of “preferential attachment” as it is also known, wherein new links on the web are more likely to go to sites that already have many links, proves that the scheme is inherently biased against new and unknown pages. When search engines constantly return popular pages at the top of the pile, more web users discover those pages and more web users are likely to link to them

    Those who got all those links early on will receive more and more links over time because they are top of the results. They just need to keep doing what they’re doing. It becomes very difficult for late entrants to beat them unless they do something extraordinary. By definition, that probably means shifting the niche to a new niche.

    If you’re late to a crowded field, then you need to think in terms of differentiation. What can you offer the rest do not? New content in such fields must be remarkable i.e worth remarking upon.

    Is that field moving in a new direction? If so, can you pivot in that direction and be first mover in that direction? Look not where a niche currently is, but where it’s going, then position ahead of it.

    “Same old, same old content” doesn’t get linked to, engaged with, ranked, or remarked upon – and why should it? The web is not short of content. The web has so much content that companies like Google have made billions trying to reduce it to a manageable set of ten links

    3. Brand

    Brand is the ultimate Google-protection tactic.

    It’s not that brands don’t get hammered by Google occasionally, because they do. But what tends to differ is the sentence handed down. The bigger the brand, the lighter the sentence, or the shorter the sentence, because no matter how much WalMart or The Office Of The President Of The United States Of America spams Google, Google must show such sites. I’m not suggesting these sites engage in aggressive SEO tactics, or need to, but we know they’ll always be in Google.

    You don’t have to be a big brand. You do need search volume on your distinctive brand name. If you’re well known enough in your niche i.e. you attract significant type-in search volume, Google must show you or appear deficient.

    This is not to say having a brand means you can get away with poor behavior. But the more type-in traffic for your brand, the more pressure there is on Google to rank you.

    Links to a brand name will almost never look forced in the same way a link in a footer to “cheap online pharmacy” looks forced. People know your name, and they link to you by name , they talk about you by name – naturally.

    The more generic your site, the more vulnerable you are, as it’s very difficult to own a space when you’re aligning with generic keyword terms. The links are always going to look a little – or a lot – forced.

    This is not to say you shouldn’t get links with keywords in them, but build a distinctive brand, too. The link graph will appear mostly natural – because it is. A few low quality links won’t trump the good signals created by a lot of natural brand links.

    4. Engagement

    The web is a place.

    This placed is filled with people. There are relationships between people. Relationships between people on the web, are almost always expressed as a link. It might be a Facebook link, a Twitter link, a comment link, a blog link, but they’re all links. It doesn’t matter if they’re crawlable or not, or if they’re no-followed, or not, it still indicates a relationship.

    If Google is to survive, it must figure out these relationships.

    That’s why all links – apart from negative SEO – are good links. The more signals of a real relationship, the better you *should* be ranked, because you are more relevant, in an objective sense.

    So look for ways to foster relationships and engagement. It might be guest posting. It might be commenting on someone elses site. It might be contributing to forums. It might be interviewing people. It might be accepting interviews. It might be forging relationships with the press. It might be forging relationships with business organisations. It might be contributing to charity. It might be running competitions. It might be attending conferences. It might be linking out to influential people.

    It’s all networking.

    And wherever you network, you should be getting links as a byproduct.

    One potential problem:

    Provide long – well, longer than 400 words – unique, editorial articles. Articles also need get linked to, and engaged with. Articles need to be placed on sites they’ll be seen, as opposed to content farms.

    Ask yourself “am I providing genuine utility?”

    5. Fill A Need

    This is similar to differentiation, but a little more focused.

    Think about the problems people have in a niche. The really hard problems to solve. “How to”, “tips”, “advice”, and so on.

    Solving a genuine problem for people tends to make people feel a sense of obligation, especially if they would otherwise have to pay for that help. If you can twist that obligation towards getting a link, all the better. For example, “if this article/video/whatever helped you, no payment necessary! But it would be great if you link to us/follow us on Twitter/” and so on. It doesn’t need to be that overt, although sometimes overt is what is required. It fosters engagement. It builds your network. And it builds links.

    Think about ways you can integrate a call-to-action that results in a link of some kind.

    Coda

    In other news, Caesars Palace bans Google :)

    Categories: 

    Measuring Social Media

    Measuring PPC and SEO is relatively straightforward. But how do we go about credibly measuring social media campaigns, and wider public relations and audience awareness campaigns?

    As the hype level of social media starts to fall, then more questions are asked about return on investment. During the early days of anything, the hype of the new is enough to sustain an endeavor. People don’t want to miss out. If their competitors are doing it, that’s often seen as good enough reason to do it, too.

    You may be familiar with this graph. It’s called the hype cycle and is typically used to demonstrate the maturity, adoption and social application of specific technologies:

    Where would social media marketing be on this graph?

    I think a reasonable guess, if we’re seeing more and more discussion about ROI, is somewhere on the “slope of enlightenment”. In this article, we’ll look at ways to measure social media performance by grounding it in the only criteria that truly matter – business fundamentals.

    Public Relations

    We’ve talked about the Cluetrain Manifesto and how the world changed when corporations could no longer control the message. If the message can no longer be controlled, then measuring the effectiveness of public relations becomes even more problematic.

    PR used to be about crafting a message and placing it, and nurturing the relationships that allowed that to happen. With the advent of social media, that’s still true, but the scope has expanded exponentially – everyone can now repeat, run with, distort, reconfigure and reinvent the messages. Controlling the message was always difficult, but now it’s impossible.

    On the plus side, it’s now much easier to measure and quantify the effectiveness of public relations activity due to the wealth of web data and tools to track what people are saying, to whom, and when.

    The Same, Only Different

    As much as things change, the more they stay the same. PR and social media is still about relationships. And getting relationships right pays off:

    Today, I want to write about something I’d like to call the “Tim Ferriss Effect.” It’s not exclusive to Tim Ferriss, but he is I believe the marquee example of a major shift that has happened in the last 5 years within the world of book promotion. Here’s the basic idea: When trying to promote a book, the main place you want coverage is on a popular single-author blog or site related to your topic…..The post opened with Tim briefly explaining how he knew me, endorsing me as a person, and describing the book (with a link to my book.) It then went directly into my guest post– there was not even an explicit call to action to buy my book or even any positive statements about my book. An hour later, (I was #45 on Amazon’s best seller list

    Public relations is more than about selling, of course. It’s also about managing reputation. It’s about getting audiences to maintain a certain point of view. Social media provides the opportunity to talk to customers and the public directly by using technology to dis-intermediate the traditional gatekeepers.

    Can We Really Measure PR & Social Media Performance?

    How do you measure the value of a relationship?

    Difficult.

    How can you really tell if people feel good enough about your product or service to buy it, and that “feeling good” was the direct result of editorial placement by a well-connected public relations professional?

    Debatable, certainly.

    Can you imagine another marketing discipline that used dozens of methods for measuring results? Take search engine marketing for example. The standards are pretty cut and dry: visitors, page views, time on site, cost per click, etc. For email marketing, we have delivery, open rates, click thru, unsubscribes, opt-ins, etc”

    In previous articles, we’ve looked at how data-driven marketing can save time and be more effective. The same is true of social media, but given it’s not an exact science, it’s a question of finding an appropriate framework.

    There are a lot of people asking questions about social media’s worth.

    No Industry Standard

    Does sending out weekly press releases result in more income? How about tweeting 20 times a day? How much are 5,000 followers on Facebook worth? Without a framework to measure performance, there’s no way of knowing.

    Furthermore, there’s no agreed industry standard.

    In direct marketing channels, such as SEO and PPC, measurement is fairly straightforward. We count cost per click, number of visitors, conversion rate, time on site, and so on. But how do we measure public relations? How do we measure influence and awareness?

    PR firms have often developed their own in-house terms of measurement. The problem is that without industry standards, success criteria can become arbitrary and often used simply to show the agency in a good light and thus validate their fees.

    Some agencies use publicity results, such as the number of mentions in the press, or the type of mention i.e. prestigious placement. Some use advertising value equivalent i.e. is what editorial coverage would cost if it were buying advertising space. Some use public opinion measures, such as polls, focus groups and surveys, whilst others compare mentions and placement vs competitors i.e. who has more or better mentions, wins. Most use a combination, depending on the nature of the campaign.

    Most business people would agree that measurement is a good thing. If we’re spending money, we need to know what we’re getting for that money. If we provide social media services to clients, we need to demonstrate what we’re doing works, so they’ll devote more budget to it in future. If the competition is using this channel, then we need to know if we’re using it better, or worse, than we are.

    Perhaps the most significant reason why we measure is to know if we’ve met a desired outcome. To do that we must ignore gut feelings and focus on whether an outcome was achieved.

    Why wouldn’t we measure?

    Some people don’t like the accountability. Some feel more comfortable with an intuitive approach. It can be difficult for some to accept that their pet theories have little substance when put to the test. It seems like more work. It seems like more expense. It’s just too hard. When it comes to social media, some question whether it can be done much at all

    For proof, look no further than The Atlantic, which shook the social media realm recently with its expose of “dark social” – the idea that the channels we fret over measuring like Facebook and Twitter represent only a small fraction of the social activity that’s really going on. The article shares evidence that reveals that the vast majority of sharing is still done through channels like email and IM that are nearly impossible to measure (and thus, dark).

    And it’s not like a lot of organizations are falling over themselves to get measurement done:

    According to a Hypatia Research report, “Benchmarking Social Community Platform Investments & ROI,” only 40% of companies measure social media performance on a quarterly or annual basis, while almost 13% or the organizations surveyed do not measure ROI from social media at all, and another 18% said they do so only on an ad hoc basis. (Hypatia didn’t specify what response the other 29% gave.)

    If we agree that measurement is a good thing and can lead to greater efficiency and better decision making, then the fact your competition may not be measuring well, or at all, then this presents great opportunity. We should strive to measure social media ROI, as providers or consumers, or it becomes difficult to justify spend. The argument that we can’t measure because we don’t know all the effects of our actions isn’t a reason not to measure what we can.

    Marketing has never been an exact science.

    What Should We Measure?

    Measurement should be linked back to business objectives.

    In “Measure What Matters”, Katie Delahaye Paine outlines seven steps to social media measurement. I liked these seven steps, because they aren’t exclusive to social media. They’re the basis for measuring any business strategy and similar measures have been used in marketing for a long time.

    It’s all about proving something works, and then using the results to enhance future performance. The book is a great source for those interested in reading further on this topic, which I’ll outline here.

    1. What Are Your Objectives?

    Any marketing objective should serve a business objective. For example, “increase sales by X by October 31st”.

    Having specific, business driven objectives gets rid of conjecture and focuses campaigns. Someone could claim that spending 30 days tweeting a new message a day is a great thing to do, but if, at the end of it, a business objective wasn’t met, then what was the point?

    Let’s say an objective is “increase sales of shoes compared to last December’s figures”. What might the social strategy look like? It might consist of time-limited offers, as opposed to more general awareness messages. What if the objective was to “get 5,000 New Yorkers to mention the brand before Christmas”? This would lend itself to viral campaigns, targeted locally. Linking the campaign to specific business objectives will likely change the approach.

    If you have multiple objectives, you can always split them up into different campaigns so you can measure the effectiveness of each separately. Objectives typically fall into sales, positioning, or education categories.

    2. Who Is The Audience?

    Who are you talking to? And how will you know if you’ve reached them? Once you have reached them, what is it you want them to do? How will this help your business?

    Your target audience is likely varied. Different audiences could be industry people, customers, supplier organizations, media outlets, and so on. Whilst the message may be seen by all audiences, you should be clear about which messages are intended for who, and what you want them to do next. The messages will be different for each group as each group likely picks up on different things.

    Attach a value to each group. Is a media organization picking up on a message more valuable than a non-customer doing so? Again, this should be anchored to a business requirement. “We need media outlets following us so they may run more of our stories in future. Our research shows more stories has led to increased sales volume in the past”. Then a measure might be to count the number of media industry followers, and to monitor the number of stories they produce.

    3. Know Your Costs

    What does it cost you to run social media campaigns? How much time will it take? How does this compare to other types of campaigns? What is your opportunity cost? How much does it cost to measure the campaign?

    As Delahaye Paine puts it, it’s the “I” in ROI.

    4. Benchmark

    Testing is comparative, so have something to compare against.

    You can compare yourself against competitors, and/or your own past performance. You can compare social media campaigns against other marketing campaigns. What do those campaigns usually achieve? Do social media campaigns work better, or worse, in terms of achieving business goals?

    In terms of ROI, what’s a social media “page view” worth? You could compare this against the cost of a click in PPC.

    5. Define KPIs

    Once you’ve determined objectives, defined the audience, and established benchmarks, you should establish criteria for success.

    For example, the objective might be to increase media industry followers. The audience is the media industry and the benchmark is the current number of media industry followers. The KPI would be the number of new media industry followers signed up, as measured by classifying followers into subgroups and conducting a headcount.

    Measuring the KPI will differ depending on objective, of course. If you’re measuring the number of mentions in the press vs your competitor, that’s pretty easy to quantify.

    “Raising awareness” is somewhat more difficult, however once you have a measurement system in place, you can start to break down the concept of “awareness” into measurable components. Awareness of what? By whom? What constitutes awareness? How to people signal they’re aware of you? And so on.

    6. Data Collection Tools

    How will you collect measurement data?

    • Content analysis of social or traditional media
    • Primary research via online, mail or phone survey
    • Web analytics

    There are an overwhelming number of tools available, and outside the scope of this article. No tool can measure “reputation” or “awareness” or “credibility” by itself, but can produce usable data if we break those areas down into suitable metrics. For example, “awareness” could be quantified by “page views + a survey of a statistically valid sample”.

    Half the battle is asking the right questions.

    7. Take Action

    A measurement process is about iteration. You do something, get the results back, act on them and make changes, and arrive at a new status quo. You then do something starting from that new point, and so on. It’s an ongoing process of optimization.

    Were objectives met? What conclusions can you draw?

    Those seven steps will be familiar to anyone who has measured marketing campaigns and business performance. They’re grounded in the fundamentals. Without relating social media metrics back to the underlying fundamentals, we can never be sure if what we’re doing is making or a difference, or worthwhile. Is 5,000 Twitter followers a good thing?

    It depends.

    What business problem does it address?

    Did You Make A Return?

    You invested time and money. Did you get a return?

    If you’ve linked your social media campaigns back to business objectives you should have a much clearer idea. Your return will depend on the nature of your business, of course, but it could be quantified in terms of sales, cost savings, avoiding costs or building an audience.

    In terms of SEO, we’ve long advocated building brand. Having people conduct brand searches is a form of insurance against Google demoting your site. If you have brand search volume, and Google don’t return you for brand searches, then Google looks deficient.

    So, one goal of social media that gels with SEO is to increase brand awareness. You establish a benchmark of branded searches based on current activity. You run your social media campaigns, and then see if branded searches increase.

    Granted, this is a fuzzy measure, especially if you have other awareness campaigns running, as you can’t be certain cause and effect. However, it’s a good start. You could give it a bit more depth by integrating a short poll for visitors i.e. “did you hear about us on Twitter/Facebook/Other?”.

    Mechanics Of Measurement

    Measuring social media isn’t that difficult. In fact, we could just as easily use search metrics in many cases. What is the cost per view? What is the cost per click? Did the click from a social media campaign convert to desired action? What was your business objective for the social media campaign? To get more leads? If so, then count the leads. How much did each lead cost to acquire? How does that cost compare to other channels, like PPC? What is the cost of customer acquisition via social media?

    In this way, we could split social media out into the customer service side and marketing side. Engaging with your customers on Facebook may not be all that measurable in terms of direct marketing effects, it’s more of a customer service function. As such, budget for the soft side of social media need not come out of marketing budgets, but customer service budgets. This could still be measured, or course, by running customer satisfaction surveys.

    Is Social Media Marketing Public Relations?

    Look around the web for definitions of the differences between PR and social media, and you’ll find a lot of vague definitions.

    Social media is a tool used often used for the purpose of public relations. The purpose is to create awareness and nurture and guide relationships.

    Public relations is sometimes viewed it as a bit of a scam. It’s an area that sucks money, yet can often struggle to prove its worth, often relying on fuzzy, feel-good proclamations of success and vague metrics. It doesn’t help that clients can have unrealistic expectations of PR, and that some PR firms are only too happy to promise the moon:

    PR is nothing like the dark, scary world that people make it out to be—but it is a new one for most. And knowing the ropes ahead of time can save you from setting impossibly high expectations or getting overpromised and oversold by the firm you hire. I’ve seen more than my fair share of clients bringing in a PR firm with the hopes that it’ll save their company or propel a small, just-launched start-up into an insta-Facebook. And unfortunately, I’ve also seen PR firms make these types of promises. Guess what? They’re never kept

    Internet marketing, in general, has a credibility problem when it doesn’t link activity back to business objectives.

    Part of that perception, in relation to social media, comes from the fact public relations is difficult to control:

    The main conduit to mass publics, particularly with a consumer issue such as rail travel or policing, are the mainstream media. Unlike advertising, which has total control of its message, PR cannot convey information without the influence of opinion, much of it editorial. How does the consumer know what is fact, and what has influenced the presentation of that fact?

    But lack of control of the message, as the Cluetrain Manifesto points out, is the nature of the environment in which we exist. Our only choice, if we are to prosper in the digital environment, is to embrace the chaos.

    Shouldn’t PR just happen? If you’re good, people just know? Well, even Google, that well known, engineering-driven advertising company has PR deeply embedded from almost day one:

    David Krane was there from day one as Google’s first public relations official. He’s had a hand in almost every single public launch of a Google product since the debut of Google.com in 1999.

    Good PR is nurtured. It’s a process. The way to find out if it’s good PR or ineffective PR is to measure it. PR isn’t a scam, anymore so than any other marketing activity is a scam. We can find out if it’s worthwhile only by tracking and measuring and linking that measurement back to a business case. Scams lack transparency.

    The way to get transparency is to measure and quantify.

    Categories: 

    Getting Granular With User Generated Content

    The stock market had a flash crash today after someone hacked the AP account & made a fake announcement about bombs going off at the White House. Recently Twitter’s search functionality has grown so inundated with spam that I don’t even look at the brand related searches much anymore. While you can block individual users, it doesn’t block them from showing up in search results, so there are various affiliate bots that spam just about any semi-branded search.

    Of course, for as spammy as the service is now, it was worse during the explosive growth period, when Twitter had fewer than 10 employees fighting spam:

    Twitter says its “spammy” tweet rate of 1.5% in 2010 was down from 11% in 2009.

    If you want to show growth by any means necessary, engagement by a spam bot is still engagement & still lifts the valuation of the company.

    Many of the social sites make no effort to police spam & only combat it after users flag it. Consider Eric Schmidt’s interview with Julian Assange, where Eric Schmidt stated:

    • “We [YouTube] can’t review every submission, so basically the crowd marks it if it is a problem post publication.”
    • “You have a different model, right. You require human editors.” on Wikileaks vs YouTube

    We would post editorial content more often, but we are sort of debating opening up a social platform so that we can focus on the user without having to bear any editorial costs until after the fact. Profit margins are apparently better that way.

    As Google drives smaller sites out of the index & ranks junk content based on no factor other than it being on a trusted site, they create the incentive for spammers to ride on the social platforms.

    All aboard. And try not to step on any toes!

    When I do some product related searches (eg: brand name & shoe model) almost the whole result set for the first 5 or 10 pages is garbage.

    • Blogspot.com subdomains
    • Appspot.com subdomains
    • YouTube accounts
    • Google+ accounts
    • sites.google.com
    • Wordpress.com subdomains
    • Facebook Notes & pages
    • Tweets
    • Slideshare
    • LinkedIn
    • blog.yahoo.com
    • subdomains off of various other free hosts

    It comes without surprise that Eric Schmidt fundamentally believes that “disinformation becomes so easy to generate because of, because complexity overwhelms knowledge, that it is in the people’s interest, if you will over the next decade, to build disinformation generating systems, this is true for corporations, for marketing, for governments and so on.”

    Of course he made no mention in Google’s role in the above problem. When they are not issuing threats & penalties to smaller independent webmasters, they are just a passive omniscient observer.

    With all these business models, there is a core model of building up a solid stream of usage data & then tricking users or looking the other way when things get out of hand. Consider Google’s Lane Shackleton’s tips on YouTube:

    • “Search is a way for a user to explicitly call out the content that they want. If a friend told me about an Audi ad, then I might go seek that out through search. It’s a strong signal of intent, and it’s a strong signal that someone found out about that content in some way.”
    • “you blur the lines between advertising and content. That’s really what we’ve been advocating our advertisers to do.”
    • “you’re making thoughtful content for a purpose. So if you want something to get shared a lot, you may skew towards doing something like a prank”

    Harlem Shake & Idiocracy: the innovative way forward to improve humanity.

    Life is a prank.

    This “spam is fine, so long as it is user generated” stuff has gotten so out of hand that Google is now implementing granular page-level penalties. When those granular penalties hit major sites Google suggests that those sites may receive clear advice on what to fix, just by contacting Google:

    Hubert said that if people file a reconsideration request, they should “get a clear answer” about what’s wrong. There’s a bit of a Catch-22 there. How can you file a reconsideration request showing you’ve removed the bad stuff, if the only way you can get a clear answer about the bad stuff to remove is to file a reconsideration request?

    The answer is that technically, you can request reconsideration without removing anything. The form doesn’t actually require you to remove bad stuff. That’s just the general advice you’ll often hear Google say, when it comes to making such a request. That’s also good advice if you do know what’s wrong.

    But if you’re confused and need more advice, you can file the form asking for specifics about what needs to be removed. Then have patience

    In the past I referenced that there is no difference between a formal white list & overly-aggressive penalties coupled with loose exemptions for select parties.

    The moral of the story is that if you are going to spam, you should make it look like a user of your site did it, that way you

    • are above judgement
    • receive only a limited granular penalty
    • get explicit & direct feedback on what to fix
    Categories: 

    Experiment Driven Web Publishing

    Do users find big headlines more relevant? Does using long text lead to more, or less, visitor engagement? Is that latest change to the shopping cart going to make things worse? Are your links just the right shade of blue?

    If you want to put an end to tiresome subjective arguments about page length, or the merits of your clients latest idea, which is to turn their website pink, then adopting an experimental process for web publishing can be a good option.

    If you don’t currently use an experiment-driven publishing approach, then this article is for you. We’ll look at ways to bake experiments into your web site, the myriad of opportunities testing creates, how it can help your SEO, and ways to mitigate cultural problems.

    Controlled Experiments

    The merits of any change should be derived from the results of the change under a controlled test. This process is common in PPC, however many SEO’s will no doubt wonder how such an approach will affect their SEO.

    Well, Google encourages it.

    We’ve gotten several questions recently about whether website testing—such as A/B or multivariate testing—affects a site’s performance in search results. We’re glad you’re asking, because we’re glad you’re testing! A/B and multivariate testing are great ways of making sure that what you’re offering really appeals to your users

    Post-panda, being more relevant to visitors, not just machines, is important. User engagement is more important. If you don’t closely align your site with user expectations and optimize for engagement, then it will likely suffer.

    The new SEO, at least as far as Panda is concerned, is about pushing your best quality stuff and the complete removal of low-quality or overhead pages from the indexes. Which means it’s not as easy anymore to compete by simply producing pages at scale, unless they’re created with quality in mind. Which means for some sites, SEO just got a whole lot harder.

    Experiments can help us achieve greater relevance.

    If It ‘Aint Broke, Fix It

    One reason for resisting experiment-driven decisions is to not mess with success. However, I’m sure we all suspect most pages and processes can be made better.

    If we implement data-driven experiments, we’re more likely to spot the winners and losers in the first place. What pages lead to the most sales? Why? What keywords are leading to the best outcomes? We identify these pages, and we nurture them. Perhaps you already experiment in some areas on your site, but what would happen if you treated most aspects of your site as controlled experiments?

    We also need to cut losers.

    If pages aren’t getting much engagement, we need to identify them, improve them, or cut them. The Panda update was about levels of engagement, and too many poorly performing pages will drag your site down. Run with the winners, cut the losers, and have a methodology in place that enables you to spot them, optimize them, and cut them if they aren’t performing.

    Testing Methodology For Marketers

    Tests are based on the same principles used to conduct scientific experiments. The process involves data gathering, designing experiments, running experiments, analyzing the results, and making changes.

    1. Set A Goal

    A goal should be simple i.e. “increase the signup rate of the newsletter”.

    We could fail in this goal (decreased signups), succeed (increased signups), or stay the same. The goal should also deliver genuine business value.

    There can be often multiple goals. For example, “increase email signups AND Facebook likes OR ensure signups don’t decrease by more than 5%”. However, if you can get it down to one goal, you’ll make life easier, especially when starting out. You can always break down multiple goals into separate experiments.

    2. Create A Hypothesis

    What do you suspect will happen as a result of your test? i.e. “if we strip all other distractions from the email sign up page, then sign-ups will increase”.

    The hypothesis can be stated as an improvement, or preventing a negative, or finding something that is wrong. Mostly, we’re concerned with improving things – extracting more positive performance out of the same pages, or set of pages.

    “Will the new video on the email sign-up page result in more email signups?” Only one way to find out. And once you have found out, you can run with it or replace it safe in the knowledge it’s not just someone’s opinion. The question will move from “just how cool is this video!” (subjective) to “does this video result in more email sign-ups?”. A strategy based on experiments eliminates most subjective questions, or shifts them to areas that don’t really affect the business case.

    The video sales page significantly increased the number of visitors who clicked to the price/guarantee page by 46.15%….Video converts! It did so when mentioned in a “call to action” (a 14.18% increase) and also when used to sell (35% and 46.15% increases in two different tests)

    When crafting a hypothesis, you should keep business value clearly in mind. If the hypothesis suggests a change that doesn’t add real value, then testing it is likely a waste of time and money. It creates an opportunity cost for other tests that do matter.

    When selecting areas to test, you should start by looking at the areas which matter most to the business, and the majority of users. For example, an e-commerce site would likely focus on product search, product descriptions, and the shopping cart. The About Page – not so much.

    Order areas to test in terms of importance and go for the low hanging fruit first. If you can demonstrate significant gains early on, then it will boost your confidence and validate your approach. As experimental testing becomes part of your process, you can move on more granular testing. Ideally, you want to end up with a culture whereby most site changes have some sort of test associated with them, even if it’s just to compare performance against the previous version.

    Look through your stats to find pages or paths with high abandonment rates or high bounce rates. If these pages are important in terms of business value, then prioritize these for testing. It’s important to order these pages in terms of business value, because high abandonment rates or bounce rates on pages that don’t deliver value isn’t a significant issue. It’s probably more a case of “should these pages exist at all”?

    3. Run An A/B or Multivariate Test

    Two of the most common testing methodologies in direct response marketing are A/B testing and multivariate testing.

    A/B Testing, otherwise known as split testing, is when you compare one version of a page against another. You collect data how each page performs, relative to the other.

    Version A is typically the current, or favored version of a page, whilst page B differs slightly, and is used as a test against page A. Any aspect of the page can be tested, from headline, to copy, to images, to color, all with the aim of improving a desired outcome. The data regarding performance of each page is tested, the winner is adopted, and the loser rejected.

    Multivariate testing is more complicated. Multivariate testing is when more than one element is tested at any one time. It’s like performing multiple A/B tests on the same page, at the same time. Multivariate testing can test the effectiveness of many different combinations of elements.

    Which method should you use?

    In most cases, in my experience, A/B testing is sufficient, but it depends. In the interest of time, value and sanity, it’s more important and productive to select the right things to test i.e. the changes that lead to the most business value.

    As your test culture develops, you can go more and more granular. The slightly different shade of blue might be important to Google, but it’s probably not that important to sites with less traffic. But, keep in mind, assumptions should be tested ;) Your mileage may vary.

    There are various tools available to help you run these test. I have no association with any of these, but here’s a few to check out:

    4. Ensure Statistical Significance

    Tests need to show statistical significance. What does statistically significant mean?

    For those who are comfortable with statistics:

    Statistical significance is used to refer to two separate notions: the p-value, the probability that observations as extreme as the data would occur by chance in a given single null hypothesis; or the Type I error rate α (false positive rate) of a statistical hypothesis test, the probability of incorrectly rejecting a given null hypothesis in favor of a second alternative hypothesis

    For those of you, like me, who prefer a more straightforward explanation. Here’s also a good explanation in relation to PPC, and a video explaining statistical significance in reference in A/B test.

    In short, you need enough visitors taking an action to decide it is not likely to have occurred randomly, but is most likely attributable to a specific cause i.e. the change you made.

    5. Run With The Winners

    Run with the winners, cut the losers, rinse and repeat. Keep in mind that you may need to retest at different times, as the audience can change, or their motivations change, depending on underlying changes in your industry. Testing, like great SEO, is best seen as an ongoing process.

    Make the most of every visitor who arrives on your site, because they’re only ever going to get more expensive.

    Here’s an interesting seminar where the results of hundreds of experiments were reduced down to three fundamental lessons:

    • a) How can I increase specify? Use quantifiable, specific information as it relates to the value proposition
    • b) How can I increase continuity? Always carry across the key message using repetition
    • c) How can I increase relevance? Use metrics to ask “why”

    Tests Fail

    Often, tests will fail.

    Changing content can sometimes make little, if any, difference. Other times, the difference will be significant. But even when tests fail to show a difference, it still gives you information you can use. These might be areas in which designers, and other vested interests, can stretch their wings, and you know that it won’t necessarily affect business value in terms of conversion.

    Sometimes, the test itself wasn’t designed well. It might not have been given enough time to run. It might not have been linked to a business case. Tests tend to get better as we gain more experience, but having a process in place is the important thing.

    You might also find that your existing page works just great and doesn’t need changing. Again, it’s good to know. You can then try replicating this successes in areas where the site isn’t performing so well.

    Enjoy Failing

    Fail fast, early and fail often”.

    Failure and mistakes are inevitable. Knowing this, we put mechanisms in place to spot failures and mistakes early, rather than later. Structured failure is a badge of honor!

    Thomas Edison performed 9,000 experiments before coming up with a successful version of the light bulb. Students of entrepreneurship talk about the J-curve of returns: the failures come early and often and the successes take time. America has proved to be more entrepreneurial than Europe in large part because it has embraced a culture of “failing forward” as a common tech-industry phrase puts it: in Germany bankruptcy can end your business career whereas in Silicon Valley it is almost a badge of honour

    Silicon Valley even comes up with euphemisms, like “pivot”, which weaves failure into the fabric of success.

    Or perhaps it’s because some of the best ideas in tech today have come from those that weren’t so good. (Remember, Apple’s first tablet devices was called the Newton.)
    There’s a word used to describe this get-over-it mentality that I heard over and over on my trip through Silicon Valley and San Francisco this week: “Pivot“

    Experimentation, and measuring results, will highlight failure. This can be a hard thing to take, and especially hard to take when our beloved, pet theories turn out to be more myth than reality. In this respect, testing can seem harsh and unkind. But failure should be seen for what it is – one step in a process leading towards success. It’s about trying stuff out in the knowledge some of it isn’t going to work, and some of it will, but we can’t be expected to know which until we try it.

    In The Lean Startup, Eric Ries talks about the benefits of using lean methodologies to take a product from not-so-good to great, using systematic testing”

    If your first product sucks, at least not too many people will know about it. But that is the best time to make mistakes, as long as you learn from them to make the product better. “It is inevitable that the first product is going to be bad in some ways,” he says. The Lean Startup methodology is a way to systematically test a company’s product ideas.
    Fail early and fail often. “Our goal is to learn as quickly as possible,” he says

    Given testing can be incremental, we don’t have to fail big. Swapping one graphic position for another could barely be considered a failure, and that’s what a testing process is about. It’s incremental, and iterative, and one failure or success doesn’t matter much, so long as it’s all heading in the direction of achieving a business goal.

    It’s about turning the dogs into winners, and making the winners even bigger winners.

    Feel Vs Experimentation

    Web publishing decisions are often based on intuition, historical precedence – “we’ve always done it this way” – or by copying the competition. Graphic designers know about colour psychology, typography and layout. There is plenty of room for conflict.

    Douglas Bowden, a graphic designer at Google, left Google because he felt the company relied too much on data-driven decisions, and not enough on the opinions of designers:

    Yes, it’s true that a team at Google couldn’t decide between two blues, so they’retesting 41 shades between each blue to see which one performs better. I had a recent debate over whether a border should be 3, 4 or 5 pixels wide, and was asked to prove my case. I can’t operate in an environment like that. I’ve grown tired of debating such minuscule design decisions. There are more exciting design problems in this world to tackle.

    That probably doesn’t come as a surprise to any Google watchers. Google is driven by engineers. In Google’s defense, they have such a massive user base that minor changes can have significant impact, so their approach is understandable.

    Integrate Design

    Putting emotion, and habit, aside is not easy.

    However, experimentation doesn’t need to exclude visual designers. Visual design is valuable. It helps visitors identify and remember brands. It can convey professionalism and status. It helps people make positive associations.

    But being relevant is also design.

    Adopting an experimentation methodology means designers can work on a number of different designs and get to see how the public really does react to their work. Design X converted better than design Y, layout Q works best for form design, buttons A, B and C work better than buttons J, K and L, and so on. It’s a further opportunity to validate creative ideas.

    Cultural Shift

    Part of getting experimentation right has to do with an organizations culture. Obviously, it’s much easier if everyone is working towards a common goal i.e. “all work, and all decisions made, should serve a business goal, as opposed to serving personal ego”.

    All aspects of web publishing can be tested, although asking the right questions about what,to test is important. Some aspects may not make a measurable difference in terms of conversion. A logo, for example. A visual designer could focus on that page element, whilst the conversion process might rely heavily on the layout of the form. Both the conversion expert and the design expert get to win, yet not stamp on each others toes.

    One of the great aspects of data-driven decision making is that common, long-held assumptions get challenged, often with surprising results. How long does it take to film a fight scene? The movie industry says 30 days.

    Mark Walberg challenged that assumption and did it in three:

    Experts go with what they know. And they’ll often insist something needs to take a long time. But when you don’t have tons of resources, you need to ask if there’s a simpler, judo way to get the impact you desire. Sometimes there’s a better way than the “best” way. I thought of this while watching “The Fighter” over the weekend. There’s a making of extra on the DVD where Mark Wahlberg, who starred in and produced the film, talks about how all the fight scenes were filmed with an actual HBO fight crew. He mentions that going this route allowed them to shoot these scenes in a fraction of the time it usually takes

    How many aspects of your site are based on assumption? Could those assumptions be masking opportunities or failure?

    Winning Experiments

    Some experiments, if poorly designed, don’t lead to more business success. If an experiment isn’t focused on improving a business case, then it’s probably just wasted time. That time could have been better spent devising and running better experiments.

    In Agile software design methodologies, the question is always asked “how does this change/feature provide value to the customer”. The underlying motive is “how does this change/feature provide value to the business”. This is a good way to prioritize test cases. Those that potentially provide the most value, such as landing page optimization on PPC campaigns, are likely to have a higher priority than, say, features available to forum users.

    Further Reading

    I hope this article has given you some food for thought and that you’ll consider adopting some experiment-based processes to your mix. Here’s some of the sources used in this article, and further reading:

    Categories: 

    Creating Effective Advertising

    The Atlantic published an interesting chart comparing print advertising spend with internet advertising spend:

    So, print advertising is tanking. Internet advertising, whilst growing, is not growing particularly fast, and certainly isn’t catching up to fill the titanic sized gap left by print.

    As a result, a number of publishers who rely on advertising for the lion’s share of their revenue are either struggling, going belly up, or changing their models.

    The Need For More Effective Advertising

    We recently looked at paywalls. More and more publishers are going the paywall route, the latest major publisher being The Washington Post.

    Given the ongoing devaluation of content by aggregators and their advertising networks, few can blame them. However, paywalls aren’t the only solution. Part of the problem with internet advertising is that as soon as people get used to seeing it they tend to block it out, so it becomes less effective.

    We looked at the problems with display advertising. Federated Media abandoned the format and will adopt a more “social” media strategy.

    We also looked at the rise of Native Advertising, which is advertising that tightly integrates with content to the point where it’s difficult to tell the two apart. This opens up a new angle for SEOs looking to place links.

    The reason the advertising gap isn’t closing is due to a number of factors. It’s partly historical, but it’s also to do with effectiveness, especially when it comes to display advertising. If advertisers aren’t seeing a return, then they won’t advertise.

    Inventory is expanding a lot faster than the ability or desire of advertisements to fill it, which is not a good situation for publishers. So, internet publishers are experimenting with ideas on how to be more effective. If native advertising and social are deemed more effective, then that is the way publishers will go.

    People just don’t like being advertised at.

    The ClueTrain Manifesto

    The Cluetrain Manifesto predicted much of what we see happening today. Written in 2000 by Rick Levine, Christopher Locke, Doc Searls, and David Weinberger, the Cluetrain Manifesto riffed on the idea that markets are conversations, and consumers aren’t just passive observers:

    A powerful global conversation has begun. Through the Internet, people are discovering and inventing new ways to share relevant knowledge with blinding speed. As a direct result, markets are getting smarter—and getting smarter faster than most companies

    That seems obvious now, but it was a pretty radical idea back then. The book was written before blogs became popular. It was way before anyone had heard of a social network, or before anyone had done any tweeting.

    Consumers were no longer passive, they were just as likely to engage and create, and they would certainly talk back, and ultimately shape the message if they didn’t like it. The traditional top-down advertising industry, and publishing industry, has been turned on its head. The consumers are publishers, and they’re not sitting around being broadcast at.

    The advertising industry has been struggling to find answers, not entirely successfully, ever since.

    Move Away From Display And Towards Engagement

    In order for marketing to be effective on the web, it needs to be engaging to an audience that ignores the broadcast message. This is the reason advertising is starting to look more like content. It ‘s trying to engage people using the forms they already use in their personal communication.

    For example, this example mimics a blog post encouraging people to share. It pretty much is a blog post, but it’s also an advertisement. It meets the customer on their terms, in their space and on their level. For better or worse, the lines are growing increasingly blurred.

    Facebook’s Managing Editor, Dan Fletcher, has just stood down, reasoning:

    The company “doesn’t need reporters,” Fletcher said, because it has a billion members who can provide content.You guys are the reporters,” Fletcher told the audience. “There is no more engaging content Facebook could produce than you talking to your family and friends.

    People aren’t reporters in the journalistic sense, but his statement suggests where the revenue for advertising lies, which is in between people’s conversations. As a side note, you may notice that article is “brought to you by our sponsor”. Most of the links go through bit.ly, however they could just as easily be straight links.

    The implication is that a lot of people aren’t even listening to reporters anymore, they want to know about the world as filtered through the eyes of their friends and families. The latter has happened since time began, but only recently has advertising leaped directly into that conversation. Whether that is a good thing or not, or welcomed, is another matter, but it it is happening.

    Two Types Of Advertisements

    Advertising takes two main forms. Institutional, or “brand” advertising, and direct response advertising. SEOs are mainly concerned with direct response advertising.

    Direct-Response Marketing is a type of marketing designed to generate an immediate response from consumers, where each consumer response (and purchase) can be measured, and attributed to individual advertisements.[1] This form of marketing is differentiated from other marketing approaches, primarily because there are no intermediaries such as retailers between the buyer and seller, and therefore the buyer must contact the seller directly to purchase products or services.

    However, brand advertising is the form around which much of the advertising industry is based:

    Brand ads, also known as “space ads,” strive to build (or refresh) the prospect’s awareness and favorable view of the company or its product or service. For example, most billboards are brand ads.

    Online, the former works well, but only if the product or service suits direct advertising. Generally speaking, a lot of new-to-market products and services, and luxury goods, don’t suit direct advertising particularly well, unless they’re being marketed on complementary attributes, such as price or convenience.

    The companies that produce goods and services that don’t suit direct marketing aren’t spending as much online.

    But curious changes are afoot.

    What’s Happening At Facebook?

    Those who advertise on Facebook will have noticed the click-thru rate. Generally, it’s pretty low, suggesting direct response isn’t working well in that environment.

    Click-through rates on Facebook ads only averaged 0.05% in 2010, down from 0.06% in 2009 and well short of what’s considered to be the industry average of 0.10%. That’s according to a Webtrends report that examined 11,000 Facebook ads, first reported upon by ClickZ.

    It’s not really surprising, give Facebook’s user base are Cluetrain passengers, even if most have never heard of it:

    Facebook, a hugely popular free service that’s supported solely through advertising, yet is packed with users who are actively hostile to the idea of being marketed to on their cherished social network……this is what I hear from readers every time I write about the online ad economy, especially ads on Facebook: “I don’t know how Facebook will ever make any money—I never click on Web ads!

    But a new study indicates click-thru rates on Facebook might not matter much. The display value of the advertising has been linked back to product purchases, and the results are an eye-opener:

    Whether you know it or not—even if you consider yourself skeptical of marketing—the ads you see on Facebook are working. Sponsored messages in your feed are changing your behavior—they’re getting you and your friends to buy certain products instead of others, and that’s happening despite the fact that you’re not clicking, and even if you think you’re ignoring the ads……his isn’t conjecture. It’s science. It’s based on a remarkable set of in-depth studies that Facebook has conducted to show whether and how its users respond to ads on the site. The studies demonstrate that Facebook ads influence purchases and that clicks don’t matter

    Granted, such a study is self-serving, but if it’s true, and translates to many advertisers, then that’s interesting. Display, engagement, institutional and direct marketing all seem to be melding together into “content”. SEOs who want to get their links in the middle of content will be in there, too.

    You may notice the Cluetrain-style language in the following Forbes post:

    Some innovative companies, like Vine and smartsy, are catching on to this wave by creating apps and software that allows a dialogue between a brand and its audience when and where the consumer wants. Such technology opens a realm of nearly endless possibilities of content creation while increasing conversion rates dramatically. Audience participation isn’t just allowed; it’s encouraged. Hell, it’s necessary. By not only providing consumers with information in the moment of their interest, but also engaging them in conversation and empowering them to create their own content, we can drastically increase the relevancy of messaging and its authenticity.

    Technology Has Finally Caught Up With The Cluetrain

    Before the internet, it wasn’t really possible to engage consumers in conversations, except in very limited ways. Technology wasn’t up to the task.

    But now it is.

    The conversation was heralded in the Cluetrain Manifesto over a decade ago. People don’t want to just be passive consumers of marketing messages – they want engagement. The new advertising trends are all about increasing that level of engagement, and advertisers are doing it, in part, by blurring the lines between advertising and content.

    Categories: 

    Native Advertising

    Native advertising presents opportunities for SEOs to boost their link building strategies, particularly those who favor paid link strategies.

    What Is Native Advertising?

    Native advertising is the marketing industries new buzzword for….well, it depends who you ask.

    Native advertising can’t just be about the creative that fills an advertising space. Native advertising must be intrinsically connected to the format that fits the user’s unique experience. There’s something philosophically beautiful about that in terms of what great advertising should (and could) be. But first, we need to all speak the same language around “native advertising.

    Native advertising is often defined as content that seamless integrates with a site, as opposed to interruption media, such as pre-rolls on YouTube videos, or advertising that sits in a box off to the side of the main content.

    It’s advertising that looks just like content, which is a big part of Google’s success.

    Here’s an example.

    Some high-profile examples of native advertising include Facebook Sponsored Stories; Twitter’s Promoted Tweets; promoted videos on YouTube, Tumblr and Forbes; promoted articles like Gawker’s Sponsored Posts and BuzzFeed’s Featured Partner content; Sponsored Listings on Yelp; promoted images on Cheezburger; and promoted playlists on Spotify and Radio.

    One interesting observation is that Adwords and Adsense are frequently cited as being examples of native advertising. Hold that thought.

    Why Native Advertising?

    The publishing industry is desperate to latch onto any potential lifeline as ad rates plummet.

    Analysts say the slowdown is being caused by the huge expansion in the amount of online advertising space as companies who manage this emerge to dominate the space. In short there’s just too many ad slots chasing ads that are growing, but at a rate slower than the creation of potential ad slots.

    This means the chances are dimming that online ad spending would gradually grow to make up for some of the falls in analogue spending in print. ….staff numbers and the attendant costs of doing business have to be slashed heavily to account for the lower yield and revenue from online ads

    And why might there be more slots than there are advertisers?

    As people get used to seeing web advertising, and mentally blocking it out, or technically filtering it out, advertising becomes less effective. Federated Media, who were predominantly a display advertising business, got out of display ads late last year:

    “The model of ‘boxes and rectangles’ – the display banner – is failing to fully support traditional ‘content’ sites beyond a handful of exceptions,” wrote Federated Media founder John Battelle in a recent blog post. He explained that the next generation of native ads on social networks and strength of Google Adwords make direct sales more competitive, and that ad agencies must evolve with the growing trend of advertisers who want more social/conversational ad campaigns.

    Advertisers aren’t seeing enough return from the advertising in order for them to want to grab the many slots that are available. And they are lowering their bids to make up for issues with publishing fraud. The promise of native advertising is that this type of advertising reaches real users, and will grab and hold viewers attention for longer.

    This remains to be seen, of course.

    Teething Pains

    Not all native advertising works. It depends on the context and the audience. Facebook hasn’t really get it right yet:

    Facebook is still largely centered around interactions with people one knows offline, making the appearance of marketing messages especially jarring. This is particularly true in mobile, where Sponsored Stories take up a much larger portion of the screen relative to desktop. Facebook did not handle the mobile rollout very gracefully, either. Rather than easing users into the change, they appeared seemingly overnight, and took up the first few posts in the newsfeed. The content itself is also hit or miss – actions taken by distant friends with dissimilar interests are often used as the basis for targeting Sponsored Stories.

    If you’re planning on offering native advertising yourself, you may need to walk a fine line. Bloggers and other publishers who are getting paid but don’t declare so risk alienating their audience and destroying their reputation.

    Some good ways of addressing this issue are policy pages that state the author has affiliate relationships with various providers, and this is a means of paying for the site, and does not affect editorial. Whether it’s true or not is up to the audience to decide, but such transparency up-front certainly helps. If a lot of free content is mixed in with native content, and audiences dislike it enough, then it might pave the way for more paid content and paywalls.

    Just like any advertising content, native advertising may become less effective over time if the audience learns to screen it out. One advantage for the SEO is that doesn’t matter so much, so long as they get the link.

    Still, some big players are using it:

    Forbes Insights and Sharethrough today announced the results of a brand study to assess adoption trends related to native video advertising that included senior executives from leading brands such as Intel, JetBlue, Heineken and Honda. The study shows that more than half of large brands are now using custom brand videos in their marketing, and when it comes to distribution, most favor “native advertising” approaches where content is visually integrated into the organic site experience, as opposed to running in standard display ad formats. The study also shows that the majority of marketers now prefer choice-based formats over interruptive formats.

    Google’s Clamp-Down On Link Advertising

    So, what’s the difference between advertorial and native content? Not much, on the face of it, except in one rather interesting respect. When it comes to native advertising, it’s often not obvious the post is sponsored.

    Google has, of course, been punishing links from advertorial content. One wonders if they’ve punished themselves, of course.

    The Atlantic, BuzzFeed and Gawker — are experimenting with new ad formats such as sponsored content or “native advertising,” as well as affiliate links. On Friday, Google engineer Matt Cutts reiterated a warning from the search giant that this kind of content has to be treated properly or Google will penalize the site that hosts it, in some cases severely.

    If native advertising proves popular with publishers and advertisers, then it’s going to compete with Google’s business model. Businesses may spend less on Adwords and may replace Adsense with native advertising. It’s no surprise, then, that Google may take a hostile line on it. However, publishers are poor, ad networks are rich, so perhaps it’s time that publishers became ad networks.

    When it comes to SEO, given Google’s warning shots, SEOs will either capitulate – and pretty much give up on paid links – or make more effort to blend seamlessly into the background.

    Blurring The Lines

    As Andrew Sullivan notes, the editorial thin blue line is looking rather “fuzzy”. It may even raise legal questions about misrepresentation. There has traditionally been a church and state divide between advertising and editorial, but as publishers get more desperate to survive, they’re going to go with whatever works. If native advertising works better than the alternatives, then publishers will use it. What choice have they got? Their industry is dying.

    It raises some pretty fundamental questions.

    I have nothing but admiration for innovation in advertizing and creative revenue-generation online. Without it, journalism will die. But if advertorials become effectively indistinguishable from editorial, aren’t we in danger of destroying the village in order to save it?

    Likewise, in order to compete in search results, a site must have links. It would great if people linked freely and often based on objective merit, but we all know that is a hit and miss affair. If native advertising provides a means to acquire paid links that don’t look like paid links, then that is what people will do.

    And if their competitors are doing it, they’ll have little choice.

    Seamless Integration

    If you’re looking for a way to build paid links, then here is where the opportunity lies for SEOs.

    Recent examples Google caught out looked heavily advertorial. They were in bulk. They would have likely been barely credible to a human reviewer as they didn’t read particularly well. Those I saw had an “auto-generated quality” to them.

    The integration with editorial needs to be seamless and, if possible, the in-house editors should write the copy, or it should look like they did. Avoid generic and boilerplate approaches. The content should not be both generic and widely distributed. Such strategy is unlikely to pass Google’s inspections.

    Markets will spring up, if they haven’t already, whereby publications will offer editorial native advertising, link included. It would be difficult to tell if such a link was “paid for”, and certainly not algorithmically, unless the publisher specifically labelled it “advertising feature” or something similar.

    Sure, this has been going on for years, but if a lot of high level publishers embrace something called “Native Advertising” then that sounds a lot more legitimate than “someone wants to pay for a link on our site”. In marketing, it’s all about the spin ;)

    It could be a paid restaurant review on a restaurant review site, link included. For SEO purposes, the review doesn’t even need to be overtly positive and glowing, therefore a high degree of editorial integrity could be maintained. This approach would suit a lot of review sites. For example, “we’ll pay you to review our product, so long as you link to it, but you can still say whatever you like about it”. The publishers production cost is met, in total, and they can maintain a high degree of editorial integrity. If Jennifer Lopez is in a new movie with some “hot” scene then that movie can pay AskMen to create a top 10 sexiest moments gallery that includes their movie at #9 & then advertise that feature across the web.

    A DIY site could show their readers how to build a garden wall. The products could be from a sponsor, link included. Editorial integrity could be maintained, as the DIY site need not push or recommend those products like an advertorial would, but the sponsor still gets the link. The equivalent of product placement in movies.

    News items can feature product placement without necessarily endorsing them, link included – they already do this with syndicated press releases. Journalists often interview the local expert on a given topic, and this can include a link. If that news article is paid for by the link buyer, yet the link buyer doesn’t have a say in editorial, then that deal will look attractive to publishers. Just a slightly different spin on “brought to you by our sponsor”. Currently services like HARO & PR Leads help connect experts with journalists looking for story background. In the years to come perhaps there will be similar services where people pay the publications directly to be quoted.

    I’m sure you can think of many other ideas. A lot of this isn’t new, it’s just a new, shiny badge on something that has been going on well before the web began. When it comes to SEO, the bar has been lifted on link building. Links from substandard content are less likely to pass Google’s filters, so SEOs need to think more about ways to get quality content integrated in a more seamless way. It takes more time, and it’s likely to be more costly, but this can be a good thing. It raises the bar on everyone else.

    Those who don’t know the bar has been raised, or don’t put more effort in, will lose.

    Low Level Of Compromise

    Native Advertising is a new spin on an old practice, however it should be especially interesting to the SEO, as the SEO doesn’t demand the publisher compromise editorial to a significant degree, as the publisher would have to do for pure advertorial. The SEO only requires they incorporate a link within a seamless, editorial-style piece.

    If the SEO is paying for the piece to be written, that’s going to look like a good deal to many publishers.

    Categories: