Banner

Follow Along

RSS Feed Join Us on Twitter On Facebook

Get Engaged

Banner

Related Reading

Our Sponsors

Banner
Banner
Banner

Join Us

Banner
Newsfeeds from around the industry
SEO Book


  • Guide To Optimizing Client Sites 2014

    For those new to optimizing clients sites, or those seeking a refresher, we thought we'd put together a guide to step you through it, along with some selected deeper reading on each topic area.

    Every SEO has different ways of doing things, but we’ll cover the aspects that you’ll find common to most client projects.

    Few Rules

    The best rule I know about SEO is there are few absolutes in SEO. Google is a black box, so complete data sets will never be available to you. Therefore, it can be difficult to pin down cause and effect, so there will always be a lot of experimentation and guesswork involved. If it works, keep doing it. If it doesn't, try something else until it does.

    Many opportunities tend to present themselves in ways not covered by “the rules”. Many opportunities will be unique and specific to the client and market sector you happen to be working with, so it's a good idea to remain flexible and alert to new relationship and networking opportunities. SEO exists on the back of relationships between sites (links) and the ability to get your content remarked upon (networking).

    When you work on a client site, you will most likely be dealing with a site that is already established, so it’s likely to have legacy issues. The other main challenge you’ll face is that you’re unlikely to have full control over the site, like you would if it were your own. You’ll need to convince other people of the merit of your ideas before you can implement them. Some of these people will be open to them, some will not, and some can be rather obstructive. So, the more solid data and sound business reasoning you provide, the better chance you have of convincing people.

    The most important aspect of doing SEO for clients is not blinding them with technical alchemy, but helping them see how SEO provides genuine business value.

    1. Strategy

    The first step in optimizing a client site is to create a high-level strategy.

    "Study the past if you would define the future.” - Confucious

    You’re in discovery mode. Seek to understand everything you can about the clients business and their current position in the market. What is their history? Where are they now and where do they want to be? Interview your client. They know their business better than you do and they will likely be delighted when you take a deep interest in them.

    • What are they good at?
    • What are their top products or services?
    • What is the full range of their products or services?
    • Are they weak in any areas, especially against competitors?
    • Who are their competitors?
    • Who are their partners?
    • Is their market sector changing? If so, how? Can they think of ways in which this presents opportunities for them?
    • What keyword areas have worked well for them in the past? Performed poorly?
    • What are their aims? More traffic? More conversions? More reach? What would success look like to them?
    • Do they have other online advertising campaigns running? If so, what areas are these targeting? Can they be aligned with SEO?
    • Do they have offline presence and advertising campaigns? Again, what areas are these targeting and can they be aligned with SEO?

    Some SEO consultants see their task being to gain more rankings under an ever-growing list of keywords. Ranking for more keywords, or getting more traffic, may not result in measurable business returns as it depends on the business and the marketing goals. Some businesses will benefit from honing in on specific opportunities that are already being targeted, others will seek wider reach. This is why it’s important to understand the business goals and market sector, then design the SEO campaign to support the goals and the environment.

    This type of analysis also provides you with leverage when it comes to discussing specific rankings and competitor rankings. The SEO can’t be expected to wave a magic wand and place a client top of a category in which they enjoy no competitive advantage. Even if the SEO did manage to achieve this feat, the client may not see much in the way of return as it’s easy for visitors to click other listings and compare offers.

    Understand all you can about their market niche. Look for areas of opportunity, such as changing demand not being met by your client or competitors. Put yourself in their customers shoes. Try and find customers and interview them. Listen to the language of customers. Go to places where their customers hang out online. From the customers language and needs, combined with the knowledge gleaned from interviewing the client, you can determine effective keywords and themes.

    Document. Get it down in writing. The strategy will change over time, but you’ll have a baseline point of agreement outlining where the site is at now, and where you intend to take it. Getting buy-in early smooths the way for later on. Ensure that whatever strategy you adopt, it adds real, measurable value by being aligned with, and serving, the business goals. It’s on this basis the client will judge you, and maintain or expand your services in future.

    Further reading:

    - 4 Principles Of Marketing Strategy In The Digital Age
    - Product Positioning In Five Easy Steps [pdf]
    - Technology Marketers Need To Document Their Marketing Strategy

    2. Site Audit

    Sites can be poorly organized, have various technical issues, and missed keyword opportunities.
    We need to quantify what is already there, and what’s not there.

    • Use a site crawler, such as Xenu Link Sleuth, Screaming Frog or other tools that will give you a list of URLs, title information, link information and other data.
    • Make a list of all broken links.
    • Make a list of all orphaned pages
    • Make a list of all pages without titles
    • Make a list of all pages with duplicate titles
    • Make a list of pages with weak keyword alignment
    • Crawl robots txt and hand-check. It’s amazing how easy it is to disrupt crawling with a robots.txt file

    Broken links are a low-quality signal. It's debatable if they are a low quality signal to Google, but certainly to users. If the client doesn't have one already, implement a system whereby broken links are checked on a regular basis. Orphaned pages are pages that have no links pointing to them. Those pages may be redundant, in which case they should be removed, or you need to point inbound links at them, so they can be crawled and have more chance of gaining rank. Page titles should be unique, aligned with keyword terms, and made attractive in order to gain a click. A link is more attractive if it speaks to a customer need. Carefully check robots.txt to ensure it’s not blocking areas of the site that need to be crawled.

    As part of the initial site audit, it might make sense to include the site in Google Webmaster Tools to see if it has any existing issues there and to look up its historical performance on competitive research tools to see if the site has seen sharp traffic declines. If they've had sharp ranking and traffic declines, pull up that time period in their web analytics to isolate the date at which it happened, then look up what penalties might be associated with that date.

    Further Reading:

    - Broken Links, Pages, Images Hurt SEO
    - Three Easy Ways To Fix Broken Links And Stop Unnecessary Visitor Loss
    - 55 Ways To Use Screaming Frog
    - Robots.txt Tutorial

    3. Competitive Analysis

    Some people roll this into a site audit, but I’ll split it out as we’re not looking at technical issues on competitor sites, we’re looking at how they are positioned, and how they’re doing it. In common with a site audit, there’s some technical reverse engineering involved.

    There are various tools that can help you do this. I use SpyFu. One reporting aspect that is especially useful is estimating the value of the SEO positions vs the Adwords positions. A client can then translate the ranks into dollar terms, and justify this back against your fee.

    When you run these competitive reports, you can see what content of theirs is working well, and what content is gaining ground. Make a list of all competitor content that is doing well. Examine where their links are coming from, and make a list. Examine where they’re mentioned in the media, and make a list. You can then use a fast-follow strategy to emulate their success, then expand upon it.

    Sometimes, “competitors”, meaning ranking competitors, can actually be potential partners. They may not be in the same industry as your client, just happen to rank in a cross-over area. They may be good for a link, become a supplier, welcome advertising on their site, or be willing to place your content on their site. Make a note of the sites that are ranking well within your niche, but aren’t direct competitors.

    Using tools that estimate the value of ranks by comparing Adwords keywords prices, you can estimate the value of your competitors positions. If your client appears lower than the competition, you can demonstrate the estimated dollar value of putting time and effort into increasing rank. You can also evaluate their rate of improvement over time vs your client, and use this as a competitive benchmark. If your client is not putting in the same effort as your competitor, they’ll be left behind. If their competitors are spending on ongoing-SEO and seeing tangible results, there is some validation for your client to do likewise.

    Further reading:

    - Competitor Analysis [pdf]
    - Illustrated SEO Competitive Workflow
    - Competitive Analysis: How To Become A SEO Hero In 4 Steps

    4. Site Architecture

    A well organised site is both useful from a usability standpoint and an SEO standpoint. If it’s clear to a user where they need to go next, then this will flow through into better engagement scores. If your client has a usability consultant on staff, this person is a likely ally.

    It’s a good idea to organise a site around themes. Anecdotal evidence suggests that Google likes pages grouped around similar topics, rather than disparate topics (see from 1.25 onwards).

    • Create spreadsheet based on a crawl after any errors have been tidied up
    • Identify best selling products and services. These deserve the most exposure and should be placed high up the site hierarchy. Items and categories that do not sell well, and our less strategically important, should be lower in the hierarchy
    • Pages that are already getting a lot of traffic, as indicated by your analytics, might deserve more exposure by moving them up the hierarchy.
    • Seasonal products might deserve more exposure just before that shopping season, and less exposure when the offer is less relevant.
    • Group pages into similar topics, where possible. For example, acme.com/blue-widgets/ , acme.com/green-widgets/.
    • Determine if internal anchor text is aligned with keyword titles and page content by looking at a backlink analysis

    A spreadsheet of all pages helps you group pages thematically, preferably into directories with similar content. Your strategy document will guide you as to which pages you need to work on, and which pages you need to religate. Some people spend a lot of time sculpting internal pagerank i.e. flowing page rank to some pages, but using nofollow on other links to not pass link equity to others. Google may have depreciated that approach, but you can still link to important products or categories sitewide to flow them more link equity, while putting less important sites lower in the site's architecture. Favour your money pages, and relegate your less important pages.

    Think mobile. If your content doesn't work on mobile, then getting to the top of search results won't do you much good.

    Further Reading:

    - Site Architecture & Search Engine Success Factors
    - Optimiing Your Websites Architecture For SEO (Slide Presentation)
    - The SEO Guide To Information Archetecture

    5. Enable Crawling & Redirects

    Ensure your site is deep crawled. To check if all your URLs are included in Google’s index, sign up with Webmaster Tools and/or other index reporting tools.

    • Include a site map
    • Check the existing robots.txt. Kep robots out of non-essential areas, such as script repositories and other admin related directories.
    • If you need to move pages, or you have links to pages that no longer exist, use page redirects to tidy them up
    • Make a list of 404 errors. Make sure the 404 page has useful navigation into the site so visitors don’t click back.

    The accepted method to redirect a page is to use a 301. The 301 indicates a page has permanently moved location. A redirect is also useful if you change domains, or if you have links pointing to different versions of the site. For example, Google sees http://www.acme.com and http://acme.com as different sites. Pick one and redirect to it.

    Here’s a video explaining how:

    If you don’t redirect pages, then you won’t be making full use of any link juice allocated to those pages.

    Further Reading:

    - What Are Google Site Maps?
    - The Ultimate Guide To 301 Redirects
    - Crawling And Indexing Metrics

    6. Backlink Analysis

    Backlinks remain a major ranking factor. Generally, the more high quality links you have pointing to your site, the better you’ll do in the results. Of late, links can also harm you. However, if your overall link profile is strong, then a subset of bad links is unlikely to cause you problems. A good rule of thumb is the Matt Cutts test. Would you be happy to show the majority of your links to Matt Cutts? :) If not, you're likely taking a high risk strategy when it comes to penalties. These can be manageable when you own the site, but they can be difficult to deal with on client sites, especially if the client was not aware of the risks involved in aggressive SEO.

    • Establish a list of existing backlinks. Consider trying to remove any that look low quality.
    • Ensure all links resolve to appropriate pages
    • Draw up a list of sites from which your main competitors have gained links
    • Draw up a list of sites where you’d like to get links from

    Getting links involves either direct placement or being linkworthy. On some sites, like industry directories, you can pay to appear. In other cases, it’s making your site into an attractive linking target.

    Getting links to purely commercial sites can be a challenge. Consider sponsoring charities aligned with your line of business. Get links from local chambers of commerce. Connect with education establishments who are doing relevant research and consider sponsoring or become involved in some way.

    Look at the sites that point to your competitors. How were these links obtained? Follow the same path. If they successfully used white papers, then copy that approach. If they successfully used news, do that, too. Do whatever seems to work for others. Evaluate the result. Do more/less of it, depending on the results.

    You also need links from sites that your competitors don’t have. Make a list of desired links. Figure out a strategy to get them. It may involve supplying them with content. It might involve participating in their discussions. It may involve giving them industry news. It might involve interviewing them or profiling them in some way, so they link to you. Ask “what do they need”?. Then give it to them.

    Of course, linking is an ongoing strategy. As a site grows, many links will come naturally, and that in itself, is a link acquisition strategy. To grow in importance and consumer interest relative to the competition. This involves your content strategy. Do you have content that your industry likes to link to? If not, create it. If your site is not something that your industry links to, like a brochure site, you may look at spinning-off a second site that is information focused, and less commercial focused. You sometimes see blogs on separate domains where employees talk about general industry topics, like Signal Vs Noise, Basecamps blog. These are much more likely to receive links than sites that are purely commercial in nature.

    Before chasing links, you should be aware of what type of site typically receives links, and make sure you’re it.

    Further Reading:

    - Interview Of Debra Mastaler, the Link Guru
    - Scaleable Link Building Techniques
    - Creative Link Building Ideas

    7 Content Assessment

    Once you have a list of keywords, an idea of where competitors rank, and what the most valuable terms are from a business point of view, you can set about examining and building out content.

    Do you have content to cover your keyword terms? If not, add it to the list of content that needs to be created. If you have content that matches terms, see if compares well with client content on the same topic. Can the pages be expanded or made more detailed? Can more/better links be added internally? Will the content benefit from amalgamating different content types i.e. videos, audio, images et al?

    You’ll need to create content for any keyword areas you’re missing. Rather than copy what is already available in the niche, look at the best ranking/most valuable content for that term and ask how it could be made better. Is there new industry analysis or reports that you can incorporate and/or expand on? People love the new. They like learning things they don’t already know. Mee-too content can work, but it’s not making the most of the opportunity. Aim to produce considerably more valuable content than already exists as you’ll have more chance of getting links, and more chance of higher levels of engagement when people flip between sites. If visitors can get the same information elsewhere, they probably will.

    Consider keyword co-occurrence. What terms are readily associated with the keywords you’re chasing? Various tools provide this analysis, but you can do it yourself using the Adwords research tool. See what keywords it associates with your keywords. The Google co-occurrence algorithm is likely the same for both Adwords and organic search.

    Also, think about how people will engage with your page. Is it obvious what the page is about? Is it obvious what the user must do next? Dense text and distracting advertising can reduce engagement, so make sure the usability is up to scratch. Text should be a reasonable size so the average person isn’t squinting. It should be broken up with headings and paragraphs. People tend to scan when reading online,searching for immediate confirmation they’ve found the right information. This was written a long time ago, but it’s interesting how relevant it remains.

    Further Reading:

    - Content Marketing Vs SEO
    - Content Analysis Using Google Analytics
    - Content Based SEO Strategy Will Eventually Fail

    8. Link Out

    Sites that don’t link out appear unnatural. Matt Cutts noted:

    Of course, folks never know when we're going to adjust our scoring. It's pretty easy to spot domains that are hoarding PageRank; that can be just another factor in scoring. If you work really hard to boost your authority-like score while trying to minimize your hub-like score, that sets your site apart from most domains. Just something to bear in mind.

    • Make a list of all outbound links
    • Determine if these links are complementary i.e. similar topic/theme, or related to the business in some way
    • Make a list of pages with no links out

    Links out are both a quality signal and good PR practise. Webmaster look at their inbound links, and will likely follow them back to see what is being said about them. That’s a great way to foster relationships, especially if your client’s site is relatively new. If you put other companies and people in a good light, you can expect many to reciprocate in kind.

    Links, the good kind, are about human relationships.

    It’s also good for your users. Your users are going to leave your site, one way or another, so you can pick up some kudos if you help them on their way by pointing them to some good authorities. If you’re wary about linking to direct competitors, then look for information resources, such as industry blogs or news sites, or anyone else you want to build a relationship with. Link to suppliers and related companies in close, but non-competing niches. Link to authoritative sites. Be very wary about pointing to low value sites, or sites that are part of link schemes. Low value sites are obvious. Sites that are part of link schemes are harder to spot, but typically feature link swapping schemes or obvious paid links unlikely to be read by visitors. Avoid link trading schemes. It’s too easy to be seen as a part of a link network, and it’s no longer 2002.

    Further Resources:

    - Five Reasons You Should Link Out
    - The Domino Effects Of Links And Relationships
    - Link Building 101: Utilizing Past Relationships

    9. Ongoing

    It’s not set and forget.

    Clients can’t expect to do a one off optimisation campaign and expect it to keep working forever. It may be self-serving for SEOs to say it, but it’s also the truth. SEO is ongoing because search keeps changing and competitors and markets move. Few companies would dream of only having one marketing campaign. The challenge for the SEO, like any marketer, is to prove the on-going spend produces a return in value.

    • Competition monitoring i.e. scan for changes in competitors rank, new competitors, and change of tactics. Determine what is working, and emulate it.
    • Sector monitoring - monitor Google trends, keywords trends, discussion groups, and news releases. This will give you ideas for new campaign angles.
    • Reporting - the client needs to be able to see the work you’ve done is paying off.
    • Availability - clients will change things on their site, or bring in other marketers, so will want you advice going forward

    Further Reading

    Whole books can be written about SEO for clients. And they have. We've skimmed across the surface but, thankfully, there is a wealth of great information out there on the specifics of how to tackle each of these topic areas.

    Perhaps you can weigh in? :) What would your advice be to those new to optimizing client sites? What do you wish someone had told you when you started?

    Categories: 


  • Google Search Censorship for Fun and Profit

    Growing Up vs Breaking Things

    Facebook's early motto was "move fast and break things," but as they wanted to become more of a platform play they changed it to "move fast with stability." Anything which is central to the web needs significant stability, or it destroys many other businesses as a side effect of its instability.

    As Google has become more dominant, they've moved in the opposite direction. Disruption is promoted as a virtue unto itself, so long as it doesn't adversely impact the home team's business model.

    There are a couple different ways to view big search algorithm updates. Large, drastic updates implicitly state one of the following:

    • we were REALLY wrong yesterday
    • we are REALLY wrong today

    Any change or disruption is easy to justify so long as you are not the one facing the consequences:

    "Smart people have a problem, especially (although not only) when you put them in large groups. That problem is an ability to convincingly rationalize nearly anything." ... "Impostor Syndrome is that voice inside you saying that not everything is as it seems, and it could all be lost in a moment. The people with the problem are the people who can't hear that voice." - Googler Avery Pennarun

    Monopoly Marketshare in a Flash

    Make no mistake, large changes come with false positives and false negatives. If a monopoly keeps buying marketshare, then any mistakes they make have more extreme outcomes.

    Here's the Flash update screen (which hits almost every web browser EXCEPT Google Chrome).

    Notice the negative option installs for the Google Chrome web browser and the Google Toolbar in Internet Explorer.

    Why doesn't that same process hit Chrome? They not only pay Adobe to use security updates to steal marketshare from other browsers, but they also pay Adobe to embed Flash inside Chrome, so Chrome users never go through the bundleware update process.

    Anytime anyone using a browser other than Chrome has a Flash security update they need to opt out of the bundleware, or they end up installing Google Chrome as their default web browser, which is the primary reason Firefox marketshare is in decline.

    Google engineers "research" new forms of Flash security issues to drive critical security updates.

    Obviously, users love it:

    Has anyone noticed that the latest Flash update automatically installs Google Toolbar and Google Chrome? What a horrible business decision Adobe. Force installing software like you are Napster. I would fire the product manager that made that decision. As a CTO I will be informing my IT staff to set Flash to ignore updates from this point forward. QA staff cannot have additional items installed that are not part of the base browser installation. Ridiculous that Adobe snuck this crap in. All I can hope now is to find something that challenges Photoshop so I can move my design team away from Adobe software as well. Smart move trying to make pennies off of your high dollar customers.

    In Chrome Google is the default search engine. As it is in Firefox and Opera and Safari and Android and iOS's web search.

    In other words, in most cases across most web interfaces you have to explicitly change the default to not get Google. And then even when you do that, you have to be vigilant in protecting against the various Google bundleware bolted onto core plugins for other web browsers, or else you still end up in an ecosystem owned, controlled & tracked by Google.

    Those "default" settings are not primarily driven by user preferences, but by a flow of funds. A few hundred million dollars here, a billion there, and the market is sewn up.

    Google's user tracking is so widespread & so sophisticated that their ad cookies were a primary tool for government surveillance efforts.

    Locking Down The Ecosystem

    And Chrome is easily the most locked down browser out there.

    Whenever Google wants to promote something they have the ability to bundle it into their web browser, operating system & search results to try to force participation. In a fluid system with finite attention, over-promoting one thing means under-promoting or censoring other options. Google likes to have their cake & eat it too, but the numbers don't lie.

    I am frustrated @JohnMu saying that it will not cost CTR. Either Google lied about the increase in CTR with photos, or they're lying now.— Rand Fishkin (@randfish) June 25, 2014

    The Right to Be Forgotten

    This brings us back to the current snafu with the "right to be forgotten" in Europe.

    Google notified publishers like the BBC & The Guardian of their links being removed due to the EU "right to be forgotten" law. Their goal was to cause a public relations uproar over "censorship" which seems to have been a bit too transparent, causing them to reverse some of the removals after they got caught with their hand in the cookie jar.

    The breadth of removals is an ongoing topic of coverage. But if you are Goldman Sachs instead of a government Google finds filtering information for you far more reasonable.

    Some have looked at the EU policy and compared it to state-run censorship in China.

    Google already hires over 10,000 remote quality raters to rate search results. How exactly is receiving 70,000 requests a monumental task? As their public relations propagandists paint this as an unbelievable burden, they are also highlighting how their own internal policies destroy smaller businesses: "If a multi-billion dollar corporation is struggling to cope with 70,000 censor requests, imagine how the small business owner feels when he/she has to disavow thousands or tens of thousands of links."

    The World's Richest Librarian

    Google aims to promote themselves as a digital librarian: "It’s a bit like saying the book can stay in the library, it just cannot be included in the library’s card catalogue."

    That analogy is absurd on a number of levels. Which librarian...

    Sorry About That Incidental Deletion From the Web...

    David Drummond's breathtaking propaganda makes it sound like Google has virtually no history in censoring access to information:

    In the past we’ve restricted the removals we make from search to a very short list. It includes information deemed illegal by a court, such as defamation, pirated content (once we’re notified by the rights holder), malware, personal information such as bank details, child sexual abuse imagery and other things prohibited by local law (like material that glorifies Nazism in Germany).

    Yet Google sends out hundreds of thousands of warning messages in webmaster tools every single month.

    Google is free to force whatever (often both arbitrary and life altering) changes they desire onto the search ecosystem. But the moment anyone else wants any level of discourse or debate into the process, they feign outrage over the impacts on the purity of their results.

    Despite Google's great power they do make mistakes. And when they do, people lose their jobs.

    Consider MetaFilter.

    They were penalized November 17, 2012.

    At a recent SMX conference Matt Cutts stated MetaFilter was a false positive.

    People noticed the Google update when it happened. It is hard to miss an overnight 40% decline in your revenues. Yet when they asked about it, Google did not confirm its existence. That economic damage hit MetaFilter for nearly two years & they only got a potential reprieve from after they fired multiple employees and were able to generate publicity about what had happened.

    As SugarRae mentioned, those false positives happen regularly, but most the people who are hit by them lack political and media influence, and are thus slaughtered with no chance of recovery.

    MetaFilter is no different than tens of thousands of other good, worthy small businesses who are also laying off employees – some even closing their doors – as a result of Google’s Panda filter serving as judge, jury and executioner. They’ve been as blindly and unfairly cast away to an island and no one can hear their pleas for help.

    The only difference between MetaFilter and tons of other small businesses on the web is that MetaFilter has friends in higher places.

    If you read past the headlines & the token slaps of big brands, these false positive death sentences for small businesses are a daily occurrence.

    And such stories are understated for fear of coverage creating a witch-hunt:

    Conversations I’ve had with web publishers, none of whom would speak on the record for fear of retribution from Cutts’ webspam team, speak to a litany of frustration at a lack of transparency and potential bullying from Google. “The very fact I’m not able to be candid, that’s a testament to the grotesque power imbalance that’s developed,” the owner of one widely read, critically acclaimed popular website told me after their site ran afoul of Cutts’ last Panda update.

    Not only does Google engage in anti-competitive censorship, but they also frequently publish misinformation. Here's a story from a week ago of a restaurant which went under after someone changed their Google listing store hours to be closed on busy days. That misinformation was embedded directly in the search results. That business is no more.

    Then there are areas like locksmiths:

    I am one of the few Real Locksmiths here in Denver and I have been struggling with this for years now. I only get one or two calls a day now thanks to spammers, and that's not calls I do, it's calls for prices. For instance I just got a call from a lady locked out of her apt. It is 1130 pm so I told her 75 dollars, Nope she said someone told her 35 dollars....a fake locksmith no doubt. She didn't understand that they meant 35 dollars to come out and look at it. These spammers charge hundreds to break your lock, they don't know how to pick a lock, then they charge you 10 times the price of some cheap lock from a hardware store. I'm so lost, I need help from google to remove those listings. Locksmithing is all I have ever done and now I'm failing at it.

    There are entire sectors of the offline economy being reshaped by Google policies.

    When those sectors get coverage, the blame always goes to the individual business owner who was personal responsible for Google's behaviors, or perhaps some coverage of the nefarious "spammers."

    Never does anybody ask if it is reasonable for Google to place their own inaccurate $0 editorial front and center. To even bring up that issue makes one an anti-capitalist nut or someone who wishes to impede on free speech rights. This even after the process behind the sausage comes to light.

    And while Google arbitrarily polices others, their leaked internal documents contain juicy quotes about their ad policies like:

    • “We are the only player in our industry still accepting these ads”
    • “We do not make these decisions based on revenue, but as background, [redacted].”
    • "As with all of our policies, we do not verify what these sites actually do, only what they claim to do."
    • "I understand that we should not let other companies, press, etc. influence our decision-making around policy"

    Is This "Censorship" Problem New?

    This problem of control to access of information is nothing new - it is only more extreme today. Read the (rarely read) preface to Animal Farm, or consider this:

    John Milton in his fiery 1644 defense of free speech, Areopagitica, was writing not against the oppressive power of the state but of the printers guilds. Darnton said the same was true of John Locke's writings about free speech. Locke's boogeyman wasn't an oppressive government, but a monopolistic commercial distribution system that was unfriendly to ways of organizing information that didn't fit into its business model. Sound familiar?

    When Google complains about censorship, they are not really complaining about what may be, but what already is. Their only problem is the idea that someone other than themselves should have any input in the process.

    "Policy is largely set by economic elites and organized groups representing business interests with little concern for public attitudes or public safety, as long as the public remains passive and obedient." ― Noam Chomsky

    Many people have come to the same conclusion

    Turn on, tune in, drop out

    "I think as technologists we should have some safe places where we can try out some new things and figure out what is the effect on society, what's the effect on people, without having to deploy kind of into the normal world. And people like those kind of things can go there and experience that and we don't have mechanisms for that." - Larry Page

    I have no problem with an "opt-in" techno-utopia test in some remote corner of the world, but if that's the sort of operation he wants to run, it would be appreciated if he stopped bundling his software into billions of electronic devices & assumed everyone else is fine with "opting out."



  • {This | The Indicated} {Just | True} {In | Newfangled}

    A couple years ago we published an article named Branding & the Cycle, which highlighted how brands would realign with the algorithmic boost they gained from Panda & leverage their increased level of trust to increase their profit margins by leveraging algorithmic journalism.

    Narrative Science has been a big player in the algorithmic journalism game for years. But they are not the only player in the market. Recently the Associated Press (AP) announced they will use algorithms to write articles based on quarterly earnings reports, working with a company named Automated Insights:

    We discovered that automation technology, from a company called Automated Insights, paired with data from Zacks Investment Research, would allow us to automate short stories – 150 to 300 words — about the earnings of companies in roughly the same time that it took our reporters.

    And instead of providing 300 stories manually, we can provide up to 4,400 automatically for companies throughout the United States each quarter.
    ...
    Zacks maintains the data when the earnings reports are issued. Automated Insights has algorithms that ping that data and then in seconds output a story.

    In the past Matt Cutts has mentioned how thin rewrites are doorway page spam:

    you can also have more subtle doorway pages. so we ran into a directv installer in denver, for example. and that installer would say I install for every city in Colorado. so I am going to make a page for every single city in Colorado. and Boulder or Aspen or whatever I do directv install in all of those. if you were just to land on that page it might look relatively reasonable. but if you were to look at 4 or 5 of those you would quickly see that the only difference between them is the city, and that is something that we would consider a doorway.

    One suspects these views do not apply to large politically connected media bodies like the AP, which are important enough to have a direct long-term deal with Google.

    In the above announcement the AP announced they include automated NFL player rankings. One interesting thing to note about the AP is they have syndication deals with 1,400 daily newspapers nationwide, as well as thousands of TV and radio stations..

    A single automated AP article might appear on thousands of websites. When thousands of articles are automated, that means millions of copies. When millions of articles are automated, that means billions of copies. When billions ... you get the idea.

    To date Automated Insights has raised a total of $10.8 million. With that limited funding they are growing quickly. Last year their Wordsmith software produced 300 million stories & this year it will likely exceed a billion articles:

    "We are the largest producer of content in the world. That's more than all media companies combined," [Automated Insights CEO Robbie Allen] said in a phone interview with USA TODAY.

    The Automated Insights homepage lists both Yahoo! & Microsoft as clients.

    The above might sound a bit dystopian (for those with careers in journalism and/or lacking equity in Automated Insights and/or publishers who must compete against algorithmically generated content), but the story also comes with a side of irony.

    Last year Google dictated press releases shall use nofollow links. All the major press release sites quickly fell in line & adopted nofollow, thinking they would remain in Google's good graces. Unfortunately for those sites, they were crushed by Panda. PR Newswire's solution their penalty was greater emphasis on manual editorial review:

    Under the new copy quality guidelines, PR Newswire editorial staff will review press releases for a number of message elements, including:

    • Inclusion of insightful analysis and original content (e.g. research, reporting or other interesting and useful information);
    • Use of varied release formats, guarding against repeated use of templated copy (except boilerplate);
    • Assessing release length, guarding against issue of very short, unsubstantial messages that are mere vehicles for links;
    • Overuse of keywords and/or links within the message.

    So now we are in a situation where press release sites require manual human editorial oversight to try to get out of being penalized, and the news companies (which currently enjoy algorithmic ranking boosts) are leveraging those same "spammy" press releases using software to auto-generate articles based on them.

    That makes sense & sounds totally reasonable, so long as you don't actually think about it (or work at Google)...



All the Latest

Getting Around the Site

Home - all the latest on SNC
SEO - our collection of SEO articles
Technical SEO - for the geeks
Latest News - latest news in search
Analytics - measure up and convert
RSS Rack - feeds from around the industry
Search - looking for something specific?
Authors - Author Login
SEO Training - Our sister site
Contact Us - get in touch with SNC

What's New?