As Google continues to transpose the idea and essence of the real world, physical marketplace through its various platforms (Google+, Shopping, Google+ Local, etc), big brands will continue to have a distinct advantage over everyone else. They can build faster, spend more, amplify their messages/reach farther through multiple channels (traditional and digital), all of which translate to “consumer trust”.
As Eric Schmidt said five years ago, and I have to keep going back to, “Brands are the solution, not the problem”¦brands are how you sort out the cesspool.”
It comes down to entity building, solidification, and continually establishing (re)connections to the brand from its disparate digital properties.
When you think about “brand-building” in that manner, every company is a brand and has the opportunity to establish itself as such with Google by simply connecting the dots in the knowledge/entity graph.
SEO is really becoming more of a technical art that is to clarify and solidify signals around “entity” and “brand” and connect all the dots for the engines. In that vein, enterprise-level Technical SEO is a cornerstone to winning the web bazaar, and traditional SEO methods (think on-site optimization) are discounted more and more every day.
That’s not to say these measures are not essential or important to square away, it is simply to say that the impact is less if you haven’t built a strong technical foundation.
When you deal with big brands [enterprise-level], you tend to deal with some enormous sites and ecosystems.
And, they come with a good deal of problems: frankenstein’d information architectures, lots of crawl inefficiencies (from technical errors to non-canonicalization/duplication), and strange little one-off microsites that polka dot the web. All these scatter and weaken that brand/entity connection and diffuse search results and positions.
We’ve all seen some frankenstein’d architectures that simply boggle the mind; things that make you step back and let out a “WTF” breath of exasperation. Whether it’s a smattering of random file hanging off the root of the site, or a patch of important product/service-centric landing pages nestled 5 levels deep, or URL structures that magically change 2 levels in, it’s enough to drive you mad. The first solution is to discover the true architecture of the site.
I use Screaming Frog. It is my all-purpose go-to tool. Even with large sites (50K+ page sites), Screaming Frog [the licensed version] is able give an idea what’s been happening over the last several years architecturally. You simply drop in the site without the “www” (as you want to capture any/all sub-domains) and let it rip.
When the spider is finished, export to Excel, sort by “Address”, and presto, you’ve got the information architecture of the site. From here you can find architecture efficiencies and make recommendations to help keep content themes concise and in close proximity. You can find ways to compact unusually deep architecture that crawlers will be hard pressed to crawl regularly, or build levels on very shallow information silos to bolster authority and density around certain keyword phrase themes.
Just Cannonicalize It, Yo
Here too, Screaming Frog is your best friend. While that first run may have been painfully long, it’ll be worth every second. You can conduct multiple custom sorts in your Excel file to find all the technical crawl errors (404s and 500s) as well as discover all the non-canonical/duplicate pages hanging out around the site.
You’d be surprised how many non-canonical domains I still discover, entire replications of sites. You’ll be able to discover all the URL errors: three versions of the same page, each of them unique because of URL casing.
You’ll also want to install Google Webmaster Tools and Bing Webmaster on the site. While Screaming Frog is pretty good about discovering what’s there today, the webmaster tool suite has memory and will help you discover nearly all the 404s, Soft 404s, 500s, and crawl errors its ever picked up. I also like to search each engine’s index for disparate pages cataloged a long time ago that Screaming Frog didn’t pick up.
Think about how much brand authority and signal is being parsed and scattered between all these duplicate versions of the same page? How much is being deflected and lost due to 400-range errors. Help Google solidify the entity and attribute all the authority to single, canonical versions of product and service content. Let the 301-fest begin!
Bring It Back to the Mothership
Big brands have this strange habit of building microsites. For everything. It seems that any time they have a wandering, errant thought about new product or service, or product/service enhancement, it’s deserving of a microsite.
Mostly they are big ole’ globs of over-the-top, “hyper-sexy”, Flash functionality that they use as a quasi-portal to funnel you back to the parent site at the end of the day. Ugh. And, likely, there could be several, and upwards of 10 or 20 depending on frisky your brand was, of these one-page/two-page micro-blobs fragmenting brand and entity authority across the web. Microsites are not the easiest thing to discover, so having a relationship with your client is important.
Through the natural course of research you’ll likely discover one or two, but where’s there’s smoke there’s definitely fire. Ask for a list of the domains that they own; it may take some time to pry loose, but you’ll find that a few microsites that you would have never thought to associate to them (i.e. the domain has nothing to do with the brand or product).
From here, it’s domain-level redirecting the entire microsite back to the best location on the parent site (it’s usually the easiest solution for an IT team to implement, and rarely do microsites involve multiple products). You restore brand and entity authority, get the parent site ranking, and stop intra-brand competition in the SERPs.
With these measures in place, you help create a solid foundation where even minimal on-site effort can be magnified for a brand and help to establish/create/restore Google’s entity graph.