Banner

Follow Along

RSS Feed Join Us on Twitter On Facebook

Get Engaged

Banner

Related Reading

Our Sponsors

Banner
Banner
Banner

Join Us

Banner
Newsfeeds from around the industry
Google Webmaster Central Blog
Official news on crawling and indexing sites for the Google index.

  • #NoHacked: a global campaign to spread hacking awareness
    Webmaster level: All

    This June, we introduced a weeklong social campaign called #NoHacked. The goals for #NoHacked are to bring awareness to hacking attacks and offer tips on how to keep your sites safe from hackers.

    We held the campaign in 11 languages on multiple channels including Google+, Twitter and Weibo. About 1 million people viewed our tips and hundreds of users used the hashtag #NoHacked to spread awareness and to share their own tips. Check them out below!

    Posts we shared during the campaign:


    Some of the many tips shared by users across the globe:
    • Pablo Silvio Esquivel from Brazil recommends users not to use pirated software (source)
    • Rens Blom from the Netherlands suggests using different passwords for your accounts, changing them regularly, and using an extra layer of security such as two-step authentication (source)
    • Дмитрий Комягин from Russia says to regularly monitor traffic sources, search queries and landing pages, and to look out for spikes in traffic (source)
    • 工務店コンサルタント from Japan advises everyone to choose a good hosting company that's knowledgeable in hacking issues and to set email forwarding in Webmaster Tools (source)
    • Kamil Guzdek from Poland advocates changing the default table prefix in wp-config to a custom one when installing a new WordPress to lower the risk of the database from being hacked (source)

    Hacking is still a surprisingly common issue around the world so we highly encourage all webmasters to follow these useful tips. Feel free to continue using the hashtag #NoHacked to share your own tips or experiences around hacking prevention and awareness. Thanks for supporting the #NoHacked campaign!

    And in the unfortunate event that your site gets hacked, we’ll help you toward a speedy and thorough recovery:

    Posted by your friendly #NoHacked helpers


  • HTTPS as a ranking signal

    Webmaster level: all

    Security is a top priority for Google. We invest a lot in making sure that our services use industry-leading security, like strong HTTPS encryption by default. That means that people using Search, Gmail and Google Drive, for example, automatically have a secure connection to Google.

    Beyond our own stuff, we’re also working to make the Internet safer more broadly. A big part of that is making sure that websites people access from Google are secure. For instance, we have created resources to help webmasters prevent and fix security breaches on their sites.

    We want to go even further. At Google I/O a few months ago, we called for “HTTPS everywhere” on the web.

    We’ve also seen more and more webmasters adopting HTTPS (also known as HTTP over TLS, or Transport Layer Security), on their website, which is encouraging.

    For these reasons, over the past few months we’ve been running tests taking into account whether sites use secure, encrypted connections as a signal in our search ranking algorithms. We've seen positive results, so we're starting to use HTTPS as a ranking signal. For now it's only a very lightweight signal — affecting fewer than 1% of global queries, and carrying less weight than other signals such as high-quality content — while we give webmasters time to switch to HTTPS. But over time, we may decide to strengthen it, because we’d like to encourage all website owners to switch from HTTP to HTTPS to keep everyone safe on the web.


    Lock


    In the coming weeks, we’ll publish detailed best practices (it's in our help center now) to make TLS adoption easier, and to avoid common mistakes. Here are some basic tips to get started:

    • Decide the kind of certificate you need: single, multi-domain, or wildcard certificate
    • Use 2048-bit key certificates
    • Use relative URLs for resources that reside on the same secure domain
    • Use protocol relative URLs for all other domains
    • Check out our Site move article for more guidelines on how to change your website’s address
    • Don’t block your HTTPS site from crawling using robots.txt
    • Allow indexing of your pages by search engines where possible. Avoid the noindex robots meta tag.

    If your website is already serving on HTTPS, you can test its security level and configuration with the Qualys Lab tool. If you are concerned about TLS and your site’s performance, have a look at Is TLS fast yet?. And of course, if you have any questions or concerns, please feel free to post in our Webmaster Help Forums.

    We hope to see more websites using HTTPS in the future. Let’s all make the web more secure!

    Posted by Zineb Ait Bahajji and Gary Illyes, Webmaster Trends Analysts


  • Introducing the Google News Publisher Center
    (Cross-posted on the Google News Blog)

    Webmaster level: All

    If you're a news publisher, your website has probably evolved and changed over time -- just like your stories. But in the past, when you made changes to the structure of your site, we might not have discovered your new content. That meant a lost opportunity for your readers, and for you. Unless you regularly checked Webmaster Tools, you might not even have realized that your new content wasn’t showing up in Google News. To prevent this from happening, we are letting you make changes to our record of your news site using the just launched Google News Publisher Center.

    With the Publisher Center, your potential readers can be more informed about the articles they’re clicking on and you benefit from better discovery and classification of your news content. After verifying ownership of your site using Google Webmaster Tools, you can use the Publisher Center to directly make the following changes:

    • Update your news site details, including changing your site name and labeling your publication with any relevant source labels (e.g., “Blog”, “Satire” or “Opinion”)
    • Update your section URLs when you change your site structure (e.g., when you add a new section such as http://example.com/2014commonwealthgames or http://example.com/elections2014)
    • Label your sections with a specific topic (e.g., “Technology” or “Politics”)

    Whenever you make changes to your site, we’d recommend also checking our record of it in the Publisher Center and updating it if necessary.

    Try it out, or learn more about how to get started.

    At the moment the tool is only available to publishers in the U.S. but we plan to introduce it in other countries soon and add more features.  In the meantime, we’d love to hear from you about what works well and what doesn’t. Ultimately, our goal is to make this a platform where news publishers and Google News can work together to provide readers with the best, most diverse news on the web.

    Posted by Eric Weigle, Software Engineer


  • Testing robots.txt files made easier

    Webmaster level: intermediate-advanced

    To crawl, or not to crawl, that is the robots.txt question.

    Making and maintaining correct robots.txt files can sometimes be difficult. While most sites have it easy (tip: they often don't even need a robots.txt file!), finding the directives within a large robots.txt file that are or were blocking individual URLs can be quite tricky. To make that easier, we're now announcing an updated robots.txt testing tool in Webmaster Tools.

    You can find the updated testing tool in Webmaster Tools within the Crawl section:

    Here you'll see the current robots.txt file, and can test new URLs to see whether they're disallowed for crawling. To guide your way through complicated directives, it will highlight the specific one that led to the final decision. You can make changes in the file and test those too, you'll just need to upload the new version of the file to your server afterwards to make the changes take effect. Our developers site has more about robots.txt directives and how the files are processed.

    Additionally, you'll be able to review older versions of your robots.txt file, and see when access issues block us from crawling. For example, if Googlebot sees a 500 server error for the robots.txt file, we'll generally pause further crawling of the website.

    Since there may be some errors or warnings shown for your existing sites, we recommend double-checking their robots.txt files. You can also combine it with other parts of Webmaster Tools: for example, you might use the updated Fetch as Google tool to render important pages on your website. If any blocked URLs are reported, you can use this robots.txt tester to find the directive that's blocking them, and, of course, then improve that. A common problem we've seen comes from old robots.txt files that block CSS, JavaScript, or mobile content — fixing that is often trivial once you've seen it.

    We hope this updated tool makes it easier for you to test & maintain the robots.txt file. Should you have any questions, or need help with crafting a good set of directives, feel free to drop by our webmaster's help forum!

    Posted by Asaph Arnon, Webmaster Tools team


  • Promoting modern websites for modern devices in Google search results
    Webmaster level: all
    A common annoyance for web users is when websites require browser technologies that are not supported by their device. When users access such pages, they may see nothing but a blank space or miss out a large portion of the page's contents.
    Starting today in our English search results in the US, we will indicate to searchers when our algorithms detect pages that may not work on their devices. For example, Adobe Flash is not supported on iOS devices or on Android versions 4.1 and higher, and a page whose contents are mostly Flash may be noted like this:

    Developing modern multi-device websites

    Fortunately, making websites that work on all modern devices is not that hard: websites can use HTML5 since it is universally supported, sometimes exclusively, by all devices. To help webmasters build websites that work on all types of devices regardless of the type of content they wish to serve, we recently announced two resources:
    • Web Fundamentals: a curated source for modern best practices.
    • Web Starter Kit: a starter framework supporting the Web Fundamentals best practices out of the box.
    By following the best practices described in Web Fundamentals you can build a responsive web design, which has long been Google's recommendation for search-friendly sites. Be sure not to block crawling of any Googlebot of the page assets (CSS, JavaScript, and images) using robots.txt or otherwise. Being able to access these external files fully helps our algorithms detect your site's responsive web design configuration and treat it appropriately. You can use the Fetch and render as Google feature in Webmaster Tools to test how our indexing algorithms see your site.
    As always, if you need more help you can ask a question in our webmaster forum.
    Posted by Keita Oda, Software Engineer, and , Webmaster Trends Analyst


All the Latest

Getting Around the Site

Home - all the latest on SNC
SEO - our collection of SEO articles
Technical SEO - for the geeks
Latest News - latest news in search
Analytics - measure up and convert
RSS Rack - feeds from around the industry
Search - looking for something specific?
Authors - Author Login
SEO Training - Our sister site
Contact Us - get in touch with SNC

What's New?

All content and images copyright Search News Central 2014
SNC is a Verve Developments production, the Forensic SEO Specialists- where Gypsies roam.