Google Webmaster Central Blog

Webmaster Conference: an event made for you

Over the years we attended hundreds of conferences, we spoke to thousands of webmasters, and recorded hundreds of hours of videos to help web creators find information about how to perform better in Google Search results. Now we'd like to go further: help those who aren't able to travel internationally and access the same information. Today we're officially announcing the Webmaster Conference, a series of local events around the world. These events are primarily located where it's difficult to access search conferences or information about Google Search, or where there's a specific need for a Search event. For example, if we identify that a region has problems with hacked sites, we may organize an event focusing on that specific topic. We want web creators to have equal opportunity in Google Search regardless of their language, financial status, gender, location, or any other attribute. The conferences are always free and easily accessible in the region where they're organized, and, based on feedback from the local communities and analyses, they're tailored for the audience that signed up for the events. That means it doesn't matter how much you already know about Google Search; the event you attend will have takeaways tailored to you. The talks will be in the local language, in case of international speakers through interpreters, and we'll do our best to also offer sign language interpretation if requested. Webmaster Conference OkinawaThe structure of the event varies from region to region. For example, in Okinawa, Japan, we had a wonderful half-day event with novice and advanced web creators where we focused on how to perform better in Google Images. At Webmaster Conference India and Indonesia, that might change and we may focus more on how to create faster websites. We will also host web communities in Europe and North America later this year, so keep an eye out for the announcements! We will continue attending external events as usual; we are doing these events to complement the existing ones. If you want to learn more about our upcoming events, visit the Webmaster Conference site which we'll update monthly, and follow our blogs and @googlewmc on Twitter! Posted by Takeaki Kanaya and Gary

A video series on SEO myths for web developers

We invited members of the SEO and web developer community to join us for a new video series called "SEO mythbusting". In this series, we discuss various topics around SEO from a developer's perspective, how we can work to make the "SEO black box" more transparent, and what technical SEO might look like as the web keeps evolving. We already published a few episodes: Web developer's 101: A look at Googlebot: Microformats and structured data: JavaScript and SEO: We have a few more episodes for you and we will launch the next episodes weekly on the Google Webmasters YouTube channel, so don't forget to subscribe to stay in the loop. You can also find all published episodes in this YouTube playlist. We look forward to hearing your feedback, topic suggestions, and guest recommendations in the YouTube comments as well as our Twitter account! Posted by Martin Splitt, friendly web fairy & series host, WTA team

Mobile-First Indexing by default for new domains

Over the years since announcing mobile-first indexing - Google's crawling of the web using a smartphone Googlebot - our analysis has shown that new websites are generally ready for this method of crawling. Accordingly, we're happy to announce that mobile-first indexing will be enabled by default for all new, previously unknown to Google Search, websites starting July 1, 2019. It's fantastic to see that new websites are now generally showing users - and search engines - the same content on both mobile and desktop devices! You can continue to check for mobile-first indexing of your website by using the URL Inspection Tool in Search Console. By looking at a URL on your website there, you'll quickly see how it was last crawled and indexed. For older websites, we'll continue monitoring and evaluating pages for their readiness for mobile first indexing, and will notify them through Search Console once they're seen as being ready. Since the default state for new websites will be mobile-first indexing, there's no need to send a notification. Using the URL Inspection Tool to check the mobile-first indexing status Our guidance on making all websites work well for mobile-first indexing continues to be relevant, for new and existing sites. For existing websites we determine their readiness for mobile-first indexing based on parity of content (including text, images, videos, links), structured data, and other meta-data (for example, titles and descriptions, robots meta tags). We recommend double-checking these factors when a website is launched or significantly redesigned. While we continue to support responsive web design, dynamic serving, and separate mobile URLs for mobile websites, we recommend responsive web design for new websites. Because of issues and confusion we've seen from separate mobile URLs over the years, both from search engines and users, we recommend using a single URL for both desktop and mobile websites. Mobile-first indexing has come a long way. We're happy to see how the web has evolved from being focused on desktop, to becoming mobile-friendly, and now to being mostly crawlable and indexable with mobile user-agents! We realize it has taken a lot of work from your side to get there, and on behalf of our mostly-mobile users, we appreciate that. We’ll continue to monitor and evaluate this change carefully. If you have any questions, please drop by our Webmaster forums or our public events. Posted by John Mueller, Developer Advocate, Google Zurich

Search at Google I/O 2019

Google I/O is our yearly developer conference where we have the pleasure of announcing some exciting new Search-related features and capabilities. A good place to start is Google Search: State of the Union, which explains how to take advantage of the latest capabilities in Google Search: We also gave more details on how JavaScript and Google Search work together and what you can do to make sure your JavaScript site performs well in Search. Try out new features todayHere are some of the new features, codelabs, and documentation that you can try out today: Googlebot now runs the latest Chromium rendering engine: This means Googebot now supports new features, like ES6, IntersectionObserver for lazy-loading, and Web Components v1 APIs. Googlebot will regularly update it's rendering engine. Learn more about the update in our Google Search and JavaScript talk, blog post, and updated our guidance on how to fix JavaScript issues for Google Search. How-to & FAQ launched on Google Search and the Assistant: You can get started today by following the developer documentation: How-to and FAQ. We also launched supporting Search Console reports. Learn more about How-to and FAQ in our structured data talk. Find and listen to podcasts in Search: Last week, we launched the ability to listen to podcasts directly on Google Search when you search for a certain show. In the coming months, we'll start surfacing podcasts in search results based on the content of the podcast, and let users save episodes for listening later. To enable your podcast in Search, follow the Podcast developer documentation. Try our new codelabs: Check out our new codelabs about how to add structured data, fix a Single Page App for Search, and implement Dynamic Rendering with Rendertron.Be among the first to test new featuresYour help is invaluable to making sure our products work for everyone. We shared some new features that we're still testing and would love your feedback and participation. Speed report: We're currently piloting the new Speed report in Search Console. Sign up to be a beta tester. Mini-apps: We announced Mini-apps, which engage users with interactive workflows and live content directly on Search and the Assistant. Submit your idea for the Mini-app Early Adopters Program.Learn more about what's coming soonI/O is a place where we get to showcase new Search features, so we're excited to give you a heads up on what's next on the horizon: High-resolution images: In the future, you'll be able to opt in to highlight your high-resolution images for your users. Stay tuned for details. 3D and AR in Search: We are working with partners to bring 3D models and AR content to Google Search. Check out what it might look like and stay tuned for more details.We hope these cool announcements help & inspire you to create even better websites that work well in Search. Should you have any questions, feel free to post in our webmaster help forums, contact us on Twitter, or reach out to us at any of the next events we're at. Posted by Lizzi Harvey, Technical Writer

New in structured data: FAQ and How-to

Over the last few weeks, we've been discussing structured data: first providing best practices and then showing how to monitor it with Search Console. Today we are announcing support for FAQ and How-to structured data on Google Search and the Google Assistant, including new reports in Search Console to monitor how your site is performing.In this post, we provide details to help you implement structured data on your FAQ and how-to pages in order to make your pages eligible to feature on Google Search as rich results and How-to Actions for the Assistant. We also show examples of how to monitor your search appearance with new Search Console enhancement reports.Disclaimer: Google does not guarantee that your structured data will show up in search results, even if your page is marked up correctly. To determine whether a result gets a rich treatment, Google algorithms use a variety of additional signals to make sure that users see rich results when their content best serves the user’s needs. Learn more about structured data guidelines.How-to on Search and the Google AssistantHow-to rich results provide users with richer previews of web results that guide users through step-by-step tasks. For example, if you provide information on how to tile a kitchen backsplash, tie a tie, or build a treehouse, you can add How-to structured data to your pages to enable the page to appear as a rich result on Search and a How-to Action for the Assistant. Add structured data to the steps, tools, duration, and other properties to enable a How-to rich result for your content on the search page. If your page uses images or video for each step, make sure to mark up your visual content to enhance the preview and expose a more visual representation of your content to users. Learn more about the required and recommended properties you can use on your markup in the How-to developer documentation.    Your content can also start surfacing on the Assistant through new voice guided experiences. This feature lets you expand your content to new surfaces, to help users complete tasks wherever they are, and interactively progress through the steps using voice commands.As shown in the Google Home Hub example below, the Assistant provides a conversational, hands-free experience that can help users complete a task. This is an incredibly lightweight way for web developers to expand their content to the Assistant. For more information about How-to for the Assistant, visit Build a How-to Guide Action with Markup.    To help you monitor How-to markup issues, we launched a report in Search Console that shows all errors, warnings and valid items for pages with HowTo structured data. Learn more about how to use the report to monitor your results.FAQ on Search and the Google AssistantAn FAQ page provides a list of frequently asked questions and answers on a particular topic. For example, an FAQ page on an e-commerce website might provide answers on shipping destinations, purchase options, return policies, and refund processes. By using FAQPage structured data, you can make your content eligible to display these questions and answers to display directly on Google Search and the Assistant, helping users to quickly find answers to frequently asked questions.FAQ structured data is only for official questions and answers; don't add FAQ structured data on forums or other pages where users can submit answers to questions - in that case, use the Q&A Page markup.You can learn more about implementation details in the FAQ developer documentation.To provide more ways for users to access your content, FAQ answers can also be surfaced on the Google Assistant. Your users can invoke your FAQ content by asking direct questions and get the answers that you marked up in your FAQ pages. For more information, visit Build an FAQ Action with Markup.To help you monitor FAQ issues and search appearance, we also launched an FAQ report in Search Console that shows all errors, warnings and valid items related to your marked-up FAQ pages.We would love to hear your thoughts on how FAQ or How-to structured data works for you. Send us any feedback either through Twitter or our forum.Posted by Daniel Waisberg, Damian Biollo, Patrick Nevels, and Yaniv Loewenstein

The new evergreen Googlebot

Googlebot is the crawler that visits web pages to include them within Google Search index. The number one question we got from the community at events and social media was if we could make Googlebot evergreen with the latest Chromium. Today, we are happy to announce that Googlebot now runs the latest Chromium rendering engine (74 at the time of this post) when rendering pages for Search. Moving forward, Googlebot will regularly update its rendering engine to ensure support for latest web platform features. What that means for youCompared to the previous version, Googlebot now supports 1000+ new features, like:ES6 and newer JavaScript features IntersectionObserver for lazy-loading Web Components v1 APIsYou should check if you’re transpiling or use polyfills specifically for Googlebot and if so, evaluate if this is still necessary. There are still some limitations, so check our troubleshooter for JavaScript-related issues and the video series on JavaScript SEO.Any thoughts on this? Talk to us on Twitter, the webmaster forums, or join us for the online office hours.Posted by Martin Splitt, friendly internet fairy at the Webmasters Trends Analyst team

Google I/O 2019 - What sessions should SEOs and webmasters watch?

Google I/O 2019 is starting tomorrow and will run for 3 days, until Thursday. Google I/O is our yearly developers festival, where product announcements are made, new APIs and frameworks are introduced, and Product Managers present the latest from Google to an audience of 7,000+ developers who fly to California.However, you don't have to physically attend the event to take advantage of this once-a-year opportunity: many conferences and talks are live streamed on YouTube for anyone to watch. Browse the full schedule of events, including a list of talks that we think will be interesting for webmasters to watch (all talks are in English). All the links shared below will bring you to pages with more details about each talk, and links to watch the sessions will display on the day of each event. All times are Pacific Central time (California time).Tuesday, May 7th4pm - Building Successful Websites: Case Studies for Mature and Emerging Markets, with Aancha Bahadur, Charlie Croom, Matt Doyle, Rudra Kasturi, and Jesar ShahWednesday, May 8th10.30am - Enhance Your Search and Assistant Presence with Structured Data, with Aylin Alroik and Will Leszczuk11.30am - Create App-like Experiences on Google Search and the Google Assistant, with Allen Harvey11.30am - Rapidly Building Better Web Experiences with AMP, with Adam Greenberg and Naina Raisinghani6.30pm - Unlocking New Capabilities for the Web, with Pete LePage and Thomas SteinerThursday, May 9th10.30am - Google Search: State of the Union, with John Mueller and Martin Splitt1.30pm - Google Search and JavaScript Sites, with Zoe Clifford and Martin SplittThis list is only a small part of the agenda that we think is useful to webmasters and SEOs. There are many more sessions that you could find interesting! To learn about those other talks, check out the full list of “web” sessions, design sessions, Cloud sessions, machine learning sessions, and more. Use the filtering function to toggle the sessions on and off.We hope you can make the time to watch the talks online, and participate in the excitement of I/O ! The videos will also be available on Youtube after the event, in case you can't tune in live.Posted by Vincent Courson, Search Outreach Specialist

Monitoring structured data with Search Console

In our previous post in the structured data series, we discussed what structured data is and why you should add it to your site. We are committed to structured data and continue to enhance related Search features and improve our tools - that’s why we have been creating solutions to help webmasters and developers implement and diagnose structured data. This post focuses on what you can do with Search Console to monitor and make the most out of structured data for your site. In addition, we have some new features that will help you even more. Below are the new additions, read on to learn more about them.Unparsable structured data is a new report that aggregates structured data syntax errors.New enhancement reports for Sitelinks searchbox and Logo.Monitoring overall structured data performanceEvery time Search Console detects a new issue related to structured data on a website, we send an email to account owners - but if an existing issue gets worse, it won’t trigger an email, so it is still important for you to check your account sporadically.This is not something you need to do every day, but we recommend you check it once in a while to make sure everything is working as intended. If your website development has defined cycles, it might be a good practice to log in to Search Console after changes are made to the website to monitor your performance. If you’d like to have an overall idea of all the errors for a specific structured data feature in your site, you can navigate to the Enhancements menu in the left sidebar and click  a feature. You'll find a summary of all errors and warnings, as well as the valid items. As mentioned above, we added a new set of reports to help you understand more types of structured data on your site: Sitelinks searchbox and Logo. They are joining the existing set of reports on Recipe, Event, Job Posting and others. You can read more about the reports in the Search Console Help Center. Here's an example of an Enhancement report, note that you can only see enhancements that have been detected in your pages. The report helps you with the following actions:Review the trends of errors, warnings and valid items: To view each status issue separately, click the colored boxes above the bar chart.Review warnings and errors per page: To see examples of pages which are currently affected by the issues, click a specific row below the bar chart.Image: Enhancements reportWe are also happy to launch the Unparsable Structured Data report, which aggregates parsing issues such as structured data syntax errors that prevented Google from identifying the feature type. That is the reason these issues are aggregated here instead of the intended specific feature report. Check this report to see if Google was unable to parse any of the structured data you tried to add to your site. Parsing issues could point you to lost opportunities for rich results for your site. Below is a screenshot showing how the report looks like. You can access the report directly and read more about the report in our help center.Image: Unparsable Structured Data reportTesting structured data on a URL levelTo make sure your pages were processed correctly and are eligible for rich results or as a way to diagnose why some rich result are not surfacing for a specific URL, you can use the URL Inspection tool. This tool helps you understand areas of improvement at a URL level and helps you get an idea on where to focus. When you paste a URL into the search box at the top of Search Console, you can find what’s working properly and warnings or errors related to your structured data in the enhancements section, as seen below for Recipes.Image: URL Inspection toolIn the screenshot above, there is an error related to Recipes. If you click Recipes, information about the error displays, and you can click the little chart icon to the right of the error to learn more about it. Once you understand and fix the error, you can click Validate Fix (see screenshot below) so Google can start validating whether the issue is indeed fixed. When you click the Validate Fix button, Google runs several instantaneous tests. If your pages don’t pass this test, Search Console provides you with an immediate notification. Otherwise, Search Console reprocesses the rest of the affected pages.Image: Structured data error detailWe would love to hear your feedback on how Search Console has helped you and how it can help you even more with structured data. Send us feedback through Twitter or the Webmaster forum.Posted by Daniel Waisberg, Search Advocate & Na'ama Zohary, Search Console team 

Enriching Search Results Through Structured Data

For many years we have been recommending the use of structured data on websites to enable a richer search experience. When you add markup to your content, you help search engines understand the different components of a page. When Google's systems understand your page more clearly, Google Search can surface content through the cool features discussed in this post, which can enhance the user experience and get you more traffic.We've worked hard to provide you with tools to understand how your websites are shown in Google Search results and whether there are issues you can fix. To help give a complete overview of structured data, we decided to do a series to explore it. This post provides a quick intro and discusses some best practices, future posts will focus on how to use Search Console to succeed with structured data. What is structured data?Structured data is a common way of providing information about a page and its content - we recommend using the schema.org vocabulary for doing so. Google supports three different formats of in-page markup: JSON-LD (recommended), Microdata, and RDFa. Different search features require different kinds of structured data - you can learn more about these in our search gallery. Our developer documentation has more details on the basics of structured data. Structured data helps Google's systems understand your content more accurately, which means it’s better for users as they will get more relevant results. If you implement structured data your pages may become eligible to be shown with an enhanced appearance in Google search results. Disclaimer: Google does not guarantee that your structured data will show up in search results, even if your page is marked up correctly. Using structured data enables a feature to be present, it does not guarantee that it will be present. Learn more about structured data guidelines.Sites that use structured data see resultsOver the years, we've seen a growing adoption of structured data in the ecosystem. In general, rich results help users to better understand how your pages are relevant to their searches, so they translate into success for websites. Here are some results that are showcased in our case studies gallery:Eventbrite leveraged event structured data and saw 100% increase in the typical YOY growth of traffic from search.Jobrapido integrated with the job experience on Google Search and saw 115% increase in organic traffic, 270% increase of new user registrations from organic traffic, and 15% lower bounce rate for Google visitors to job pages.Rakuten used the recipe search experience and saw a 2.7X increase in traffic from search engines and a 1.5X increase in session duration.How to use structured data?There are a few ways your site could benefit from structured data. Below we discuss some examples grouped by different types of goals: increase brand awareness, highlight content, and highlight product information.1. Increase brand awarenessOne thing you can do to promote your brand with structured data is to take advantage of features such as Logo, Local business, and Sitelinks searchbox. In addition to adding structured data, you should verify your site for the Knowledge Panel and claim your business on Google My Business. Here is an example of the knowledge panel with a Logo.2. Highlight contentIf you publish content on the web, there are a number of features that can help promote your content and attract more users, depending on your industry. For example: Article, Breadcrumb, Event, Job, Q&A, Recipe, Review and others. Here is an example of a recipe rich result.3. Highlight product informationIf you sell merchandise, you could add product structured data to your page, including price, availability, and review ratings. Here is how your product might show for a relevant search.Try it and let us knowNow that you understand the importance of structured data, try our codelab to learn how to add it to your pages. Stay tuned to learn more about structured data, in the coming posts we’ll be discussing how to use Search Console to better analyze your efforts.We would love to hear your thoughts and stories on how structured data works for you, send us any feedback either through Twitter or our forum.Posted by Daniel Waisberg, Search Advocate

Instant-loading AMP pages from your own domain

Today we are rolling out support in Google Search’s AMP web results (also known as “blue links”) to link to signed exchanges, an emerging new feature of the web enabled by the IETF web packaging specification. Signed exchanges enable displaying the publisher’s domain when content is instantly loaded via Google Search. This is available in browsers that support the necessary web platform feature—as of the time of writing, Google Chrome—and availability will expand to include other browsers as they gain support (e.g. the upcoming version of Microsoft Edge). Background on AMP’s instant loading One of AMP's biggest user benefits has been the unique ability to instantly load AMP web pages that users click on in Google Search. Near-instant loading works by requesting content ahead of time, balancing the likelihood of a user clicking on a result with device and network constraints–and doing it in a privacy-sensitive way. We believe that privacy-preserving instant loading web content is a transformative user experience, but in order to accomplish this, we had to make trade-offs; namely, the URLs displayed in browser address bars begin with google.com/amp, as a consequence of being shown in the Google AMP Viewer, rather than display the domain of the publisher. We heard both user and publisher feedback over this, and last year we identified a web platform innovation that provides a solution that shows the content’s original URL while still retaining AMP's instant loading. Introducing signed exchanges A signed exchange is a file format, defined in the web packaging specification, that allows the browser to trust a document as if it belongs to your origin. This allows you to use first-party cookies and storage to customize content and simplify analytics integration. Your page appears under your URL instead of the google.com/amp URL. Google Search links to signed exchanges when the publisher, browser, and the Search experience context all support it. As a publisher, you will need to publish both the signed exchange version of the content in addition to the non-signed exchange version. Learn more about how Google Search supports signed exchange. Getting started with signed exchanges Many publishers have already begun to publish signed exchanges since the developer preview opened up last fall. To implement signed exchanges in your own serving infrastructure, follow the guide “Serve AMP using Signed Exchanges” available at amp.dev. If you use a CDN provider, ask them if they can provide AMP signed exchanges. Cloudflare has recently announced that it is offering signed exchanges to all of its customers free of charge. Check out our resources like the webmaster community or get in touch with members of the AMP Project with any questions. You can also provide feedback on the signed exchange specification. Posted by Devin Mullins and Greg Rogers

Search Console reporting for your site's Discover performance data

Discover is a popular way for users to stay up-to-date on all their favorite topics, even when they’re not searching. To provide publishers and sites visibility into their Discover traffic, we're adding a new report in Google Search Console to share relevant statistics and help answer questions such as: How often is my site shown in users' Discover? How large is my traffic? Which pieces of content perform well in Discover? How does my content perform differently in Discover compared to traditional search results? A quick reminder: What is Discover? Discover is a feature within Google Search that helps users stay up-to-date on all their favorite topics, without needing a query. Users get to their Discover experience in the Google app, on the Google.com mobile homepage, and by swiping right from the homescreen on Pixel phones. It has grown significantly since launching in 2017 and now helps more than 800M monthly active users get inspired and explore new information by surfacing articles, videos, and other content on topics they care most about. Users have the ability to follow topics directly or let Google know if they’d like to see more or less of a specific topic. In addition, Discover isn’t limited to what’s new. It surfaces the best of the web regardless of publication date, from recipes and human interest stories, to fashion videos and more. Here is our guide on how you can optimize your site for Discover. Discover in Search Console The new Discover report is shown to websites that have accumulated meaningful visibility in Discover, with the data shown back to March 2019. We hope this report is helpful in thinking about how you might optimize your content strategy to help users discover engaging information-- both new and evergreen. For questions or comments on the report, feel free to drop by our webmaster help forums, or contact us through our other channels. Posted by Michael Huzman, Ariel Kroszynski

User experience improvements with page speed in mobile search

To help users find the answers to their questions faster, we included page speed as a ranking factor for mobile searches in 2018. Since then, we've observed improvements on many pages across the web. We want to recognize the performance improvements webmasters have made over the past year. A few highlights: For the slowest one-third of traffic, we saw user-centric performance metrics improve by 15% to 20% in 2018. As a comparison, no improvement was seen in 2017. We observed improvements across the whole web ecosystem. On a per country basis, more than 95% of countries had improved speeds. When a page is slow to load, users are more likely to abandon the navigation. Thanks to these speed improvements, we've observed a 20% reduction in abandonment rate for navigations initiated from Search, a metric that site owners can now also measure via the Network Error Logging API available in Chrome. In 2018, developers ran over a billion PageSpeed Insights audits to identify performance optimization opportunities for over 200 million unique urls. Great work and thank you! We encourage all webmasters to optimize their sites’ user experience. If you're unsure how your pages are performing, the following tools and documents can be useful: PageSpeed Insights provides page analysis and optimization recommendations. Google Chrome User Experience Report provides the user experience metrics for how real-world Chrome users experience popular destinations on the web. Documentation on performance on Web Fundamentals. For any questions, feel free to drop by our help forums (like the webmaster community) to chat with other experts.Posted by Genqing Wu and Doantam Phan

This year in Search Spam - Webspam report 2018

Google aims to provide the highest quality results for any search. As part of this, we take action to prevent what we call “webspam” from degrading the search experience, content and behaviors that violate our webmaster guidelines. Our efforts help ensure that well under 1 percent of results visited by users are for spammy pages. Here’s more about how we fought webspam in 2018.Google webspam trends and how we fought webspam in 2018Of the types of spam we fought in 2018, three continue to stand out:Spam on hacked websites: We reported in 2017 that we had seen a substantial reduction of spam from hacked websites in search results. This trend continued in 2018, with faster discovery of hacked web pages before they affect search results or put someone in harm’s way.   While we reduced how spam on hacked sites affects search, hacked websites remain a major security problem affecting the safety of the web. Even though we can’t prevent a website hack from happening, we’re committed to helping webmasters whose websites have been compromised by offering resources to help them recover from a hacked website. User-generated spam: A particular type of spam known as User-generated spam has been a continued focus for us. User-generated spam includes spammy posts on forums, as well as spammy accounts on free blogs and platforms, none of which are meant to be consumed by human beings, and all of which disrupt conversations while adding no value to users. In 2018, we were able to reduce the impact on search users from this type of spam by more than 80%. While we can’t prevent websites from being exploited, we do want to make it easier for website owners to learn how to protect themselves, which is why we provide resources on how to prevent abuse of your site’s public areas.Link spam: We continued to protect the value of authoritative and relevant links as an important ranking signal for Search. We continued to deal swiftly with egregious link spam, and made a number of bad linking practices less effective for manipulating ranking. Above all, we continued to engage with webmasters and SEOs to chip away at the many myths that have emerged over the years relating to linking practices. We continued to remind website owners that if you simply stay away from building links mainly as an attempt to rank better and focus on creating great content, you should not have to worry about any of the myths or realities. We think that one of the best ways of fighting spam of all types is by encouraging website owners to just create great quality content. Resources such as the SEO starter-guide highlight best practices and bust some common myths and misconceptions related to what it takes to appear well in Google Search results. Reporting link spam is also a great way to assist us in fighting against this type of abuse and to help preserve fairness in Search ranking.Working with users, webmasters and developers for a better webEveryday users continue to help us find spam, malware and other issues in Search that escape our filters and processes by reporting spam on search, reporting phishing or  reporting malware. We received over 180,000 search spam user reports and we were able to take action on 64% of the reports we processed. These reports truly make a difference and we’d like to thank all of you who submitted them. We think it’s important to let website owners known when we detect something wrong with their website. In 2018, we generated over 186 million messages to website owners calling out potential improvements, issues and problems that could affect their site’s appearance on Search results. We can only deliver these notifications to site owners that verified their sites in Search Console, and we successfully delivered 96 million of those messages. The rest of the messages will be kept linked with the website for as long as they are relevant, so they can be seen when a webmaster successfully registers their site in Search Console. The majority of these messages were welcoming new users to Search Console, and the second largest group was informing registered Search Console users when Mobile-First Indexing became available. Of all messages, slightly over 2%—about 4 million—were related to manual actions resulting from violations of our Webmaster Guidelines. High quality content keeps spam off of search results, and we continued to improve the tools and reports we offer for webmasters that create that content. The Google Search Console was completely rebuilt from the ground up to provide both new and improved reports (Performance, Index Coverage, Links, Mobile Usability report), as well as brand new features (URL Inspection Tool and Site and User management). This improved Search Console graduated out of beta in 2018 and is now available generally to all registered website owners.We didn’t forget the front-end developers who make the modern web work, and focused on helping them make their sites great for users and also search-friendly regardless of whether they are on a CMS, roll their own CSS and JS, or build on top of a web framework. With the new SEO audit capability in Lighthouse, the open-source and automated auditing tool for improving the quality of web pages, developers and webmasters can now run actionable SEO health-checks on their pages and quickly identify areas for improvement.We also engage directly with website owners to provide help with thorny issues. Our dedicated team members meet with webmasters around the world regularly, both online and in-person. We delivered more than 190 online office hours, online events and offline events in more than 76 cities, to audiences totaling over 170,000 including SEOs, developers and online marketers. We hosted four search events in Tokyo, Singapore, Zurich and Osaka as well as an 11-city Search Conference in India. In 2018, we started live office hours in Spanish on top of English, French, German, Hindi and Japanese, where Webmasters can find help, tips and useful discussion on our Google Webmaster YouTube channel. Product experts continued to help webmasters find solutions through our official support forums in over a dozen languages. We look forward to continuing our work to deliver a spam-free Search experience to all in 2019!Posted by Juan Felipe Rincón, Webmaster Outreach, Dublin

How to discover & suggest Google-selected canonical URLs for your pages

Sometimes a web page can be reached by using more than one URL. In such cases, Google tries to determine the best URL to display in search and to use in other ways. We call this the “canonical URL.” There are ways site owners can help us better determine what should be the canonical URLs for their content. If you suspect we’ve not selected the best canonical URL for your content, you can check by entering your page’s address into the URL Inspection tool within Search Console. It will show you the Google-selected canonical. If you believe there’s a better canonical that should be used, follow the steps on our duplicate URLs help page on how to suggest a preferred choice for consideration. Please be aware that if you search using the site: or inurl: commands, you will be shown the domain you specified in those, even if these aren’t the Google-selected canonical. This happens because we’re fulfilling the exact request entered. Behind-the-scenes, we still use the Google-selected canonical, including for when people see pages without using the site: or inurl: commands. We’ve also changed URL Inspection tool so that it will display any Google-selected canonical for a URL, not just those for properties you manage in Search Console. With this change, we’re also retiring the info: command. This was an alternative way of discovering canonicals. It was relatively underused, and URL Inspection tool provides a more comprehensive solution to help publishers with URLs. Posted by John Mueller, Google Switzerland

Help Google Search know the best date for your web page

Sometimes, Google shows dates next to listings in its search results. In this post, we’ll answer some commonly-asked questions webmasters have about how these dates are determined and provide some best practices to help improve their accuracy. How dates are determined Google shows the date of a page when its automated systems determine that it would be relevant to do so, such as for pages that can be time-sensitive, including news content: Google determines a date using a variety of factors, including but not limited to: any prominent date listed on the page itself or dates provided by the publisher through structured markup. Google doesn’t depend on one single factor because all of them can be prone to issues. Publishers may not always provide a clear visible date. Sometimes, structured data may be lacking or may not be adjusted to the correct time zone. That’s why our systems look at several factors to come up with what we consider to be our best estimate of when a page was published or significantly updated. How to specify a date on a page To help Google to pick the right date, site owners and publishers should: Show a clear date: Show a visible date prominently on the page. Use structured data: Use the datePublished and dateModified schema with the correct time zone designator for AMP or non-AMP pages. When using structured data, make sure to use the ISO 8601 format for dates. Guidelines specific to Google News Google News requires clearly showing both the date and the time that content was published or updated. Structured data alone is not enough, though it is recommended to use in addition to a visible date and time. Date and time should be positioned between the headline and the article text. For more guidance, also see our help page about article dates. If an article has been substantially changed, it can make sense to give it a fresh date and time. However, don't artificially freshen a story without adding significant information or some other compelling reason for the freshening. Also, do not create a very slightly updated story from one previously published, then delete the old story and redirect to the new one. That's against our article URLs guidelines. More best practices for dates on web pages In addition to the most important requirements listed above, here are additional best practices to help Google determine the best page to consider showing for a web page: Show when a page has been updated: If you update a page significantly, also update the visible date (and time, if you display that). If desired, you can show two dates: when a page was originally published and when it was updated. Just do so in a way that’s visually clear to your readers. If showing both dates, it’s also highly recommended to use datePublished and dateModified for AMP or non-AMP pages to make it easier for algorithms to recognize. Use the right time zone: If specifying a time, make sure to provide the correct timezone, taking into account daylight saving time as appropriate. Be consistent in usage. Within a page, make sure to use exactly the same date (and, potentially, time) in structured data as well as in the visible part of the page. Make sure to use the same timezone if you specify one on the page. Don’t use future dates or dates related to what a page is about: Always use a date for when a page itself was published or updated, not a date linked to something like an event that the page is writing about, especially for events or other subjects that happen in the future (you may use Event markup separately, if appropriate). Follow Google's structured data guidelines: While Google doesn't guarantee that a date (or structured data in general) specified on a page will be used, following our structured data guidelines does help our algorithms to have it available in a machine-readable way. Troubleshoot by minimizing other dates on the page: If you’ve followed the best practices above and find incorrect dates are being selected, consider if you can remove or minimize other dates that may appear on the page, such as those that might be next to related stories. We hope these guidelines help to make it easier to specify the right date on your website's pages! For questions or comments on this, or other structured data topics, feel free to drop by our webmaster help forums. Posted by John Mueller, Developer Advocate, Zurich

Dynamic Rendering with Rendertron

Many frontend frameworks rely on JavaScript to show content. This can mean Google might take some time to index your content or update the indexed content. A workaround we discussed at Google I/O this year is dynamic rendering. There are many ways to implement this. This blog post shows an example implementation of dynamic rendering using Rendertron, which is an open source solution based on headless Chromium.Which sites should consider dynamic rendering?Not all search engines or social media bots visiting your website can run JavaScript. Googlebot might take time to run your JavaScript and has some limitations, for example. Dynamic rendering is useful for content that changes often and needs JavaScript to display. Your site's user experience (especially the time to first meaningful paint) may benefit from considering hybrid rendering (for example, Angular Universal). How does dynamic rendering work?Dynamic rendering means switching between client-side rendered and pre-rendered content for specific user agents. You will need a renderer to execute the JavaScript and produce static HTML. Rendertron is an open source project that uses headless Chromium to render. Single Page Apps often load data in the background or defer work to render their content. Rendertron has mechanisms to determine when a website has completed rendering. It waits until all network requests have finished and there is no outstanding work. This post covers: Take a look at a sample web appSet up a small express.js server to serve the web appInstall and configure Rendertron as a middleware for dynamic renderingThe sample web appThe “kitten corner” web app uses JavaScript to load a variety of cat images from an API and displays them in a grid. Cute cat images in a grid and a button to show more - this web app truly has it all! Here is the JavaScript:   const apiUrl = 'https://api.thecatapi.com/v1/images/search?limit=50';   const tpl = document.querySelector('template').content;   const container = document.querySelector('ul');   function init () {     fetch(apiUrl)     .then(response => response.json())     .then(cats => {       container.innerHTML = '';       cats         .map(cat => {           const li = document.importNode(tpl, true);           li.querySelector('img').src = cat.url;           return li;         }).forEach(li => container.appendChild(li));     })   }   init();   document.querySelector('button').addEventListener('click', init);The web app uses modern JavaScript (ES6), which isn't supported in Googlebot yet. We can use the mobile-friendly test to check if Googlebot can see the content:The mobile-friendly test shows that the page is mobile-friendly, but the screenshot is missing all the cats! The headline and button appear but none of the cat pictures are there.While this problem is simple to fix, it's a good exercise to learn how to setup dynamic rendering. Dynamic rendering will allow Googlebot to see the cat pictures without changes to the web app code.Set up the serverTo serve the web application, let's use express, a node.js library, to build web servers.The server code looks like this (find the full project source code here):const express = require('express');const app = express();const DIST_FOLDER = process.cwd() + '/docs';const PORT = process.env.PORT || 8080;// Serve static assets (images, css, etc.)app.get('*.*', express.static(DIST_FOLDER));// Point all other URLs to index.html for our single page appapp.get('*', (req, res) => {  res.sendFile(DIST_FOLDER + '/index.html');});// Start Express Serverapp.listen(PORT, () => {  console.log(`Node Express server listening on http://localhost:${PORT} from ${DIST_FOLDER}`);});You can try the live example here - you should see a bunch of cat pictures, if you are using a modern browser. To run the project from your computer, you need node.js to run the following commands: npm install --save express rendertron-middleware node server.js Then point your browser to http://localhost:8080. Now it’s time to set up dynamic rendering. Deploy a Rendertron instanceRendertron runs a server that takes a URL and returns static HTML for the URL by using headless Chromium. We'll follow the recommendation from the Rendertron project and use Google Cloud Platform.The form to create a new Google Cloud Platform project.Please note that you can get started with the free usage tier, using this setup in production may incur costs according to the Google Cloud Platform pricing.Create a new project in the Google Cloud console. Take note of the “Project ID” below the input field.Install the Google Cloud SDK as described in the documentation and log in.Clone the Rendertron repository from GitHub with:git clone https://github.com/GoogleChrome/rendertron.git cd rendertron Run the following commands to install dependencies and build Rendertron on your computer:npm install && npm run buildEnable Rendertron’s cache by creating a new file called config.json in the rendertron directory with the following content:{ "datastoreCache": true }Run the following command from the rendertron directory. Substitute YOUR_PROJECT_ID with your project ID from step 1.gcloud app deploy app.yaml --project YOUR_PROJECT_IDSelect a region of your choice and confirm the deployment. Wait for it to finish.Enter the URL YOUR_PROJECT_ID.appspot.com (substitute YOUR_PROJECT_ID for your actual project ID from step 1 in your browser. You should see Rendertron’s interface with an input field and a few buttons.Rendertron’s UI after deploying to Google Cloud PlatformWhen you see the Rendertron web interface, you have successfully deployed your own Rendertron instance. Take note of your project’s URL (YOUR_PROJECT_ID.appspot.com) as you will need it in the next part of the process.Add Rendertron to the serverThe web server is using express.js and Rendertron has an express.js middleware. Run the following command in the directory of the server.js file: npm install --save rendertron-middlewareThis command installs the rendertron-middleware from npm so we can add it to the server:const express = require('express');const app = express();const rendertron = require('rendertron-middleware');Configure the bot listRendertron uses the user-agent HTTP header to determine if a request comes from a bot or a user’s browser. It has a well-maintained list of bot user agents to compare with. By default this list does not include Googlebot, because Googlebot can execute JavaScript. To make Rendertron render Googlebot requests as well, add Googlebot to the list of user agents:const BOTS = rendertron.botUserAgents.concat('googlebot');const BOT_UA_PATTERN = new RegExp(BOTS.join('|'), 'i');Rendertron compares the user-agent header against this regular expression later.Add the middlewareTo send bot requests to the Rendertron instance, we need to add the middleware to our express.js server. The middleware checks the requesting user agent and forwards requests from known bots to the Rendertron instance. Add the following code to server.js and don’t forget to substitute “YOUR_PROJECT_ID” with your Google Cloud Platform project ID:app.use(rendertron.makeMiddleware({  proxyUrl: 'https://YOUR_PROJECT_ID.appspot.com/render',  userAgentPattern: BOT_UA_PATTERN}));Bots requesting the sample website receive the static HTML from Rendertron, so the bots don’t need to run JavaScript to display the content.Testing our setupTo test if the Rendertron setup was successful, run the mobile-friendly test again.Unlike the first test, the cat pictures are visible. In the HTML tab we can see all HTML the JavaScript code generated and that Rendertron has removed the need for JavaScript to display the content.ConclusionYou created a dynamic rendering setup without making any changes to the web app. With these changes, you can serve a static HTML version of the web app to crawlers.Posted by Martin Splitt, Open Web Unicorn

Introducing a new JavaScript SEO video series

We made a new video series on JavaScript SEO that benefits both web developers and SEOs. In the series we want to help making discoverable web apps with JavaScript.JavaScript is popular as it allows to build more engaging web applications. JavaScript frameworks are widely used as they:Improve developer productivity by providing useful utilities and toolingAllow faster prototyping cycles thanks to their ecosystems of components and librariesHelp structuring the code even in larger application codebasesJavaScript also brings a few new considerations and challenges to SEO. Some of the considerations are strategic and some are more technical. In the video series, we'll cover:The difference between classic and JavaScript sitesHow Google Search crawls, renders, and indexes JavaScript contentSEO fundamentals for React, Angular, and VueTools to test and debug a JavaScript siteWhat dynamic rendering is and how to set it up with RendertronCheck out the JavaScript SEO YouTube playlist and subscribe to the Google Webmasters channel to get the weekly episodes when they go online. We are looking forward to your feedback and are all ears for your input on further episodes. You can reach us through the Webmaster Forum, the Google Webmasters Twitter account or in the YouTube comments under the videos.Posted by Martin Splitt, friendly web fairy & series host, WTA team

Announcing domain-wide data in Search Console

Google recommends verifying all versions of a website -- http, https, www, and non-www -- in order to get the most comprehensive view of your site in Google Search Console. Unfortunately, many separate listings can make it hard for webmasters to understand the full picture of how Google “sees” their domain as a whole. To make this easier, today we're announcing "domain properties" in Search Console, a way of verifying and seeing the data from Google Search for a whole domain. Domain properties show data for all URLs under the domain name, including all protocols, subdomains, and paths. They give you a complete view of your website across Search Console, reducing the need to manually combine data. So regardless of whether you use m-dot URLs for mobile pages, or are (finally) getting the migration to HTTPS set up, Search Console will be able to help with a complete view of your site's data with regards to how Google Search sees it. If you already have DNS verification set up, Search Console will automatically create new domain properties for you over the next few weeks, with data over all reports. Otherwise, to add a new domain property, go to the property selector, add a new domain property, and use DNS verification.We recommend using domain properties where possible going forward. Domain properties were built based on your feedback; thank you again for everything you've sent our way over the years! We hope this makes it easier to manage your site, and to get a complete overview without having to manually combine data. Should you have any questions, feel free to drop by our help forums, or leave us a comment on Twitter. And as always, you can also use the feedback feature built in to Search Console as well. Posted by Erez Bixon, Search Console Team

Help customers discover your products on Google

People come to Google to discover new brands and products throughout their shopping journey. On Search and Google Images, shoppers are provided with rich snippets like product description, ratings, and price to help guide purchase decisions. Connecting potential customers with up-to-date and accurate product information is key to successful shopping journeys on Google, so today, we’re introducing new ways for merchants to provide this information to improve results for shoppers. Search ConsoleMany retailers and brands add structured data markup to their websites to ensure Google understands the products they sell. A new report for ‘Products’ is now available in Search Console for sites that use schema.org structured data markup to annotate product information. The report allows you to see any pending issues for markup on your site. Once an issue is fixed, you can use the report to validate if your issues were resolved by re-crawling your affected pages. Learn more about the rich result status reports Merchant CenterWhile structured data markup helps Google properly display your product information when we crawl your site, we are expanding capabilities for all retailers to directly provide up-to-date product information to Google in real-time. Product data feeds uploaded to Google Merchant Center will now be eligible for display in results on surfaces like Search and Google Images. This product information will be ranked based only on relevance to users’ queries, and no payment is required or accepted for eligibility. We’re starting with the expansion in the US, and support for other countries will be announced later in the year. Get started You don’t need a Google Ads campaign to participate. If you don’t have an existing account and sell your products in the US, create a Merchant Center account and upload a product data feed. Manufacturer Center We’re also rolling out new features to improve your brand’s visibility and help customers find your products on Google by providing authoritative and up-to-date product information through Google Manufacturer Center. This information includes product description, variants, and rich content, such as high-quality images and videos that can show on the product’s knowledge panel. These solutions give you multiple options to better reach and inform potential customers about your products as they shop across Google. If you have any questions, be sure to post in our forum. Posted by Bernhard Schindlholzer, Product Manager for Google Merchant Tools

Consolidating your website traffic on canonical URLs

In Search Console, the Performance report currently credits all page metrics to the exact URL that the user is referred to by Google Search. Although this provides very specific data, it makes property management more difficult; for example: if your site has mobile and desktop versions on different properties, you must open multiple properties to see all your Search data for the same piece of content. To help unify your data, Search Console will soon begin assigning search metrics to the (Google-selected) canonical URL, rather than the URL referred to by Google Search. This change has several benefits: It unifies all search metrics for a single piece of content into a single URL: the canonical URL. This shows you the full picture about a specific piece of content in one property. For users with separate mobile or AMP pages, it unifies all (or most, since some mobile URLs may end up as canonical) of your data to a single property (the "canonical" property). It improves the usability of the AMP and Mobile-Friendly reports. These reports currently show issues in the canonical page property, but show the impression in the property that owns the actual URL referred to by Google Search. After this change, the impressions and issues will be shown in the same property. When will this happen? We plan to transition all performance data on April 10, 2019. In order to provide continuity to your data, we will pre-populate your unified data beginning from January 2018. We will also enable you to view both old and new versions for a few weeks during the transition to see the impact and understand the differences. API and Data Studio users: The Search Console API will change to canonical data on April 10, 2019. How will this affect my data? At an individual URL level, you will see traffic shift from any non-canonical (duplicate) URLs to the canonical URL. At the property level, you will see data from your alternate property (for example, your mobile site) shifted to your "canonical property". Your alternate property traffic probably won't drop to zero in Search Console because canonicalization is at the page, not the property level, and your mobile property might have some canonical pages. However, for most users, most property-level data will shift to one property. AMP property traffic will drop to zero in most cases (except for self-canonical pages). You will still be able to filter data by device, search appearance (such as AMP), country, and other dimensions without losing important information about your traffic. You can see some examples of these traffic changes below. Preparing for the change Consider whether you need to change user access to your various properties; for example: do you need to add new users to your canonical property, or do existing users continue to need access to the non-canonical properties. Modify any custom traffic reports you might have created in order to adapt for this traffic shift. If you need to learn the canonical URL for a given URL, you can use the URL Inspection tool. If you want to save your traffic data calculated using the current system, you should download your data using either the Performance report's Export Data button, or using the Search Console API. Examples Here are a few examples showing how data might change on your site. In these examples, you can see how your traffic numbers would change between a canonical site (called example.com) and alternate site (called m.example.com). Important: In these examples, the desktop site contains all the canonical pages and the mobile contains all the alternate pages. In the real world, your desktop site might contain some alternate pages and your mobile site might contain some canonical pages. You can determine the canonical for a given URL using the URL Inspection tool. Total traffic In the current version, some of your traffic is attributed to the canonical property and some to the alternate property. The new version should attribute all of your traffic to the canonical property. Canonical property(http://example.com) Alternate property(http://m.example.com) Current New, based on canonical URLs Change +0.7K     |        +3K -0.7K        |          -3K Individual page traffic You can see traffic changes between the duplicate and canonical URLs for individual pages in the Pages view. The next example shows how traffic that used to be split between the canonical and alternate pages are now all attributed to the canonical URL: Canonical property(http://example.com) Alternate property(http://m.example.com) Old New Change+150     |        +800-150     |        -800 Mobile traffic In the current version, all of your mobile traffic was attributed to your m. property. The new version attributes all traffic to your canonical property when you apply the "Device: Mobile" filter as shown here: Canonical property(http://example.com) Alternate property(http://m.example.com) Old New Change+0.7K      | +3K-0.7K      | -3K In conclusion We know that this change might seem a little confusing at first, but we're confident that it will simplify your job of tracking traffic data for your site. If you have any questions or concerns, please reach out on the Webmaster Help Forum.Posted by John Mueller, Developer Advocate, Zurich

Pages

Recommended Content