Here are the notes for the latest Google Help Hangout. In this article, we will share with you what we think are the most important things we learned from this video. Then, we'll follow with a full transcript.
New: We have written these in a format that should allow for easy quoting for those of you who are looking for quotes from Googlers in order to support your recommendations.
Google's algorithms like to see dates on articles
John was asked whether we should be displaying a "last updated" date alone, or whether we should also show the first published date on our articles. It is important to note that the Quality Raters' Guidelines mention in several places how important it is for YMYL (Your Money or Your Life) sites to stay updated:
Here is what John said.
That's something that you can do either way. Personally, I like having both of these dates there. There's also structured data that you can use for both of these dates. So if you provide it on your articles and I would also look into the structure data and try to use that as well I think both of these dates can be useful for our algorithms"
We thought it was interesting how John commented that Google's algorithms find both the date and the "last updated" date useful. Now, he didn't say that this info is specifically used in ranking algorithms. But, it's possible!
How Does Google decide which updates to share with us?
Barry Schwartz asked how Google decides upon which algorithm updates to confirm. John said the following:
For the most case, there is no kind of explicit guideline where we say if it crosses this specific metric and it goes this far that we would announce that explicitly. What one aspect that from my side plays a strong role here is whether or not there's something really actionable from the site owner point of view with regards to this change. So, things like page speed or like the mobile friendliness update, the mobile indexing changes those are all things where we do make changes in ranking and it's something that a webmaster can explicitly influence. So, if we tell them about this like; hey we're going to do this because we think it's the right, thing then they can take that and say; ok I can work on this and I can make changes to my website to make sure it matches these expectations. That's something from our point of view that definitely makes sense to announce.
The general kind of relevance updates that we have in search, the core ranking updates that happen those are things that happen all of the time and they happen I guess on a daily basis more or less and for a large part these are just small shifts that happen in search. And it's not something where we'd be able to tell people; hey we think your site is less relevant for this query therefore you should make it more relevant because that's not really that useful of feedback. So, it's something where we tend not to announce those too broadly and sometimes when a lot of people start talking about these kind of updates and we see that people are really confused and sometimes I'll say; ok fine we said we wouldn't talk about this but maybe we just want to confirm that actually we did make some changes here it's not that you're seeing ghosts and there's nothing explicit that you can kind of work on directly with regards to these changes. So that sometimes comes into play there.”
John went on to clarify that it's not the fact that the community is making noise that makes Google come out and give clarification, but rather, when there seems to be a lot of confusion in the SEO community.
More information on author bios
John was asked to talk more about the importance of an author's expertise. (This is related to E-A-T, Expertise, Authoritativeness and Trust).
We used to have the authorship markup on pages we don't have that anymore or at least we don't use that anymore. But otherwise it's also something that I think users care about quite a bit so instead of thinking about this as an SEO problem I would think about this more as a user problem and figure out: How do you gain a user’s trust? How do you make it clear to them that the content that you're providing is actually relevant content? that it's actually something that they can trust?
So, I would see that more as a user experience type issue here rather than a pure SEO issue. Obviously search algorithms try to understand what users are thinking and to rank results in a way that they would find relevant. So, some of that could play into that as well more indirectly probably but primarily you can you can work on this on the user level. Do user studies and see how users react to that or if they care about it at all maybe for your specific site it doesn't really matter.”
What we believe John is saying here is that it makes sense from the perspective of making things better for users, to write good author bios. You want to do everything you can to show your potential customers why they can trust the information that is written in your articles.
Be careful having auto-translated content on your site!
When asked about using machine translation, John said this.
“For auto translated content we prefer not to be able to index that. So if you're using an API to translate parts of content on your website then ideally you'd be doing that in a way that Google doesn't actually index that content because for the most part this automatically translated content isn't really that high quality and that can really throw us off a bit. So the other option that you mentioned there let the user translate by clicking a button I tend towards that a little bit more.”
The take home message here is that poorly translated content in Google's index is potentially a sign of low quality.
If you like stuff like this, you'll love my newsletter!
My team and I report every week on the latest Google algorithm updates, news, and SEO tips.
Full transcript
Question 0:42 – If I decided to block all IPs that use Proxies is that something that Google can see as a negative impact?
Answer 0:54 – Probably not. So, as long as Googlebot can crawl your pages. We don’t really care directly about what you do with the other IP addresses. Obviously if you’re blocking legitimate users and they can’t access your content, they can’t buy anything. They would be less likely to recommend it to other people. But that’s ultimately up to you. So, some sites block users from certain countries are strict with regards to kind of abuse that happens from IP addresses and IP blocks that's ultimately up to you.
Question 1:33 – Some CDNs offer that they hide your content based on the country of the visitor. So, for example if somebody is coming from Korea they have this CAPTCHA icon and they need to solve the CAPTCHA in order to see the content and there is also another case, I believe this is what CloudFlare is doing, that they say we are checking your browser. So in case our website has issues with security and we believe somebody is using some kind of automated tools in order to do something with our content. Is using CDNs hurting the search performance in some way because you're basically ruining the experience for people by slowing down the initial process of loading the website and also cutting the content from the user so it's not visible at the moment of the load?
Answer 2:30 – “That's ultimately between you and your users. So that's something if you think for your legitimate users the ones that you care about your website is fast and accessible and that's fine. If you think that maybe you're blocking legitimate users accidentally using these systems and that's probably something you want to fix. The important part from Google's point of view for SEO is just that we're able to access the content with the normal Googlebot IP addresses and I believe a lot of these systems also have a provision for automatically allowing search engine BOTS so yes probably that will be okay.”
Question 3:35 – We are a tech team that implemented client-side redirection for most of the URLs for our website. When we load the pages from NGINX and then adding parameters to the URL itself. The content is the same but showing through different parameters and our clean URL. Is there an issue with that? Can google see that they have different URLS but the content is the same.
Answer 4:16 - “I think you'd want to be careful in a case like that because it kind of depends on how Google is able to crawl through these different URLs and what happens with those different URLs. So, for example if you redirect to a URL with a session ID let's say in a worst case then that means we will find this session ID URL we'll try to crawl and index it and then in a second step we'll see; oh it's the same content as we've seen before with a different session ID. So it's a lot of extra work for us to index and crawl all of this content so that's something I generally try to avoid. Just serving different content may be like personalization and in that sense that's less of an issue but if you're changing the urls on the fly then that can make things a little bit more complicated with canonicalization with us knowing which URLs belong together all of that”
Question 7:45 – I’m writing on behalf of one of the largest online retailers in the US with regard to prescription eyewear and we're recently seeing some changes in in ranking with regards to our website.
Answer 8:08 – “That's something where from my point of view I don't really have much insight into a generic situation where you're seeing a change in rankings because changes and rankings can always happen even if a website has been around for a long time. That doesn't mean that the ranking of that website will always be the same as it was before because like if you start a new website you also want to have a chance to make it to the top of the search results and similarly, if you're just coasting in an extreme case with a website that was really good a couple years ago then maybe that's not actually what users still expect to see in the search results. So, these are all situations where changes in ranking are essentially normal so, and changes in the way that we put together the search results those happen all of the time and we try to work on improving the relevance of our search results matching the changes in user expectations as well over time.”
Question 9:56 - What goes through the decision process with the search team, whoever's confirming these updates, in terms of “We should confirm this core algorithm update and not confirm that core algorithm update”?
Answer 10:59 – “For the most case there is no kind of explicit guideline where we say if it crosses this specific metric and it goes this far that we would announce that explicitly. What one aspect that from my side plays a strong role here is whether or not there's something really actionable from the site owner point of view with regards to this change. So, things like page speeds or like the mobile friendliness update, the mobile indexing changes those are all things where we do make changes in ranking and it's something that a webmaster can explicitly influence. So, if we tell them about this like; hey we're going to do this because we think it's the right, thing then they can take that and say; ok I can work on this and I can make changes to my website to make sure it matches these expectations. That's something from our point of view that definitely makes sense to announce.
The general kind of relevance updates that we have in search, the core ranking updates that happen those are things that happen all of the time and they happen I guess on a daily basis more or less and for a large part these are just small shifts that happen in search. And it's not something where we'd be able to tell people; hey we think your site is less relevant for this query therefore you should make it more relevant because that's not really that useful of feedback. So, it's something where we tend not to announce those too broadly and sometimes when a lot of people start talking about these kind of updates and we see that people are really confused and sometimes I'll say; ok fine we said we wouldn't talk about this but maybe we just want to confirm that actually we did make some changes here it's not that you're seeing ghosts and there's nothing explicit that you can kind of work on directly with regards to these changes. So that sometimes comes into play there.”
Question 13:09 - So generally the more noise the community is making the more likely you'll go ahead and confirm that yes we may have done an update that day?
Answer 13:17 – “I wouldn't frame it as the more noise. I mean it's like obviously the community can be really loud at some times but when we see that people are genuinely confused and when we see that the people that are confused are really people who aren't kind of this SEO bunch that understands that changes happen all the time and Google made this other tweak here and changed that. Then that's something where I think it makes sense to talk to those people and let them know about this. For the most part the normal kind of ranking changes that happen all the time, like people talk about them in forums and their sites go up and some of them go down, that's not really something where we really have something useful to say and even confirming that would be essentially like yes we made changes again it's like we make changes every day because we're not always playing pool all day were actually working”
Question 16:49 - I just started a website and index the homepage 30 days ago now I want to change the whole title of my homepage does changing the whole title affect my site's ranking?
Answer 17:00 “Yes probably we I believe we do take the title into account when it comes to
ranking a page, it's a part of the content on the page. It's also something that in many cases is visible in the search results so if you change things on your pages and that can affect how those
pages are shown in search.”
Question 19:21 - Can you talk more about authorship and the importance of it in determining the expertise of an article? Ss it important to have author bios for each writer on the website does it matter how many articles a person is written?
Answer 19:37 - “So this is something I guess that comes up again and again, we used to have the authorship markup on pages we don't have that anymore or at least we don't use that anymore. But otherwise it's also something that I think users care about quite a bit so instead of thinking about this as an SEO problem I would think about this more as a user problem and
figure out like; how do you gain a user’s trust? How do you make it clear to them that the content that you're providing is actually relevant content? that it's actually something that they can trust?
So, I would see that more as a user experience type issue here rather than a pure SEO issue. Obviously search algorithms try to understand what users are thinking and to rank results in a
way that they would find relevant. So, some of that could play into that as well more indirectly probably but primarily you can you can work on this on the user level. Do user studies and see how users react to that or if they care about it at all maybe for your specific site it doesn't really matter.”
Question 24:15 - is it sufficient to display the date the article was last updated or is it also required that we show the first published date?
Answer 24:24 - That's something that you can do either way. Personally, I like having both of these dates there. There's also structured data that you can use for both of these dates. So if you provide it on your articles and I would also look into the structure data and try to use that as well I think both of these dates can be useful for our algorithms
Question 24:25 – Would Googlebot prioritize that over other content in terms of dates? Would Googlebot prefer the structured data form?
Answer 25:01 – “We prefer when things are consistent. Let's put it that way because that's
that's something where we have run into issues in regard to that and specifically for new sites. Where we see one date on the page and a different date and structured data or with breaking news type things. If it has one time zone in the structure data with like one format of structured data and a different time zone in the structure data elsewhere on the page then that can be really confusing for us and then you'll see weird things like this breaking event that just happened 10 minutes ago and in search results as a it happened five hours ago. Which is just because the time zone issues here and similarly if you don't have a time zone on in the structure data then we can you have to guess and figure out like what might your local time be and how does that map for most things in the long term it doesn't matter but in the short term especially for breaking issues that that can play a role.”
Question 34:06 - Is there any problem with Google search console fetch as Google all is loading fine except the images no errors about this everything is loaded in the source code?
Answer 34:19 – “So I think that there are two aspects that you could be running into there. One is that perhaps these images are blocked in some particular way so that could be through robots.txt for example. That the block from loading or somehow not accessible wherever they're hosted from Googlebot’s IP addresses that could be one option. Another option that could be happening here is that we're just running out of time when rendering these pages. So in particular for these testing tools we try to fetch all of the content as fresh as possible so that we can show you the current preview and by doing that we sometimes need to request hundreds of different URLs depending on how the page is structured and that can result in us saying well we don't really have time to fetch all of these will kind of tend towards giving the user a faster preview rather than be more complete preview. So that could be happening here too that maybe we've seen these images but we just can't fetch them live that quickly to show you the preview.
For indexing it's a little bit different because we have more time we don't have to give you an answer right away we can take the HTML page and we can fetch all of the embedded content over the course of a day or even longer and we can cache all of that content for a fairly long time so that the next time we look at this page we’ll say a oh that image I've seen that before I don't have to get that one fresh again.
So that's probably one of those two options is what's happening there in your specific case and if everything else is loading fine then that sounds like things are lined up properly that things are working as expected. You can perhaps look at the number of requests that this page is making using a tool like webpage test or checking in chrome and the developer tools and if you see that there are hundreds of different requests are required to load this page then from a speed one of you would make sense to clean it up a little bit and that should be reflected in fetch as Google as well. But in general, if things are just timing out with fetch as Google that's not something that you usually need to worry about with regards to indexing. “
Question 41:04 – If we use the Google Translate API from German to French and Italian all of the comments on our website what's the correct way around this should I use kind of a span with a language tag associated with it how can I do that?
Answer 41:23 – “So we don't use the language attribute on individual HTML elements. We try to recognize a language of the content directly. The language attribute makes sense for screen readers which kind of shift to the different language kind of pronunciation. So I wouldn't drop it completely but consider that most search engines probably wouldn't be using it. For auto translated content we prefer not to be able to index that. So if you're using an API to translate parts of content on your website then ideally you'd be doing that in a way that Google doesn't actually index that content because for the most part this automatically translated content isn't really that high quality and that can really throw us off a bit. So the other option that you mentioned there let the user translate by clicking a button I tend towards that a little bit more.”
Question 43:27 - In July impressions regarding specific search terms of a product and our portfolio have tripled with clicks staying roughly constant. At the same time Google Ads data shows no increase in impressions of the corresponding ad campaigns what could be happening there?
Answer 43:47 – “I think that can happen. I think that's kind of a normal situation especially if a page was kind of fluctuating in and out with regards to search results a little bit maybe coming from page two just to the bottom of page one then those are subtle changes that you might not see reflected immediately in clicks or in ranking but they can have a big impact on the impressions that your pages see from search. So, if a page from your website is moving from page two in a search results to page one and the search results on the bottom then that would be reflected in impressions as well. So the impressions are counted when one of your pages is shown and searched it doesn't matter if that's on the top of the search results page or on the bottom all of that counts as an impression whereas if it's shown on page two and few people click to page two then you'll have significant lower impressions there and that could also be why you're not seeing those changes reflected elsewhere like for example if you have an ad campaign running or if you look at the overall impressions I believe in in AdWords then that's something where you probably wouldn't see that change.”
Question – John did not read the question out loud in regards to page speed but mentioned this and I thought it was interesting.
John at 48:54 – “With regards to ranking when it comes to speed we use a variety of different metrics to kind of figure out how this page is with regards to speed including some lab data and some field data. So, it's not explicitly mapped one to one to any of these metrics primarily so that we can also update that over time but also because we think these metrics are useful for you to kind of improve on your website but that might not necessarily map one to one - what we think is important for search. I don't know if that's right or way different in that but essentially the metrics that you see in these tools generally don't map one to one with regards to the way that we rank them in search”
Question 53:18 – Do we need a canonical tag on AMP pages linking to the regular non AMP pages now or is this still ok to not have that?
Answer 53:17 - So if you have the amp page connected to the kind of traditional HTML page we do need that canonical link from amp to the HTML page and together with the link amp HTML from the web page to the amp page. Without that two-way connection we would probably treat those pages as just being separate pages on your website.
Comments are closed.