We join John Mueller in his home in this Webmaster Help Hangout. We had some great questions about Page Speed, hreflang, and Backlinks. The video and a full transcription can be found after our summery. If you like this consider signing up for our weekly newsletter! We summarize the best SEO articles of the week in easy digestible bites.
Can you explain the issue with using json-ld via a tag manager?
So I think what we’ve talked about is quite a number of times and a bunch of these hangouts as well and there's new blog posts written up about this as well from various people including Barry as well. But that essentially it goes back to, for us to be able to pull in content from tag manager means we need to be able to render the JavaScript process the script files from tag manager and output the writing there and include the indexing side. So that's something that always takes a lot of work for our systems and is not always something that we do for all pages, in particular when we see that the page otherwise would be the same, then it's kind of hard for our systems to justify that we need to also process all of this JavaScript. And on top of that a lot of the testing tools they don't process the tag manager and output either so it's really hard for tools to kind of confirm that this markup is working properly. It takes longer for it to be processed in search it might be a little bit flaky and that you don't really know exactly what it's being indexed in such at any given time. So those are all reasons why I from my point of view it's fine to use tag manager for anything else. It's fine to use it for this as well for JsonLD for structured data in for search but it’s worth just keeping in mind that it's not a great approach for especially structured data in search. They're much more straightforward ways to provide the structure data directly on the webpage makes it easier for maintenance makes it easier for tracking for testing all of that. So that's what I would recommend doing that, I'm not saying you can't use tag manager here for this you can certainly use it and we'll try our best to pick it up and use it and such but it's it's not quite the same level of kind of speed and flexibility, security when it comes to actually providing the structure data directly on the pages.
Summary: Though it is possible to use JSON-LD and other structured data via Google Tag Manager, it is not the best approach. For Google to pull the content form Tag Manager and the has to render the JavaScript. Process the script from Tag Manager and output the writing and then index. There are straightforward ways to provide structured data directly on the page that makes it easier on Google.
How does Google pick a URL to show in Search?
So this kind of goes into the general question of how Google picks a URL to show in search and on the one hand there's an aspect of in this case like the landing page for image 1 having a rel canonical set to the view all page, would mean that these pages are not equivalent. So it's kind of hit or miss we would pick up and use that canonical at all, that's I think one aspect to keep in mind. The other thing to keep in mind there is that even when we do understand that these pages could be seen as equivalent it's a matter of us using multiple factors to determine which of these pages is the actual canonical. So for that, we use a rel canonical, we use redirects if we have anything. We use internal external links we use things like sitemaps, hreflang links, all of that helped us to understand which of these URLs is the one that we should be showing and if the canonical URL that you specify is one that you never use within the rest of your website, then chances are we'll say, making that link rel canonical was a mistake the webmaster didn't actually mean it like that, and maybe we'll have to pick a different URL as a canonical. So my I guess is what will happen here is either we will ignore the rel canonical because these they're not the same thing is or we will pick one of the other existing pages instead because that's one of the pages that is like strongly linked within the website. So I don't think this specific setup would be that useful in your case.
Summary: Google uses the rel canonical, redirects, internal linking, sitemaps, hreflang to understand with URL is the one that Google should be showing. But if the canonical URL that is specified is one that is never used within the rest of your site, Google may ignore the rel canonical and chose another page that is heavily linked internally.
Is it a good idea to include authors on blog posts instead of a generic editor user?
I think that's a good move like especially if you know who wrote the article originally and you can treat it like an author landing page I think that's a good move. Even just from a pure user point of view if someone goes to your website and they suddenly see oh these articles were written by Barry instead of just editor and you have a landing page for that author then that could be a sign that it's better for the user that could be something that users pick up on or they go to the author's landing page and see oh this author is actually an expert in this field and has been active there for some years that's, I think always useful to have on a website in general, and with regards to Google rankings it's it's tricky to say if that would have any direct effect at all. But at least indirect effects is that users might trust your content more like recommended or I think that's something that would be kind of expected.
Summary: Yes! Having an author landing page is a way to show your authors' E-A-T. It will also improve user experience in general as opposed to having a generic “Editor” or “Admin”
What’s the recommendation if your UI can be translated into other languages but supplementary content, such as user generated content, stays in another language?
So this is something that happens quite a bit especially for user-generated content. So if you have a forum or a blog or something and people are commenting in one language but you have the set up so that the UI can change to other languages. Then you quickly have a situation where the UI could be in French or in German but the content might still be in Spanish for example because everyone is commenting in Spanish and that's essentially something that you can handle in multiple ways. So you can say my canonical version is the Spanish version and everything is just like the Spanish version that's one option you can do. You can use the hreflang annotations between those versions to say this is the most French version of my content that I can provide main content isn't in French but the UI is in French so user going the page will be able to kind of navigate around my website that's something you could do. And essentially those are the the different variations that you can kind of provide there to let us know a little bit more about what your preferences would be. From a practical point of view that's ultimately more up to you with regards how you want to be shown in search. So if you think it's it's useful for a French user to come to your site and land on the page with the content in Spanish and the UI in French and then by all means use the hreflang annotations between those versions. If you think that a user in France would have trouble navigating your site at all if the main content is in Spanish even if the UI is in French then maybe it makes sense to just keep the Spanish version with the Spanish UI indexed. So it's kind of ultimately up to you I think neither of these are perfect solutions. Sometimes it depends a bit on how uniform your content is how clearly you understand which language versions you want to have indexed and which versions users expect to see in the search results. So for example if you're a very international forum and people post in all the kinds of different languages then it's probably tricky to say like you only want this UI version indexed maybe it makes sense to have all UI versions indexed. The downside of course to having all UI versions indexed is it multiplies the number of your urls that your site suddenly has. So that means we have to crawl lot more and if it's a large user-generated content site that means we have to crawl, the one hand all of the user-generated content and then we have to crawl all the multiples of that for each different language version, which I don't know if that's useful, if that's too much crawling for your website if that prevents maybe newer content from appearing quickly in the search results. That's something else to kind of weigh off there. If you're just talking about a few thousand articles that maybe that's less of an issue.
Summary: You can choose one language version as the canonical version or you can add hreflang annotations between those language versions.
Should I disavow large numbers of random urls that link to my site?
No, that's that's essentially the right move to do and especially if it's something that you're really worried about. So I think for most websites it doesn't make sense to go out there and disavow things that are just like iffy and weird because for the most part we just ignore those. So in particular with regards to links, if it's something that when you look at it you say well this could be seen as these links be being bought by us, like naturally placed there by us, if someone from the website team were to take a look at this manually and they would assume that, this is us doing something stupid, that's the kind of thing where I'd say probably disavowing them or getting them removed would be the right move there but otherwise if it's just an iffy link and it looks like it's something like there are millions of other links on there someone ran a tool and dropped tons of links into this forum then that's something our algorithms have figured out already. So that's not something I wouldn’t worry about.
Summary: Google is good at figuring out which links to ignore. But it’s better to be safe rather than sorry when it comes to disavowing.
How important is page speed to Google?
What is the current policy? So speed is something that does matter quite a bit to us and it has a big effect on users so that's something that I would personally take quite seriously and I think the the nice part about speed is there various tools that gives you pretty objective measures there that you can actually work on. With regards to a lot of us or other issues around SEO like, I don't know the quality of their content things like that speed is something that that is quite measurable and something that you can kind of work on, and it should also be something we're used a direct effect from your users behavior within your website. So it's not just something that from from Google's point of view like we say speed is important it is rank tracker but it's something that you will see directly when users come to your website and your website is suddenly taking a couple seconds longer to load those users will react quite differently on your website and you'll have more trouble converting them into customers however you define customers on your website.
Summary: Speed is extremely important when it comes to Google. It is one metric that is easily trackable and easily actionable. There are great tools that give great insights on how your site performs and what you can do it help increase page speed.
If I pay for Google Ads will my ranking be better or worse?
So we get this question every now and then and the question here is like, will my ranking be affected but the better or worst part is something that we also hear some people say, that your ranking won't get better if you use Google Ads some people say your ranking will get lower if you use Google ads because we want you to buy more ad and neither of those are true. So our search results are completely independent on whether or not you use Google Ads they're completely independent on the technologies that you use on your website. So if you use something like analytics or so another tracking tool that's totally up to you. If you monetize with Adsense or any of these other ad networks totally up to you. So whether or not you use Google products within your website, you use other Google services for your website is totally up to you. That's something that we prefer to have these services kind of stand on their own and if you say this one particular reason Google service is not great I don't want to use it then feel free to use something else. We don't want to kind of put you in that box where you're like stuck between focusing on your website and doing what you think is right for your users and having to use this one particular product. So it's really the case that we don't tie these together and we do that explicitly and we work hard to make sure that these things work well.
Summary: Google Ads has no effect on organic rankings.
If you like stuff like this, you'll love my newsletter!
My team and I report every week on the latest Google algorithm updates, news, and SEO tips.
Full Video and Transcript
Question 1:14 - Two weeks ago our website lost 94% of Google's traffic overnight. With consistent search traffic for the last 20 years and no major changes we assume something technical could sharing IPS or SSL via a CDN like cloudfare cause a big drop in traffic algorithmically. We dug in deeper and found some sites with it seem extremely risky content on the same IP. We can change our theme and have a dedicated certificate however we'll still down in traffic, what what could be happening here?
Answer 2:01 - But in general just because other sites are hosted on the same ip address generally isn't the reason for concern. So in particular with larger hosters share IP addresses is a fairly common with CDNs shared IP address is a extremely common. And it's something that changes because a lot of CDNs have endpoints in different countries and they kind of share those endpoints with the different websites that are active there. So essentially user and germany might see different IP address versus a user in the US for example but in general this is an extremely common practice, sharing IP addresses, and is not something that will be problematic. In the way early days this is this was something that sometimes was very useful to to recognize dates and hosts. Where if we saw one IP addresses and 9000 sites that had static addresses and they're all spammy and if there were two other websites on the same host then that might be tricky for us to realize like, are these two sites really come to completely separate sites compared to these 9,000 other sites. That's kind of a tricky situation for algorithms. But in most cases like this we'll see mix of all kinds of different sites so different sites in different languages for different countries with different target users some spammy sites some non spammy sites on the same IP address and all of that is is perfectly fine. So that wouldn't be a reason for us to say, oh like because there's one spammy site on this IP address that that would be a problem. So I don't know specifically what has happened here with this website I take a look into that in general also kind of the the aspect of our website was doing well on search for so many years before I tend to say that's like good on your site but that's not necessarily something that we would always keep it like that.
So just because the site was doing well the past and search doesn't mean it'll continue to do well in search on the one hand yeah user expectations change on the other hand Google's algorithms change. So these things can always change and it can happen that things sometimes change significantly and our goal there is less to say like this one particular website is bad algorithms but rather to say we we've recognized that maybe we've missed serve user expectations or we've been doing things in ways that don't match what users expect anymore so our algorithms change to try to bring results say users and relevant nowadays back to the search results. So that's something that can always play in depending on the type of website.
Question 5:49 - Quite a while ago about the site that seemed to outrank us based basically stealing our content and then modifying it so they don't violate copyright and then outranking us. We noticed the pattern when we look back and did some research that it seems that like will redesign our site, improve our quality, improving the rankings then they'll copy it and sometime over the next month or two then they'll start out ranking us and it um it seems to us like somehow the algorithms confused and is giving that site credit for being the content originator instead of us and then suppressing us in the rankings as a result.
Answer 6:37 - I don't know, I'd have to take a look at the sites to see see what exactly is happening there. So that's that's kind of tricky from an algorithmic point to say that, like our algorithms would always be picking that site over your site for those queries. But what I would generally aim to do there is if these sites are copying your content then try to find ways to kind of tackle that at the root to kind of encourage them not to copy your content. So maybe look into things like a DMCA complaint, I don't know if that's relevant in your case or not, but anything to try to handle that in a way that search doesn't have to make that guess at which of these versions of the content should be ranking for those queries.
Question 11:18 - Can you explain the issue with using json-ld via a tag manager? Tag manager is used to verify search console and probably analytics so surely it's pretty stable enough.
Answer 11:33 - So I think what we’ve talked about is quite a number of times and a bunch of these hangouts as well and there's new blog posts written up about this as well from from various people including Barry as well. But that essentially it goes back to, for us to be able to pull in content from tag manager means we need to be able to render the JavaScript process the script files from tag manager and output the writing there and include the indexing side. So that's something that always takes a lot of work for our systems and is not always something that we do for all pages, in particular when we see that the page otherwise would be the same, then it's kind of hard for our systems to justify that we need to also process all of this JavaScript. And on top of that a lot of the testing tools they don't process the tag manager and output either so it's really hard for tools to kind of confirm that this markup is working properly. It takes longer for it to be processed in search it might be a little bit flaky and that you don't really know exactly what it's being indexed in such at any given time. So those are all reasons why I from from my point of view it's fine to use tag manager for anything else. It's fine to use it for for this as well for JsonLD for structured data in for search but it’s worth just keeping in mind that it's not a great approach for especially structured data in search. They're much more straightforward ways to provide the structure data directly on the webpage makes it easier for maintenance makes it easier for tracking for testing all of that. So that's what I would recommend doing that, I'm not saying you can't use tag manager here for this you can certainly use it and we'll try our best to to pick it up and use it and such but it's it's not quite the same level of kind of speed and flexibility, security when it comes to actually providing the structure data directly on the pages.
Question 14:00 - A client did an HTTPs migration by moving using 302 instead of 301 do they need to change that to a 301? How long will it take for Google to understand that this is a permanent redirect?
Answer 14:14 - So probably we will be picking that up as well lots of people use the wrong tag of redirect for site moves and we still work on trying to figure that out properly. A quick way to see what is happening is to check in search console to see if things are are being indexed properly. So if the the new domain is working well then, that's probably okay already. That said if you've recognized issues like this or any other technical issues on a website that are easy to fix and feel like they might have a big impact then we'll just go ahead and fix that. Especially the wrong redirect type, that's like a one-line change in the htaccess file on most websites. So that's that's something that's really easy to fix. So that you don't have to worry about search engines interpreting that in the wrong way.
Question 15:20 - Our image galleries have a unique URL for image like /gallery/image1 or /image2 or /image 3 and we want to add gallery /view all and use this as the canonical url but we don't have this link anywhere on the site can we do that? Does the view all need to be visible for the reader?
Answer 15:46 - So this kind of goes into the general question of how Google picks a URL to show in search and on the one hand there there's a aspect of in this case like the landing page for image 1 having a rel canonical set to the view all page, would mean that these pages are not equivalent. So it's kind of hit or miss we would pick up and use that canonical at all, that's I think one aspect to keep in mind. The other thing to keep in mind there is that even when we do understand that these pages could be seen as equivalent it's a matter of us using multiple factors to determine which of these pages is the actual canonical. So for that we use a rel canonical, we use redirects if we have anything. We use internal external links we use things like sitemaps, hreflang links, all of that helped us to understand which of these URLs is the one that we should be showing and if the canonical URL that you specify is one that you never use within the rest of your website, then chances are we'll say, making that link rel canonical was a mistake the webmaster didn't actually mean it like that, and maybe we'll have to pick a different URL as a canonical. So my I guess is what will happen here is either we will ignore the rel canonical because these they're not the same thing is or we will pick one of the other existing pages instead because that's one of the pages that is like strongly linked within the website. So I don't think this specific setup would be that useful in your case.
Question 17:38 - What's your recommendation for sites that have a lot of content that they want to provide for other languages and countries but they only have translated the interface so far, so not the main content.
Answer 17:55 - So this is something that happens quite a bit especially for user-generated content. So if you have a forum or a blog or something and people are commenting in one language but you have the set up so that the UI can change to other languages. Then you quickly have a situation where the UI could be in French or in German but the content might still be in Spanish for example because everyone is commenting in Spanish and that's essentially something that you can handle in multiple ways. So you can say my canonical version is the Spanish version and everything is just like the Spanish version that's one option you can do. You can use the hreflang annotations between those versions to say this is the most French version of my content that I can provide main content isn't in French but the UI is in French so user going the page will be able to kind of navigate around my website that's something you could do. And essentially those are the the different variations that you can kind of provide there to let us know a little bit more about what your preferences would be. From a practical point of view that's ultimately more up to you with regards how you want to be shown in search. So if you think it's it's useful for a French user to come to your site and land on the page with the content in Spanish and the UI in French and then by all means use the hreflang annotations between those versions. If you think that a user in France would have trouble navigating your site at all if the main content is in Spanish even if the UI is in French then maybe it makes sense to just keep the Spanish version with the Spanish UI indexed. So it's kind of ultimately up to you I think neither of these are perfect solutions. Sometimes it depends a bit on how uniform your content is how clearly you understand which language versions you want to have indexed and which versions users expect to see in the search results. So for example if you're a very international forum and people post in all the kinds of different languages then it's probably tricky to say like you only want this UI version indexed maybe it makes sense to have all UI versions indexed. The downside of course to having all UI versions indexed is it multiplies the number of your urls that your site suddenly has. So that means we have to crawl lot more and if it's a large user-generated content site that means we have to crawl, the one hand all of the user-generated content and then we have to crawl all the multiples of that for each different language version, which I don't know if that's useful, if that's too much crawling for your website if that prevents maybe newer content from appearing quickly in the search results. That's something else to kind of weigh off there. If you're just talking about a few thousand articles that maybe that's less of an issue.
Question 21:25 - We have a blog running alongside our e-commerce website since the beginning the blog posts were marked as written by generic editor user. Looking at the quality guidelines and EAT we'd like to replace editor with the real name of the post author is this kind of operation positive or could it potentially be seeing spam?
Answer 21:48 - I think that's a good move like especially if you know who wrote the article originally and you can treat it like an author landing page I think that's a good move. Even just from a pure user point of view if someone goes to your website and they suddenly see oh these articles were written by Barry instead of just editor and you have a landing page for that author then that could be a sign that it's better for the user that could be something that users pick up on or they go to the author's landing page and see oh this author is actually an expert in this field and has been active there for some years that's, I think always useful to have on a website in general, and with regards to Google rankings it's it's tricky to say if that would have any direct effect at all. But at least indirect effects is that users might trust your content more like recommended or I think that's something that would be kind of expected.
Question 22:58 - Schema markup with a desktop and amp version, is it okay if the desktop version is implemented using microdata but the amp version is using json-ld?
Answer 23:09 - Sure that's perfectly fine. With regards to the the format or that that is used there I don't see a problem with that the only thing to keep in mind is that as far as I know some kinds of structured data are only available in json-ld. So that might be something where you need to double-check the types of structured data that you're using but having one version of your site using one type of structure data on the other version using a different type, even within the same desktop mobile app variation, that's certainly possible so for example if you have a blog maybe a, I don't know, product directory on your website and e-commerce site and the e-commerce site has reviews that use json-ld and your blog uses article markup that's using micro data, I don't know if that's the right way to do that but that would be perfectly fine.
Question 24:19 - Regarding hidden content display:none is that okay Google support and has mentioned that white background white font or font size zero will be against guidelines but what about display:none?
Answer 24:32 - Hidden content is generally not something that we appreciate. In particular it comes across as the site trying to push keywords into Google's index that are not actually visible on the page itself. So that's something I'd really avoid. You mentioned responsive design and the rest of your question I think that's the one aspect that does come into play here. So if if you're using responsive design to make this content visible for mobile users or for desktop users then that's perfectly fine. But if this content is essentially always invisible like fonts by zero or white font on white background or black font on black background then those are the kind of things that our systems will pick up on and say well maybe this text here isn't as relevant as otherwise could be and from a practical point of view you probably won't get a manual action from for something like this. But other than through trying to figure this out and they will try to kind of devalue that content when it comes to search. So that it's less likely to be shown in the snippet less likely to be treated as really important on those pages.
Question 25:55 - After links manual action how long does Google treat the domain once the reconsideration request accepted but not regained their potential rankings in traffic?
Answer 26:08 - So I think they're two aspects here on the one hand if the manual action is resolved then pretty much directly that site will be visible in search without that manual action. So there's one exception here where if a site is removed for pure spam reasons then it'll essentially be removed from our index completely. So it's not that we can just turn it back on and show it again it will require that we actually recrawl and we process that site sometimes that takes a few weeks but for all other manual actions that's thing where once that manual action is resolved and things are back in the previous state it's not that Google holds a grudge and says well there was a manual action here or so I need to be extra careful it's really resolved it’s resolved. With regards to links of course if your site was artificially ranking in the search results due to unnatural links and you got a manual action and you fix that by removing these unnatural links, then of course your site won't be artificially ranking higher because of those unnatural links have gone now. So that's something where it would be completely normal to see a change in visibility after resolving something like that, similarly if if a site were unnaturally visible due to other things on your website and you resolve that by removing those other things then obviously your site will be visible naturally again but it won't be unnaturally visible due to those things that you removed so that's something kind of to keep in mind.
Question 27:59 - In looking at some of our backlink profile we had found, I don't even know how our links ended up on these pages, like on pages that have just like 7,000 links and things like that. We disavowed them when we found them. Is there anything we need to be concerned about other than doing that with when we find stuff like that?
Answer 28:20 - No that's that's essentially the right move to do and especially if it's something that you're really worried about. So I think for most websites it doesn't make sense to go out there and disavow things that are just like iffy and weird because for the most part we just ignore those. So in particular with regards to links, if it's something that when you look at it you say well this could be seen as these links be being bought by us, like naturally placed there by us, if someone from the website team were to take a look at this manually and they would assume that, this is us doing something stupid, that's the kind of thing where I'd say probably disavowing them or getting them removed would be the right move there but otherwise if it's just an iffy link and it looks like it's something like there are millions of other links on there someone ran a tool and dropped tons of links into this forum then that's something our algorithms have figured out already. So that's not something I wouldn’t worry about.
Question 30:06 - What do you suggest to tackle a low traffic, low quality pages on a site? There lots of suggestions regarding content pruning what recommendations do you have regarding that?
Answer 30:20 - So I think first off the assumption that a page that has low traffic is also low quality is something that I would question and sometimes pages just have low traffic because a lot of people search for them but they're still a really good truck pages. So I I would question kind of the assumption that you can just go into analytics and sort of your pages by a number of page views and delete all of the lowest pages there because I don't think that necessarily picks up like that these pages are really low quality or not. So that's kind of a first assumption there, if you know your website then obviously you can combine different metrics to try to figure out where the low quality pages are but I would still recommend making sure that these are really low quality pages before you take any kind of harsh action on those pages. And then as a next step if you do know that these are low quality pages when whenever I talk to our engineers from the quality team they tell us not to tell web masters to just go off and delete those pages but instead to going to improve them. So if you know that they're low quality pages that probably means you know what is missing and that probably means you know there are ways to kind of make these higher quality pages. So that's kind of the direction I would take there and not just delete things that are low quality but figure out a way to make them more high quality instead. So that could be by combining pages maybe it's something where you see this one-page, its kind of thin but it matches this your page and you have otherwise on your website maybe combining them makes sense. So 301 redirecting them to kind of one shared URL instead that might be an option. Rewriting them to be higher quality is obviously a good idea obviously takes work so it's not this one simple magic trick to make number one. Then finally if it's really something that you can't resolve at all or that is such a big mass of pages that are low quality that you can't really fix then maybe deleting them is it right. So those are kind of the different variations there that are available but again I would strongly question the the assumption that low traffic equals low quality. So if you're looking even looking at a larger site don't just assume that because something has low traffic is sign that it's not important for your website or for the rest of the web.
Answered Cont’ 34:03 - Yeah so I think one way you could look at this is to say given this state of content that you have what would be your preferred new website look like? So kind of saying like assuming I had all of this content and I had to create a new website out of it what would it look like and then to try to find a way to migrate your existing content into this new structure that you have in mind and like I said it could improve include combining pages, combining maybe tens of different pages together into one stronger page, it could be deleting pages where you say, well these don't make any sense for my website anymore maybe it was something that users cared about a couple years ago but now, I don't know, nobody is playing ingress anymore so all of those ingress pages on my website I have to make a hard decision and delete them. I can see the shocked faces everywhere in here now. But these kind of things happen over time and it makes sense to clean things up over time and sometimes it means deleting, sometimes that means combining, sometimes that just means rewriting and cleaning up. So it's it's hard to have one one answer that works for every side in every situation.
Question 36:36 - How to fix the crawl frequency of low priority pages within a website? Will Google crawl more of such pages because the quantity of these pages is more compared to the important pages?
Answer 36:49 - So I think this was your question as well in general you don't need things the the crawl rate of pages unless these are pages that are being changed more frequently than the crawl rate. So if you have an article that you wrote and it's being crawled once every three months and you're never changing this article that's that's perfectly fine we don't need to crawl it more often. There is no ranking bonus for being crawled more often. So crawling from our site is more of a technical thing where we say, this page has changed we should find a way to pick up this change as quickly as possible. It's not that we would say well the stage has been crawled twice in the last week therefore we will rank it higher those are completely separate parts of our algorithms.
Question 37:48 - I was checking the log files and 90% of our crawl budget is going to those specific URLs only and only 10% is crawling my product pages. So I was wondering I could make them crawl less frequently for those specific sections and maybe Google can start crawling or kind of giving more importance to my other sections of a set?
Answer 38:18 - Okay so you actually want to do the opposite which is I think a good move too. To have those pages crawled less frequently. So from from our point of view there's really no way to do that. So it's something that you would need to almost attack from the other way around to say, that I think these are other pages that are important on my website and therefore I'll link them prominently within my website. I'll make sure that all of my other pages refer to those pages, that they're specifying the sitemap file with the last modification page that we can confirm. So all of those signals to help us understand we need to be able to crawl these pages more frequently because there are changes on these pages. On the other hand if there are no changes on these pages we don't really to recall them for more free company so that's kind of be the other aspect there. If these are pages that are important for you but they are not changing frequently then there's no need to artificially force them to be crawled more often.
Question 40:11 - Can you tell if with redirection only link penalty passes or link penalty and content penalties both pass for example at website with pure spam manual action is redirected to another site so technically the URLs will be a soft 404, will it affect the redirected website?
Question 40:37 - So I'm not quite sure with which part of this question you're you're kind of focusing on. On the one hand if a random spammy website redirects to your website that's usually something we can recognize and just ignore. On the other hand if yours is that spammy website and you're redirecting to another website to try to escape that penalty then probably we will be able to follow that site migration and apply that manual action or algorithmic action to the new website as well. So my recommendation there would be instead of trying to get away by doing fancy redirects or other types of site moves. I would recommend just cleaning up the issues so that you don't have to worry about those anymore. So if there are link actions with regards to that website then clean up those those links so that you're in a clean state again. The reconsideration process is great for that because someone from the web spam team will take a manual look at your website and they'll say, this looks good this is fine like. You did good work and clean things up it's clear that you understand what you should be doing now so we can remove them. So I think that's really useful to have there from a practical point of view. So that would be kind of my recommendation if you're the website that has this problem. On the other hand if like I mentioned some random website redirects to your website and that's usually something that we can recognize, this is not a normal site move this is just the read website redirecting to you to another website and we can get that.
Question 42:30 - John two quick general questions one related to site load speed, we've read and heard various things including recently people saying that like every microsecond counts and things like that, what is the current policy I know in the past you said as long as it's not ridiculously long to load you're fine?
Answer 42:53 - What is the current policy? So speed is something that does matter quite a bit to us and it has a big effect on users so that's something that I would personally take quite seriously and I think the the nice part about speed is there various tools that gives you pretty objective measures there that you can actually work on. With regards to a lot of us or other issues around SEO like, I don't know the quality of their content things like that speed is something that that is quite measurable and something that you can kind of work on, and it should also be something we're used a direct effect from your users behavior within your website. So it's not just something that from from Google's point of view like we say speed is important it is rank tracker but it's something that you will see directly when users come to your website and your website is suddenly taking a couple seconds longer to load those users will react quite differently on your website and you'll have more trouble converting them into customers however you define customers on your website.
Question 44:08 - From the standpoint of like if it's 1.1 seconds versus 1.2 second that kind of thing would would you say that that's very important to try to really optimize those?
Answer 44:21 - I think the tricky part with speed is there's so many different measures in the meantime that it's hard for me to say like, load time is the only thing you should be thinking about, but there ways to to kind of determine how quickly the page is is generally accessible. How quickly they the content is visible on the page, even kind of ignoring the aspect that maybe the rest of the page below the fold is still rendering and still takes a bit of time to actually be ready, maybe the part that users care about is actually visible fairly quickly. So from from that point of view usually small differences are less of a thing but kind of like I mentioned speed is something where you can use these different tools who could come up with a different metrics and you can focus on those metrics and try to improve those and you can measure that yourself and you can kind of work on that without having to go through various Google tools and waiting for things to update in the index in these tools.
Question 45:47 - Can I use Google official videos in my blog or can I only link to them for example Matt Cutts videos about SEO. I will use Adsense on the blog when I have enough adsense my blog will be complete in 6 months.
Answer 46:03 - I don't think there are any restrictions with regards to embedding videos for a channel but if there were no restrictions then I think the embed option YouTube wouldn't be available there. So if the embed option is there then then go for it. I think in in general I'd be cautious about using just a video as the primary piece of content on a web page and you should really work to kind of use the video in a way that supports your primary content but not that it replaces your primary. So for example I wouldn't take any of these videos and just put them on a blog post and add a title to them and expect them to show up highly in search. But if you have specific content around that video if you have a transcription of that many don't you have some comments to that transcription to the content that are shown in the video or you're using that video as kind of a point of reference with regards to your content and I think that's a perfectly fine approach. But just purely using a video on a page is something that atleast in a web search point view makes it really hard for us to determine what is actually useful on this page and why should we show it in the search results.
Question 47:27 - If I pay for Google Ads will my ranking be better or worse?
Answer 47:34 - So we we get this question every now and then and the question here is like, will my ranking be affected but the better or worst part is something that we also hear some people say, that your ranking won't get better if you use Google Ads some people say your ranking will get lower if you use Google ads because we want you to buy more ad and neither of those are true. So our search results are completely independent on whether or not you use Google Ads they're completely independent on the technologies that you use on your website. So if you use something like analytics or so another tracking tool that's totally up to you. If you monetize with Adsense or any of these other ad networks totally up to you. So whether or not you use Google products within your website, you use other Google services for your website is totally up to you. That's something that we prefer to have these services kind of stand on their own and if you say this one particular reason Google service is not great I don't want to use it then feel free to use something else. We we don't want to kind of put you in that box where you're like stuck between focusing on your website and doing what you think is right for your users and having to use this one particular product. So it's really the case that we don't tie these together and we do that explicitly and we work hard to make sure that these things work well.
Comments are closed.