Reconsideration Requests
18/Dec/2018
 
Show Video

Google+ Hangouts - Office Hours - 21 April 2015



Direct link to this YouTube Video »

Key Questions Below



All questions have Show Video links that will fast forward to the appropriate place in the video.
Transcript Of The Office Hours Hangout
Click on any line of text to go to that point in the video


JOHN MUELLER: Just a second. Welcome everyone to today's Google Webmaster's Central Office Hours Hangout. My name is John Mueller. I am webmaster trends analyst here at Google in Switzerland, and happy to help answer your questions-- the questions that got submitted or from the people here live in the room. Danny, do you want to go ahead?

DANNY: Yes. Thanks for having me on-- long time viewer, first time participant. My question is about URL parameters in the Webmaster Tools and the difference between two particular options. And I was wondering if I could do a screen share to show them? Is that possible in this?

JOHN MUELLER: Sure.

DANNY: OK. Are you seeing what I'm seeing?

JOHN MUELLER: Yeah.

DANNY: URL parameters? So my question is the difference between, no, it doesn't affect page content and yes, but don't crawl in URLs-- this last radio option. Does that have the same effect ultimately?

JOHN MUELLER: So I thinlk-- I'd have to double check. But from what I recall from when we set this up initially, the option where you say it doesn't affect page contents essentially says we can pick one of these URLs and just crawl it like that. Whereas the radio button on the bottom says no URLs basically means we don't crawl any URLs that have these parameters. So it's kind of the difference between crawling one or crawling none of them.

DANNY: OK, my goal here is actually related to mobile. On my mobile site, I have separate mobile and separate desktop. And on my mobile site, I have a link at the bottom of each page to the full site. And the way that it works is it adds a URL parameter of mobile equals no. And through the PHP get method, it creates a session and it forces to show the desktop each time the user loads a new page on their phone. And what that created now for the first time as a webmaster for me is two versions of each page when

I do a site: search in Google. Does that make sense so far?

JOHN MUELLER: What I would do there is more to the use the rel=canonical than the URL parameter handling here. You probably could do something similar with the URL parameter tool, but I'd have to--

DANNY: Because so far, I experimented with the no option for about three months, I left it on no. And it didn't change the results when I did a cycle in search. So just recently I changed to yes. Are you still hearing me?

JOHN MUELLER: Yeah.

DANNY: Recently, I changed to yes-- no URLs for the past week or two.

And it still hasn't changed the site: search results.

JOHN MUELLER: Yeah, so the URL parameter handling tool primarily handles how we crawl the page. So it's not something where we would directly change what we have indexed based on that. But over time, we would change what we have index.

That's something where I imagine if you do a site: query, you'll see the changes there over a couple of months, not specifically within a couple of weeks.

DANNY: But I've I heard you before in these Hangouts

say that sometimes that the site: search might even reflect two versions of a page even though Google's index is only going to pick one for SERPs. Do you think that's going to be the case here? Even if I choose the right option in the URL parameter tools?

JOHN MUELLER: Possibly. It depends on what kind of a query you do.

If you do a generic site: query for your website, then we'll probably primarily show the ones that we think are relevant for your site.

Whereas if you do a site: query and in the URL mobile equals zero, or something like that, that's something where we probably try to pull out those URLs even if we're not showing them as indexed. And what you can do in a case like that to double check is to do an info query.

So you do info: and then the full URL. And you see what URL we actually index there. And that is generally a rough way to double check which one we're actually picking for indexing.

DANNY: OK, that's the first time I've heard of info colon. And I've heard you before say that the rel=canonical tool, even though it may still show two versions of the site

in a site: search is that correct?

JOHN MUELLER: Yes, that can happen to, yeah. And that's not something you really need to worry about. For the normal website, if we crawl like a double version of your website, that's not something that's going to significantly change things for your website. Whereas if you have like 10 different copies or 100 different copies of the same content, that's something where it'll be a little bit harder and harder to crawl that content. So if you're just talking about mobile equals zero and not mobile equals zero, then that's not something I'd really lose any sleep over or try to find some complicated solution for that.

DANNY: Webmasters must deal with this every day, especially for e-commerce companies where they have filtering. And they use URL parameters. Which one should they pick? Should they pick no? Or should they pick yes, no URLs? Or do you want to research it offline and get back?

JOHN MUELLER: That probably depends a bit on what they're trying to do there. Some kind of filtering where you have strong category pages, for example. That's something where maybe you want to have those pages indexed separately. Other kinds of filtering, you're basically creating an endless list of URLs that can be followed. And that's the kind of thing you don't want to have crawled separately. So it depends on which direction you want to go there.

DANNY: So if that e-commerce webmaster did not want them indexed, would he or she choose yes, no URLs, or would he or she choose, no, it doesn't affect page content?

JOHN MUELLER: I'd use yes because it's different content when we crawl the pages to double check. That's something where we see that there's different content already and then use the option no on the bottom to say, well, this isn't something that you should be crawling.

DANNY: OK.

JOHN MUELLER: And this is something we generally try to figure out ourselves. So if you have the option like let Google bot decide, and this is something you're really not sure about, then I'd just leave it at that. And let it get indexed like that. If we see that it leads to exactly the same content, then that's something where we'll fold those URLs together and treat them as one URL anyway.

DANNY: Yeah, because in my mind when I think of a mobile equals no or some sort of-- it's more of like an affiliate tracking code that shouldn't affect page content?

JOHN MUELLER: Exactly.

DANNY: Do you agree?

JOHN MUELLER: Yes.

DANNY: So it would seem like the natural answer would be, no, it doesn't affect page content because it says select this option if this parameter can be set any value without changing the page content.

JOHN MUELLER: Exactly, yeah.

DANNY: OK, but it hasn't worked for me in three months that I've left it up. Should I wait longer or should I switched to, yes, no URL?

JOHN MUELLER: I'd just leave it like that. I wouldn't worry about that. If it's really just like one parameter and it's like zero or one. But I'd just leave it like that. It's not going to cause any problems for your site.

DANNY: OK, thanks very much. I'll bounce off so somebody else can enter. All right.

BARUCH LABUNSKI: How are you?

JOHN MUELLER: Hey, Baruch.

BARUCH LABUNSKI: I've got a quick mobile question. And, yeah, I had to do that. I had to do that. So I have corrected about 300 pages. And I'm worried with one area where you know how you can mark it as fixed, right? So I have to go back to double check and triple check and make sure that the page is up in Berkeley working on mobile-friendly test. So I wanted to know-- so is this OK if it was launched? It was launched yesterday. Everything was corrected yesterday. And today the update-- is that a problem? Will it affect the ranking?

JOHN MUELLER: That sounds bad, yeah. No, I think you should be OK. We just published a blog post or two blog posts actually. And you probably aren't seeing this because you're in the Hangout here. So if you want to read all the news, go read that. But we basically have that question covered in the blog post like I just made a whole bunch of changes across my website, what should I do now? And it mentions different things there like the Fetch as Google submit to indexing can kind of help as well as setting up a site profile and saying all these pages just recently changed and Google bot will go off and crawl all those pages again. So those are essentially the main options there where you can tell us this content changed. And we should take a look at it as quickly as possible so that we can take its current state into account.

BARUCH LABUNSKI: But the thing is, John, the errors are from March. Some of them are from March.

JOHN MUELLER: Yeah, I mean that's something you really need to worry about. That's essentially the way web residuals aggregates it out of there. And that doesn't mean that those pages are seen as bad at the moment. And that's something where I do more like a cycle query or search for some text on that page and sees if it has a mobile-friendly snippet or not. And if it doesn't have it, and I'm guessing if you uploaded the website yesterday, then a lot of the pages won't have it yet. That's completely natural. That just takes time to re-crawl every process and everything so that we have the new state.

BARUCH LABUNSKI: All right, thank you so much.

JOHN MUELLER: All right, let me grab some questions from the Q&A. And then we can get back to more from you all. "Following a domain migration into new domains, some pages that never existed on the old domain are appearing in the search results as pages from old domain. All links lead to the new domain. Do you know why this might be happening?" This is similar to what we mentioned before.

If you do an explicit site: query for a site that actually moved to a different domain, than we'll understand that those URLs were also on that own domain. And we'll show them to you if you explicitly look for them. So that's something where if you look for the old URLs, we'll try to be helpful and say, well, here are a bunch of these old URLs even though we already know that these actually moved to a new domain in the meantime. So that's not something I'd really worried about there. This is essentially just the way our algorithms are working. You can follow the process of the site migration more in the index status information in webmaster tools where you see for the old domain, the accounts are going down, and for the new domain, the count should be going up. And if you see that happening, that's essentially a sign that things are working properly. Another thing you can track is in site maps, the index URL count where you see explicitly for the URLs that you submitted to us which of these are actually indexed. And if you do that for the new domain, you'll see across your sitemap files, if you have multiples of them, which ones are getting indexed and how much. "For product listing pages on a site that are mobile-friendly, Google is showing products one through x of y tag instead of the mobile-friendly tag in the search results." We've looked at that a bit. And we're trying to figure out what we can do there to also show the mobile-friendly tag there. So that's something where we're trying to make sure that we have the best of both worlds because both of these kind of snippets add value for the user. And we just want to make sure that we're providing the right things to users at the right time. So I imagine that'll change at some point. But I don't have any confirmation that it will happen or any time frame when that might happen. "Does Google apply sandbox effect also on old domains? This would be a big learning for e-commerce, SEO, if you can answer this question." I took a quick look at the question in the forum. And it looks like there are a whole bunch of changes made on this website. And, of course, if you make a whole bunch of changes on a website, then that's something that's going to take a while to be processed and taken into account, again. And depending on what kind of changes you make, those changes might result in a higher ranking or a higher visibility of your website or they might result in a lower visibility of your website over time. So that's not something where I'd say just because you have an old domain, you're automatically always ranking number one for those queries. We really try to take the relevance of these pages into account when it comes to our ranking. Let's see. "Would you give me some hints about the next Panda update or refresh? Some industry folks are guessing it would come after the mobile-friendly ranking algorithm in May. Should I believe the predictions?" I guess if these industry experts know this kind of information, maybe they can also give you some lottery numbers because they probably know a little bit more than we do here at Google. So that's something where I wouldn't necessarily take their advice into account for these kind of things because we work on these updates. And it's not that it's something where we'd say, well, we have this plan and we're going to leak it to some industry experts. And they'll know about this before everyone else. We really try to figure out when the best time is to make these updates. And sometimes we have to work incrementally step-by-step and figure things out along the way. Other times, we have different teams that are working in parallel that are trying to come to a solution essentially at the same time. So that's not something where you have to wait one thing for the next thing to happen all the time. That said, I don't have any dates or any updates on the next Panda update. So I don't have anything specific to pass along at the moment. It might be that it happens in May. It might be that it happens later. I don't have any confirmation there.

MIHAI APERGHIS: Go ahead, Greg.

GREG: Hey, John, I'm sorry.

JOHN MUELLER: Hi, Greg.

GREG: Hey there. I sent you a message on Google+. I know it's not the easiest message in the world. But I'm just wondering if you could possibly get a chance to reply, it would really help us out a lot.

JOHN MUELLER: Sure. I took a quick look at it today. I just got back today. But I haven't drafted anything out. I guess in general, if you run across something that's really weird that you can figure out, you're welcome to send it to me. But I can't guarantee that I can get you an answer quickly. But I notice noticed your post. So I'll try to see if I can bring you something there.

GREG: Greatly appreciated. Thank you.

MIHAI APERGHIS: Hey, John, since we're still talking about penalties. You mentioned in the past that a website can be both penalized from a website algorithm such as Penguin and at the same time have a manual action in Webmaster Tools or the same-- more or less the reason why spammy links. And we've got clients that came to us about a year ago with manual action. And we went ahead and went through the whole process of disallowing the links, removing what we could, send a reconsideration request. And within a few weeks, we got the penalty removed. So last week, for example, the same person came again and told me that you need-- there's no manual action appearing in the Webmaster Tools. But the results haven't really improved, especially in organic traffic. This is actually the website. So I was wondering whether it is the case that there might be several penalties tacked. And they asked me, actually, they should move to an entire new domain and start a team if they have that option.

JOHN MUELLER: Yeah, so if there's a manual action, they will still see that in the manual actions part in Webmaster Tools. So if they have multiple manual actions, they'll see the multiple listings there. That's something where they should see that directly in the manual action section if there's something like a penalty still applying. When it comes to our algorithms, of course, they can pick up similar things as well and depending on when they were doing these link issues that you mentioned, that's something that might be taken into account there and that might be something that has a really strong effect on this website because, I don't know, maybe they were doing it for years and years and years and years. And algorithms have been piling that on all the time and say, well, this is something where you really have to be careful with. So whether or not they need to move to a different domain or not is really hard to say for me. But it's something where you with your experience might be able to look at that and say, well there's just so much cruff that was happening here with this old domain that it really makes sense to start over with a new website. And that's something where I can't really say you should do it like this or you should do it like that. That's something you have to work on your own and figure out how much time does it really take to build a new website? Or how easily could I switch to a new website versus how strongly we're actually attached to this websites? So if you're a brand and you've been working on your website for a decade or longer, then you're not just going to switch to V2 domain name and say, well everything is on a different domain. It's a lot harder. So that's something where you have to take your experience and give them soft advice that is really hard to turn into something technically correct or technically incorrect.

MIHAI APERGHIS: Right. I was wondering since when we-- last year when we did the whole reconsideration request and removed the manual action, they haven't received it. So they have just one manual action regarding the links. We have it revoked. But when we did that and removed the links and upload it where we couldn't remove the disavow file, they only had left like I don't know, 5% original, natural, quality links. So I was wondering they didn't have any further results because they didn't have anything good left really. Or there's just another penalty like a Penguin penalty that also added on. And that breaks their efforts to move up.

JOHN MUELLER: I would guess based on your-- they have 5% of the links left comment that it's a combination of both where there's very little support left because almost everything that existed was problematic. And at the same time, our algorithms have picked up a lot of problematic things. So it's probably a combination of both of those.

MIHAI APERGHIS: So moving to a new domain might be a good idea if they can do this from a grand perspective.

JOHN MUELLER: I would perhaps look into that. But at the same time, I think you need to make sure that they don't go down this road again so that they don't keep rumbling through one domain after another because that doesn't really make sense for anyone.

MIHAI APERGHIS: Of course, yeah, they came back right now and are looking forward to working with us. But we just wanted to know that any effort that we do when working with them isn't going to be stopped by the fact that they still have something going on behind the scenes in automated fashion. OK, cool, thank you.

JOHN MUELLER: "I got referral spam traffic in my Google Analytics. Users usually come from Russia. Is Google doing something about this?" I know this is something that the Analytics team has heard about before. I don't know what they're currently doing about this. So you're not alone. There are people that have been sharing blacklists that you can set up for Analytics. And I don't think it's specific to Russian users. So it probably doesn't make sense to block individual countries. But there's just referral spam happening. And this is essentially an old school type of spamming that they're just doing with Analytics now. "If I don't have a mobile mobile-optimized website, is it possible that Google de-indexes me in mobile, but in desktop it doesn't? Also, how can Google suggest me in desktop search, but if someone searches for mobile, it doesn't?" So first of all, we don't remove websites for not being mobile-friendly. Our algorithms essentially demote them slightly and promote them if they are mobile-friendly, of course. So this is something where we don't remove those websites completely. And if someone is searching, explicitly for you, then we'll know that your website is really the most relevant for that search term and we'll show it anyway. So it's not that your website is going to disappear completely from mobile. But if we, of course, have different options available for us, and we see some of these are mobile-friendly, and some of those are not mobile friendly-- they are all similar in relevance-- then probably we'll be showing the mobile-friendly results to users who are on smartphones. So not on desktop, but specifically to users that would profit from seeing something that they can actually look at on their phone.

BARUCH LABUNSKI: What do you mean by slightly?

JOHN MUELLER: Slightly.

BARUCH LABUNSKI: If everybody's got the mobile-friendly tag the first, second, third page-- so does that mean the site would end up on the fourth page?

JOHN MUELLER: I don't think it would work out quite that well-- or quite that strongly. It's something where when we look at the motions, we essentially move things down a little bit or have less weight in the search results. So it's not that we'll count the search results places and say, oh everyone will go back 30 places. So that's not something where you can say, well, it'll always be on page four if all the other results are mobile friendly.

BARUCH LABUNSKI: Wow. Wait and see.

JOHN MUELLER: This is rolling out over the course of a week or two. So it's not that you would see it jump over today. But you'll probably see it, I guess, if you're watching the search results for a site. Over the course of the next week or so, you should see that kind of a change there.

MIHAI APERGHIS: John, is it more of a boost that websites with mobile-friendly pages get or more of a demotion for the sites that don't have this?

JOHN MUELLER: They are kind of both equivalent. If you boost one site, then the other site goes down. We can't 12 results on page one. So if we put two new ones in there, have to take two out and put them on page two.

MIHAI APERGHIS: I was wondering whether it's like a web spam algorithm, for example, which actually demotes the website rather than just pulling up other websites.

JOHN MUELLER: No, it's not a website algorithm. We are not looking for something bad and penalizing people for this. We're trying to make sure that we have the relevant useful results for users available. And that means bubbling the most good ones up.

MIHAI APERGHIS: Right, so it's more like SSL type of update that went last year.

JOHN MUELLER: Similar to that, yeah, I mean, it's definitely a stronger effect. So it's not exactly comparable. But it's similar in that regard.

BARUCH LABUNSKI: Oh, wow. OK.

JOHN MUELLER: But, again, it's always essentially equivalent when you look at it. If you boost some up, you have to take some down. So it's not the case that we would say, well, there's a big difference if you've demoted some or if you promoted some others.

MIHAI APERGHIS: But the impact is higher. That's all, John, right?

JOHN MUELLER: Yeah. We have some of that in the blog posts. I'd take a look at that after the Hangout. "How many words of content minimum are required on a single page?" There's no minimum, no maximum. You can put whatever you want essentially on the page. From our point of view, it makes sense that there is unique, compelling, and high-quality content on there. And sometimes that means a lot of text. Sometimes that means you don't need that much text. So it's not that there's any magic number out there that you need to aim for. I think this goes into the next question here. "How many outbound links can we add in a 500-word article?" You can put as many as you want there-- as many as you think are relevant. I think what was it? Matt Cutts did a blog post where he linked every word in a blog post pointing to something else. So you can do that if you want. It probably doesn't make that much sense for a user. But try to find out what makes sense and what works for your content.

MIHAI APERGHIS: By the way, if you show that when you write an in-depth article and you link to your sources and other relevant pages from the internet from all the websites is actually helpful and Google is expecting this to happen?

JOHN MUELLER: I don't think so.

MIHAI APERGHIS: Because when you, for example, when you link to a spammy website, you get some of the effect on you as well. So I was wondering if that's the other way around? Like linking to a high authority website or like your source or something like that is kind of beneficial to you if it's done organically.

JOHN MUELLER: There's no magic SEO benefit of linking to high-quality sites. That's something that I think spammers tried out 10 years ago where they would make these really spammy pages. And on the bottom, they'd link to Wikipedia and say, well, this is a high-quality link. Therefore, my content must be high-quality. And that's not the way that we look at it. So that's something where we try to take the whole page into account as it comes. But it's not that there's any magic advantage of having a high-quality link on the bottom of the page.

MIHAI APERGHIS: OK, not magic, but--

JOHN MUELLER: For the users, obviously it can provide a lot of value. And that's something that indirectly might make sense for a website. If you're really providing something valuable for a user that they can use as a research source, then maybe they'll link to your pages as well and say, this page links to all the right resources. Just go here and get that information. And that's indirectly you get that advantage. But it's not a direct SEO advantage where I'd say, well, you'd get a boost in your rankings like this little bit because you're linking to Wikipedia. I don't think that would really make sense.

MIHAI APERGHIS: But would Google use those links or the content on the pages that the links lead to to better understand. For example, if they're not sure-- if Google is not sure what the content is about on our page, might it visit some of the external links to try to figure that out?

JOHN MUELLER: That's theoretically possible. I think that could happen in some situations. But at the same time, if we really don't know what your page is about, then that's a bigger problem I think.

MIHAI APERGHIS: No, I mean at the beginning when you just first start indexing it and maybe you take some time as far as I know to understand exactly what a page should be situated.

JOHN MUELLER: I think even then we'd have to take a look at the content first and understand the content first. So if you're writing a page called Golf. And we look at that page, and we don't know is this about the car or about the sport, then based on the content alone, if we can't figure that out, then that's kind of a bad sign already. And that's not something you can override by just having this one link to the Golf car homepage. So that's kind of something where you really need to make sure that the relevance is provided on your site first. And then that additional information-- I don't know. It's possible that there's like a small aspect there that we take that into account. But I wouldn't rely on that. It's not something I'd say, well, this fixes the problems that I have otherwise on my website.

MIHAI APERGHIS: Right, right, and I would think possibly because some people think that they should actually avoid linking to other websites. So they hog their page rank or whatever-- people used to think in the past.

JOHN MUELLER: I think that's something where you really need to work with your users. That's not something where I'd say there's a primary SEO part there. I'd really think about the users and make sure that they're able to use your site properly.

GREG: John, I actually have a follow up to that question.

JOHN MUELLER: Sure.

GREG: Last week we previously had no-followed all of our links going to external sites. And we decided to remove the no-follow tag. And we immediately started receiving these link removal requests. They seemed pretty automated. I guess someone who's been going through some sort of spam attack has been contacting every site. I'm not sure the other participants in this Hangout have dealt with that. But what do you do when someone says can you remove our link and yet it's a high-quality link that we're using as a reference.

JOHN MUELLER: I think that's up to you. I don't know. I wouldn't necessarily worry too much about that. If that's really a normal link that you have on your page and you see, well, this is something I really want to keep on your page, then maybe it makes sense to keep it. I think in the worst case, the other site can always disavow that link if they want to. On the other hand, if you're looking at that, and you say, well, I don't really need this link here. And these guys seem pretty upset. Maybe I'll just take it down and do them a favor.

GREG: We've gone back to no-following the links. But if we didn't do that and we kept them as do-follow and they-- let's say we had x-amount of people adding us to a disavow file, does that negatively impact our trust with Google?

JOHN MUELLER: No, that's fine. That's something that sometimes happens. For example, maybe you have a blog. And in the comments someone managed to drop a bunch of comments spam. And those sites are now using the disavow tool to clean that up. It doesn't mean that the rest of your content is really bad. It's just, well, someone managed to drop a bunch of comment spam on your blog. And you didn't notice that. So it got picked up like that. So that's something where just because people are putting you in your disavow file, it isn't necessarily bad. On the other hand, it probably makes sense to keep a healthy relationship with the people that you're working with, especially if you're active in an area where you want to work together with these other websites. If you're constantly battling them and just saying, well, it's my right to link to you however I feel like it. And at the same time, you're asking for a favor somewhere else, then I imagine that's not going to be the healthiest relationship.

GREG: Thank you, sir.

JOHN MUELLER: Some more questions from the Q&A. "Is it OK to have an image video in a website map in one XML file? Sure . You can do that. That's absolutely no problem. "Is there any way of using the Webmaster Tools API to download top search queries, keywords, and top link pages for sites. And if the API can't be used this way, can you confirm? Is there an alternative way to download this data?" At the moment, you can use the Python script to download that data. That's something that will end up in the API I'm guessing in the upcoming quarter. So not immediately, but probably later this year you'll also have access to the search query information within the API. And for that, course, we also need to make sure that we have the whole search analytics feature launch first. So it's kind of waiting on that as well. "A couple pages popped up in Webmaster Tools showing it's non-mobile-friendly for one of my sites in March was previously shown, no problem with pages. But when I checked them with the mobile friendly test, it shows OK. Why might this be?" This is something where you have to take a look at the current tests and figure out what's actually happening there. It might be that maybe some CSS or JavaScript files were disavowed back then. It might just be that at the time when we checked your site, we couldn't crawl it properly. Maybe there are some things that timed out. So if, for example, your CSS or your JavaScript files timed out when we crawled it back then. And they are working now normally. And your server is kind of fine now, then that's something that will just get picked up as we re-crawl those pages normally. So that's not necessarily something I'd worry about. "Our website can't be crawled by Google bot due to 100% DNS errors. Robots.txt can't be found. No changes were made to the robots.txt. There's no HD access. Bing webmaster tools can crawl the site. Other DNS tools say everything's fine. What could be wrong?" Usually, this means that your DNS server is somehow blocking Google bot. And often that's based on the IP address, for example. So that's something where you might want to double check your DNS provider to see if they are blocking Google bot in any way. Or if you have a chance to try an alternate DNS provider for your website to see if that works better, that might be an option there as well. The robots.txt not found message there might be a little bit misleading because it's not necessarily that we're trying to find that file. We are trying to find that robots.txt file. And that's just the first URL that we try to crawl when we go to a website. So if we can't even get to the robots.txt file, then that's probably something pretty basic on your server is blocking us from being able to access your site. So it's not necessarily that the robots.txt file is a problem. But just that we can't get to your site at all. And the robots.txt file happens to be the first one that we try to pick up.

MIHAI APERGHIS: By the way, John, the user, actually, the question actually said that the robots.txt file can be found.

JOHN MUELLER: Yeah, I mean it doesn't matter if the file exists or not. If we don't get any reply all-- if we can't do the DNS look up at all for that host name-- then that's essentially something where we can't get anything back for that request. We can't even start to crawl normal either. "How bad are soft 404 errors due to switching to SSL?" I don't know. Your site shouldn't have soft 404 errors from switching to SSL. That sounds like something might be configured incorrectly. Essentially if you're switching to SSL, the first step you do is make sure that it works on both variations with HTTPS and without HTTPS. Usually there's a bit of work involved with HTTPS version to make sure all embedded content works well under HTTPS so that you don't have the mixed content warnings. And once you have that set up properly, you could do a redirect to the HTTPS version or set the rel=canonical first. But either way, you shouldn't really have soft 404 errors for that. That sounds like something else might be misconfigured. "Looks like Webmaster Tools is getting broken links from scraper sites when using bad regex to determine if it's in a site map. For example, something with [INAUDIBLE] in it." We do try to pick up all kinds of links that we can find and sometimes the links that we pick up are broken. So a common example that we see is someone will copy and paste a link into a forum. The forum software will truncate that link and show the first part of the URL and then dot, dot, dot, and then some of the rest of the URL as well. And if we just see the text version of that link, then we'll think, well, these dot, dot, dots in the middle might be a part of the normal URL. And we'll try to crawl it like that. And we'll see that there's a 404 there. And we'll say, well, fine, OK, so this URL didn't work. We'll try the next ones. So this is something where you see the results of those 404s in Webmaster Tools. But that doesn't necessarily mean that there's a problem there that you need to fix. That might just be like we found these links. We tried them out. They didn't work. And fine, we move on to the next links. It's not something you have to fix. It's not something where you have to go to other people's websites and say, hey, you have a bad link somewhere on your page. You need to fix that. That's essentially us just saying, well, we'd love to pick up more content from your website. We found this link that we haven't heard of before. We tried it out. It didn't work. Fine. We just want to let you know about it. "How does Google pick one website app to suggest as install app in between the [INAUDIBLE] links. What if I don't want to install that app. And every time I search in the same market and find the same website app suggested to me by Google, will it be user-friendly or won't it?" I don't know the exact criteria that we use to show the install app button in the search results. We do try to pick out the ones that we think are relevant to show to users. And oftentimes, if you get the same data through an app, you have a nicer experience. There's some nicer things that you could do with that app. So I don't really think that's something that we turn off completely, but it's still fairly new. So I'm sure any kind of feedback that could send back to us in the search results with a feedback link will be very appreciated by people. So if you find this annoying and you really don't want to see that happen anymore. Then let us know about that so that we can find a solution there for you as well. And this is something where I imagine over time you'll see a lot of iterations and changes still happening. Sometimes it'll be nicer. Sometimes there might be things that you don't really appreciate. So feedback is always welcome.

JOSHUA BERG: Hi, John.

JOHN MUELLER: Sure.

JOSHUA BERG: Yeah, a question-- so I understood that this mobile ranking change is not going to be-- it's not yet going to include the website speed as a direct ranking factor. But currently, there is at least some of the website speed is a factor, at least for the especially slow sites. And then we see this data in Webmaster Tools. So is that an average crawl data-- the time spent downloading a page by Google bot. Is that an average between Google, the mobile bot, and the regular Google bot are not going to be coming up with the same speeds?

JOHN MUELLER: That's actually something different. So that's a common point of confusion, I guess. On the one hand, what we look for are what we look for in that blog post-- I don't know-- it must be four or five years back in the meantime-- is how long it takes to actually render the page in a browser. So if you enter the URL, how long it takes for you to actually see the content directly in the browser. And that's something that I think is always worth working on, even if it's currently not a ranking factor. I know there are lots of really neat tricks out there to make mobile-friendly sites really fast and snappy. And from what I've heard from other people that if you have a really fast mobile-friendly website or a really fast website in general, that people will spend more time on your website clicking around getting to know more about your website, about your business, about your products. So that's something where you kind of have this indirect effect in any case, even if there's no SEO effect right away. The other part you're talking about is the crawl time in Webmaster Tools. And that's the time it takes for us to download the URLs directly. So that's not taking into account any rendering. That's essentially just raw downloading the HTML page-- raw downloading the images-- any kind of content that you have in there. And that's the average time that you see there. Usually, where the difference comes in is that you might have a really fast server. But you have so many different things that are embedded within the page that you're actually still rendering the page very slowly. So that's something where you have a kind of a disconnect between a fast server on the one hand. But the way your HTML pages are structured makes them extremely slow on a variety of devices. On the other hand, you might have a relatively slow server, but maybe you have your HTML set up in a way that's really efficient that means that these pages still render really quickly. So those are kind of the two variations there. For us, the time to crawl a page kind of drives how many URLs we can crawl from our server at any given time where if your server is really fast and snappy, it always returns the URLs really quickly. But we can crawl a lot more than if your server is really slow. So that's-- especially if you have a website that lives on current events-- things that are happening quickly that need to be found in search fairly quickly-- then if you have a slow server, then chances are we're going to have a hard time keeping up with those news. On the other hand, if you have a really fast server, then we can crawl those a lot faster. So the crawling speed is more of a technical issue for Google bot. And the rendering time is more of a user experience issue for users not like the Google bot. So those are the two sides of that coin.

JOSHUA BERG: OK, so the indexing is mostly what that's about. So is there are a ranking factor that includes-- isn't there? That includes rendering time or not? Is that too much data?

JOHN MUELLER: I don't know how much of that is actually still used at the moment. So we do say we have a small factor in there for pages that are really slow to load where we take that into account. But I don't know how much that's actually still a problem in practice. So we've seen pages speed up a lot. I think over time we'll find that maybe mobile friendly pages are technically they can load on a mobile device. But they are actually really, really slow. Maybe we have to encourage people to speed things up there as well. But I think at the moment, this is something where I had to focus more on the user aspect there where if you can make users happy by having a really snapping and fast website that renders quickly, then they're going to do a lot more on your website. They are going to hang around a lot longer just because it's a lot less involved for them to try things out. If they can click on different options-- different categories within your website-- and they get the results essentially immediately, then they're going to try a lot more things there.

BARUCH LABUNSKI: Yeah, in milliseconds.

JOHN MUELLER: Yeah, there are lots of studies done out there, especially by Amazon who has done a ton of experiments around this. And I think they've published a lot of information about this where they artificially slowed things down just a couple of hundred milliseconds and even see that there's a significant impact on what people actually do on a website there. And if you're talking about a website where you're currently really slow in rendering pages. And you can go from, I don't know, maybe 10 seconds a page to one second a page rendering time, then that's something where I'm almost certain you'll see a drastic change in the way users react within your website and with that, of course, they'll pass that on to other people. They'll recommend your website more and say, hey, if you want more information about this topic, this website works great because you can get everything really quickly. So that's something where they'll be a really strong indirect effect there that you can work with as well.

MIHAI APERGHIS: By the way, John, regarding crawling, I noticed for one of the websites that we work with, we use the crawler to try to over the whole website. And after a minute of crawling, maybe because of the speed, the CMS or maybe the hosting server, I'm not sure started throwing out the 508 errors unused. I am not sure why that is. I am just wondering if Google might interpret that problematically like those pages not existing or whether just knowing and slowing down the crawl rate and everything.

JOHN MUELLER: Yeah, so what we would do in a case like that. If we're crawling a website and we run across a bunch of server errors like 500s, 508s, whatever-- then usually we'll just slow down. So we'll see, well, there's a rise in server errors when we crawl like this. Therefore, tomorrow we'll crawl a little bit less. And that might be from one day to the next and might take a couple of days for us to pick that up automatically. But that's something that our systems essentially do automatically. They'll try to fine tune the crawling rate that works best for your server without causing problems on your server because it doesn't make sense for us to crawl so much that your users can't use your website. It doesn't make sense to crawl so much that we slow down your server or that we cause server errors. We want to find that magic crawl rate that works well for us to pick up your new content, but at the same time also not caused any problems. "Would you suggests to block tag pages of robots.txt? I use a lot of tags to group my tutorials for my e-reader. Is that duplicate content? And so on or bad for my website?" This really kind of depends on the website-- the kind of site that you have. I think there's some kinds of tag pages that are essentially going to search results pages, which probably don't make sense to get indexed. There are other kinds of tag pages that are almost like category pages where you have a useful collection of individual pieces of content that match this category. And that might be something you do want to have indexed. So that's not something where I'd say there is a default answer that works well for everyone. You have to work that out for your website yourself. Look at some of those sample pages and be as objective as you can in saying, well, is this really something I want to have indexed or is this something I don't really need to have indexed.

BARUCH LABUNSKI: John, just a quick question regarding the mobile update. And I know, I know, I know, that you guys said many times that it doesn't affect the search results. But if the sites is basically a very busy site where 50% is mobile traffic and that website has not adjusted to the mobile update, can it affect the search results somehow and not, I understand, the desktop it won't affect. It won't affect. But if there's less traffic coming to the sites, and they are not much on desktop, then that's it, right? I

JOHN MUELLER: It's really hard to say. I don't think there is a full answer that I can give you there. It kind of depends on the web site, I guess, how people are going to this website, if they're using search to go to the website-- if they're searching for something generic and landing on the website-- what they're doing afterwards. If they are on a mobile phone, and they want to recommend it to others, is this something that they wouldn't do because they don't see it in search? It really depends on the website. So that's something where if you do see that you have a lot of traffic for mobile and your website doesn't work well on mobile, that even outside of search you probably want to fix that problem. You don't want to say, well, 80% of my people are coming through mobile. But I don't want to serve them something that works for their devices. That's not really a long-term strategy regardless of why you think that happens.

BARUCH LABUNSKI: It's just that person is old school. And he doesn't want to make that switch. And I'm sure there's thousands and millions of people out there that are like that-- I did say that. And I told them to make that switch here.

JOHN MUELLER: Yeah, this is something where not everyone wants to do that switch now. Maybe they are also saying, well, I have a lot of investments planned for the spring. And I want to save my money for other things to invest my business to make sure that really works well. And maybe in the fall, I have more time to actually work on my marketing, my website, those kind of things. That's perfectly fine. Your website isn't going to be removed from search just because it's not mobile-friendly. If people are searching for your business, they'll still find your business. So it's not something where there's like, I don't know, a hammer is coming down on your website, and you're going to disappear from search forever. It's when you're ready, we'll be able to take that into account and until then, maybe you'll see a drop in mobile traffic to your website. Maybe that's fine with you. Maybe that's not something you're really worried about at the moment. Maybe you have bigger problems.

BARUCH LABUNSKI: OK.

JOHN MUELLER: All right, we just have a few minutes left. So I'll just open it up to you all.

MIHAI APERGHIS: Just a quick one, John. I sent you an email last week regarding some image, spammy websites that were using hot-linking images from normal websites. And they were appearing in image search. I don't know if you got it and whether you managed to take a look.

JOHN MUELLER: For those kinds of things, I generally try to submit a spam report so that the Webstrong team is aware of that. I can pass your email out as well so that they have those details too.

MIHAI APERGHIS: Yeah, I sent you some examples. I didn't know if it went straight to your spam folder because I mentioned some spamming websites.

JOHN MUELLER: I get a lot of weird emails. Don't worry.

MIHAI APERGHIS: Cool.

JOSHUA BERG: John?

JOHN MUELLER: Yes?

JOSHUA BERG: So when the mobile-friendly tags first started getting used, was that a significant change as far as user reception increased click throughs from mobile users using that for what you see-- say it wasn't that big a difference.

JOHN MUELLER: It's really hard to say because these things always take time for people to understand what they actually mean. And that's not something where we can put out something and say, well, this says mobile friendly and people will understand it right away. They might assume this means something slightly different. Or they might be confused by the label. And they don't really know what to do with that. I know before that, we also showed various icons in the search results where we'd see, I don't know, a green mobile phone, or a red mobile phone, or a crossed out mobile phone to try to see which of these variations brought the message across best. And that's something where we've noticed it was very, very widely varying reception from users where in the beginning they'd be totally confused and not really know what this means. It's like will this call someone on my phone? Or what does the green or the red mean? What does the crossed out phone mean? But we notice that sometimes it just takes time for these things to get picked up by users and to be figured out. So that's something where we didn't really expect anything to happen from one day to the next and say, well, the click through rate will double for these sites because, of course, everyone will figure it out. A lot of you SEO experts who've been looking at this area for a long time-- you probably figured it out a little bit faster than the average user. But the normal user doesn't read Google blogs. They just use their phone, and they see the search results, and they see this label. And they don't really know where this is actually coming from. So it always takes a bit of time for things to get understood by the normal user. But I think that's something that's part of the normal process and part of the reason we do so many experiments as well to see which of these experiments are easier picked up by users, which of these are more complicated and harder to be understood by the average user.

JOSHUA BERG: Would you say that the mobile-friendly implementations over the last couple of months has been as high as expected in very good numbers or compared with the SSL change where not so many people may have really seen the value?

JOHN MUELLER: We've seen some nice rises in the number of mobile sites that are out there. So that's something where the effect is definitely positive in that we're encouraging more people to make really nice mobile-friendly sites. So I think from that point of view, it's working out fairly well. I don't think we have any numbers that a we can share on that though. So it's not that I can say, well, I don't know. 10% more of the web is now mobile-friendly. I don't think we have any numbers like that.

JOSHUA BERG: Well, the word is out today because it's on all the international news channels.

BARUCH LABUNSKI: Like on Bloomberg, CNN.

JOHN MUELLER: Yeah, it's everywhere. I think it's interesting that then we have so much coverage of these. I'm kind of sad that there is a strong negative focus on this as well where, I don't know, businesses are disappearing because of this change. And not so much people realizing, well, this makes sense for everyone who's using a mobile phone who is trying to use the internet on the phone. And that's a really large number of people. And that's a number that's growing and growing. So that's something where I think people are going to get used to the new reality of a lot of people using their mobile phone as a primary internet device.

BARUCH LABUNSKI: Right, but I mean, people are still using desktop.

JOHN MUELLER: Yeah, I guess. I don't know. I was on vacation the last five days or so. And I had my laptop with me. I didn't open it once. I was able to do everything with my phone. And that's I think where things are headed. You might have a desktop. You might still use a desktop for some things. But a lot of the browsing that you do nowadays-- you can do it on the phone. It works just as well.

MIHAI APERGHIS: By the way, John, since you mentioned, the rollup will be over the next week or so. Is it going to be at the same time, internationally or the US or English-based websites will see some of the changes first and maybe international queries and results will be affected later?

JOHN MUELLER: This is globally. So this is happening at the same time globally. All right, thank you all for joining. Thanks for all the questions and discussions. I'll have the next Hangout in English I think on Friday. So depending on your time zone, maybe we'll see you there. Otherwise, in one of the next ones as well. Have a great week, everyone, and good luck with your mobile-friendly websites.

MIHAI APERGHIS: Thank you, John. Have a great one.

GREG: Thank you.

JOSHUA BERG: Thanks, John.

JOHN MUELLER: Bye.

GREG: Take care.
ReconsiderationRequests.net | Copyright 2018