Reconsideration Requests
Show Video

Google+ Hangouts - Office Hours - 29 December 2015

Direct link to this YouTube Video »

Key Questions Below

All questions have Show Video links that will fast forward to the appropriate place in the video.
Transcript Of The Office Hours Hangout
Click on any line of text to go to that point in the video

JOHN MUELLER: All right. Welcome everyone to today's Google Webmaster Central Office Hours Hangouts. This will be, I guess, the last one for this year. But I'll set up new ones for next year as well so we can keep rolling on these. Looks like we have a bunch of people here signed in already, hanging out with us. A bunch of questions were submitted. We can probably make it through them. We have a bit more time today. So maybe we'll have time for discussions and questions from you all as well. As always, does anyone want to get started with a question or comment or something?

BARUCH LABUNSKI: Yeah, I wanted to start a question about HTTPS. So I changed a website with 45,000 pages. Of course, it was me and someone else. And basically, everything was changed to HTTPS. The exact certificate that you wanted and everything moved correctly in Webmaster Tools from HTTP to HTTPS. And then, of course, you watch the traffic go down to zero in the HTTP account. So that was moved to the HTTPS. But then, like always, traffic started going a bit down. How normal? How many positions usually? Because it's not the first website I moved. And normally, there wasn't any fluctuations. But how normal is it for the fluctuations, because it looks like it's significantly going down? But then it went up again normally. And I moved the analytics as well property. So it's everything-- I'm watching things closely. It's a lot of [INAUDIBLE] understand that.

JOHN MUELLER: Yeah, it's really hard to say. In general, you wouldn't see much of a big change overall. You might see some small fluctuations as things kind of shuffle over. But it should settle down to the previous state. And I think sometimes what happens is when people do the move during an off-season time, then they might see those effects as well in the traffic. So that's something worth looking at, maybe looking at the traffic patterns you had last year throughout this time. Did it change during that time? Did it go up or down? And try to see if that matches what you're seeing with HTTPS there.

BARUCH LABUNSKI: No, last year was the same, of course, like always December. Yeah, it's a bit slow.

JOHN MUELLER: Yeah, but it probably depends on the site. So that's something maybe we can take a look afterwards if we run through the questions and have a bit of time. But in general, these things just take a bit of time to settle down and be reprocessed and re-indexed in the new way.

BARUCH LABUNSKI: OK, sounds good. And then also regarding that email, did you see that one I sent you?


BARUCH LABUNSKI: The latest one? OK.

JOHN MUELLER: I'd have to double check.


ROBB YOUNG: John, for the secure move, it's something that did-- one of the things we did. But then I'm wound again. Is the current advice still you're moving the whole thing or you just give Google the option of both and let Google choose to prioritize secure over not? Because when we did the whole site move and canonical, we had to unwind it all because it just died. But this was in the early days of the first two or three months of when other people were reporting the same. But what's current advice? Move or give Google the option?

JOHN MUELLER: With changes like this, what I would do is just make it available in both versions, first of all. So in your position, it would be give Google the option in the beginning so that you can be sure that the HTTPS site is working as it should be-- that you don't have these mixed content issues-- those kind of problems there. And once you're sure that the site is working the way it should be, that the performance is the way that you expect it to be, then I'd set up a 301 redirect from the HTTP version to the HTTPS version. So do it in a two-step process. Obviously, if you're sure that the HTTPS version is working really well because you've tested it properly internally, then just setting it up and doing a site move directly would be just as fine.

BARUCH LABUNSKI: But what about the defaults? Does that keep in the defaults? Because you said you were ranking things by defaults.

JOHN MUELLER: How do you mean by default?

ROBB YOUNG: My default domain?

BARUCH LABUNSKI: Yeah, you're starting to rank HTTPS.

JOHN MUELLER: Oh, OK, with the recent blog posts. So, essentially, if we discover a URL on HTTP and we discover the same URL on HTTPS, then what will happen is we will try to prioritize the HTTPS version over the HTTP version. So that's kind of what we mean by default in the sense that if we have both of them, and they're essentially equivalent for us because we don't have much more information about them-- you don't have a redirect set up-- you have around canonical set up-- then we'll try to pick the HTTPS version, provided that all the criteria are met.

ROBB YOUNG: So you don't really need a 301. The risk is that in the past you've confirmed that a 301 doesn't give you 100% percent of the authority of the previous domain.

JOHN MUELLER: I wouldn't worry about that with HTTPS.

ROBB YOUNG: Well, but we have to.

JOHN MUELLER: But you're not really doing a site move. You're staying on the same URLs. You're just switching the protocol. So we know it's the same domain. It's essentially just switching the URLs over. So that's the kind of situation where I wouldn't see those URLs as losing any page rank from a redirect.

ROBB YOUNG: But there's no real risk in just letting both URLs out there-- seeing what happens over time. And if it shows that your always shown the HTTPS, then just leave it.

JOHN MUELLER: Sure. Risk-- it's something where essentially we hope that you guys move to HTTPS because of your users, because of the connection to the user that the connection is secure, that they're aware of going to the right site. So that's something where if you leave both of the versions up there, then users might not move. Whereas if you set up a clean redirect, then of course, you're sending all your users to the HTTPS version. And they stay on that because maybe your site is linking to those pages anyway. So that's something where I wouldn't say a good long-term solution is to just leave both versions open. I try to pick the version that's best and make sure that everyone is routed to that version.

ROBB YOUNG: And is that dependent on audience? Our audience is 90%-plus new anyway because people-- normal e-commerce scenarios-- there's lots of new customers out there. So we don't have a forum like some of these other guys. So they're coming back every week anyway. So eventually over time, most of them will end up on the HTTPS anyway, I would imagine.

JOHN MUELLER: Probably, yeah. At that point, you can set up the redirect anyway, right?

ROBB YOUNG: Assuming the traffic is right, yeah. You know our sites are risky-- any kind of change.

JOHN MUELLER: I think the average site-- if you're just moving to HTTPS, then that's really not something I'd worry about. That's the kind of situation where I'd just say, well, you want to move to HTTPS because you have a bunch of good reasons to do that. And as a part of the move to HTTPS, you set up those redirects. That's something that we're doing as well on Google. We're essentially going through the whole site, everything we have on and making sure it works on HTTPS. And when we're sure it works on HTTPS, we set up the redirects. Obviously, it's not something that's done overnight that we would switch redirect on. And if anything breaks, we'll try to fix it as we go along.

ROBB YOUNG: Does Google rely on Google for all of its organic traffic though?

JOHN MUELLER: Well, for some of it sure, yeah. We have to show up in search as well. And we regularly have the same kinds of SEO problems that all of you have where someone will go out and robot the homepage or robot the login page, and everything go through the login page, and you can't see the content anymore. So we can't for it. And all of these things are normal issues that every site has to work with.

BARUCH LABUNSKI: Well, John, regarding the disavow in HTTP and HTTPS, does it matter if you first added disavow to the HTTPS and then add it to the non-HTTP?

JOHN MUELLER: I wouldn't worry too much about the order there. That's something where I'd essentially just upload the same file to both of them.

BARUCH LABUNSKI: Yeah, so now you get two messages.

JOHN MUELLER: Yeah, both of them in the same account, yeah. Over time, you'll have one version of the site that you pick as your canonical anyway. So that's the one you'd be working on over time. But in the time frame where you have both of the versions up where some of your links are still pointing at the old version, then you will have both of them in your account. And you work on both of them. But in the long run, you'd have one version of your site essentially in Search Console, and that's the one that you'd be showing.

BARUCH LABUNSKI: So you're saying for maybe two or three months, just do it like that or a full year?

JOHN MUELLER: Sure. I don't know. Maybe half a year or something like that if you're doing a site move type thing or if you're switching from HTTP to beat HTTPS. Let me run through some of the questions that were submitted. And I'll try to keep it short so that we have more time for all of the other live questions as well. Or maybe you could just go ahead and ask your question first. And then I'll run through the questions.

MALE SPEAKER 1: Regarding this HTTPS, actually, this was the question. In case somebody has disavow in HTTP version for his website. But suppose he does not disavow for HTTPS. Will Google take any action? Or this is really risky for a future purpose if somebody does not disallow an HTTPS version or does not have such proper information?

JOHN MUELLER: I would just make sure that disavow is live on the current canonical version of your site. So if your site is currently on HTTP and you don't have anything on HTTPS, and everything is indexed with HTTP, then that's where you do the disavow file. If your site has moved to HTTPS or moved from www to non-www or to a different domain, then wherever your current canonical is, that's the one where you need to do the disavow file. And you could just take the disavow file from the old version and just re-upload it to the new version. So you don't need to do anything special with that file then.

MALE SPEAKER 1: OK, but John, my question was actually is Google really very concerned that you have to disavow on HTTPS? Does Google have any plan to take action against those websites who do not do this?

JOHN MUELLER: Well, if your site isn't indexed as HTTPS, then you don't need to do that. It's really just based on where your site is currently canonical. So it's not something where we try to throw anything artificial in the way and say, you need to always do this on HTTP as well. It's just if that's the version of your site that's canonical at the moment, then that's the one where we need to have these files. "When do you guys estimate the next Penguin update will be?" I don't have any new news on that. We talked about that briefly before Barry joined. But apart from that, I don't have anything new to share. No, just kidding. I don't have any news on the Penguin up. Usually we take a pause during the holidays to make sure that we don't do anything crazy then. But I imagine we'll hear more about this in the New Year.


JOHN MUELLER: We'll see. "I want to know about spam referral traffic. How can we stop it? Our website's rank is going down due to spam referral traffic." So there are two kinds of spam traffic that I've heard about recently. On the one hand, there is the analytics spam where people aren't actually going to your website. They are just sending pings to analytics. And that's showing up in analytics. And that's something that I know the analytics team is working on and taking pretty seriously. The other type of spam traffic is when people just hit your site with a bot with a fake referrer that shows up in other types of analytic software. So for both of those, that's really not something that would be affecting Google at all. It's kind of a hassle because you have to filter out in your analytics. And you have to make sure that this is traffic isn't counted if you're looking at your page views. But apart from that, it's not something that would cause problems on Google's side. "When testing my site with Google Page Speed Tools, it says I have to improve the loading of external Google Fonts as well as adding caching for Google Analytics. What do you suggest when Google complains about Google?" So in general, this is something that I would treat as any other issue that comes up on your site. We explicitly don't white list Google's features with these tools, because these features slow down your website in a browser anyway. And just because this feature is created by Google, doesn't mean it doesn't slow things down for your users. So that's the kind of thing where you have to take the output of these testing tools and use your knowledge to figure out which of these aspects are really problematic, which of these are really slowing things down for users, and then take action to improve things in the right way. And that might be removing some features that Google has implemented because they do slow down your site. So that's something where you have to take your knowledge and your experience, apply it to the output of these tools, and then make changes in the ways that you think makes sense for your users. "Will making changes to a site template improve our decline site traffic or site rankings?" Yes, it will probably change your traffic and rankings. Obviously, if you do things really well-- you make things really nice-- then that might have positive effects. If you break things by making site template changes, then that might have negative effects. So making changes to your site template is essentially making a change on your website and something that we would pick up on with crawling and indexing.

BARUCH LABUNSKI: When does the design core algorithm kick in? I think it's once a year, right?

JOHN MUELLER: Which algorithm?

BARUCH LABUNSKI: Well, there's the design, which is 1, I guess, from the 200. Once a year--

JOHN MUELLER: I don't know if we do that. But as a part of, for example, Panda where we do try to recognize high-quality sites. This is something that plays into the same thing where if you have a website that looks really, really obsolete, and you're trying to say, well, this is like bleeding edge scientific information here, and you're presenting it in a way that looks like a front-page website from 1995, then that might not look as trustworthy as something that actually looks like you know what you're talking about. So that's something where I wouldn't focus on specific algorithms. I'd more focus on making sure that your website matches the image that you want to bring across. And sometimes that does mean updating a really absolutely template to something that looks a lot nicer that works a lot better where you see also people go to your website, and they think, oh, well, I can really trust this website. It looks professional. It looks like they know what they're talking about compared to something where, oh, I don't know if this is made by some kid with some obsolete graphics program that just copy and pasted some content together where it's the kind of thing where you might assume that the content isn't really that high quality.

BARUCH LABUNSKI: How often does the core algorithm role out?

JOHN MUELLER: The core algorithm-- so we make changes to our algorithms all the time. And these algorithms run pretty much automatically. So that's not something where I'd say recognizing how the content is-- how a website is-- is done once a year or gated in any way.

BARUCH LABUNSKI: So the minute that a website changes, then it can affect the ranking positively or negatively.

JOHN MUELLER: Sure, like any other change you'd make on a website. And some of these are more of a technical nature. So for example, if you changed your internal linking significantly, then that might be something due to the template change that you've made, which could essentially it's affect the way that your site is crawled and indexed. So maybe we can find the content that was missing before. Maybe we can't reach that content anymore because it has such a fancy navigation that you have to go through like a search field or something like that. So these are things which essentially happen automatically all the time. "Will Google consider using HTTP2 as a ranking factor to speed things?" I don't see that happening any time soon. HTTP2 is something that I suspect a lot of websites will just have enabled by their server automatically where you won't have to do anything special. And it'll just work in the background. So this isn't something where you have to artificially create something on your server to make that work. It'll essentially work transparently. One of the aspects here with HTTP2 is that a lot of browsers only support some of the features in HTTP2 if your site is already on HTTPS. So that's one thing that might count towards, well, I'll move to HTTPS as a preparation for making sure that my site works even better when HTTP2 rolls out to my server. "As a way of dealing with scrapers, I'd like to add the x-robots tag to my XML site maps and then permit crawling only to Googlebot and some other credible crawlers, but not to row crawlers or end users. Could that be perceived as cloaking?" No, I don't think that would be, in any way, problematic. With sitemap files, you can even cloak them directly to search engines. That's something that we would explicitly allow where if you test the IP addresses, and you see it's not a Google or Bing or Yandex or Yahoo or whatever IP address, you can serve a not allowed page if you want to. So that's the kind of situation where this content is explicitly only for search engines. So you can choose to really explicitly only show it to search engines. "I've been trying to clear some manual actions due to unnatural links. I've just had two reconsideration requests refused. I had no sample links quoted in the reply. Is this now the standard?" No, I wouldn't say this is the standard. But sometimes that happens. So that's the situation where what I would do is go to the help forums and get advice from some other peers who have gone through this process as well, which might help to point you in the direction that you'd need to work on. "Our source code was copied and iFrame placed over it to create websites hosted on ad servers. We found over 50 sites that still show our links and link back to us. Is Google able to spot that this is malicious? And other than disavow, what can we do?" I think disavow was a great thing to do in a case like that. You might consider using the spam report tool as well, depending on what exactly is happening there. But in general, we recognize this kind of situation. And we should be able to handle it fairly well. If you see that we're not handling it well where some hack site is shown your content and ranking above you, that's something we'd love to hear about. So I'd post in the help forums fr that. "Our site rankings have been under attack since beginning of December. Someone copied our site and inserted some content on totally unrelated sites. Some of those sites are ranking higher than us. What can we do?" So, again, in this case, the web spam reports you mentioned that they're not helping. That's one thing you can do. Another thing would be to post in the help forum so that we can get these escalations and look at them directly.

BARUCH LABUNSKI: Why don't he just DMCA it?

JOHN MUELLER: That's obviously sometimes another thing. But that's something where you'll probably have to look at the legal situation behind that first. That's not something where I could make a recommendations and say, well, you should take this specific legal action because, obviously, I'm not your lawyer. I can't give you legal advice like that. "For a website, the robots tester in Search Console shows

me that a page is blocked by this rule disallow: *lightbox. Is there a syntax mistake?" Yes, all disallow and allow lines should begin with a slash. So in this specific example, it has

disallow: and then a space, and then an asterisk and light box. So it would need a slash before the asterisk there. And then essentially that should work as expected. "In the last session, you mentioned hidden text is discounted. If relevant hidden text is placed under appropriate headers to make the page less cluttered and easier to use, why should that be discounted?" So from our point of view, the content that people see when they visit your page should be what your page is about. And if there is important content that people don't see when they open the page in the browser, then it feels like there's a mismatch there that you think it's important content, but at the same time, you don't actually want to show it to people by default. So that feels like a kind of a mismatch there. And if you feel it's important content, then I'd recommend putting it on a separate URL so that we can index it there, so that it's directly visible there, and you can still use fancy JavaScript techniques to handle a navigation within your site so that pages don't have to reload and take time to be refeteched in the browser when people navigate between specific parts of your site. But by default, if you think something is really important for your page, make sure it's visible by default. "Am I correct in understanding that Google Search is now rewarding field topic experts while penalizing those with a university degree in the journalistic style of writing which enables translation of fields into easy-to-understand copy for the masses?" I don't really understand this question. But I think it kind of goes in the direction of why don't you give real journalists more visibility in search? They know what they're writing about compared to these people who are just writing for the web. And from our point of view, both sites could be generating great content and could be generating something that we'd want to show in search. So it's not that we're explicitly saying that one type of content is better than the other. It's just they're different versions of content out there. We try to rank them appropriately. "What , according to Google, is the best way to optimize an entity page like some celebrity or any topic like the Nepal earthquake? What kind of content does Google expect is relevant for its users and also helps us understand that this is what Google end users wanted?" There is really no simple rule with regards to how to create a great website or a great page on a specific topic. So that's something where you as a topical expert would need to bring in all of the information and knowledge that you have. Also, maybe information about your readers, about the people who are viewing your web pages so that you make sure it kind of matches what they'd expect to find. So there is no simple rule where we would say, well, you need to have this much HTML and this clear structure here. Some sites are really creative in the way that they present the information. Other sites present it in a more straightforward way. And both of those sites have reason to show up in search. There is no one simple answer that solves everything. So take your information. Take the knowledge and experience that you have on this topic and present it in the way that works best for your users. "All of us here must compete with Google's promotion on its domain, it's page, and Facebook ads that the website owners choose among Weebly, wiks, and et cetera for the web building needs. Can Google also add a line to promotion to use local search for developers/designers?" I don't quite understand this question. Maybe if you're listening in, maybe you can reformulate it and submit it again or post it in a chat if you are here in the Hangout. But at the moment, I'm not quite sure what specifically you're looking for. "Google is not using the keyword metatag as a ranking signal anymore. But is it if given use for anything else as there still might be useful information in there? Or do you just ignore it completely?" As far as I know, we just ignore it completely for search. I believe maybe AdSense uses it, but I really don't know if that's still the case.




JOHN MUELLER: Google News. OK, Google News in that case. So if your site is listed in Google News, and if that's correct, I don't know, then maybe it makes sense to keep those there. In the past, I've also use the keywords meta tag just as a helper for myself to know that this page is on this topic and keep information focused like that. But essentially for web search, you really don't need it. You can put whatever you want in there. "How does Google treat SWF content? I run Flash gaming websites. And these games are distributed by game developers. How does Google then measure duplication of Flash games? The only way I understand it is to write a unique description of the game." Yes, I guess in some cases, we can actually pull out the content of Flash files and include that as a part of the web page. So that's something that sometimes we're able to do. Having unique description for the Flash file definitely helps. With regards to site setter, essentially just a combination of existing Flash files. That's I guess always a tricky situation because you really need to provide something unique and compelling that's more than just kind of a spun version of the text. So from my point of view, what I would look at there is to make sure your site shines on the usability on other aspects that are involved there with regards to really bringing some unique value to people who are looking for these type of games. So that's something where I wouldn't necessarily try to rank for these specific Flash files, especially if you know that they're already duplicated multiple times across the whole web. "I revised one of my sites. The site has a few links to another site of mine. The URLs are not correct anymore. But there's a 301. Should I just that correct the URLs and let the 301 do its job? Or does Google give more value to an H link than to a new one?" From my point of view, whenever you can link directly, I prefer to link to directly. It makes it a little bit faster for the user. But from a practical point of view, if you're talking about a handful of links from one site to another, that's not really something you end up seeing any visible change in the search results because of that. So that's something where, on the one hand, sure, it's a great idea to link directly wherever you can. With a 301 at least people are making it to the right content. And it's kind of like you could do it both ways. I wouldn't necessarily assume that you'd see any significant ranking change just by going one way to other. "If a great number of URLs, more than 1,000 URLs and 1,000 domains was removed from a disavow file, can be seen as a link boom? Could it be understood by Google as a spam signal? For example, if the URLs were added by mistake more than half a year ago." I don't see any problem with that. That's not something where I would say this would be seen as a bad thing or is a good thing. It's essentially just a technical change that you're making. You're telling us that when we recrawl those URLs, we should take those links into account. And if those are normal, good links, then by all means, let us take those into account. "We've seen a quite substantial clickthrough rate drop in Search Console for apps for all of our accounts for the 16th and 17th of December without search appearance filter applied. It's a few days after the search appearance filter being applied to Search Console. What can be the reason for that drop?" I don't know offhand. I'd have to take a look into My Account to see what all was happening back there. The tricky part specifically around app indexing is that this is still fairly new. And we do make changes with the way that these apps are shown in search results. And some of those changes can have a significant effect on the clickthrough rate, on the impressions, that you're seeing for these apps specifically. So that's something where I would kind of expect to see things fluctuate for a little bit until we actually figure out how to handle these apps in the search results for the long run. And I would assume that, I don't know, I'm just making a guess in the next half year or year or so, things will be significantly settled down that you wouldn't see such strong fluctuations just randomly happening. "Does excessive nofollow cause de-ranking of a website?" No. If you use nofollow on your website when linking to other sites, that's perfectly fine. If you use nofollow within your own website, then, of course, that could cause problems where if we can't crawl your website and pass Page Rank internally, then that's something that makes it really hard for us to crawl and index. But if you're just meaning that your site has a lot of nofollow links to other sites, that's your business. That's the way you run your site. That's up to you. "When having multiple URLs indexed with almost duplicate content caused by a former relaunch, is it better to use 301 redirect or to choose a canonical tag to fix that? Old pages will be noindexed as well but still shown." If you're moving URLs from one URL to another, I'd use a 301 redirect. If you have to keep them up in parallel for whatever reason, then use a rel canonical. But it's not that there's like one version better than the other. It's essentially what are you trying to do, and what's the best tool for that. It depends more on that than on how Google would interpret that. "Hidden with JavaScript internal links in case they're not closed in robots.text and useful for Googlebot. Can this action cause any penalties from Google? Will page rank still be transferred through these kind of links? Or will they work like rel nofollow links?" This wouldn't cause any kind of manual action from our point of view as long as this is normal content on your pages. And these links will still pass page rank normally. So a really common use case for this is if you have a kind of navigation on your site that has a menu where you click on the top-level menu item, and JavaScript renders the rest of the menu items. And from there it links to various parts of your site. That's something we've had to deal with for a fairly long time. We were pretty good at figuring out how that's supposed to work.

MALE SPEAKER 2: John, if I may, I will add. So those are just functional internal links. Those are site-wide links like log into my account, drag my order, stuff like this. So I was referring to those links.

JOHN MUELLER: Yeah, that's perfectly fine. I don't see any problem with that.

ROBB YOUNG: I think somebody else asked something similar in the chat, didn't they? Whether those links would be considered cloaking? Basically.

JOHN MUELLER: If you're generating those links with JavaScript, and we see them when we render the page, then they're essentially normal links. So it wouldn't be like clothing.

ROBB YOUNG: And it says, in an effort to combat scrapers, we tested reverse links with jQuery. Would this be considered cloaking? Since when the link is hovered over by a real visitor, the real link is revealed.

JOHN MUELLER: OK, so if it's on hover, then it turns into a link. What would happen in a case like that is we probably wouldn't pick those link ups because Googlebot isn't going to hover over every part of the page. It'll pull out the page. It'll render it once like a browser. It's not going to interact with the page to see what's actually going to happen when they do specific things. So if you need those links to be found by Googlebot, then make sure we can find them when we load the page. If you just want to make them available for users, then sure, I think that might be an option. I think, in most cases, you wouldn't really want to do that. And if you are having problems with scrapers, then I'd try to find something different to attack that more directly than to obfuscate the links like this, which could essentially end up causing more problems for your website in search then the scrapers ever were.

ROBB YOUNG: OK, I was asking that on behalf of someone. I think they were already in the chat, actually.

JOHN MUELLER: All right, I'll have to check out the chat afterwards. "What are the top three stories you wish I didn't write on my site?" Oh, jeez, I don't know, Barry. Your site is so big. Where do I start? Oh, wait, that you wish I didn't write. OK.

MALE SPEAKER 3: You can skip it if you don't want to answer it. There must be one that comes to mind. I'm just curious.

JOHN MUELLER: I don't know. I can't think of anything at the moment.

MALE SPEAKER 3: It's all that good.

JOHN MUELLER: You've picked up so many things that it's going to be natural that every now and then, you'll run into something where I'm just, oh, god, Barry. But yeah, I don't know. I think that's normal if you end up picking up so much content about what's happening around search that there's bound to be one or the other that one or another person might not appreciate. Overall, I think it's really helpful to have such almost like an encyclopedia of SEO of what's happening around the web.

ROBB YOUNG: He'd just deny everything anyway. What's the difference?

JOHN MUELLER: All right, site links. Is it possible to get more than six site links when searching for brand or company name? I don't know. Maybe. I know this is the type of UI where people are always tweaking on it and seeing what makes more sense. But I don't know what specifically there are limits there or not. So if you've seen this in search, then obviously it's possible. If you haven't seen it in search, then maybe it's not possible at the moment. But maybe it's something that will happen in the future or that has happened in the past where we've tested and said, well, this is maybe too confusing to users, too many, or maybe we just haven't tested it out yet. "I own a large image website. Is it OK to have the meta title, image title, and product description the same until we get product descriptions more defined?" Sure, that's perfectly fine. What will probably happen is we'll just pick up one of these versions and keep that to show in the search results. But if they are all the same, then they are all the same. Then that's not something where we would create some kind of a keyword stuffing filter and remove those pages from the search results. It's essentially a technical issue and not an SEO problem. "In Webmaster Tools, or Search Console, I guess, we saw most of the site URLs get removed from index status after changing from HTTP to HTTPS. Yet our analytics weren't affected this way. What's the best course of action to take at this point?" This is something that's really common. And we've seen a bunch of people get confused by this. But essentially, Search Console, if you have your site as HTTPS or HTTP listed in Search Console, then that's the URLs we'll focus on. So if you move your site from one version to the other, you'll see the number of index URLs go down in one version and then go back up in the other version. So what you're probably looking at here is the HTTP version of your site where it goes down when you move to HTTPS. And if you add the HTTPS version of your site, then you'll see that it's been going up at the same time. So especially if you're saying the analytics weren't affected, then what's essentially happening is we're shifting the URLs from one version to the other version. And all your traffic is going with that as well. So that's not something you need to fix per se, but you can find the right information by looking at the right version of your site in Search Console. "Is there some way to access and download all data from Search Analytics of Search Console, not only in the last three months?" I think that would be a great feature. I totally agree with you, that would be great feature. At the moment, that's not possible. I know there's some tools that use API to download this data step-by-step, where if you start downloading the data, then, of course, over the course of a couple of months, you'll be able to collect this data during that time. But those are essentially external tools that use the API to download this information. "Some research papers aren't being properly indexed by Google Scholar and others." This sounds like something I probably need to have more details on. So if you could post that into the help forums, that would be a great thing. And I'll try to pick it up there. "With the new AMP blog article markup, should that only go in the AMP-only pages? Or should it also be added to the original version as well? I notice the image objects aren't following the same rules as normal. URL appears to be replacing content URL." If you're doing something AMP specific on your pages, then I would put that on the AMP pages specifically, not on the normal desktop pages. "What's the optimal size or character link for a title?" I don't know. I would look at the search results and think about what would makes sense for your pages. And this is something that probably depends a bit on your site, probably depends a bit on your content. I try to keep the title to something that's not keyword stuffing that describes what your page is about so that people, when they see it in search results, they can react to it as appropriate. But that's less of something where I'd say it's an SEO issue and more of something where you present your site this way in the search results. And that should be fitting for your pages. "In an AngularJS site, the generated head part is considered by Google. So for example, is it OK to generate a rel canonical through JavaScript?" Yes, that's fine. If you use JavaScript to update the head of your page, then as we render those pages, we'll be able to pick that up and use that as appropriate. "Can we use a Google Knowledge Graph API for improving our entity pages? Like we have a page about Taylor Swift and wanted information about Taylor Swift. Can we fetch this data from the API and show it on our website?" Sure, if that's what you're using to provide additional value for your users, feel free to go ahead. If your website is only content that's great from other sites, then obviously that's something that I'd try to discourage. But if you're providing additional value to users through APIs, through feeds, or something like that, then by all means, go for that. "SVGs can be included on the web page either through an image tag, object tag, or embedding the SVG directly. Is there any difference as to how these will be indexed? Will all three approaches lead to the images appearing in image search?" I believe all three would work nowadays. So if used to be that you'd have to host your SVGs as separate SVG files and embed them kind of like images. But I believe all three of these variations will work now. "Any idea when the real-time Penguin is coming?" No, not yet. "I am the webmaster of We have an AngularJS system online for about a year. The sitemap exists in Search Console-- still seem to have indexing problems. Our external links are rare and internal link structure needs optimization. Do you see any problems with Angular? By default, I don't see any problems with Angular. I know there some really good Angular-based websites out there that do fairly well in search. Of course, there are probably ways that you can set up an Angular website that don't work that well in search like you could do that with any other kind of framework. So what I'd recommend doing there is maybe taking your site, posting it to the help forums, or posting to somewhere else where webmasters and SEOs hang out and get other people's feedback on your site specifically. Maybe there are simple things you can do to improve it. Maybe you are doing everything right, and it's just a matter of having even a better website to present to users through search engines. "I am new to SEO and how Google works. I manage my family's taxi company. We are a local company. We have a business page. And we have our contact details on a contact page. Also, we now have an SSL certificate on our site. Will this help rank better?" So SSL certificate, essentially moving the content to HTTPS, does help a tiny bit. But it's not something where you'd probably see a significant ranking change in the search results. So in a situation like this, I would consider really working on your pages directly. I'd probably also get help from people in the help forums because they've seen a lot of these types of small business websites and can give you a little bit of information on what you could be focusing on, what you shouldn't be focusing on. In some cases, maybe they'll even tell you you shouldn't be focusing on the website at all because this niche is way too competitive for someone new to just jump in and try things out on. Maybe it makes sense to be a bit more creative and drive traffic to your site in other ways. So this is something where there's no simple rule that works for all websites where it's really best to get other people's advice on your website, your specific business, and what you're trying to achieve. Let's see-- a few questions left. "What words of encouragement would you give webmasters who do the right thing such as build a great user experience website-- HTTPS, mobile-friendly, and work countless hours on their websites only to be outranked by those who use blackout techniques?" That's always a frustrating situation to be in. That's true. What I would recommend doing there is maybe also going to the help forums and checking out with the other people there to see that you're not missing something totally obvious and to see what you can do to improve your site in general. The thing also here to keep in mind is that sometimes these sites that also use black hat techniques do a lot of things really, really well. So the website might be doing some kind of sneaky stuff on the side, but otherwise has a really fantastic website. And in general, our algorithms would try to recognize that sneaky stuff, ignore that, and rank the website based on the other aspects that are involved there. So just because they're doing something sneaky, doesn't necessarily mean that they're profiting from that or that they are ranking because of that. Maybe they're ranking essentially despite of that sneaky stuff that you're doing on the side. So if you recognize the sneaky stuff that people are doing, then that's a good sign because that means you'll probably avoid those kind of sneaky things as well. But that doesn't necessarily mean that your website will automatically rank above these other ones. So continuing to work on your website and really taking the time to take an honest look at your website, at your business, at how things are working, always makes sense in my view. "I'd like to ask a question about the link badges. So images with keywords alt tags for many sites en masse-- for example, members of a forum, for example, this looks a lot like a unnatural link building to me. Is this method not recommended?" I think you'd probably have to look at the individual cases for something like that. I haven't really seen any cases where this would significantly cause any problems or cause any significant advantages for a website. I think if you're the type of website that has a really strong community behind that, that's always a good thing to have. And that really helps your website over the long run because these are people who will come back to your website regardless of whether or not your site is shown in search at all. So that's something where I'd say it's a good thing to have this strong community behind that. And just because they're using badges to promote their part of this community, I don't know if that would necessarily always be a bad thing or always be a good thing. "They use a block quote tag as a signal. If I quote another source with a block quote tag and link back to it, do you recognize that?" No, we don't, at least as far as I know, we don't take that into account at all. So if you quote another website, then we'll probably pick up that this is a quote from that website. But it's not going to be because of the block quote tag. You can use other type of markup there if you want. "Two months after a domain change, our rankings are normal. But we don't see the rich snippets star ratings for reviews. Does this mean Google needs more time to determine the credibility of a new website or how long can that take?" It's hard to say because sometimes we do we re-evaluate the quality of sites in general. And maybe you've just coincidentally hit one of those times where we updated our view of what we would want to show in rich snippets and what we wouldn't. So it's not necessarily related to a site move that you see these rich snippets star ratings or that you don't see them any longer. All right, wow! I ran through those. We still have-- what is it? Almost 30 minutes left. So what's on your mind?

BARUCH LABUNSKI: Regarding private blog networks, how does Googlebot determine that?

JOHN MUELLER: I don't know.

BARUCH LABUNSKI: If somebody, I mean, like if there's spam going on with something like that, how do you guys deal with that?

JOHN MUELLER: I would imagine we do that the way we handle most of this kind of link spam as a mixture of automatic and manual action that we'd have to take there. So I don't think there's one simple recipe for how Google recognizes link spam.


JOHN MUELLER: Let me just double check what was happening in the comments event. It looks like Chrome keeps crashing for some people, sorry. "Will it take much longer to rank for domains that have been previously parked for many years?" There is probably a period of time where we consider something still to be parked, even though it has content now. But usually, that should clean up fairly quickly. So that's not something that there's a manual time frame there that things have to wait out for. "There are parts of our website I'd like to index. But I have no interest in boosting it for the results page." I don't quite understand your question there, Kenny. So if you can post that in a different way, I'll try to pick that up for the next round.

JOHN MUELLER: All right, what have I missed in the chat? Lots of stuff happening.

MALE SPEAKER 4: OK, John, can I ask a question?


MALE SPEAKER 4: You just said that-- I mean, if somebody is actually using black hat techniques and saying also he's using the good techniques, don't you think that Google should not even look at the good techniques he has put because he's already using the black hat techniques, because we are struggling all-- we are putting all the rules which you tell us. And we don't use any black hat and stuff like that.

JOHN MUELLER: I don't know. From my personal point of view, I would prefer it if we could recognize the problematic part and just ignore those and focus on the rest because I know there are lots of people out there that have followed bad advice in the past that maybe have had their website set up in a bad way by maybe an older SEO from older times when maybe this worked a little bit better. And it's not that these people are explicitly trying to do something bad and harm users and spam the search results. It's just they've gotten bad advice. They've done something stupid. And they don't even realize that this is a problem. So in all of those cases, I think the smarter move would be to just say, well, we see that this sneaky stuff is happening here. But we can ignore that. And we can be certain that the quality of our search results aren't impacted by people doing this kind of sneaky stuff, then we'll try to ignore that. On the other hand, if we say, well any type of problem on your website will cause your website to be removed from Google permanently, then I think we would have pretty empty search results pages, because lots of people do things accidentally. And it's not something where I'd say just because you accidentally did something wrong, we should remove your website completely. But obviously, finding the right balance there is very hard and especially in situations where you're not sure that you're able to really isolate this one problematic thing completely, then that's a lot harder for us to figure out what the appropriate reaction would be.

MALE SPEAKER 4: OK, thank you.

BARUCH LABUNSKI: Is it a problem if you block your who-is information?

JOHN MUELLER: I don't think so.

BARUCH LABUNSKI: There was a lot of criticism out there for a long time that if you do that, it's not a good thing. You guys don't look at is a good thing.

JOHN MUELLER: I don't think we take that into account at all.


JOHN MUELLER: If you want to block that, from my point of view, that's fine. Sometimes it's really useful to have that information. So I use that sometimes when we get an email from some indexing or crawling people internally who are saying, wow, this website is this doing something really, really wrong. We need to let them know about that. And sometimes the who-is information leads us to the right people who can actually fix that problem. So from my point of view, I like having that information. But it's not something that I'd say we need to have for search.

BARUCH LABUNSKI: No, just that if it was registered by a lawyer, you don't want your lawyers info and all that stuff. So I just wanted to know if it's a problem.

JOHN MUELLER: Yeah. I don't know. I wouldn't see that as a problem someone.

BARUCH LABUNSKI: Making it transparent is even better, yeah?

JOHN MUELLER: I don't think we would take that into account at all for search. So if you think it helps for your users, fine. If you prefer to stay anonymous or stay hidden behind that who-is domain privacy, that's fine too.

ROBB YOUNG: Is that not part of-- unless this is also not true-- I read that having a PO box or not real addresses on your site can also hurt. Is there any Google Maps ranking?

JOHN MUELLER: That's probably just for local search results. But I don't know how the local search often handles that.

BARUCH LABUNSKI: For Google Places, you need a real address. You can't have a PO box.

JOHN MUELLER: I don't think we take that into account at all for search .


MALE SPEAKER 2: How about the duplicated content? Back to my question about the car-related website, if you remember, John.


MALE SPEAKER 2: So I was just wondering if you had time to take a closer look to it or possibly did you come up with some kind of advice for us.

JOHN MUELLER: I'd have to take a look again, sorry.

MALE SPEAKER 2: Should I resend it to you?

JOHN MUELLER: Sure. Sure. You could always ping that again.

MALE SPEAKER 2: Yeah, because we just can't sleep because we think there are some manual actions. That's why we're wondering.

JOHN MUELLER: If there were manual action, then that would only--

MALE SPEAKER 2: Sorry, I meant automatic.


MIHAI APERGHIS: Hey, John, Google AdWords allows you to track calls to a phone number on a website by using a script that changes the phone number on your website for visitors that come through ads, for example. Do you think that will be possible or something that we can maybe expect to be available for SEO as well? Or people coming from organic search maybe track calls via Google forwarding number as well. I know there would be some issues regarding changing the phone number, cloaking, things like that. That would be interesting to see exactly how many phone calls maybe leads a person gets from SEO efforts-- organic results-- organic traffic.

JOHN MUELLER: Probably, I would expect that you can already do that. I assumed people were already doing that. So from my point of view, I think that would be unproblematic. The tricky part, of course, is if you have things like what is it? The local business markup on your pages. And you're marking up your phone number like this. And the phone number is changing from time to time, then that might make it harder for us to really figure out what local business phone number we should really be associating with the site. But if you are doing this essentially in a script that we never see, then I don't see a big with that. It's not that you're significantly changing your site and cloaking something to your users. You're essentially just swapping out the phone number and using a session ID there. So I would assume that would be unproblematic.

MIHAI APERGHIS: OK. It would be nice to see an official script like Google AdWords ads that specialists can implement on the client website or webmasters.

JOHN MUELLER: I doubt you'd see something like that coming from our side because from search, we obviously liked it to see things as straightforward as possible. But I suspect this is something that would be trivially done with jQuery where you just swap out an element on a page and show a different number instead of a existing number.

MIHAI APERGHIS: Well, yeah. It's just that Google AdWords offers a Google forwarding number and allocates it automatically. And it has all those advantages, not figure out what phone number you need to switch and things like that.

JOHN MUELLER: Yeah, I don't know how they do that. Maybe it's trying it out and doing a blog post about it, letting us know how it works.

MIHAI APERGHIS: One more thing regarding Search Console, Search Analytics specifically-- this is more feedback, I guess. You have that average position metric, which is aggregated over all the keywords and their positions. I don't know if it's very helpful in certain situations. For example, I create a new landing page. And right after I create it, I stopped ranking for certain keywords regarding that landing page. And I don't trying to get high right away. So my average position might actually be lower versus last month. But I didn't lose any positions for the other keywords. So it might be a bit confusing for certain webmasters thinking, oh, OK, so Google started to down rank me or I was ranking or something like that. Maybe something like an average position change or changes in position, something like that. Maybe that would help a bit more.

JOHN MUELLER: I don't know. It's hard. I think especially with the ranking information, it's something where more and more you really have to be able to interpret that information properly. And like the situation you mentioned is one aspect where either part of your website to or part of new content to your website. And obviously, for those words, it won't be ranking that well in the beginning. That's one aspect. The other aspect is sometimes a different search results elements on a page where maybe they'll be, I don't know, 15 entries on a page that could be on page one where you're ranking 15 is still page one. And other times, maybe your ranking is number 9 and you're actually on page 2. So these kind of additional aspects that come into play with all the modern search results pages-- there's something where you really need to take the time to interpret the information that you're getting out of these tools and make sure that you're pulling the right results out of that.

MIHAI APERGHIS: Right. And regarding that same metric, I know that for certain queries that there is a very low search volume. You don't show them in the queries report. Does the average position take that into account. For example, if there's a key word that hasn't been searched that month , its position isn't taken into account versus last month, for example. There are no impressions for that keyword.

JOHN MUELLER: Well, if there are no impressions, then there would be no ranking information for it.

MIHAI APERGHIS: OK. It's not like you're tracking the actual keywords.

JOHN MUELLER: No. No, that's I guess sometimes a part of the confusing thing there. It's not that we're tracking the rankings. It's that we're tracking what we actually show to people. So that could be that sometimes personalized information is in there where the personalized ranking at number one could be something very visible for some of these, which might skew your average ranking. But that's essentially just because that's where we showed it this time. And it's not that we have any kind of a fixed ranking for a page in a query. It really depends a lot on the user, on personalizations, their location, all of that.


BARUCH LABUNSKI: Do you guys ever take Google Page Insight to make it a ranking signal?

JOHN MUELLER: Google Page Insight-- which one do you mean?

BARUCH LABUNSKI: Where it shows if it [INAUDIBLE] code-wise. [INAUDIBLE]

JOHN MUELLER: Like the speed information?

BARUCH LABUNSKI: Yeah, exactly. So remember we talked about passing the 83 as they were mentioning at Google I/O. So just wondering are you guys ever going to-- if it's necessary.

JOHN MUELLER: I don't know. Are you ever going to is a really tricky question because all I can say is maybe. But I think speed is probably something we'll be taking a look at more with regards to mobile specifically because as we see more and more sites move to something that's kind of mobile-friendly by design, but still almost mobile hostile with regards to usability on mobile devices, I think speed is one of the next aspects that's going to be more relevant there where we can see--

BARUCH LABUNSKI: I agree with you because I had that issue where I really wanted to know. I was in Toronto. I wanted to really know what the weather was this one specific website locally. And if it wasn't working because the signal of the hotel was really bad. So reaching that site took me over a minute.



JOHN MUELLER: Especially on mobile. That's something where I think the next big hurdle is. And that's something where we see a lot of sites shift their UI to something that works on a mobile device. But actually, every time you open a page on that site, it's over 1,000 requests. And it takes a couple of minutes actually to load all of the content to render it in a way that's actually visible. So I think speed is potentially something that we could look at there where we could say, well, this is a really fast mobile site. Therefore, we should make sure they are really treated as mobile-friendly. Whereas this site renderers on a mobile. But it's a really terrible user experience-- really, really slow. It has a lot of different embeds. It does everything wrong with regards to making something that works quickly and snappy on a mobile device. So maybe we shouldn't treat it as mobile-friendly.

BARUCH LABUNSKI: So if it's over five seconds, then it's a problem.

JOHN MUELLER: I don't know where we would draw that line. That's really hard to kind say there. But this is something that you see when you interact with websites yourself as well where if you're on your phone, you're trying to do something. And you can load the page or you can see that it's loading, but it just takes forever to actually load, then that's not something you're going to be sticking around with on the website. And if Google points you at that website and you know there's an alternative that's really snappy, then you'd almost say, well, Google, why do you hate me as a user? Why do you send me to the site that's so terrible on my mobile phone.


MIHAI APERGHIS: John, any chance I can ask you a site-specific question?

JOHN MUELLER: All right.

MIHAI APERGHIS: It's a publisher website. It's a pretty big website. And we've been working with them for three years now. But just for the past six or seven months, traffic has started trickling down maybe 5%, 10% per month. And it's really puzzling. We cannot find a specific issue. I know there's stuff to be worked on, of course. It's just I was wondering maybe you can see something you should first focus on this. Maybe this is a problem. It's fine. There's nothing we can say either. But I had to try.

JOHN MUELLER: I'd probably have to take a more in-depth look there. I don't see anything offhand where I would say, well, you obviously forgot this one stupid thing. Therefore, your website is dropping in search. Sometimes, it just a way that the whole environment is shifting over time where maybe there are some really strong competitors that are coming up or maybe people just aren't looking for that type of content anymore. It's really hard nail these slow changes down.

MIHAI APERGHIS: Right. Well, one of the problems or things we are trying to implement is having a static URL or like a reviews, because we do car reviews and things like that. And every time there's a new model, the URL changes. And we're trying to get a static URL from year to year. So it gets all of those signals. We're also trying AMP right now to implement-- see how that goes.

JOHN MUELLER: Yeah. I suspect both of those will help a little bit. But I don't think you'd see a really strong jump from either of those because those are like tweaks that you're making on top of the existing website. And they don't significantly change the whole model and the whole environment of how people interact with your site, how they want to find it in the search results. That's something that sometimes just changes over time. And finding out why it's changing isn't always like to due to a specific, SEO technical type thing on your website. It's maybe just because people are changing and doing something else instead. And maybe that's something else that you can also do on your site. Maybe that's something else where they say, well, we're going away from cars. And everyone is just taking hoverboards now. Therefore, all car websites are obsolete. These type of things can always happen.

MIHAI APERGHIS: Sure, well, we did check our competitors and try to see where they stand. And we know there's lot of traffic potential there-- a great lot of potential. So I know this is a subject that people are still looking for. I'm just trying to figure out how to, as you said, try to target them to see what exactly they are more interested in, I guess.

JOHN MUELLER: Yeah, I guess what I would do in a case like this where you're seeing this kind of steady decline is really do some user studies to figure out what is actually happening there to see if these other sites are just ranking subtly better than your site-- if it's just a matter of ranking, or if it's a matter of your site not really matching exactly what they're looking for and really trying to figure out where does that mismatch come from-- where do these changes happen. Because especially if it's something that's subtly happening over several months, it's not going to be an SEO problem. It's not going to be something where you can just say, oh I forgot the rel canonical here. I'll fix that, and it'll jump back up again. It's probably more of a general problem than that.

BARUCH LABUNSKI: What about recording the users? He can record the users and watch their behavior and maybe change.


BARUCH LABUNSKI: You know, using pop ups like OptiMonk which will reduce the bounce rate.

JOHN MUELLER: I don't know if using pop ups would be the right thing. But doing user studies by recording what they're actually doing on your site, that's definitely a good thing. Maybe doing a lot of A/B testing to see what kind of content performs better-- what kind of UI performs better for users-- where can you get more than just like a few percent out of it. All these things I think would be really useful to help understand what is actually happening there.

MIHAI APERGHIS: Thanks. By the way, regarding AMP, do you think that once it's launched it will have a lot of queries, maybe you know a percentage.

JOHN MUELLER: I don't know.

MIHAI APERGHIS: But you mentioned it will first focus on newsy queries.

JOHN MUELLER: Yeah. The demo is specific for I think in the news box that we have where some sites do see a lot of traffic from that. And that's something where generally people kind of click through it. They want to read an article and see different views. So that's something where if you have, I don't know in your case, like car specific news. And you provide them in that way. And we show them in the news box. And that's probably something where you would see a bit more traffic just because people are actually able to actually see this content a little bit faster and then interact with that directly a little bit faster.

MIHAI APERGHIS: OK, and what happens with those sites that don't implement AMP or those queries? I'm guessing they will see a pretty big decrease.

JOHN MUELLER: I don't know. I mean, on the one hand, we don't want to use specific technical elements as a way of saying, well, you have to implement this. Otherwise, we won't show you in search. On the other hand, if we show things like a carousel with AMP content and your site doesn't have AMP content, then we can't include you in that AMP carousel just for technical reasons, because we don't have that content that we can actually there. So that's something maybe some of these sites will see a change in their search traffic. Maybe other sites that wouldn't traditionally be shown in there anyway. They wouldn't see any change.

MIHAI APERGHIS: OK, one more thing regarding AMP-- you mentioned, so again, newsy type stuff. What if I implement it on an e-commerce website and do a static page with product information? And there's maybe a button that leads to the actual website, maybe Add-to-Cart or something like that. I'm guessing this won't show up right away, because you're not showing it up or product-related or e-commerece-related queries--


MIHAI APERGHIS: But further down the line, maybe?

JOHN MUELLER: Yeah, maybe further down the line that's always possible. The other thing to keep in mind with AMP is that it's not just Google-specific. It's an open-source platform. It's also being used I think by Twitter, Pinterest, LinkedIn, for example. So if your content is linked to from any of those other page cases, and you think that your AMP version would be a good match for that, then maybe it does make sense to show that even for things that aren't shown in the Google newsy-type environment. So if you have something, for example, that shared a lot on social media that's used a lot on Twitter, then maybe this is something where you would see a significant boom out of that as well. And the neat thing I find about the AMP content is that you can include analytics in there as well. So you can see how many people are previewing your content through the AMP version. And you can make a judgment call based on that over time where you say, well, maybe I'll just do 10% of my content like this to start off with. I'll watch the analytics and see is it really as something worthwhile for me or not. And then make a decision based on that afterwards.

MIHAI APERGHIS: Yeah, sounds good.

JOHN MUELLER: But as always, with a lot of these new technologies, I think it's something that SEOs in particular is really useful for them to at least have some experience with that. And another aspect there is, of course as well, is if you're the first people within your niche to implement this kind of thing, then you'll be the ones who will be able to profit from that first. So obviously, there's always a little bit of a risk involved there as well that maybe you spend a lot of influencing something that nobody actually notices. But I think as an SEO in the beginning, that's something where you build up a lot of experience as well. And where when clients come to you and say, should I be doing AMP? You can say, well, yes or no. You can make a clear judgment call based on your experience rather than just on, well, I read this one blog post. That's it. Maybe you should. And this the one site, maybe you shouldn't, but I don't really know. And if you don't have any experience, then that's really hard to give these people an educated piece of advice.

MIHAI APERGHIS: Yeah, thanks.

BARUCH LABUNSKI: John, every year you recommend what will happen in 2016 or what webmasters should do. But where is Google going in 2016, this being the last Hangout for 2015?

JOHN MUELLER: Where is Google going? I don't know. I hope they keep the offices here because these are really kind of comfortable. But past that, I don't know. I see a lot of talk about things like having an assistant, which I think is really interesting. I don't how that will work with regards to websites in general, like if we have to pick up special markup to make that work really well. But I think that's kind of a rough very futuristic direction where things could be headed where we see people using search not just to get an answer to keyword-type queries, but really to answer a specific question that they have like, should I bring an umbrella to the office tomorrow? Is it going to rain? What do I have to do next week? I want to go on vacation in Canada. What should I watch out for? Where should I go? These type of things where Google has a lot of information where we can theoretically combine that information and really help guide a user to do something that matches what they're trying to do but where at the moment, search falls short because we're looking more at the keywords in the queries rather than trying to understand the actual intent of what these people are trying to do. So I kind of suspect what will happen to make that more possible is that things like structured data markup will become more relevant again where if we can pick up the entities better on a page, understand what actions could be done on a website, it'll be a lot easier for us to actually guide people there and say, well, you are looking for information about Canada. Here is this really great website with opportunities in Canada since it looks like you're not just trying to go on vacation, but trying to move there, for example. So kind of having this structured data markup on these pages I suspect will become more and more important over time.

BARUCH LABUNSKI: Do you have any idea why when you move a site to HTTPS, you can't use the Data Highlighter right away?

JOHN MUELLER: Because these pages aren't cached yet.

BARUCH LABUNSKI: OK, so just wait another couple weeks?

JOHN MUELLER: Yeah, so the Data Highlighter relies on the cached pages. And if you move from one version of a URL to another, then we have to re-indexed and recapture those pages.



MALE SPEAKER 5: I have a question in the Q&A for you if you [INAUDIBLE].


MALE SPEAKER 5: About the multi-language site.

JOHN MUELLER: Let me switch over again. "Why does a multi-language site take more time than a single language site to index in Google and gain some search results?" That's a hard question. I think from a technical point of view, one of the difficulties there is we have to crawl and index these multiple URLs first. And we have to understand them and kind of the connection between these pages first. So if you have one page, then that's a lot easier for us to understand than maybe 10 different pages that are on different languages or different country's focus. So that's something where it will always, coming from a technical point of view, take a little bit longer. From a practical point of view as well, if you dilute your information too much by putting it on to too many different versions of pages, then that's a lot harder for us to get signals for each of those individual pages. Whereas if you have one really strong page on this one topic, then that's something we can collect signals for fairly quickly. But if you have 10 really of good pages on one topic in different languages, then it's a lot harder for us to understand, well, this page is really good on this topic. And it's available in these different languages. So it's sometimes mostly a matter of time, but also a matter of knowing when to concentrate things one URL compared to splitting things up onto multiple URLs.

MALE SPEAKER 5: OK, so assuming that all signals of both the meta language are correct, I only need some patience.

JOHN MUELLER: I would say patience. But maybe also take a step back and think about does it really make sense to start with all of these different language versions. Or can I start with maybe two or three different language versions in the beginning and expand from there, because it's a lot easier to have one or a small set of pages that are really strong compared to having a lot of different versions of the same pages.

MALE SPEAKER 5: OK, so you should just first to maybe begin with two or three languages and then in the future, add some more.


MALE SPEAKER 5: Because I was thinking-- I was thinking to choose some languages where the competition maybe is a little lower.


MALE SPEAKER 5: And then call plan the English language also and then in the future, add some more. OK, thanks a lot.

JOHN MUELLER: That sounds good.

MALE SPEAKER 5: And have a great New Year.

JOHN MUELLER: Thanks, you too.



ROBB YOUNG: Can I possibly ask about our site again since it's our two and a half year anniversary? Is there any, without giving me any specifics because I know you can't-- is there any change in that underlying issue? It is it something that's being looked at, discussed, changed. Or is that now an absolutely permanent change in an algorithm that is never going to change again? Because when I spoke to you a year ago or two years ago, you said it might. Or do you-- if you think of all the things that have changed over the last two years, there's obviously a lot of ranking factors that have changed. So is it something that might?

JOHN MUELLER: It might. These things might always change. I mean you moved to your site too like a different domain now, right?

ROBB YOUNG: Yeah, we have. But because that also hasn't really changed in ranking in the last 12 months at all. So we've oscillated between having some connection to it versus no connection to it at all. And sometimes when we use a hreflang to relate one to the other because at Christmas, we need the seasonal-- it bizarrely gives a seasonal boost. But then it looks like Google catches us again and says, no, we know what you're doing. Come on. So then we have to remove it again. But at Christmas, that's all-important because a conversion rate is 10 times what it is normally. So we almost have to use it every 9, 10 months to say merely a short boost. But we definitely want to distance ourselves the rest of the time. But it seems like we'll never get away from it. But what we literally can't survive without that once a year association, because the natural position for the other site doesn't change no matter how many quality links we get or how much content we put on it. It makes no difference. It just stays static, totally static.

JOHN MUELLER: The new site is the one with an E, right?

ROBB YOUNG: Yeah, it is, yeah.

JOHN MUELLER: I really don't see anything holding that site back.

ROBB YOUNG: Even though that's currently got the hreflang from the old one.

JOHN MUELLER: I don't see anything specific holding that back, no.

ROBB YOUNG: But the old one is still in your special file.

JOHN MUELLER: That's what I say, yes. But I see it going up and down. I guess that's the hreflang things that we're doing now.

ROBB YOUNG: Yeah, and also the December goes up a lot and then down after the 25th because we're a gift company. So we don't have a quiet period then.

JOHN MUELLER: This is something where, on the one hand, we try to do what makes sense for search results there. And we do occasionally take things to different parts of Google to kind of look at say, hey, this is really obsolete. We need to either get rid of it or update it or change it or whatever. And I think your site, at least your old site with the way that it's handled there, falls into that situation. And this is something that from our point of view, we see extremely rarely, which makes it really hard for us to push that appropriately with the teams internally and say, hey, this is affecting a lot of sites. We need to change this really quickly. It's kind of a tough situation to be in, I guess.

ROBB YOUNG: I know. We've lost [INAUDIBLE] of our revenue. And it's hard to not take seriously and not ask every time I'm on these calls and try to contribute to other stuff. But every so often, I like to ask because we're not doing anything-- you know we're not doing anything inappropriate.

JOHN MUELLER: This is something we do take up with the other teams again. So it's good that you keep asking because we do keep pushing these things at our teams too.

ROBB YOUNG: But is it really only us? Or is it something that can affect other people because I do get other people ask me, what's wrong? Because they want to avoid the same issue. But I can't say, well, don't do this that or this, even within the Webmaster Forum, because I've posted in there before. But the problem is that everyone jumps straight on it in saying you've got these four backlinks , and no, it's not. It's' not that.


ROBB YOUNG: It's vicious in there.

JOHN MUELLER: I think in the last year, when I look back, I've seen maybe one other site that ran into something like that. So it's something where for the most part, when we look at the evaluations for that specific change there, they're saying, well, it's working as expected. It's doing the right thing. And it's really these kind of cases where we say, well, it's not doing the right thing where we tend to push back on the teams as well and say, hey, we need to find a solution here and figure out how to do this in a different way.

ROBB YOUNG: But is that only one other site reporting that issue, because, obviously, I could have just listened to the advice in the forum and said, OK, you're right. It's that blog we wrote that time with all these footer links. And then no one would've raised it further up. So could there be hundreds of sites? Or is there only one or two, but you only hear about the one.

JOHN MUELLER: I don't know. It's hard to say, because the type of algorithms, they're meant to affect the web in a broad way. It's not. We wouldn't make like an algorithm that would only affect five handpicked sites because then it wouldn't be an algorithm. Then it would be something manual, right?

ROBB YOUNG: No, and I understand from a user perspective, people are still finding our competitors. So it's not affecting users. I'm realistic enough to know that I have to try and maintain my calm, but because users find someone else. But there are only two or three other sites. The frustration is when you look at the results and a lot of those sites are utter rubbish. But users don't care because they still find. It's not like they Google something. And there's 10 empty results. It doesn't happen.

JOHN MUELLER: I don't know. I'll pick it up again.

ROBB YOUNG: If you do, I'll promise not to mention it again until about March.

JOHN MUELLER: Why do you keep mentioning this? No, I mean, these are the type of things where sometimes we do bring them up with the team. But if we never hear anything back from either side, then we assume, well, maybe it's not a big as a problem as we originally thought. Whereas if we keep hearing back from the webmaster and saying, well, this still isn't fixed. This is still isn't working the way I wanted to. Then that's something where we'll continue to try to push internally.

ROBB YOUNG: All right, if you could ask, I'd appreciate it. And I'll wait definitely until the new year anyway.

BARUCH LABUNSKI: Any idea regarding the search tools? You said you were going to take it up with the team-- the search tools with the location thing.

JOHN MUELLER: Search tools.

BARUCH LABUNSKI: The search tools, the location, to get a precise look at how my site is doing in an area which is like--

JOHN MUELLER: Oh, OK, so the advanced search settings in the model search results. I haven't heard anything back on that. I assume this is just a normal UI change where we decided that this makes sense to keep it like that.

BARUCH LABUNSKI: OK, so if I want to find out like a special hottest chocolates in Switzerland, and I wanted to go into precise area, I won't be able to find it, because it's going to give me a need to tap into somebody else's computer in order for me to get that precise location, even--

JOHN MUELLER: I don't know. If you want to find out the best chocolates in Switzerland, you probably shouldn't just rely on the search results. That's seems a bit--

ROBB YOUNG: You should travel to Switzerland is what you're saying.

JOHN MUELLER: Well, or ask Swiss people.

BARUCH LABUNSKI: I'm just saying the Search Tools--

JOHN MUELLER: People who know about chocolate.

BARUCH LABUNSKI: It was a really cool feature. And it's sad that it went away. That's all. There a lot of SEOs are feeling the same.

JOHN MUELLER: But on the other hand, we have to make our search results usable by the general public. So this is something that really only SEOs were using, then I don't know if it make sense to keep supporting that and keep updating the data and making sure the UI doesn't break.

ROBB YOUNG: If you all know the best or something in any particular area, surely you Google that instead. You don't Google what's the best something in England. You Google what's the best something in London and just keep going down until you-- isn't that normal behavior versus using the Search Tools.


BARUCH LABUNSKI: Canada, for instance, in an area like Toronto, and there's Mississauga, which is 20 kilometers away, you're not going to get the exact precise-- every result is different now because of this change, because of removing the location change because you said it's based on where the user is located. But I said if I'm located 15 kilometers away from the location, I'm going to get something totally different because of my phone. For instance, every day, if the IP is different on my phone, it's a different IP all the time. And the location changes. So I'm not physically in that area is what I'm saying.

JOHN MUELLER: You could use a GPS spoofer. You just boot your phone and install one of spoofing apps and make your phone think you're somewhere else. I don't know . That doesn't seem like something that we would officially recommend. But theoretically, you could do that if you really, really wanted to see what the search results are like there.


MIHAI APERGHIS: You can still use the parameter that changes your location.

JOHN MUELLER: I think the parameter still works, yeah.

BARUCH LABUNSKI: But it's only Canada/US. It's like, come on.

JOHN MUELLER: I don't know. You can specify. You can do it on a city or a regional level as well.

MIHAI APERGHIS: John, can I ask regarding Robb's site, how come he cannot do anything on his site to change that would not be-- so the site wouldn't be affected by that certain algorithm, I guess? You mentioned that there's nothing he can do about it. How come? How cannot he change his website so that wouldn't be the problem anymore?

JOHN MUELLER: It's tricky.

MIHAI APERGHIS: I've always been curious about it.

JOHN MUELLER: Yeah. At the moment, I really can't go into more details there.


ROBB YOUNG: So even if it somehow became as popular as Twitter, despite the fact that we had have a ton of direct traffic, we still wouldn't rank?

JOHN MUELLER: I think it would rank. A lot of these things, when it comes to search results, they are not such that we'd remove it directly completely from search results.

ROBB YOUNG: It's funny though, if you Google the exact name, we still come up on Wikipedia entry and various press. So it's not our brand name at all because that is found. But that's such a specific query within a niche area. That's not how we make-- or not how we used to make money. It was on all of the generic in different terms. Anyway, I don't want to hog it. I am not Mihai. So I know you can't say anything. So I'll move on.

JOHN MUELLER: I'll keep pushing. All right, somewhere there's a question from someone who can't speak due to fever and coughing. Let me see if I can find that. Or maybe you can just paste it into the chat again, and I'll pick it up from there.

ROBB YOUNG: While that's going on, John, do you want to comment on the internet of things?

JOHN MUELLER: The internet of things.

ROBB YOUNG: Yeah, and how Google will adapt to advertising when you're trying to talk to your fridge. If everyone is not actually sitting there Googling anything, what are you guys going to do to adapt to that.

JOHN MUELLER: When people are going on their fridge. I don't know what that's--

ROBB YOUNG: Not even Google. They're talking to it instead. Asking I'm out of milk. Can you get me some milk? Would it narrow down your opportunity to advertise or would it broaden it.

JOHN MUELLER: I don't know. I could imagine if you're a store selling milk, then maybe it will be like, well, would you like to order here, here, or here.

ROBB YOUNG: Right, that would be a direct manufacturer relationship if they do it right rather than Google having any place there at all unless you provide the technology.

JOHN MUELLER: I don't know. I think the whole internet of things is on the one hand, there's a lot of talk about it where it's not really talk about practical things just yet, because there are a lot of crazy ideas out there that might make sense in the long run, but might not even end up working at all. Where I don't know if your light bulbs all have individual IP addresses, what does that really mean? Does that mean that your light switch is going to have Wi-Fi built in and turn things on and off? Or does that mean that you're going to have a website where the light bulbs on your house are going to be displayed, and you can display patterns and make them blink in fancy ways? Probably not. You're probably just going to want to turn the lights or turn them off again. So there are lots of things that might theoretically be possible. But if they'll actually end up being used that way, it's really hard to say. But I could definitely imagine things like refrigerators having a bit more smarts built in as being something that might make sense. But if that's something that people will actually use in the long run, I don't really know.

BARUCH LABUNSKI: It's probably just going to be all recipe-related or on the best queries.

JOHN MUELLER: Yeah, I mean maybe that that makes sense. The tricky part there is also that people buy refrigerators for the long run. They buy them to keep them for 20, 30 years. Whereas if you want to buy a laptop, you're probably just going to keep it for a year or two. So it'll become obsolete after one or two years. And if you have your refrigerator, that's obsolete after one or two years, I don't know if that would really make a great refrigerator for people, if that's really something people would want to buy as a refrigerator.

BARUCH LABUNSKI: Are you able to answer anything regarding the search to be forgotten? Anything related to that? How fast-- there was no response at all from the team after submitting an application. But when does that team get back? Or do you do not know?

JOHN MUELLER: Usually they happens fairly quickly.

BARUCH LABUNSKI: OK, so just submit it again and wait for an answer.

JOHN MUELLER: Yeah, it's for people in Europe, right?

BARUCH LABUNSKI: Yeah, exactly.

JOHN MUELLER: I don't know what the normal time frames are there. Let me see. With regards to this site that you just posted, for the move from HTTP to HTTPS, I don't see any problems there. That looks like it's about working as expected. All right, and since we're in kind of drifting off into all kinds of crazy topics, let me just pop the normal recording here now. And we can still chat a bit for a while. And for those of you watching on YouTube, I guess I'll see you again next year or in one of the future Hangouts. I'll set them up again probably when we get started again and try to keep them at the right rhythm, the right frequency of maybe once every two weeks as we've had it so far. Until then, I wish you all a great New Year, and I hope everything starts off really well in the next year. Bye.

MALE SPEAKER 1: Yeah, Happy New Year to you too, all.

BARUCH LABUNSKI: Happy New Year. | Copyright 2019