All questions have Show Video links that will fast forward to the appropriate place in the video.
Transcript Of The Office Hours Hangout
Click on any line of text to go to that point in the video
JOHN MUELLER: Welcome everyone
to today's Google WebmasterCentral Office Hours Hangouts.My name is John Mueller.I am a webmaster trends analyst
here at Google in Switzerland,and part of what I do is talk
with webmasters and publisherslike the ones here
in the Hangout,the ones that submitted a
bunch of questions already.As always, if there are any
people here in the Hangout,especially the newer
folks who'd liketo go ahead and ask
the first question,feel free to jump on in.
MALE SPEAKER 1: Hi, John, back
into it to my question aboutduplicated content on
the automotive websitethat we spoke about
last time-- did youmanage to take a look at this?
JOHN MUELLER: Not
in detail-- not yet.
MALE SPEAKER 1: All right, would
you like to take a look at itin the future?Or would you like us to send
you some more details on it?
JOHN MUELLER: I'll dig
into it a little bit, yeah.
MALE SPEAKER 1: OK, although I
have another question for you.It's in regards to
the content, again.So currently, I'll
tell you the story.Our website is selling
car parts and accessoriesfor about six years.It's a pretty big
automotive website.But half a year ago, we
decided to add new categoriesand new products to the
website that are not relatedto automotive industry at all.For example, some home audio
microphones, home theaters,and stuff like that.And we placed them in the
root of the main website.And we noticed that currently,
our traffic heavily droppedon the automotive keywords.So can we say that
all those new productslowered the site ranking and
caused that traffic drop?
JOHN MUELLER: Usually,
it's not that simple.Let me just mute you.It looks like we have
a bit of an echo there.Usually, it's not that simple
that we would say, well,you focus on automotive parts.Now, you added, I don't know,
maybe furniture or somethingcompletely different.Than we would say,
oh, well, this websiteis not so strong for
automotive anymore.Usually, we'd still focus
on those automotive partsnormally.So it's not that you would
be diluting your websiteor making it less useful
for automotive partsby adding more other kinds
of information there.So usually, that's
not something where Iwould say it would be related.
John, but one timeyou said it can confuse
Googlebot if youadd more or different pages.You said that in
one of the Hangoutsthat it can confuse
Googlebot if you add--
Well, it shouldn't.So adding more and
different kinds of contentscan definitely make sense.It wouldn't result
in us thinning outhow we would rank your website
for the existing content that'sthere.Of course, if you add a
mix of high-quality contentand low-quality
content, then that'sa different question, but if
it's normal content that you'readding to your site, and it's
a slightly different theme,than that should
generally be fine.That wouldn't be
something where I wouldsay it should cause problems.So if you're seeing a drop
in ranking after somethinglike that, then
that shouldn't bedue to adding different kinds
of content to your site.That would probably be
due to something elsethat maybe we were
picking up across the siteor that we're just not finding
as useful as it used to be.
MALE SPEAKER 1: Hey,
John, another question--would that be a better
way to organize it?Should we better
get a new domainfor the entire other
non-automotive themeor maybe a subdomain?Would it make any difference
for Google at all?
JOHN MUELLER: I am sure it
would make a difference,but I can't say if it would
be a positive or negativedifference.It would essentially
be two separate sitesout of one site, which is always
a fairly big change when you'redoing that with a website.So moving from one
domain to anotheris something we could easily
pick up on, taking one domainand splitting it up
into two, or two domainsand combining into one.Both of those scenarios are
very hard for us to guess aheadof time what would
come out of that.And from my point of
view, personally, Iprefer having one really strong
website instead of splittingthings up into
separate websites justbecause if you can make
one strong website,you can focus your
energies a lot moreand really make
sure that everythingacross the whole website is
high quality as it can be.I personally don't like
splitting things up that much.Sometimes I think it makes
sense for marketing reasonsif you're targeting completely
different audiences.Maybe you have resale products
that you sell directlyto consumers and other products
you sell to distributors.And it might be the same
product, but a very differentaudience, and maybe
it makes senseto set up separate websites
for that kind of situation.But in general, if you're
targeting the same audienceor generally the same
audience, I reallyrecommend making one really
strong, really good website,rather than a handful or large
number of less strong websites.
BARUCH LABUNSKI: A good
example to take a look atwould be maybe Newegg.Newegg has some similar to that.
JOHN MUELLER: I don't
know the Newegg site.I usually say Amazon
as an example here,but Amazon also has a
bunch of different websitesprobably based on the
different acquisitionsthat they've had over the years.So it's something where you
can find pretty much everythingon a big website like that.And I think if you can really
build out a strong website thatrecognizes the
brand-- people wantto go there-- they explicitly
want to go to that websitebecause they think they can find
the information they're lookingfor-- then that's probably
something that you could aimfor.
MALE SPEAKER 1:
Thanks a lot, John.Actually, we're visiting
your meetings in order justto make sure we're
doing everything right.And from our perspective,
we checked our website,and we still see a major traffic
drop on automotive categories.Would it be possible
that in my messageI'm going to just include
a couple of examplesso you could take a look because
it's for a so irregular--
JOHN MUELLER: Yeah, I'm
happy to take a look.I can't promise that
I'll be able to say muchabout that because
a lot of times,this is feedback that we bring
back to the search qualityteam.And the search team
will try to figure outdo we need to improve
our algorithms?Are we doing
something wrong here?Is that algorithm
picking up somethingthat maybe they don't want
to tell you about explicitly?Those kind of situations.So I can't promise
I'll be able to tellyou need to change
this line of codeand that everything
will be fine.But it's definitely
useful information for usto have to discuss with
the team internally.
MALE SPEAKER 1: Thanks a lot.That's going to be
a great help for us.Thanks.
MIHAI APERGHIS: John, a
quick follow up on that.When you have a very
niche-specific, focusedwebsite, and you are adding a
bunch of category of productsnot related to that
niche, technicallyaren't you diluting though some
of the signals that the websitehas through these new
[INAUDIBLE] signalswill be spread out both the
niche-specific pages that youhave as well as the new
products that you've added?Perhaps this is why you
also usually recommendfor small businesses to
focus on one specific nicheand not got to growth.
JOHN MUELLER: I
wouldn't necessarilysay that you're diluting things.But it's something where,
especially if the quality isn'tthe same, or you can't
say this is somethingreally well-established here.You're adding something
completely newthat nobody really
knows about or caresabout maybe to your website
as well, then of courseyou're spreading things
out a little bit,but if you're taking
something really fantasticand adding more
fantastic stuff to it,then that generally
builds up on each other.It's not something where I
would say it would be a problem.And sometimes what
happens is youhave to introduce
new products somehow,and nobody will know about
these things for a while,but you hope that through your
existing brand recognition,people will come to
your site and say,oh, well, this is
interesting too.I recommend that to other people
as well-- I'll buy that maybe,whatever.So that's something where you
have to build up step-by-step,and obviously, if
you're taking one thing,and you're piling on lots
and lots of new content,then it takes a while
for us to understandwhat to do with all that.But if this is
good stuff, then weshould be able to pick that up.
MIHAI APERGHIS: So there's
no problem, for example,have a store selling
laptops and IT products,and then suddenly, I'm
adding baby productsor pet-related
food-- I don't know.
JOHN MUELLER: I'd say
you'd have more problemswith your audience than
you would have with Googlein that if you have a user
base that is looking for ITproducts, and they keep
coming back to your sitefor that, and the main
products that you promote noware suddenly baby
powder, then they'll go,oh, this is not what I wanted
to find here and maybe run away.But if it's something
where you knowthe audience is also interested
in that, then, sure, why not?And that might be
the case, and youtarget something specific
to an IT audience thatgrows older over time,
and suddenly theyneed baby products because
they're all having kids.I don't know.These things can happen.
MIHAI APERGHIS: Google doesn't
have any bias towards oneor another as long
as the audience is--
JOHN MUELLER: Not
that I'm aware of.
MALE SPEAKER 2: John, I have
questions about app indexingin the Search Console.Is that for a different Hangout?Or could I ask?
JOHN MUELLER: Go for it.
MALE SPEAKER 2: It's
not news related.So I could share my screen to
make it easier to understand.OK, so I launched the
Search Engine Roundtable apparound here and started to
get indexed impressions, notmany clicks, as you can see.And then that had that Search
Console bug with the SearchAnalytics around here, and I
was like, oh, it'll be fixed.But when it was fixed,
it just kept dying,and then I don't know
if it has anythingto do with the app and
something that I'd done,but when I look at
the crawl areas,there is nothing, obviously.That's an issue you said
you're working on, I think?
JOHN MUELLER: Mhm.
MALE SPEAKER 2: And
if I go to the fetch,and I fetch a new URL,
it tells me these things,but it always said that,
even when I first launched.
JOHN MUELLER: What does it say?Let me click on you.
MALE SPEAKER 2:
Sometimes it works,but if I touch the homepage or
certain older pages, the work,it says, complete.But sometimes URLs give me
this it's unsupported URI,which I can't figure
out if it's just--
JOHN MUELLER: Oh, you
have the colon in there.It should just be https,
slash, and then the host name,and the URL.
MALE SPEAKER 2: So
if I go remove--
JOHN MUELLER: It depends on
how you have your intent set upin the app, but--
MALE SPEAKER 2: Good question.So if I remove the
colon, you said?
JOHN MUELLER: Yeah.
MALE SPEAKER 2: Need the slash?
JOHN MUELLER: If you have it
set up to respond to that, yes.
MALE SPEAKER 2: I have
got to set this out.Yeah, it looks
like the ones-- OK,I'm going to be-- all
right, I apologize.
JOHN MUELLER: OK.I passed the issue with the
impressions on to the team.And they basically
said this was mostlybecause very few people
have your app installedat the moment.
MALE SPEAKER 2: You've
got to install it?
JOHN MUELLER: Yeah.So those people who do have it
installed, they will see it,of course, in search.But those who don't have it
installed, they won't see it.
MALE SPEAKER 2: I thought
they'd see it anyway.I thought the new thing is
that they showed the app evento people who don't
have it installed.
JOHN MUELLER: We show
the install buttonif someone is explicitly
looking for your app.So if they are
like, I don't knowwhat queries you would use.It might search engine
roundtable app or search engineround table maybe itself.Those kind of things should
show the install button.But otherwise, it would only
trigger on the content itselfif you have the app installed.
MALE SPEAKER 2: Are you sure?
JOHN MUELLER: I'm pretty sure.
MALE SPEAKER 2: I thought
there was a news release sayingthat even if you don't
have the app installed now,it will surface the content
using the streaming feature.
DANIEL PICKEN: Can I jump in?I've got the app, Barry,
because it's great,And when I search for it, it
comes up with standard desktopversion, and then at the top,
it tells me to open in there.It doesn't necessarily give
me an option straight awayto go straight to the app.It actually shows the
desktop version first.
MALE SPEAKER 2:
Yeah, it's weird.It used to.
JOHN MUELLER: I think
we did some experimentswith bubbling up the app
for content-type queries.So if someone is searching
for, I don't know, Penguin,and you happen to write
about Penguins, thenthat's something
where-- I believewe did an experiment about
showing a deep link to thatdirectly, even for people
who didn't have it installed.
MALE SPEAKER 2: Right,
like the example,again, I guess this is a
limited number of people.The example was with hotels,
that it would actuallyshow apps without
having them installed.You clicked on it, it would
launch the Google Play appand streaming or
something like that.
JOHN MUELLER: The
streaming I thinkis just a very, very
limited test run for them.So that wouldn't be happening.
MALE SPEAKER 2:
And the good newsis the fetch worked
out without the colon.So thank you.
JOHN MUELLER: OK.
MALE SPEAKER 2: And a quick
question on the-- somebodysaid that the author article
markups-- the article markupin the rich snippets option
specifically now requiresauthor name.But has it always
required an author name?Or you're not sure?
JOHN MUELLER: I don't think so.
MALE SPEAKER 2: You
think that's new?
JOHN MUELLER: I believe that's
something new that was added,but I saw the feed as
well, and I asked the team,but I hadn't got an
answer on that yet.
MALE SPEAKER 2: OK,
authorship is back.
JOHN MUELLER: No, no, no.Not, not-- no, no.Let's not open
those wounds, man.
MALE SPEAKER 2: OK, thank you.
DANIEL PICKEN: John, can I
jump in with it, and well,it's specifically an indexing
and no indexing question,if that's OK?
JOHN MUELLER: Mhm.
DANIEL PICKEN: So we
spoke last week aboutwhy we would use
a robots.txt file,and it would generally would be
to block the unpaginated pages.Now, imagine the situation where
Google's crawled the pages,so search pages are
already indexed.So what we'll do then is
we'll block my robots.txt.They are still there.We can't put a
noindex on the pagebecause if you crawl the
site, you'll hit robots.txt,and then you can't
crawl with noindex.In that situation, where
we want to pull them out,those pages out of
the search, how wouldwe go about that because,
again, we blocked themthrough robots.txt.How would we then go
about pulling those pagesout, and really as
quickly as possible?
JOHN MUELLER: I'd use the URL
removal tools for somethinglike that.So if this is like a
specific folder or sub-domainor something like that where
you can say everything maybethat starts with search.php,
or whatever pattern you have,should be removed, and I'd use
the removal tools for that.Those removals are for a
limited time, I think, 90 days.So I'd set a calendar reminder--
double check after 90 daysto see what's still
left because probablya lot of the content that was
the blocked by robots.txt endsup dropping out of the index
anyway if we can crawl itanymore.But after maybe 90 days, look at
it and say, well, this is fine.This isn't that
bad after all now.I'll just leave it like
this, or maybe you'll say,well, I'll do another 90
days, and then it gets betterafter those 90 days.
BARUCH LABUNSKI: Are you
referring to expiredsor where it says expired?Because it says,
expired, then I'llhave over 3,000 things
that I get from--
JOHN MUELLER: Yeah, that's
when those 90 days expires.So there's a difference there
between removing contentand saying that you want
the cache refreshed.If you use the cache
refresh, then thatwill expire faster if we
recognize that we've recrawledand reindexed those
pages in the meantime,but if you use a
removal tool, I'lljust remove it for 90 days.You have a choice between-- so
if you have the site verified,you have the choice between a
per URL removal or a per folderremoval, essentially.And for the folder
removal, you can alsospecify a script name,
and essentially every URLthat starts with
that will be removed.So if it's search PHP?Whatever.If you submit search
PHP, then we'llremove everything that
starts with that URL there.
DANIEL PICKEN: Will it work on
anything dynamically driven,because this is a question
that was put to me?And before the false,
anything dynamicallydriven before the actual folder
where it actually worked,it wouldn't accept it--
so a regular expressionit won't accept.
JOHN MUELLER: No,
it has to be exact.So it has to be an exact
matches of that folder.So, if, for example, you have
a category and then search,and then you want everything
removed under search.But the categories
can be hundredsof different categories.And that's not something you'd
be able to easily do there.You can submit
them manually likethat if you have the time if
it's really important for you.But it's not something
where you can justsay anything that starts
with any random characters,and then it goes to search,
that's what I want removed.
DANIEL PICKEN: Is there
any alternative to that,or is that generally
the default?If we we're not going to
take the noindex route,is that generally, is there
any other alternativesto that at all to
get those pages out?
JOHN MUELLER: Not really.Depending on what
kind of content is,what probably will happen if
you block it by robots.txtis that we'll lose
track of the contentbecause we can't crawl
it, and it generallywon't rank that well.So , if you search for
normal content type queries,you won't find it.But if you explicitly do a
site query or in URL query,then you'll still find that.So depending on what you're
looking at-- if you're saying,well, I just don't want
normal users to find it,then probably it will be fine to
just block it like robots.txt.If you say this is
confidential information,maybe in the query string
there is an email addressor something like that, then
you might want to take this stepand say, well, I don't
want people to evenbe able to run across
that at all in search.
DANIEL PICKEN: If you
were supporting noindexas a directive in
robots.txt, I hadheard somewhere that Google
[INAUDIBLE], but I'm not sure.So I'm seeking clarification.So if you had the
directive of noindex,would you honor that,
or that wouldn'tbe accepted because that
has to be in the page code?
JOHN MUELLER: Maybe.Yeah, so-- I mean, the tricky
part is it's-- I believe wehonor it partially, or to
some extent at the moment,but it's not an official
directive from our pointof view.And it's something that
could theoretically change,because it's not something
that we officially launched.So you could put
that in there, and itmight be the right thing.Or it might happen that
next week the team has been,oh, we've been meaning to
remove this code for so long,we just delete that.And it won't work anymore.So it's something you
could do if you wantedto see what happens, but if it's
really a critical issue for youand something that you want to
make sure is handled properly,then I wouldn't rely on that.
DANIEL PICKEN: That
would be a good solutionif you could do that.So then it stops that
problem of putting thingsin the robots.txt where
you can't crawl that whereyou won't be the noindex.So it overrides that.I think it would be
good solution as a wayforward if you were to do that.
JOHN MUELLER: It's a regular
point of discussion, yeah,because there are
definitely some good pointsfor using the noindex
directive in robots.txt.But there's also
the problem of a lotof people don't know what the
robots.txt does, and they justcopy and paste it
from somewhere else,and if they disallow all
crawling of their siteaccidentally, which happens
fairly frequently when peopledo relaunches, they accidentally
keep the relaunch robots.txt--those kind of things.If it's just a disallow,
then we can stillshow the site in search
and users can stillget there somehow.It probably won't be
optimal, but we'll stillbe able to do something with it.Whereas if they copy and paste
and noindex in the robots.txt,and they remove their whole
site, then that's somethingwe can't really help the users
to find that site anymore.
DANIEL PICKEN: And just quick
index question, I'm awareof multiple solutions
really to get Googleto index our content.In your opinion, what would
be the best and quickest wayto get that content indexed?What would be the best way?
JOHN MUELLER: If you
have handful pages?
DANIEL PICKEN: The
pages on your site.
JOHN MUELLER: There's
a bit of echo.So if you have a
handful of pages,I'd use the Fetch as
Google, so that could index.If you have a lot of
pages, if it's dynamic,I'd definitely use sitemaps.
DANIEL PICKEN: OK, cool.
John, I feel reallylost now without the
search tool's location.It's not me.Probably another million
SEOs feel the same.It's just there's an
option there where it says,myself being from Canada,
there is only one optionto either choose Canada or
any country, any countryand Canada is confusing.And I'm thinking towards the
appeal in Search Console.Should I remove it?Should I [INAUDIBLE]?Should I not?Because it's very
confusing the way it is.
JOHN MUELLER: So the setting
in Search Console is confusing?
BARUCH LABUNSKI: No in search,
like on top in Search Tools,there's no more location.You can't set it to specifically
in Canada, Toronto, where I am.Like I said, I mentioned
that in another Hangout,but it's hard to work
with-- I don't wheremy-- like, you know, different
areas-- it's jumping,and like what response.It's weird.
JOHN MUELLER: A lot
of the search featuresare focus on user queries--
those kind of things.So that's probably
not something wherewe'd say this is
critical for a webmaster.Therefore, we should add
that functionality back,because in general, what happens
with a lot of these thingswhere you have special
links and fieldsis that users get confused
because they don't really knowwhat they're supposed to do.So one thing you could try is
just using the URL parameters.I've heard that still works.So I don't know which
parameter name that was,but that might be
something to try out.
This is on my phone.If I'm located wherever,
the company I use,I don't want to
mention their name,but they're just giving me
an IP located in Mississaugawhich is about 45
minutes away from me.So I'm not going to
get precise company.So if I'm looking
for flowers, I'mgoing to get the flower shop
15 minutes away from me.I'm going to get about
45 minutes away from me.
JOHN MUELLER: Something that's
particularly on a phone,we should be able to
pick up more than justbased on the IP address.But I will take that
feedback back to the teamand see what they say.
BARUCH LABUNSKI: OK, thanks.
JOHN MUELLER: All
right, let me runthrough some of the
submitted questions,and then we can get back
to more discussions."Can you tell me if title and
metatags are ranking factors?Is it generally OK to
generate them automaticallyor is that bad?"As far as I know, from
everything at the moment,the title is something we do
use partially for ranking.Description, we
don't use at all.Description is something we
show in the search resultsas a part of the snippet.For both of them,
I personally tryto create something
that's useful for users.If you're generating
them automatically,that might make sense
as well, as longas you're showing that your
system generates somethingthat's kind of useful for
users and isn't just the copyone-to-one of your
whole content.So from my point of
view, if you havea smart set up to generate
title and description tags,go for it."Do you have any new
SEO tips for 2016?"Oh, man, I don't have any
magical SEO tips for next year.So I can't tell you about
that high ranking metatagthat we're currently working on.It probably won't work yet.But let's see, in general, I
think next year you'll probablyhear a lot from us about AMP,
about mobile continued kindof as we've been
doing over the years,because that's still a very
big topic, and we stilla lot of sites not
doing that properly.Those are probably
the bigger changes.Some of the other things that
will definitely happen as wellis more information about
really figure out howto handle these a little
bit better in searchand to probably make some better
recommendations on what youcould or should be doing there.But past that, of course,
high quality contentis something I'd focus on.I've seen lots and lots
of SEO blogs postedabout user experience, which
I think is a great thingto focus on as well, because
that essentially focuseson what we're trying to look at
as well that we want to bringpeople to content that's
really useful for them,and if your content is
really useful for users,then we want to
bring that to people.So if we can recognize that your
site is doing everything right,then that's something
we'll try to reflect.
BARUCH LABUNSKI: Just like Gary
said, don't remove the content,make it better.
JOHN MUELLER: Yeah,
that's definitelyone way to look at it,
especially when you'relooking at quality algorithms.If you know that your
content is low quality,then why not take the time
to actually improve thatinstead of deleting everything."Does Google treat
every industry the same,or does it have some
advantages or disadvantages?"In general, we try to
treat everything the same.We try to make our algorithms
so that they apply to the weboverall because,
on the one hand,the web is constantly changing.On the other hand,
we don't have timeto make specific algorithms
for specific parts of the web,and that could be for
individual languages.We try to keep everything
general for countries,for industries as well
where we say, well,this algorithm generally
works across allof the billions of sites
that we've run across so far.So it doesn't make
sense to split it upinto individual industry
or even to have settingsfor industry or
settings for language,settings for our country.Sometimes with
search features, wedo have limitations
per country basedon maybe policy and legal
issues in individual countries.With regards to some of the, I
guess, more advanced features,we also have, per language,
sometimes edge casesor it works better or
it doesn't work thatwell specifically around
voice search for example.If you're searching
in Swiss Germanas some obscure language,
for example, then it probablywon't be that good or that
useful in voice search.But if you use something
that's more commonly used, thenmaybe we can pick that
up a little bit better.So these are sometimes
the differencesbetween what some people
see and what you would see.So usually, that's more based
on language and country ratherthan industries."Does a two-way internal
link carry more weightand reinforce the relationship
between two pages morethan a link just going one way?Could this help with rankings if
the relationship was reinforcedbetter?"I think, in general,
this usuallyhelps us to better
crawl the websiteto understand the context
a little bit better,but I doubt you would ever see
any visible change in rankingsfrom changing a website to using
two-way links or one-way linksbecause, for the most part, we
can crawl these sites normallyjust fine.There's no need to
artificially create reciprocallinking within your website.If we can go through your menus,
through your normal structureon your site, then usually we
can pick that up fairly well.So I definitely
wouldn't play aroundwith artificial internal
links like one way, twoway within a site
because I thinkyou're probably just wasting
your time by focusing on it.
DANIEL PICKEN: What if
you do it incidentally?So, for example, I've
got "The Sun," BBC,all these great
websites linking to me,and I want to share
that with my customers.So therefore, I'll have a page
get coverage around the web,and I will link to that.But then, am I in danger there?Because unsuspectingly, I'm
doing this to share the mediacoverage, I suppose, there.
JOHN MUELLER: No,
that's perfect fine.That's not something I'd
say would ever be a problem.I think this question goes
more into internal linkingand, in a case
like that, I thinkit's even more
artificial if you'retrying to avoid a two-way
link, because if youhave your internal pages
set up, and you say,well, these pages are related.So of course they don't
link to each other.It makes sense for the user.It makes sense for us to
understand the content better.It's not something
you need to avoid,or where you'd ever see any
kind of a ranking changejust by saying, well, this
page links to that one,but it doesn't link
back, so it can't bethat strong of a relationship.I definitely wouldn't
focus on that."We added some service
pages to our branch pagesshowing different services
we offer in a certain city.This has useful
and unique content,and our rankings went up a lot.Does this positive
effect mean you don'tclass this as doorway pages?"Not necessarily.And so it's really hard to tell
based on just a description.If these are a
handful of pages thatare focusing on specific
aspects of your servicethat are essentially different
product pages, that maybeyou're offering to your clients,
then that's something thatsounds very useful
and not somethingwe necessarily
react negatively to.On the other hand, if
you're creating thousandsof pages with hundreds
of variants each,then that's probably
something we'd pick upas lower-quality content on the
one hand, maybe doorway pages,maybe even if someone
from the web spam teamwere to manually look
at it, they'd say,well, this is way too
far-- way past the line.We have to take action
here to make surethat the quality of our
search results remains high.So just because you're
seeing a certain behavior nowdoesn't necessarily
mean that you're doingis the right thing.I'd take a step back and
look at the bigger picture,maybe get some input
from other webmasters whoare in similar situations,
who figure out are you gettingclose to the line, are
you way past the line,or are you doing something
that essentially just makessense for your users
and perfectly fine?"If you recover from a
penalty, does Google stilldiscount the keyword phrases
it was penalizing you forto be with once the
penalty has been removed?Or is there nothing
holding you back?"It's hard to say.So in general,
the manual action,if it's resolved with a
reconsideration request,it's out of our system.It's not that our
algorithms willhold a grudge
against your websiteand say, well, there was
a manual action here.Therefore, we can't
trust this site as well.With regards to some
kind of manual actions,our algorithms might pick
up on these issues as well,and it might take some time
to reflect the new state.So for example, if you have
a lot of unnatural backlinks,then that's something where the
manual action might resolve--might happen and be resolved
by cleaning up some of that,but the algorithms
might say, well,we've recently recrawled
these and stillhave seen the bad links.So it might take a certain
longer time for thatto settle back down again.But, especially,
if you're talkingabout content on
your pages, it'snot that our
algorithms would holdany grudge against your
site from a manual action."If I have page content
under and H2 heading,and this content also has
subheadings within it,should I use H3s
with this contentor just leave it all under H2?What's better?"So these headings
help us to understandthe context of your content
a little bit better,but it's not a magic bullet.So if you use these
headings incorrectly,then it's not something
where we'll say,well, this will be a
problem, and our algorithmswill rank your site worse.These headings help us to
better understand the structure,but it's not
something that I wouldsay wouldn't be
absolutely critical.So from that point
of view, if youthink it makes sense from
a semantic point of viewto use H3s below the H2s, and
you can structure your contentlike that, sure, go for it.If this is something
that would resultin you having to redesign
your website, then probablythere are more important
things to focus on.
DANIEL PICKEN: What
about multiple H1s?Will that have any impact at all
if it's relevant to the page?
JOHN MUELLER: Not really.What generally happens
there is we recognizethere are multiple headings.So we don't know
which one of theseis really the main heading.So there is some
kind of dilution,I guess you would
say happening there.But it's not the case
that we would sayyou're doing something wrong.We're going to ignore
this completely.It's like taking a heading
that's, I don't know,four or five words,
and you expand itto a heading that's two
paragraphs long, then,of course, we can't
treat all of the contentwithin that heading the
same as a short one."Regarding rich
snippets, would itbe OK to have the same reviews
on many similar category pagesor is there risk to lose ranking
stars completely if one does soand therefore doesn't provide
unique ratings on every page?"In general, the reviews should
be focusing on the productthat you're offering
on those pages.So if this is a review
about one specific product,and you have the same product
on different parts of your site,maybe in different categories,
for example, then usingthe same reviews there
definitely makes sense.On the other hand, if you
have the same reviews acrossyour whole website for all of
the products that you offer,that starts looking a little bit
strange in that we don't reallyknow what we should be
doing with these reviews ,because they're clearly not
specific to the main productthat you have on
the page anymore."Is there any way
to find out whatpages have not
been indexed apartfrom using small sitemaps?This is in Search Console
to help work it out.Also, any tips as to why
pages may not get indexed?"That's a hard one.I don't know.So sitemaps is definitely
one way to do it.Another thing that you could
do if you have the server logsis to see which pages
get traffic from search.That indirectly tells you these
pages are actually indexed,because they get
traffic from search,and maybe the other
ones aren't indexed.That's probably what I'd focus
on more rather than focusingon the technical part
of this page is indexedor this page is not indexed.As to why pages may
not get indexed,there are a bunch of reasons.Let's see, we have spam.Sometimes we recognize
that there's spam.So we drop those from the index.Sometimes we
recognize that a pageis duplicate before
even crawling it.So we might not pick that up.Sometimes we'll recognize its
duplicate after crawling it.So if the content is
essentially the same,or the primary
content is the same,then maybe we'll drop
that after indexing.Sometimes what
happens is we justdon't have enough signals for
page to actually go and figureout, well, this is something we
really need to crawl and index.So that might be happening
that we don't evengo out and crawl it because we
don't know that much about it.What you can do there is
help us with a sitempa file--help us with clear,
internal navigation.The easier we can recognize
that these pages are reallyrelevant to your site, the more
we can actually focus on them.But otherwise, I
wouldn't necessarilyfocus on having all pages
indexed all the time.That's, from my point of view,
something that's almost a rarersituation that we'd actually
index 100% of a siteall the time, even for really
big and popular websites,we don't index all of the pages
that we find all of the time.So that's not
something that I wouldsay is necessarily a goal.But rather, I'd focus
on pages that youthink are important
for your site,and really make
sure that they arevisibly important when we do
go off and crawl and indexyour site."We've changed our
structure data markups, yet,the old schemas we
use to have are stillshown in Search Console with the
last detection date of nearlythree months ago.I'm sure our pages have
been crawled more often.So what's up with that?"I'd love to have
an example of this.I've heard this from, I
think, two people so far,and it might be that we're
just showing the old data therebecause we don't have a
new variation of that datayet for those pages.So for example, if you remove
a specific type of markupfrom your page,
then we might know,well, the old markup was last
found on these pages back then.But it doesn't mean
that the new markupthat we find there which was
different from the old markup,but hasn't been picked up.But if we're providing
something that'skind of confusing in Search
Console around structured datathere, then I'd love to
have some examples so that Ican talk to the team
about that and seewhat we can do to make them
either clear to understandwhat's actually happening
there or filter all of thatso that we can really
say, well, thisis what you need to
focus on, and we'llkeep all of this cruff,
old data away from youso you don't have
to worry about it.But if you can send me some
examples, I'd love to see them."Is there any connection
between crawling and ranking?We talk about index pages."Of course, I think the
general question hereis given better pages
already indexed, what'sthe difference between a page
that's crawled more frequentlyand one that's crawled
less frequently?And in general, we
do try to pick uppages that are more important
and crawl them more frequently,but it's more of a the symptom
essentially that we think,well, this is a more
important to refer,and we'll crawl it
more frequently.It's not that we'd say, well, we
crawl this page more frequentlyfor other reasons maybe because
we keep see timestampingchange on this page.Therefore, it will
become more important.But rather, it's kind of the
other way around that we thinkthis is a critical page for us.Therefore, we'll crawl
it more frequently.
BARUCH LABUNSKI: Are
you guys every goingto change the crawl
stat graph, because it'sbeen the same graph
for so many years.Are you planning to
maybe give it a makeover?
JOHN MUELLER: I don't know.Sometimes, I thought
they were kind of useful.So at least, I think
it's kind of liveand it reflects what's
actually happening there.But if you have
specific ideas whereyou say this is
totally confusingor can't work with this at
all, the way that it is is,you've got to delete this
graph and start over again,then that kind of
feedback would be useful.But I think if it's
working for people,if it's useful
information for people,then I'd be up for
keeping it the way it is.I wouldn't focus on
modernizing something justfor the sake of having
modernized it, because that'salways time that's
lost which couldspent focusing on other things."Is it possible that
product pages with imagesdo benefit or suffer from
their own ranking becauseof the ranking of the
image in image search?So a good image ranking might
result in a good rankingfor the hosting product page?"Not necessarily.We try to separate those
essentially as much aspossible in the sens
that some things makesense in image search.Some things are really
important in web search,and it's not that
we'd say this isreally popular in image
search or really usefulfor image search.Therefore, we'll boost it
in web ranking as well.We try to keep those separate.And what might happen is,
of course, for some topics,we'll show the image
universal block on top,and that could include an image
from a page that's actuallyranked fairly low or somewhere
below it in the main searchresults as well.So it's not that we'd, say,
couple the two that strong."We notice a huge increase
in traffic over images.Google DE is referred.Has there been any
major changes at Googlewith impact in image search?"I don't think there have
been big changes there.One thing that is
kind of specialwith the German and
the French image searchis that they use old-style UI
with the iframed landing pagebehind.So that's something where
you might see some effectsfrom experiments
there with regardsto moving to HTTPS--
those kind of things.With regards to impressions
and clicks in Search Console,you might also see
some changes thereas we try to unify the way that
we track clicks and impressionsacross the older style UI that
we have in Germany and Franceand the more modern UI
that we have in URLs.So those are I think the two
main sources where you mightsee some mismatches there.I think especially
in Germany and Francewith regards to
image search, that'ssomething where
you'll probably expectsome changes because the old
UI is in the mean time gettingrusty almost where maybe it
makes sense to find a wayto modernize that a
little bit in the future."If a site has given us
organic and affiliate links,can that be detrimental
to our website.Imagine where a
person placed an orderon the site, blogged
about the products,and later became an affiliate.Can this look dodgy to Google?"From my point of view,
that's perfectly fine.That's not something
I'd really worry about.If you do have a lot
of affiliate links--if you provide a snippet
for your affiliatesto link to your site,
I'd just make surethat it has a nofollow tag to
clearly say this is essentiallya link that's placed with
regards to monetary connectionto the site.But otherwise, if their
natural links and affiliatelinks on the same site pointing
to your site, that can happen.That's not something I'd say
our algorithms would pickup and say this is problematic."Do blogs hosted on platforms
such as WordPress or Blogspothave the same value as blogs
on their own top-level domain?"Yes, in general, they do.And sometimes you can host your
blog on Wordpress or Blogspotand have a custom domain.So it's not something
that we tried to separate."What's the status of HTTP2?"I don't know.Someone asked about this in
one of the last Hangouts,and I haven't heard back
from the team on that.So I'll assume we're
not doing that just yet.If your site supports
HTTP too, thenthat's something you can
probably check your serverlogs and see what's
happening therebefore we actually tell you.So enterprising SEO
bloggers out there,if you find Google crawling
with HTTP on your site,then that might be
something to write about,and I'm sure Barry will pick
that up quickly as well.
MALE SPEAKER 2: So
you recommend youswitch to HTTP2 just to test
to see if you guys indexed it?
JOHN MUELLER: Well, you're
not switching to HTTP2.It's something
that's automaticallyupgraded in the sense that when
a browser that supports HTTP2goes to your site, it
will send, I believe,a special header that
says, I'm ready for HTTP2,and if the server
supports that, you willswitch to that transparently.
MALE SPEAKER 2: So it
is definitely goingto support both, you're saying.
JOHN MUELLER: Yeah,
yeah, it's notthat you're going
from one to the other,and then suddenly it
doesn't work anymore.Essentially, it's
backwards compatibleand it transparently upgrades.
MALE SPEAKER 2: We'll
give it a try, thanks.
BARUCH LABUNSKI: Are you
going to do a separate Hangouton AMP?
JOHN MUELLER: Probably
about next year.So I think the next one is
about mobile, which includes AMPas well.So that will be like
an intro, I guess.But we'll probably
do more on AMPas we get closer to actually
showing it in searchor as we do start
showing it in search.
BARUCH LABUNSKI: By the way,
I loved that video you showedon Google+.It's probably the best
video I've seen in the year.The one that you
shared about the AMP.
JOHN MUELLER: Oh,
BARUCH LABUNSKI: Yeah, it
was a 10 out of 10, John.
JOHN MUELLER: Yeah,
it wasn't mine.So I'll have to pass that on."Why so much not provided
in Google Analytics?Difficult to analyze
the traffic."This is essentially based on
the referrer information thatis passed to the websites,
which is something,by policy, we decided to do,
I believe, three or four yearsago, in the meantime.So that's something
where you haveto be creative-- use the data
from Search Console as well.Mix that together with
the analytics dataand find a way that you
can connect those two."I'm looking for SEO
PPC for a company.Our ranking for some of
the main search terms,'self-guided walking holidays,'
has dropped significantlythe last seven days."And this probably goes on.Ooh, lots of
questions, suddenly.What I'd recommend
doing there is maybeposting in the help
forum and getting helpfrom other peers who
have similar websites whomight have some tips for you
about this kind of thing.And in the help forum, make
sure that you have your URLin there somewhere.You can use a URL shortener
if you like but justso that people can actually
look at your specific contentrather than try to discuss
the theoretical situationof this site ranking
for this term.So that's probably what
I'd recommend doing there.All right, let me just run
through some of the remainingquestions and see
if there's somethingspecific we need to focus on.Otherwise, yeah, maybe
I'll just open it upfor questions from you
all in the meantime.
BARUCH LABUNSKI: Just the
HTTPS move-- I wanted to knowis it really necessary add the
disavow file from the old one.
So from www.shoes.com
to https://www.shoes.comshould I add the disavow?
JOHN MUELLER: You
should put the disavowon your canonical
version of your site.So if you move your
site from HTTP to HTTPS,then that's why I'd
submit the disavow file.If you move from one domain
to another, of course,then I'd put that there too.
So you're sayingit's necessary to do that.
JOHN MUELLER: Yeah,
it's fairly easy to do.You can download the
file in Search Consoleand just upload the same file
again with the other setup,and then you have it solved.
BARUCH LABUNSKI: OK, 2048 is the
best certificate you suggest,yeah?
JOHN MUELLER: I think that's
the common one at the moment.
BARUCH LABUNSKI: OK.Thanks.
MALE SPEAKER 1: Hi, John.Is there any difference between
a link with a text anchorand the link from
a picture in casethat picture has alt and
title in terms of PR transfer?
JOHN MUELLER: Page rank
should be the same regardlessof what you have there.With regards to associating
the anchor with the URL,I believe both of those would
be essentially equivalent.You don't need both the
title and the alt tag.I think the alt tag is enough
there because it's essentiallyduplicated again.
MALE SPEAKER 1: Thanks a lot.
MALE SPEAKER 3: All right.OK, John, I'm just
really curious to hearif we are doing anything wrong
regarding using the SearchConsole API because even
though the quota says somethingaround, I think, 2,000
requests per second,we can only manage
around three per second.Who can we get in touch with
to solve these kinds of issues?
JOHN MUELLER: I think
that's the normal quota.So for the API, we
have a general quota.think it's 10 QPS.But that's across
all the features.If you're only focusing
on Search Analytics data,then I think the
limit there is 2 QPS.
MALE SPEAKER 3: OK, so we
would have to request increase.
JOHN MUELLER: Yeah.I believe at the
moment, we're discussinghow we should set up
the quota in the futureso that people don't need
to have special exceptions.So you probably won't
hear from us that quickly,but we're looking
at what we couldto do there to make that
a little bit easier sothat it just works for people
who need this kind of data.So in practice, what I'd
do there is just make surethat you're throttling and
spreading your traffic outso you don't hit
those quota limits.I think another
difficulty there isthat the error that's returned
if you go above the quotais, I think, some generic
error instead of a clearlike you're doing too much.You should slow down.
MALE SPEAKER 3: Would it help
to do request from, let's say,more than one IP address?Or is that irrelevant?
JOHN MUELLER: That's irrelevant.
MALE SPEAKER 3:
Yeah, because we'rehitting the same endpoint.
JOHN MUELLER: Yeah.
MALE SPEAKER 3: But
is there any way--we wouldn't need to increase--
maybe we could managewith the quota right now.But if we want you to tightly
integrate with Search Console,and I should give
the background weare Site Improve, which we have
created a website governancetool.We want to integrate
keywords on a per pageso they can see what
pages are ranking for.And down the line, the quota
it just may be insufficient.
JOHN MUELLER: I don't have
a great answer for youat the moment.I suspect it'll look different
in a couple of monthsafter we figure out what we can
do with the quota-- what kindof limits we can provide.Also, with regards
to better documentingand how we're actually
doing the quota specificallyaround Search Analytics because
that's where a lot of peopleare hitting the limits with
the other functionality--the API, the people who are
not reaching those limits thatmuch.So that's something
where maybe wecan clarify how many
requests per site or peraccount you can do so that
you can better work out,well, if I grow to
double my user base,I need to maybe do
more caching on my siteor maybe bundle
request a little bitdifferently so that I can
remain below the quotafor the long run.
MALE SPEAKER 2: John, can I go
back to the app stuff quickly?
JOHN MUELLER: Sure.
MALE SPEAKER 2: So it seems
like if the Fetch as Googledoes work, then there
wouldn't be any crawl errors.So everything seems
to be working properlyin terms of deep linking
and stuff like that.When I first launched,
I had on average maybelike 600 impressions
to 1,000 impressions.Then that Search
Console bug happened,and now, I'm literally
at zero impressionsto two impressions per day.Do I have a Penguin app penalty?
JOHN MUELLER: I'm not aware of
any app penalties around that.So I wouldn't say that
it's anything like that.I think what probably happened
is that the timing was awkward,especially with regards
to that Search Console bugthat we had there
where maybe we wererunning an experiment before.We were running that
intellibot then,and we stopped that
experiment, and wehad that bug in Search Console.Now, it looks like, well,
because of that bug,the numbers went down.But actually, it's just because
we stopped that experiment.But I really passed
your site onto the teamto look at to see what
specifically is happening therebecause I also thought, well,
this isn't really motivatingfor people to actually
do app indexing if youget two impressions a day.That's not really that exciting.So I don't know what
else I can reallyrecommend there other than you
encourage your readers to usethe app as well.What might make sense is
to use the Smart App bannerthat you can use now for
Chrome that has the banner thatgoes to the Play Store link
so that when people are usingyour phone to access your site,
then you point them there.
MALE SPEAKER 2: OK,
I think I have that.I'll double check.And people are asking
me, show me this app.Show me if it's worth doing.And I'm like, I really
see nothing now.So I can't really
tell people anymore.
JOHN MUELLER: All right.Let's take a break here.I think we have the next
couple of Hangouts set up.The one, I believe, in the week
between Christmas and New Yearis a little bit longer.So we have some
more time to chat,and maybe we won't go through
so many questions or somethinglike that.Maybe we'll do it a
little bit more chatty.If people are around then,
feel free to jump on in.Maybe we'll also
have time to lookat specific sites,
depending how much we'llfind the time to make it.
ROBB YOUNG: Can you reveal
secrets on that one?
JOHN MUELLER: I don't know.Maybe.
ROBB YOUNG: One small secret?
JOHN MUELLER: I don't think
that's going to go on.But maybe that secret metatag
that I was mentioning before.
ROBB YOUNG: All right.
JOHN MUELLER: All right.
ROBB YOUNG: It's worth asking.
BARUCH LABUNSKI: But
you're working the holidayslike always, right?Because you're on the forums
every single holiday--you and Matt Cutts last year.
JOHN MUELLER: Yeah.I don't know.I don't know what Matt's
plans are for this year.I don't know.It's always nice to
hang out in he forumsand do some stuff
there, and especiallyover the holidays when you
see people having problemswith their websites,
they usuallyappreciate that I try to help.That's something you
guys can jump in as well.And maybe you'll find the
question here or there.I know Mihai has been
doing a bunch of answersin the forums, which
is really appreciated.So you're welcome
to jump in as well,and maybe you'll see some of
us there for the holidays.All right, with that?Let's take a break here.Thank you for joining.Thanks for all the questions.Thanks for all the discussions
on the various topicsthat we've been looking at.And I hope to see
some of you againin some of the future Hangouts.Bye, everyone.