All questions have Show Video links that will fast forward to the appropriate place in the video.
Transcript Of The Office Hours Hangout
Click on any line of text to go to that point in the video
JOHN MUELLER: OK.Welcome, everyone, to today's
Google Webmaster Central OfficeHours Hangouts.My name is John Mueller.I'm a Webmaster Trends Analyst
here at Google Switzerland,and I'd like to help
answer your Webmaster websearch related questions.We have a bunch of questions
that were submitted already.There are a bunch of
people here in the Hangoutlive as well, ready to
ask more questions, I bet.So let's get started.Let me mute you, Greg.You have a lot of
background noise,but feel free to unmute
if you have any questions.
BARUCH LABUNSKI: I got a
question about a disavow.
JOHN MUELLER: All right.Go for it.
BARUCH LABUNSKI: So
this is a questionI've been waiting
for like patiently,I guess, around two weeks.So suppose exampleone.com
has decidedto rebrand itself and add
another keyword to their URLor whatever for all
sorts of purposes.And so exampletwo.com
comes aroundand exampleone is 301ed
to the exampletwo,and exampleone also
have a negative SEOimpact on their website, so they
decided to go ahead and disavowand they left the file
in the exampleone URL.And now exampletwo comes
around, and let's saytwo or three weeks
from now, those linkstransfer to exampletwo,
does that meanswe need to have two disavows?
JOHN MUELLER: Yeah.
BARUCH LABUNSKI: Is
there a reason for that,or should the exampletwo
brand new first domain nothave a disavow?
JOHN MUELLER: Well, if you're
redirecting from exampleoneto exampletwo, then
you're essentiallyforwarding that page
rank from those links,so that's essentially what
you'd want to kind of disavow.So what I would do there is
just take your disavow filefrom exampleone and upload
it on exampletwo as welland continue working on
that as you find more thingsthat you need to disavow.
BARUCH LABUNSKI: Oh,
so you can do that.
JOHN MUELLER: Yeah.
Oh, OK, because Iasked many guys out
there and-- wow, so OK.
JOHN MUELLER: OK, great.
MALE SPEAKER: John,
on that, wouldyou do the exact
same file and notbother making new notations
or anything or a notein there saying,
previously uploaded to.Each separate URL
has a line for notes.Would you do something on there?
JOHN MUELLER: We
don't read the notes,so those are
essentially for you.If you think the notes are
useful, keep them there.If you don't want to add any
notes, that's absolutely fine.We essentially process that
automatically, so anythingyou leave in the notes there
is going to get dropped.
MALE SPEAKER: And unless
there's a manual reviewthat is under a penalty.
JOHN MUELLER: Even
then, we would onlylook at the results
of the file, so wewouldn't look at the disavow
file that was submitted.
BARUCH LABUNSKI: So it won't
harm the site or anythinglike that, John?
JOHN MUELLER: Well,
I mean dependingon what you do with
the disavow file.Of course, if you include all of
the links in your disavow file,then we'll treat
your site as if itdidn't have any links after
we process all of that,so that could be problematic.But if you take the disavow file
that you previously had and youknow that kind of covers
the bad links that you foundfor your site and reuse
that on your new site,then that's fine.
BARUCH LABUNSKI: OK.Thank you.
MALE SPEAKER: Hey, John.Mind if I ask a
question as well?
JOHN MUELLER: Sure.
MALE SPEAKER: All right,
so I got a new clientthat I'm doing link
profile review for,and he has a lot of bad links,
so got to deal with that.But he also had some
interesting linksthat I think are
specific to his nicheand they're not really intended
to manipulate search enginerankings or something
like that, but theyare some sort of
stealth submitted.I'm talking about a
real estate agency thathas its own listings,
but also postslistings to aggregate sites,
like classifieds or aggregatorsof these types of listings.And every time an
agent of theirs postsa listing on these
aggregator sites, there'salso a profile of the
agency that posted the list.And so we have contact details
like name, address, a phonenumber, and website, the URL.So should I disavow
these links as well,or are those links
already ignored by Googlebecause it kind of
sees that, yes, thisis a feature of the platform
itself and we shouldn'thave did that account.I'll give you an example.
JOHN MUELLER: OK.
MALE SPEAKER: And the links
of the agency's profileis in the top right corner
of where you can see that.
JOHN MUELLER: OK.Yeah, I'd have to take a look
at that to see the kind of valuewe have that's set up there.But in general, if you're
dropping these links yourself,then that's something that
shouldn't be passing page rank.So on the one
hand, we'd probablyrecommend for a site like
that, like it's almostlike a directory
site, I guess youcould say, to use a
nofollow there as well.And that's something
we probably tryto recognize in our
algorithms as well.So if you want to
nofollow that or if youwant to use a disavow for
that, that's probably fine,but it's not something
where I'd sayit's always extremely clear cut.So I'd have to take a look at
that, how you have that set up,how that's actually
working there,to be able to give you a
little bit better advice.
MALE SPEAKER: The idea is that
the agent posting the listingis not the one actually
posting the link.The platform itself posts
the agency's profileon every listing
the agent posts, sothat's just the same
profile on every page.
JOHN MUELLER: I'd have
to take a look at that.I don't have any complete
answer for that [INAUDIBLE].
MALE SPEAKER: I'll send you a
message on Google+ with that.
JOHN MUELLER: OK, sure.All right.Let's go through some
of the questions thatwere submitted here.This one was really
top ranked and Ithink it shows that we're
confusing people a little bit,so we have to kind of
watch out for that.The question is, allow
make them scroll a lot.Don't scroll too much on mobile,
but make them hide content.I'm confused.Who else is confused?So I guess in general,
the main themes hereare crawling of CSS
doing because if you pullin content with your
mobile friendly, for example,with your CSS and
can't recognize that content.We can't use it.So we try to pull in the content
there, but if that's blocked,then we can't use it.So that's something that I
think is generally little bitmore clear cut.We probably need to explain
that a little bit betterthough, I guess.Since so many people
voted this question up,also with regards to
being user friendly,and hiding continent on
mobile, or not hidingcontent, that kind of plays
into one of the topicswe talked about in one
of the previous Hangoutswhere I said that if content
is hidden on your pageby default, if a user goes
to your page and not shown,then that's something
that we kind ofdiscount in our indexing and
for the relevancy in the searchresults.And that's been something
that's been like that for years.Someone else pulled out
a link to a Hangout,I think, from 2012 where
we talked about that.So that's definitely been
like that since at least then.I think we've talked about
that from before as well.So if content is hidden on
your page when that page loads,that's not something that we
would treat as critical or asimportant to a page
as the content that'sactually visible to the user.But I think these
are topics thatcan get quite complicated
when you take a look at allthe specifics, so maybe we need
to explain these a little bitbetter.
MALE SPEAKER: John,
with that, the contentnot being shown by
default, would that alsobe taken into account
for the mobile algorithm,because a lot more is hidden
to be friendly on a mobilethan a normal screen or not?
JOHN MUELLER: With regards to
the content from the mobile,what we recommend doing is
having like the rail canonicalpointing at the desktop page.So if you have separate
URLs for example.And what would happened
there was we'd index the pagebased on what is seen on
the desktop page primarily.So if we can tell
that the content isequivalent on the mobile, even
if it's formatted differently,if things like images
might not be shown,then that's absolutely fine.We just expect that the primary
content actually be equivalent.OK.
BARUCH LABUNSKI: But if a
page has 1,500 words, John,how should we display
that on mobile?I mean, even if
it converts it, itdoesn't look-- because
I mean, the mobile useris completely different than
the desktop user, right?
JOHN MUELLER: Yeah.I mean, that's
something where youhave to think about what you
can do to create an equivalentexperience for the user
where the primary content isequivalent.It doesn't have to
be exactly the same.So kind of like,
I guess, Wikipediadoes that fairly nicely.They have the primary
content the sameand you can expand out to
the individual sectionsif you want to do that.
BARUCH LABUNSKI: OK.
JOHN MUELLER: Or
other sites I've seenhave the primary
content the sameand they treat
some other contenton the desktop page as
kind of secondary contentthat they don't show
at all, or that theyshow on different pages.So that's something
that would work as well.
MALE SPEAKER: And John,
I know we've probablydiscussed this a
few times before,but I was just wondering
about the content in termsof the position on
the page that it sits.I think with the new
redesign we're rolling outat the moment on our
site, we've got buttonsat the top that use a
jQuery.ScrollTo method, whichmeans really the content is
a lot higher and a lot moreimportant than it might appear
at first sort of appearance.So is that being taken into
account at the moment in termsof Googlebot, or if my content
was higher up, would itactually, rather than
being lower down somewhere,would it be better?Because really, it
doesn't make sensewhen you look at our
design, if that makes sense.
JOHN MUELLER: I don't have
your design in my headat the moment, but in general,
if you can see the content whenyou go to those
pages, that's fine.And that doesn't mean you have
to see all of the content,but one thing we try
to avoid, for example,is that the primary content
you see on a page when you govisit it is just an ad.Maybe it's even an ad
for a different site.So if you go to a page like
that and you feel like,well, there's nothing
here that I search for.But if you go to a
page that makes sense,where the primary content is
partially visible at least,then as a user, you
go there and you say,well, this is what
I was looking for.I know there's more
further down on the page.It's clearly visible that the
main content is at least theresomewhere.
MALE SPEAKER: Yeah.I mean, we're making sure
everything's actually visibleand everything's on the page.So there's actually
no hidden contentor no divs that hide content.It was more of a
case of how highup now that content should be
and whether that does actuallymake a difference.I know you say do
everything that you shouldthe way that you want to do it.But obviously there are impacts.And if this is a 0.1
of a percent thatmakes a difference of us
having that content higher,it wouldn't impact
our design massively.Should we put that higher up?
JOHN MUELLER: I think if
it's visible on an average--what do you call it-- the first
page, when you view the page,then that's--
MALE SPEAKER: Above
the fold you mean?
JOHN MUELLER: Above
the fold, yeah.It's not something--
MALE SPEAKER: But that
actually is better.
JOHN MUELLER: It's
not something whereI'd say 50 pixels is
better than 100 pixelsfrom the top of the page.I don't think that's a little
detail you need to worry about.
MALE SPEAKER: Yeah, OK.Thanks, John.
JOHN MUELLER: Sure.
MALE SPEAKER: John, I have
a quick question for youif you have a second.In regards to new websites in
highly competitive verticalswith solid on and
off page SEO signals,can it take longer
than normal to rankin extremely
JOHN MUELLER: Sure.I mean, this is something where
essentially the answer isn'treally related to
Google's algorithms,but if you're in a very
competitive market,then those competitors will
have spent a lot of timeand a lot of money on
their websites gettingthings set up right.Maybe they're going to have
a lot of really happy users,and kind of breaking
into that marketis always going to be hard.And our algorithms look
at a bunch of stuffwhen we look at different
sites, and if wesee one site that's
really new that hassome signals but
not really all thesignals that we're looking
at, then sometimes wecan guess and say,
well, this might be OK.We'll just try it out.But sometimes, we just
have so many strong signalsfrom the existing
sites that we say,well, this site really just
has to prove itself firstbefore we start putting it
high up in the search results.
MALE SPEAKER: Thanks.And I also apologize
for the background noisebeing in the coffee shop.
JOHN MUELLER: No problem.All right.Next question here.Google will discount
the content addedin tabs or on click
read more links.How will it affect rankings
for search queries relatedto that content?Will this happen in
100% of the cases?When will you start to do this?Google also uses tagged content.So again, we've been
doing this for years now.It's not really something new.If your content isn't
really visible on a pagewhen we load it,
then that's somethingwe'll try to take into
account in our algorithms.It's not that we'll ignore
it completely, but we'll say,this isn't the primary content.And we might kind of
discount its weightwhen we do the relevancy
calculations in the searchresults.
If the developerwanted to create user
experience-- like if it'sselling something like tires,
for instance, like 9.99, 10.99,it's like you
separate it in tabs,and I understand the user
would not find out that hey,you could click here in the tab.But I mean that's
a problem, yeah?
JOHN MUELLER: It's
primarily a problembecause people might be coming
from the search results lookingfor something specific, and
the snippet might say, oh,you can read all about
our great shoe store here.And you go to that
page and it doesn'thave anything about a
shoe store because it'shidden away in one of the tabs.And that's kind of the problem
that we're facing there.And it's not that we'd say
we take this out completely,but we kind of discount it.So if this is secondary
content that's not primarilyrelevant to those pages, then
if you search for an exact matchfrom that text, then probably
you'll still find it in search.But it's not
something where we'dsay this is really
the primary content.
BARUCH LABUNSKI: I actually
found it, like you said.I did find it and it ended
up going to the exact page.
JOHN MUELLER: Yeah.I mean, if that's
something that'sreally unique to
that page, we'llstill kind of take
that into account.But if you're competing
with other sites,if you're in like normal search
results for generic queries,then that's something
where we'd say,well, this page
doesn't primarilyseem to be focused
on this topic.It also includes
something about it,but it's not the primary focus.So what I would do, especially
with regards to tabs,is think about whether or not
you can split those tabs offinto separate pages
and say this is reallyimportant content in these tabs.I really want my page to rank
for this, and in that case,load it off into a separate page
or put it on your primary page.If it's something that
you say, well, thisis kind of secondary
information, you're looking at,I don't know, tires for
example like you mentioned,and you have all the different
sizes that you have available,then maybe that's not
critical to your pageand you could put
that into a tab.
John, does Googlebotlike parallax websites?
JOHN MUELLER: I don't know.Probably.What's parallax?
BARUCH LABUNSKI: Where you can
just continue to scroll and--
JOHN MUELLER: Oh, like the
infinite scroll type things.
MALE SPEAKER: John, I was
asking you the same thing rightexactly on this topic.How about infinite
scroll websiteswhere you have all the
content on one page?
JOHN MUELLER: Like with infinite
scroll websites, on the onehand we have our recommendations
that we published,I think, couple months
ago, maybe longer,on how you can kind of paginate
that infinite scroll content.The important part there is if
this is really a gigantic HTMLpage, at some point
we'll just cut offand say we're just
indexing up to here,and we're kind of
focusing on that.So that's something
to keep in mind.If it's something that
continuously gets loaded moreand more as you scroll
down, then we'llalso just scroll down
to a certain pointand say, well, we've been
reloading more content herefor a while now.This is probably not
really that relevantany more and we'll
stop at some point.So it's not that
we'll index everythingon an infinite scroll page,
because I could be going onforever, but we'll try to
get a good view of that page.So--
MALE SPEAKER: Go ahead, please.
JOHN MUELLER: So I
guess one recommendationI would make there is if
you're doing infinite scroll,make sure you also have some
kind of paginated navigationor some kind of
category navigationas well so that people
can kind of click linksand go to that content instead
of having to go to the homepageand scrolling down
500 times until theyreach that piece of content.
MALE SPEAKER: Do you
have a certain amountof data you're able to scroll,
I mean, to be on the safe sideif you make this
JOHN MUELLER: I don't know.I remember someone did
a test a while back,and it was something like
50 megabytes of the HTML.I don't know if
that's still relevant,but especially
with 50 megabytes,you can put a lot of content
into 50 megabytes of HTML.So if you're at
that size, I thinkyou're probably better off
splitting things often.
MALE SPEAKER: OK.Thank you very much.
MALE SPEAKER: Surely
you're presenting yourselfwith the same issue of
hidden text at that pointthat we just had the
JOHN MUELLER: Yeah,
I mean, it startsgetting into similar
areas where we say,well, this content
is so far down.Is this really still relevant?The tricky part with
really big pagesis recognizing which
parts of the pageare actually important
because sometimes, you'llhave something like a PDF
that goes on for 100 pagesand there's something
really important on page 99,but if it's not mentioned
somewhere on the top,then we might actually
miss out on that.So that's something where
if there's something reallycritical to your
pages, make sure it'svisible to your
users, and then wecan take that into account
a lot easier as well.
MALE SPEAKER: John, I have
one more follow up question.I want to clearly state this
is not a penguin question.Sorry, Barry.In regards to third
party metrics,such as Majestic,
and Ahrefs, and Moz,do any part of
Google's algorithmsacknowledge and look
at those metricswhen assessing the site?
JOHN MUELLER: No.
MALE SPEAKER: Not domain
authority or anythinglike that?
JOHN MUELLER: I mean,
what might happenis that some of these metrics
overlap with other thingsthat we look at, but it's
definitely not the casethat we have a Majestic
API license and we goand look up sites there.I think to some
extent, these toolsare pulling in similar metrics
and trying to recreate metricsthat Google might be using,
and I think some of themdo a really interesting
job of that.But it's not the case that
we would use their metricsbecause we have
enough metrics to takeinto account on
our side directly.
MALE SPEAKER: Thank you, sir.
JOHN MUELLER: All right.Is a new start up ranking
based on how many pages youhave overall on the
site, which is difficultwhen there are competitors
with lots of content,or is it OK to focus on a
smaller number of high qualitypages in the initial stages?I think this is similar to
the previous question wehad about existing markets.Essentially, we're
looking at your pagesbased on what we find
there and it's nota matter of the
quantity of contentyou have on your pages
and more of the qualityand how that actually
fits in overall.And what I recommend doing there
is the same thing you woulddo with any new business that
kind of goes into an existingmarket is find something
that you do really welland focus on that and
kind of build up on that.Instead of trying to
be the same as everyoneelse in that market, really try
to find something that sets youapart, that makes you
unique, and focus on that.So don't just create
the same number of pagesthat other people have.Don't just focus on
the bulk of content.Really find something that
makes your business unique,that makes-- kind of gives
us a reason to show your sideas well in those kind
of search results.Our homepage and
department pageshave a cache date
of 22nd of October.We had a big drop in
crawl rate for a few daysbut then went back
to normal rankings.Organic traffic
has also reduced.Is this a sign of a penalty?If it were a penalty,
you'd definitelyget a notice in
Webmaster Tools, so Idon't know if you looked
there, but I'd doublecheck there at least.We did have a technical
problem on our side with regardto the cached pages
where a lot of siteswere seeing cache dates around
20th, 21st, 2nd of October,and that should be
resolved in the meantime,so that's not
something that shouldbe causing any problems there.And the cache date
isn't somethingthat affects how we
crawl or index your site.So if you're seeing
changes with regardsto your ranking
or your crawling,then that would be completely
unrelated to the cache date.Oh, a penguin question,
maybe just a real short one.Is the Penguin
update done rolling?Essentially, I think it's still
rolling out to some extent,so this is something
where they'redoing a really slow and cautious
rollout of this data there,but I don't have anything new
to add to those discussions.
MALE SPEAKER: Is it rolling
out across the board?
JOHN MUELLER: Sorry.Sorry?
MALE SPEAKER: Is it rolling
out across the board,like across-- or let's say
North America's alreadybeen completed and
now it's overseas,or is it like everywhere,
it's being rolled out still?
JOHN MUELLER: We try
not to split things upby country or by
language unless there'sspecific reasons for
that, like policy reasonsor legal reasons.For example, when it
comes, I don't know, maybeto the snippets, or
titles, or thingslike that where it's
also hard for usto understand the
different language content.So those are the kind of cases
where we kind of split this upby country.But general web spam,
general quality algorithms,are things we try
to do globally.What's Google's view
of duplicate content,and how does that
relate to feedssuch as MLS feeds for
real estate listings?Real estate sites
end up competingagainst each other over
exactly the same content.That's always a
problem, I guess,especially if you kind of have
your own content duplicatedlike that, if you syndicate
your content like that.I think that's the
same across the board.It's not specific to
real estate sites,but as soon as you
syndicate your content,we crawl and index all of
those different versions.And in the search
results, we haveto make a decision which
one we want to show there,so if all of this content
is essentially the same,we'll see that as
duplicate content.And in the search results, we'll
try to pick one or the otherand show that as appropriate.So that's not something
I see specific to MLSsites or real estate sites.It's really across the board.If you're syndicating your
content, then people might be--or we might be picking
one or the other oneto actually show in
the search results.
MALE SPEAKER: And John, in
terms of duplicated content,another thing that we
touched on in the pastis your internal
duplicated content.And the one thing I've
noticed that hasn't recoveredon our site is the office space
search related information.So we have a lot of tailored
virtual office products,but we also broker a
lot of office space,and so that means that
we have multiple pageswith multiple duplicated
content in areasthat have a close proximity.And due to the fact that
I feel like the content isfairly high quality,
especially comparedto others I'm seeing
out there, I'mstill concerned that there
is a problem with havingthis many locations--
let's say 3,000or 4,000 different locations
with various duplicatedinternal content-- that maybe
is being looked at by Pandaand not giving us
the best results.And I'm wondering, what is
my best course of action?Am I better off reducing the
amount of pages I actuallyhave, focus only on a
core set of locations?Is that actually going to
make a difference or amI dealing with a factor that
I'm not taking into account?Sorry for the long
JOHN MUELLER: I
guess it's alwayshard to make a decision
between having more pages thatare kind of similar and
having fewer pages thatare very focused.But it's not something where
I'd say is primarily a qualityreason where we'd say these
are really low quality pages,we'd kind of demote the
site because of that.Most of the time,
it's just a matterof having a lot of
pages that have very,let's say, diluted signals.And compared to a few pages,
I have very strong signals.So depending on your
website, it might make senseto kind of shift the
balance a little bitover towards having
fewer pages thatare really strong on their
own or it might make senseto say, well, these pages are
so strong I can afford to splitthem off into two
MALE SPEAKER: What
do you mean strong?
JOHN MUELLER: How do you--
MALE SPEAKER: What do
you mean by strong?I mean, what would
happen is essentiallyI would be taking
5,000 pages and Iwould remove 4,000 of them,
but nothing would actuallyphysically change
in those pages.
BARUCH LABUNSKI: What if he
rewrites those 5,000 pages?Then it's fine, no?
JOHN MUELLER: That's a
lot of content, yeah.
MALE SPEAKER: We already
wrote half a million wordswhen we realized that we had a
problem as the previous personin the question
asked there about MLSand whether or not you
should be sharing content.We had that issue, so
we rewrote everything,half a million words.It was huge, so
we're just tryingto find what the
right avenue is.I mean, this is what you've
been telling people to do.I'm trying to also give people
some help on multiple forumswho seem to be having
similar issues.So for the car industry, for
the real estate industry,for our industry and
office space-- and thereare a lot of others who are
probably listening and watchingthis that don't know what
to do in this situation.
JOHN MUELLER: Yeah.It's really hard.So one thing I'd recommend
doing is making surethat you don't have
search pages indexed.That's one aspect that
sometimes kind of explodesthe number of pages that
we find from a site.If you have a search form,
you can enter any wordand it'll find any
number of pageson your site, that
essentially createsa ton of different pages,
which generally end upbeing not so
important for a site.So that's, I think, one aspect
I kind of exclude from there.If you're essentially
looking at the,I guess you could say, category
level of pages on your site,where you have one
category and allof the different products
roughly speaking listed there,then that's something where
sometimes it makes senseto have those indexed like that.If people are looking,
for example, Idon't know, office space
in specific cities,then maybe it makes
sense to createthis kind of a category page.If that's something that
kind of explodes on its ownas well where you have like
you can enter a city nameand it'll say office space
within 500 miles of this city,then that's going to bring a lot
of information that's probablynot so relevant for the user.So somewhere between
something that's very targetedand something that's
too broad to actuallybe useful for the user,
I think that's whereyou kind of have
to find the match.And I don't think
there's one size thatfits all for this
kind of question.
MALE SPEAKER: And what's the
result of that, then, if it is?I mean, is it a
quality algorithm?Is it Panda?What's actually
affecting whether or notthat page ranks well compared to
maybe having a reduced amount?Is it content quality?
JOHN MUELLER: On the one hand,
it's something like quality.But I think the
primary thing thathappens there is if you have
fewer pages on your website,then all of the good signals
that we have for your websitewill be kind of split
among those fewer pagesas compared to being split among
thousands of different pages.So that's kind of where
you're concentratingthe value, the signals,
all kinds of signalsthat we've collected
for your website,and can kind of focus
those on fewer pagesto make those pages a
little bit stronger.And this is something
that comes up,for example, with
international websitesas well where maybe you
have content for Germany,Switzerland, and Austria,
but it's all in Germany.And maybe it makes
sense to combine thatinto one page that's
in German that'svalid for all of these
countries instead of creatingseparate pages for
each of these countriesbecause it's essentially
the same thing.So kind of finding the balance
between spreading things out sothat you have information that's
relevant for specific users.And being able to focus on
something and collecting,like really concentrating
the value of your websiteonto those pages, is, I guess,
a tricky balance to find.And it really depends
on your website,so it's not that there's
one solution for all.
MALE SPEAKER: John, I
have a follow up question,sort of a hybrid
question regardingthe third party
linked metric tools.For new top level
domains, I knowthis question's
been asked before,but how does Google view
new top level domains?And part of the reason I'm
bringing up the third partymetrics is because new top
level domains always show zeroaccording to Moz's
domain authority, and Mozrank, and all that stuff.So I just was wondering how
Google handles new top leveldomains versus the
JOHN MUELLER: We
essentially treat themas generic top level domains.So that's something where
if you can't get a dot combut you can get a dot company
or whatever top level domain isout there now, then
that's essentially fine.That's something you could use.You can set geotargeting
in WebmasterTools for those domains.You can essentially
use them normal.So it's not that we're
treating them with cautionor we're holding them
back or anything.They can rank just as well.Sometimes they rank really well.So I don't know.I think I saw some
report externallyabout some of those
new top level domainssaying they actually ranked
better than the previous ones.And it's not really
the case that theyranked better because of
the top level domains.It's just that if you
have a really good websiteand you redirect it to
a new top level domain,we'll try to follow
that informationand rank them appropriately.
MALE SPEAKER: Is it safe to say
to steer clear from dot infos?
JOHN MUELLER: No.That's absolutely fine to use.I think one of
our-- I don't knowwho is like the leader
of research department.Amit Singhal, for example,
has a dot info website.So if he believes in
dot info websites,then I think it's
safe to say that theycan rank just as well
as anything else.
MALE SPEAKER: I've just noticed
that there's so much spam outthere.I mean, these spam
networks specificallyuse so many dot
infos, which is whyI was wondering if
Google had any sortof downward connotation towards
ones that are traditionallyused for spam.
JOHN MUELLER: We try to
avoid doing that too broadly.I mean, when it comes
to free hosters,for example, that have
like a subdomain, that'ssomething where sometimes
we'll take fairlybroad action because we'll
say, well, this free hosterhas like so much time on here.We can't even find the
good content on there.We'll just treat everything
as being kind of problematic.But with top level domains,
I think that's not somethingthat we're doing at the moment.That's not something
that really makes sense,because people are using
them for different purposesand if spammers picked up
on one of these ones, thatdoesn't mean that all sites
there are necessarily bad.
BARUCH LABUNSKI: What's
the limit to sister sites,like same?
JOHN MUELLER: There's is no
hard limit in that regard,but I think it's like
with everything else,keep it to a reasonable amount
and try not to kind of gooverboard with those
kind of things.
BARUCH LABUNSKI: [INAUDIBLE].
JOHN MUELLER: Sorry?
BARUCH LABUNSKI: I sent
you an email regarding thatand it's still been
around for over a year.
MALE SPEAKER: Yes.Now it is.
MALE SPEAKER: I'm going
to mute, mute, mute, mute.Sorry, what?
JOHN MUELLER: Yeah,
I mean, that'ssomething where we don't have
any strict guidelines wherewe'll say you're
only allowed to havelike two websites in search
or something like that.So to some extent,
if you have sitesthat are significantly separate,
that's absolutely fine.If you have sites that
are exactly the same,then we should be
treating that the samefrom a technical point of view.It's not something that
we'd kind of like manuallycurate and say, well, this
is from the same company,therefore they should
never be showing upin the same search results.
It's the same TLD.It's the same TLD, but
one is without a dashand one is with a dash.
JOHN MUELLER: That
can be fine as well.I mean, it's not something where
we kind of take and kind of,let's say, look at
the search resultsand say we need
to clean these upand manually curate the search
results because we can't reallydo that.I think, what is it,
10% of all querieswe see everyday are new, so
we couldn't realisticallyever clean up all the
search results manuallyand say, well, this site is
kind of the same as that one.We can fold those together
and this is actuallythe same company, or
this site is actuallyselling the same product
as the other site there.That's not the level of detail
where we'd get involved,and that's something
that our algorithmsshould be able to figure out.And sometimes they
do a good job.Sometimes they don't
get it completely right,and that feedback
is good to have,but it's not the case that we
would manually kind of cleanthose things out
when we're talkingabout a small number of pages.But I remember your
email and I knowwe've talked about
it with the engineersa few times, so
it's not somethingthat I'd say is
completely lost most case.
BARUCH LABUNSKI: Sounds good.It's just, crowding
the elevator is unfair,and others are waiting.It's just--
JOHN MUELLER: Yeah.It's really hard to say.I mean, it's similar
to the same kindof search results
from the same sitewhere we also don't
have a hard limitand say, well, you
can only show up twicein the same search results.So that's kind of something
that the quality engineerslike to have the freedom to kind
of expand or kind of contractas they think is necessary
by the algorithms.But let's go through some
of the other questionshere before they get lost.Does every single
product page needto have a product description
for optimal ranking,or if the product is obvious
and all the user wantsis a picture and
price, will thatsuffice for optimal ranking
as long as the user is happy?If all you have on a page
is a picture and a price,then we probably won't
be able to figure outwhat this page is about.So to some extent, we need
some information on these pagesto be able to figure
out what we shouldbe ranking this page for.So I would definitely at
least put something on therethat you'd like to
rank for that youthink is relevant for the user.What if one of the
tabs is for reviewsand there are hundreds, in some
cases thousands, of reviews?I guess one of the
things you could do hereis create separate
pages for this.You could paginate that
separately if you wanted to.You could filter out the best
reviews and only show those.This is essentially a
question of like howyou want to design it
on your website, whatyou want to do there.And there are different
options availabledepending on what
you want to do,what you want to
have indexed, whatyou don't want to have indexed.When using ajax
to serve content,is it OK to show a dynamic
URL in the address partbut have a search engine
friendly canonical URLif the content is the same?For example, domain.com
category/product and domain.comcategoryid=123, productid=15.Generally, I'd recommend
using the same URLsthat you want to have indexed
as the ones that are actuallyshown to the user.To some extent,
we'll figure it outif you have something like a
rail canonical on these pages.But especially with
the newer techniquesthat you can use with HTML5,
you can use normal looking URLand essentially have
those indexed as well.So that's something where I try
to show exactly the same URLto both search
engines and users.My website got negative SEO
with thousands of spammy links.I'm the only one in the
company with Penguin 3.0.It lost 70% of its traffic.I've already disavowed, but
I'm losing in the meantime.Can you advise what to do?I took a quick look
at the site beforehandand I don't see
anything particularlycritical with regards to those
links that are really causinga problem there, so I'd continue
focusing on your websiteand make sure it's really
the best of its kind.It's not something where I'd say
you need to do something reallyspecific with the
links, or wherethis negative SEO that
you might be seeing thereis causing you problems.Webmaster Tools is giving
an error, a missing titletag in robots.txt.Is this a glitch?I don't think the robots.txt
should have a title tag.That's probably true, yeah.I think there are
a few cases wherewe want to index
the robots.txt file,so if you're seeing
something like this,I think that's definitely
safe to ignore.
LYLE ROMER: Hey, John.Also just a real quick
follow up in Webmaster Tools.A couple Hangouts ago, I
brought up the crawlers issuerwhere when we switched
to the secure HTTPS,that you lose the
traceability of wherethe crawlers came from.Just wanted to follow
up and see if anythinghad been looked at
as far as that goes.
JOHN MUELLER: That's
definitely somethingI passed on to the
team to kind of lookat what they could do
there to improve that.I don't know if they've been
able to change anything there,but in theory, you should
see the original sourceof those links in the meantime.I think that should have
been the case beforehand aswell, so I don't
know if that was justlike a temporary stage
from the redirectsthere that were
showing up there,but that should actually show
the external or internal sourceof that link.
LYLE ROMER: Well, check again.They still weren't showing.I mean, basically because
the non HTTPS Webmaster Toolsaccount shows everything's fine
because it gets redirected,and then when the HTTPS
version just shows everythingas being referred from
just the HTTP version.
JOHN MUELLER: Do you have a
redirect from [INAUDIBLE]?
LYLE ROMER: Everything blank,
it redirects from the non-secureto the new HTTPS, and I think
that's where the problem isbecause the non-secure
Webmaster Tools doesn'tsee a problem because
it's getting redirected.But then when the page doesn't
exist and there's a crawler,the secure side just
sees the referral URLas being from the non-secure.
JOHN MUELLER: That should
actually catch up as wekind of re-crawl and
re-index everything.So that should be cleaning up,
at least from what I've seenand from looking at other sites.But I can definitely
take another lookat your site to see if
there's something maybedifferent happening there that's
kind of holding that back.
LYLE ROMER: OK.I'll pop you the URL.Thanks.
JOHN MUELLER: Sure.
MALE SPEAKER: John, can I step
in with a question please?
JOHN MUELLER: Sure.
MALE SPEAKER: I'm seeing
in Webmaster Toolsthat Google recognized a lot
of structural markup data,including hlisting microformat.Can you please tell
me in which wayit is used by Google, as I
can't see it applied in searchresults, at least not
in a visible manner?
JOHN MUELLER: The
structure data dashboardthat we have in Webmaster
Tools essentially showsall structured data that
we find on your website.So if you are using this
mark up on your pages,then we'll show that
in the dashboard.That doesn't necessarily mean
that we use it in search,but essentially
what you're saying,we found this markup
on your pages.That could be a
schema.org markupthat we don't use,
for example, or couldbe a specific type
of microformatthat maybe you have on
your page but we don't use.We show that regardless
in Webmaster Tools.
MALE SPEAKER: I see, because
I'm starting to clean it up,I mean, to make it
more proficient.I'm seeing the
errors dropping downand I was wondering if there
is any benefit from it?
JOHN MUELLER: Any benefit from
using structured data thatessentially isn't shown
in the search results?So that's always
a tricky question,because on the one hand, it
makes it a little bit easierfor us to recognize
what your page is aboutif we have more
markup on those pages,but it's not the case that
we'd say this significantlyeffects your site's ranking.So primarily, what
you would see thereis the difference between
content that, at the moment,we don't use in rich
snippets, for example,but that we might
use in the future.And for example, when the
engineers look at this,they'll often say, well,
this is the type of markupif we could find
this on the web,we'd be able to show
that in rich snippets.And then they look at
the web as it is nowand they say, well, nobody
is ever using this markup.Therefore, we
probably won't evenset up any rich snippets for it.Or if they look at the
web and say, well, look,there are lots of sites that
are using this markup already,then it makes it a lot
easier for us actuallystart using that
for rich snippets.So in a sense,
you're kind of aheadof the curve if you do
that, but you're alsokind of betting that this
specific type of markupmight at some point be
used for rich snippets.So it's something--
I think if youlike spending the time
on structured dataand you think it
makes sense to kindof markup your page,
then that's fine.But if you need to focus
on something that'svisible in the short
term, then that'sprobably not something
that you'd need to do.
MALE SPEAKER: OK, thank you.
MALE SPEAKER: John,
I have a colleaguewho their website received
a manual penalty backin, I believe, March.And it was revoked a month
later and they've stillseen no traffic gain, even
after manually removing linksand doing a thorough disavow.
JOHN MUELLER: Sometimes that
happens when the manual actionkind of matches something
that our algorithms willbe doing at about the same time.So sometimes, from the
time it's very clearthat this manual action took
place and it got cleaned upand then the algorithm
took effect, but sometimesthat border is kind of
washed out from the timings.So what might be
happening there,or probably is happening
there, is that a manual actionwas taken based on some
issue that was found there.At about the same time or
sometime in between there,the algorithms also found
something problematicand took action on that.And it might be the same thing.It might be something
MALE SPEAKER: Is
it OK if I maybeshoot you a message after
this just with that domain?
JOHN MUELLER: Sure, sure.I'm happy to take a look.I can't promise that I'll have
anything specific back for you.
MALE SPEAKER: Thanks.
JOHN MUELLER: Sometimes
we can find something.Sometimes it's useful to pass
on to the engineers to say, hey,people are confused
with this kind of thing.But it's not always
the case that we'dbe able to let you know
what exactly was happening.
MALE SPEAKER: I appreciate it.
JOHN MUELLER: All right.We redesigned our
website so that some URLswill be different.We have a big AdWords
account that runs daily.Is there any way to change
URLs to correct thembecause users will see
404 pages and we'llbe charged for useless clicks?This isn't something
that Webmaster Toolswill be able to help you with
because, on the one hand,the data in Webmaster Tools
is based on the crawlingand not necessarily
based on the ads.On the other hand, the
data in Webmaster Toolsis generally a little bit
delayed by a couple of days,depending on how we crawl
and index your site.So that's something where
I think about setting upa tool on your site that
actually crawls these URLs.There are lots of existing tools
that crawl like a list of URLs,but making your own is
generally pretty easy as well.Let me run through a whole
bunch of the questionsthat we have left
in a few minutes.John, will there be a formal
notification when the Penguinupdate has actually completed?I don't know.To some extent, we're hoping
that these things will justkeep on updating, so
maybe it'll just kind ofroll over into that.Would you be able to
say why my site suddenlylost its rankings?This is the site I
looked at before,and from our point
of view, it's notthat there's anything
specifically badabout the site.It's just that algorithms
have changed over the years,so if your site was ranking
really well a couple of yearsago, that doesn't mean it'll
always be ranking really well.Any update on the expandable
hidden content issues?We talked about that.I understand Google
Places and Mapsare separate from
organic search.Do the local place results
show search exposuresin the average keyword
ranks in Webmaster Tools?And no.If you're shown within the
map, then that's not somethingwe show in Webmaster Tools.Seems everywhere
[INAUDIBLE] actionsconnected to the
target link network.Those affected get
very quick attentionto reconsideration
requests, but thosewho just are working and
submitting seem to wait longer.Does web spam prioritize
over those already waiting?No.We essentially have our
normal list of sitesthat we go through, and we
go through them step by step.So it's not something where
we prioritize them differentlydepending on what
actually happened.All right, oh.Lots of questions left.But maybe I'll just take
one or two from you guysbefore someone tries
to grab this room.
BARUCH LABUNSKI: When
is mobile usabilitygoing to kick in in 2015?I'm not asking for a specific
date, time, where, what.I just wanted to know
because is there stilltime to get prepared?
JOHN MUELLER: Sure.I mean, there's a lot of time.Like, if you have
any issues, it'salways a good time
to fix those issues.So with regards
to mobile, we arelooking at what we can do
there past just showingthat label in the
search results.But if you notice that there
are issues with your siteswith regards to
mobile, I'd definitelyclean those up just
whenever you can.So it's not that I'd
say you have to give upand say oh, I missed this train,
I'll wait for the next one.Users are going to
continue using smartphonesand want to use your site.
BARUCH LABUNSKI: No,
no, no, because yousaid it's going to kick in
for ranking, that's all.You said there's
going to be rankinginvolved for mobile usability.I think [INAUDIBLE].
JOHN MUELLER: I think that's
something that's stillbeing talked about, so I don't
have any specific news on that.But like with everything else,
if that takes place tomorrow,it still makes sense to
work on your site today.If it takes place
in a half a year,it still makes sense
to work on your siteto clean those issues up.So I wouldn't focus
on when it kicks in,but rather if you see problems,
try to get those fixed.
BARUCH LABUNSKI: Three
weeks around we have.
JOHN MUELLER: No, no, no.I didn't say any dates.
BARUCH LABUNSKI: No, no, no.I'm saying I have about
three weeks, I guess.It's going to be cleared
up in three weeks,but there's like
thousands of pages.
MALE SPEAKER: John, can I
have you another question?
JOHN MUELLER: Sure.
MALE SPEAKER: The technical
Webmaster guidelineshave been updated to include
progressive enhancements.How should this influence the
way we create our websites?
BARUCH LABUNSKI: Hm.
MALE SPEAKER: I mean, if
today browsers supportCSS for tooltips and
expendable contents,why shouldn't we use
that in our advantage--I mean in a SEO point of view?
JOHN MUELLER: Good question.I don't have any quick
answer for that, but--
MALE SPEAKER: I'll come back
with this question next time.
JOHN MUELLER: I think
the important part isjust that Google bot is more
and more like a normal browserand we try to pick up
on more informationas we can kind of find
it from the HTML pages.So we try to get as much
from there as possible.And using things like
progressive enhancementmakes it easy for us to
pick up the primary contentas quickly as possible.And if we can give up a
little bit more informationby processing more of
the page, then that'ssomething that kind of adds
a little bit more value.But it's not the case that kind
of focusing on something thatonly works in the
modern browsersand doesn't work anywhere
else, then that'ssomething where we're going to
have trouble kind of includingthat directly in search.
MALE SPEAKER: I understand.I was saying that
because you saidyou won't be using CSS
tooltips small windowsand expandable
content, and this isone of the features for
which I think you justput the progressive
enhancement in your guidelines.So there has to be somewhere
an advantage for it.
JOHN MUELLER: I don't have the
exact text from the guidelinesoff hand, so I'd have to think
a little bit what exactly weput in there.
MALE SPEAKER: No problem.Thank you.
MALE SPEAKER: John, quick
question about hreflang.
JOHN MUELLER: OK.
MALE SPEAKER: So I
got a website thathas content translated
into multiple languages.And for the English language,
they use a hreflang with n-us.Would that be a problem
for somebody searchingfrom, like, UK, or
Australia, or Canada?Should they use a en
address or x default?
JOHN MUELLER: It depends.So en is probably
a good match there.If this is the only English
version that you have,then we know that this applies
to all English countries.So that's probably
a good match there.If that's also the page that's
relevant for all random usersthat you don't have a
specific language for,then you can also use x default.So you can use the same
page and say, this is en-us,but it's also en.It's like a generic
English page.Maybe you have a
en UK page as well.And you could also say
this is the x default page.So the same page can have
multiple annotations like that.
MALE SPEAKER: OK.So I can use en-us and
en at the same time?
JOHN MUELLER: Yes.Yes.
MALE SPEAKER: Oh, OK.OK, cool.Thanks.
JOHN MUELLER: All right.Let's take a break here.It's been great having you guys.Thank you all for joining
in and maybe I'll see youguys again next time,
one of the future times.Have a great week.