In response to a Tweet, Google’s John Mueller made a very relevant point when it comes to receiving links from non-English language websites.
The question to John asked: “I received some dofollow links from some non english websites. My sites priamary (primary) language is english. Have no idea how they got to find my stuff and link to it.. Should i be bothered about those links.. They have a neat link profile”
John then pointed out the following:
Just because a website is not in English doesn’t mean it’s spam, or that links from it are bad.
Highlighting the point that backlinks should be judged on their individual merit, rather than the country of origin of the site that they came from.
As part of a Reddit AMA that Google’s Gary Illyes completed in February 2019, a user asked a question with regards to the impact of conversion rates on search results.
Specifically Gary was asked:
“I guess Google is smart enough to understand conversions on some website (bought product, booked accommodation, etc).
Is the percentage (or a number) of conversions something that Google would count as a ranking factor (as a good UX signal)”
Gary replied with a brief but definitive response:
So there we have it, having neither a positive, or negative converstion rate from customers on your site will impact how your site ranks in Google search results.
In a Reddit AMA from August 2018, Google’s John Mueller was asked for advice on how to support the recovery of a site with a bad history:
“One of my clients got a domain which had a bad history, Many SEO experts claim that it’s an Algorithmic hit or the site was rated low in the Google Quality Rating. There’s nothing one can do other than improving the quality. Even if the content quality and all is improved, How does Google know that?
In our case, we’ve removed all the previous content, we’ve tested with new content which has been up to mark for 3 months now, but we believe that the website isn’t performing up to it’s fullest potential, even a new site outranks the 4 year old site.”
In response to this, John replied with the following advice:
There’s no “reset button” for a domain, we don’t even have that internally, so a manual review wouldn’t change anything there. If there’s a lot of bad history associated with that, you either have to live with it, clean it up as much as possible, or move to a different domain. I realize that can be a hassle, but it’s the same with any kind of business, cleaning up a bad name/reputation can be a lot of work, and it’s hard to say ahead of time if it’ll be worth it in the end.
John discussed that options include cleaning up the site, or moving to a new domain.
In another Reddit AMA from February 2019, Google’s Gary Illyes confirmed that there isn’t a point where sites are not salvageable.
In response to this question that was put to him: “Can a domain be damaged in Google’s algorithm to the point that it’s not repairable? I don’t mean manual penalties, but for some reason the domain won’t be able to rank given its history of sketchy content / behavior?”
Gary replied with a candid answer:
This is the thoughts of two separate Googlers on potentially previously penalised domains.
Google have repeatedly said that they use hundreds of factors when determining where websites should appear in the search results. This point was once again confirmed by Google’s Gary Illyes.
In a Reddit AMA from February 2019, Gary was asked a question regarding some of the factors affecting search results: “If the algorithm isn’t tracking CTR, Dwell time, etc. how does Google know if a piece of content is successful? Or is this determined by on-page elements that Google deems to be good?
For example, instead of monitoring user metrics on the page, Google makes the assumption that if the page is fast, and has relevant content above the fold it will please users.”
Rather than being able to give a specific and definitive answer to this question, Gary did confirm that there are in fact over 200 factors affecting search results:
PR answer: we use over 200 signals to rank pages.
Gary answer: we use over 200 signals to rank pages, and some of those are even announced and you mentioned them.
It’s unsurprising that Gary wasn’t able to provide a more in-depth answer, especially in an AMA, however it’s still useful to once again receive this confirmation of their being hundreds of factors.
In a Google Webmaster Hangout in March 2017, Google’s John Mueller was asked the question:
“Can low quality pages on a site affect the overall authority?”
In response, John answered with the following:
Yes… in general when our algorithms look at a website overall, they do look at individual pages as well. If there are a bunch of really low quality, bad pages on a site. Then that does affect how we view a site overall.
On the other hand if it’s a really large website and there’s a handful of pages which are kind of bad, then we also understand that in the bigger scheme of things that those pages aren’t the main issue for this website.
If you’re aware of low quality pages on your website then that is something I’d try to fix and find a solution. So either removing those pages if you really cannot change them, or in the best case finding a way to make them less low quality, and make them useful pages on your site.
This advice, whilst not surprising if Google are thinking of user experiences, is fairly conclusive.
In August 2018, Google’s John Mueller carried out an AMA on Reddit, and provided a number of insightful answers to website related questions.
One such question was with regards to the volume of content that a homepage or internal page should have, in order to be recognised in a positive light by Google.
Part of the question included the following: “Our company was till the end of 2017 heavily focusing on “texts must be key to solve all SEO problems. We will optimize the technical part of your website, but content is king. Put atleast 500+ words on the home-page and about 700+ words on the subpages.” ”
In response to this, John replied with the following, to share that he does not believe algorithms monitor the volume of content on a page:
FWIW I’m almost certain that none of our algorithms count the words on a page — there’s certainly no “min 300 words” algorithm or anything like that. Don’t focus on word count, instead, focus on getting your message across to the user in their language. What would they be searching for, and how does your page/site provide a solution to that? Speak their language — not just “German” or “English”, but rather in the way they understand, and in the way they’d want to be spoken with. Sometimes that takes more text to bring across, sometimes less.
John’s advice here is very much to focus on the quality of the content, rather than the quantity of the words on the page.
As part of a Reddit AMA by Google’s John Mueller in August 2018, John was asked the following question:
“Do you think pages with 1 year old content are loosing (losing) ranking beacuse (because) Google wants to show fresh content every time?”
John replied with the following thoughts:
No. Fresher doesn’t mean better. Don’t fall into the trap of tweaking things constantly for the sake of having tweaked things, when you could be moving the whole thing to a much higher level instead.
One way of interpreting this response is that a site should not update content for the sake of making it ‘fresher’. If you are going to update your content simply improve it significantly.
As part of a Reddit AMA in August 2018, Google’s John Mueller was asked a question on which tool to use and trust, with regards to monitoring the page speed of a site.
Specifically: “Google has said that only the slowest sites will be impacted by the coming Speed Update and referred us to 2 tools to check our speed, PageSpeed Insights and Lighthouse. These two tools return different scores. If a website is “average” in PageSpeed Insights, but “slow” in Lighthouse, which score will Google be using (especially for mobile)?”
John responded to this question with the following advice:
I would not focus on a single score for determining the speed of your site, but rather try to take the different measurements into account and determine where you need to start working (we also don’t just blindly take one number). There’s no trivial, single number that covers all aspects of speed. I find these scores & numbers useful to figure out where the bottleneck is, but you need to interpret them yourself. There are lots of ways to make awesome & fast sites, being aware of an issue is the most important first step though.
As merely a starting point to check your own site speed, the links to the two tools mentioned within the inital question can be found here for PageSpeed Insights, and here for Lighthouse.
In August 2018, Google’s John Mueller conducted an Ask Me Anything (AMA) on Reddit.
During the course of the two hour session John received a number of intriguing questions from users.
One in particular was whether having multiple pages on the same topic, known as keyword cannibalisation could in fact hurt search rankings.
John responded with the following advice, that he personally prefers to see less pages with higher quality, than a higher number of pages:
We just rank the content as we get it. If you have a bunch of pages with roughly the same content, it’s going to compete with each other, kinda like a bunch of kids wanting to be first in line, and ultimately someone else slips in ahead of them :). Personally, I prefer fewer, stronger pages over lots of weaker ones – don’t water your site’s value down.
Whilst John’s response may not be in the most technical format, it does allude to the fact that competing pages aren’t positive in the eyes of Google.
In a Reddit discussion from January 2019, it was asked if there is any benefit to a site’s rankings in using a particular type of SSL certificate.
After another user commented that the main concern was that they have a secure connection in place on the site, Google’s John Mueller confirmed that there is no difference from a ranking point of view:
Yeah, there’s no ranking boost for using any particular kind of certificate. Use whatever works for you. Use a free one if you want.
This backs up the point John made in a Reddit AMA in August 2018, in which he said:
we don’t differentiate between valid certificates for websearch.
This emphasises that the main concern from a ranking perspective for site owners should be that they have an SSL certificate in place, rather than worrying about which one to choose.