Should We Make Use of TF*IDF To Make Content Better?

Here’s what Google’s John Mueller said in response to this question on a Google Webmaster Hangout,

My general recommendation here is not to focus on these kinds of artificial metrics… because it’s something where on the one hand you can’t reproduce this metric directly because it’s based on the overall index of all of the content on the web.

So it’s not that you can kind of like say well, this is what I need to do, because you don’t really have that metric overall.

Perhaps most interesting is that he also added,

this is a fairly old metric and things have evolved quite a bit over the years

Hinting that Google has little to no use for this specific metric anymore.

That’s not to say it can’t be useful. It’s good to get a run down of your content and see terms and phrases others are focused on that you aren’t, but it probably isn’t the be-all and end all to what you need to do to rank.

Will I Get Penalised for 301 Redirecting Expired Domains?

It seems to be increasingly common in 2019 that people are using the tactic of redirecting expired domains in order to boost the rankings of their “money” (main) website.

Google are traditionally anti-link-building so it’s important not to take everything they say with a grain of salt but here’s John Mueller’s response to this specific question.

Note that this was asked in the context of building “spammy” links to the domain that will then 301 redirect to a main site.

The 301 basically makes the main site canonical, meaning the links go directly there. You might as well skip the detour [as] it’s just as obvious to the algorithms annd spam team.

I would have to imagine that Google are not too strict with this.

After all, anyone can redirect any domain they own to your website and then point bad links at it. Unless you leave some obvious footprint, it’s very hard to know exactly who set-up the redirect in the first place.

That said, any form of link building can result in a penalty so it’s something to be careful about and wary of ahead of time.

Can Links Still Negatively Affect Your Site If They’re Not Shown in Google Search Console?

Yes, they can.

Google doesn’t show all links they know about in Search Console but rather a “relevant sample”.

Search Console shows a relevant sample of the known links.

That said, it’s likely that the links aren’t the most authoratative if Google has not decided to show them at all.

If our systems don’t even show them there, then they’re pretty irrelevant overall.

John added that even if Google knows about a lot of links from an individual site, they’re not going to show all of them.

If you have site-wide paid links from a site, we’re not going to list every URL on that site.

My Sitemap and jQuery Implementation of Hreflang is Different. Does It Matter?

If you are trying to define the language of a page via hreflang, it’s generally better to use one way of doing so rather than multiple.

A Redditor asked if there were any issues when an XML sitemap includes hreflang markup and the site is also including hreflang markup via jQuery.

Google’s John Mueller replied with,

If you have it in the sitemap, just use that. Adding a second set via jQuery just makes it much harder to diagnose, find, & fix errors.

As the specific question was around multiple implementations that were showing different data then it makes even more sense to pick one over the other. If you can’t be sure which implementation is going to be picked up, I would use the non-javascript version.

Why Do The Number of Results for a Site:Domain Search Fluctuate Wildly?

Google’s John Mueller today clarified that the number of results shown for a site: search (e.g. site:detailed.com) are “extremely rough approximations”.

A question appeared on Reddit asking why the number of results shown fluctuates so dramatically on a regular basis.

Here’s the exact quote,

Site:-query result counts are optimised for speed, not accuracy.

I wouldn’t use them for any diagnostics purposes. Especially when you get into the higher counts, those numbers are extremely rough approximations. Don’t get tricked into trying to use them for optimizations or diagnostics purposes.