Our monthly recap features all of the Google Updates you should care about and SEO news for July of 2022.
In June 2022, we saw Google SERP feature updates, algorithm changes, and John Mueller news. For this month, we scoured the web to bring you the latest juicy SEO news and tidbits in this edition of the SEO news recap.
As usual, July saw tons of updates, including:
- Google changes to rich results as they relate to prohibited, regulated, or harmful Products
- Twitter removed, then added back the nofollow attribute to its links
- Google reminded SEO pros they shouldn’t modify third party reviews with structured data markup
- Google said it’s fine to delete your disavow file if you have no history of bad links or manual actions
- Google dealt with a massive influx of Nazi stuff in their search results
- Google explained that search query pages are considered low-quality, low-effort pages
- New support for Google Analytics 4 in Google Search Console Insights
- Google created a new landing page with Google algorithm update history
- and much, much more.
Enjoy.
Google and Harmful Products in Rich Results
In case you didn’t know, rich results are those results that appear on Google for certain types of queries.
On July 1st, Google revised the rich results help document with brand new guidelines.
These guidelines bar webmasters from marking up their content that is largely regulated or prohibited.
Products that can result in serious harm to self or others are also banned.
Such products will just not be shown in rich results, according to Google.
Twitter Added Back Nofollow Attributes
Earlier in July, Twitter removed nofollow attributes from their links. This has long been a significant attribute of Twitter (and other social networks): they all nofollow their links.
When it was removed, it was groundbreaking.
Could SEO pros finally game Google with thousands of Twitter posts?
Not so fast. Because shortly thereafter, Twitter added these nofollow attributes (again).
Sorry, SEO pros, but you can’t use Twitter for their links value now.
Structured Data and Third-Party Reviews
Way, way, way back in the pre-pandemic era of 2016, Google added an update to its guidelines that stated that their rich results guidelines do not allow you to use structured markup on third-party reviews.
Early in July, John Mueller posted a reminder that you still can’t mark up third-party reviews with structured data:
You can’t mark up 3rd party reviews with structured data, but feel free to show them on your site without the markup.
For not indexing a part of a page, there’s no simple solution. Sometimes data-nosnippet is enough though.— 🫧 johnmu of switzerland (personal) 🫧 (@JohnMu) June 29, 2022
If you need a refresher on what Schema you should be using for local businesses, here is that structured data in Google’s help documentation.
Disavow Files and Bad Link History
John Mueller explained in a hangout that if you don’t have anything bad in your link history, or if you don’t have any manual actions, then you don’t have to worry about your disavow file, you can delete it (though we disagree).
The SEO pro’s question was: they had disavowed over 11,000 links over the past 15 years. They said they never bought a link or did anything unallowed, such as sharing. They also said that the links they disavowed are those from hacked sites or from nonsense, auto-generated content. They asked, since Google claims they have improved algorithms that ignore these links, whether they should delete their disavow file?
John answered that from Google’s point of view, they work really hard to ignore these kinds of links. They do this because Google knows that the disavow tool is a niche tool, and although SEO pros know about it, the average person who is a webmaster with a website has no idea that it exists.
All the links that the SEO professional mentioned here are the links that any website will get over the years. And Google’s systems understand that these are not the typical things you are doing to try and game the algorithms. From this perspective, if the webmaster is really sure that there’s nothing around a manual action that had to be resolved from these links, John recommends just deleting the disavow file and moving on with life.
The one thing John also personally recommends is making sure you download and make a copy of the disavow file so you have a record of what you deleted.
Author comment: This is not entirely accurate. Google requires a disavow file because there are algorithmic devaluations and other algorithmic events that occur because you have a bad link profile. Also, Google can devalue folders, pages, and even the entire site. So if you have a disavow file, the best idea would be to consult with your SEO professional to make sure this is a wise course of action. Not everything Google says is copacetic.
John (Question)
Over the last 15 years, I’ve disavowed over 11,000 links in total. I never bought a link or did anything unallowed, like sharing. The links that I disavowed may have been from hacked sites
or from nonsense, auto-generated content. Since Google now claims that they have better tools
to not factor these types of hacked or spammy links into their algorithms, should I just delete my disavow file? Is there any risk or upside or downside to just deleting it?John (Answer)
So this is a good question. It comes up every now and then. And disavowing links is always kind of one of those tricky topics, because it feels like Google is probably not telling you
the full information. But, from our point of view, it’s actually–we do work really hard to avoid taking these kind of links into account. And we do that because we know that the disavow links tool is somewhat a niche tool, and SEOs know about it, but the average person who runs a website has no idea about it.And all of those links that you mentioned there are the kind of links that any website gets over the years. And our systems understand that these are not things that you’re trying to do to game our algorithms. So, from that point of view, if you’re really sure that there’s nothing around a manual action that you had to resolve with regards to these links, I would just delete the disavow file and move on with life and leave all of that aside.
One thing I would personally do is just download it and make a copy so that you have a record of what you deleted. But, otherwise, if you’re sure these are just the normal, crusty things from the internet, I would just delete it and move on. There’s much more to spend your time on when it comes to websites than just disavowing these random things that happen to any website on the web.
Google Fixed a Search Issue
Barry Schwartz reported that in late June, social media became awash with talk about how a search for certain types of desk ornaments would bring up Nazi items.
Things like German Nazi items, Swastikas, and other terrible things that should never have been given an audience in search.
Danny Sullivan over at Google sent the issue up to the Google Search Team and said this is a high priority fix.
When Barry reported on it, almost a week later, the results were still showing up.
Many news publications reported on this news that week, including The Mirror, Futurism, and Newsweek.
According to our latest search, Google appears to have – finally – corrected this issue.
Well done, Google.
Search Query Pages Are Low on the Effort Spectrum
Over on Twitter, SEO professional Bill Hartzer posed the following question to the SEO community at large:
Should you purposely allow indexing of internal search results pages?
John explained that he would block them overall, because they are low-effort and thus likely spammy in nature.
Here is the overall Twitter thread:
Should you purposely allow indexing of internal search results pages?
— Bill Hartzer (@bhartzer) July 4, 2022
Technically – no. However, if you have/allow some to be indexed and they work well with respect to ranking and clicks/conversions – figure out a way to make them “legitimate” pages in their own right
— Peter Mindenhall (@PeterMindenhall) July 5, 2022
I can see a selected list of search-queries being essentially equivalent to low-effort category pages, but overall I’d block them. It’s not just indexed pages: every week I get mails from sites big & small who got spammed with queries & struggle to fix it.
— 🫧 johnmu of switzerland (personal) 🫧 (@JohnMu) July 5, 2022
Google Analytics 4 and Google Search Console Insights
Early June also saw the integration of Google Analytics 4 along with Google Search Console. If there were ever any doubt about Google’s allegiance to the new Google Analytics, this should remove them.
The major change, announced by Google on Twitter, is that Google Analytics 4 is now integrated with Google Search Console insights.
📢 Have a GA4 property but couldn’t use it with Search Console Insights? Now you can! We are rolling out GA4 support, check it out! 🧑💻https://t.co/XTwC0VhfIW
— Google Search Central (@googlesearchc) July 6, 2022
There was a limited rollout of this, as reported by Barry Schwartz over here.
Nofollow Follow Links Ratio Does Not Cause SEO Problems
Over on the social platform Reddit, John Mueller explained that there is no ideal ratio of follow to nofollow links.
The SEO professional explained that they were having a problem with Google deindexing their site’s pages rapidly.
They have checked everything, and the only thing that stood out to them was the ratio of dofollow vs nofollow links, which was 75:25.
These dofollow links appeared in March (the majority of them were from 2-5 domains) and several days later the rankings and traffic tanked.
ttubehtabaaj
Thank you for the reply. Maybe I should revisit the content and technical stuff again?I’m in a loop and it’s the first thing on my mind before I hit the bed and after I wake up. It’s like an itch you can’t scratch. The daily traffic went down by 90% and the pages are getting deindexed. So, that’s what got me worried. It’s like where did it went wrong. What more information should I share in order to better assess this situation? Would really appreciate some guidance here, Mr. Lebowski. inserts So, you’re telling me there’s a chance
johnmu
Any problem your site has would not be due to the ratio of follow to nofollow links. That’s just not a thing.If I had to guess just based on this information (I might be totally off — there’s not a lot to go on here), my guess would be that it’s a low’ish quality site where you’ve been working hard to place those links because they’re not easy to get naturally. In that case, the problem is not the links.
ttubehtabaaj
Thank you. Appreciate the response, Mr. Lebowski.
The site’s pages have been getting deindexed by the hour.
I checked everything. The only thing that stands out is the 75:25 ratio of dofollow and nofollow links. The dofollow links appeared in March (most of them are from 2-5 domains) and a few days later the rankings & traffic tanked.
Maybe I should just flip a coin and see what happens.
johnmu
There is no ideal ratio.
New Algorithm Update Reference Page Added by Google
In a stunning surprise, Google released a brand-new landing page on Google Search Central.
It’s in its early stages, of course, but one can’t help but wonder what they plan on working towards with this algorithm update page.
It’s also relatively light, containing only the past couple of years worth of Google algorithm updates.
Other resources, however, such as Moz’s Ultimate Google Algorithm Update guide, are far more extensive and they go all the way back to the year 2000.
Search Engine Land also has a pretty extensive resource on Google Algorithm updates as well.
But, we are certainly looking forward to seeing where Google runs with this.
People Also Ask Queries Dropped Significantly on Google
Barry was the first to report on a substantial drop in People Also Ask queries, which dropped quite significantly towards the middle of July.
He asked Semrush about whether or not this was a mistake in their reporting.
Semrush confirmed back that they have not made any mistakes in their reporting.
Barry also said that RankRanger informed him of the fact that their data is also good, without any issues on their end.
Sadly, it seems that People Also Ask opportunities on Google search for desktop are shrinking considerably.
@semrush@MordyOberstein@RangerShay is this right? https://t.co/hQhvqhv3WJpic.twitter.com/w3qRkmrSbD
— Barry Schwartz (@rustybrick) July 11, 2022
Hi Barry! We’ve checked with the team – this doesn’t seem to be a mistake on our end. – Sasha
— Semrush (@semrush) July 11, 2022
New Metrics Now Supported in Google Analytics 4
Mid-July saw Google Analytics 4 announcing some brand-new metrics and dimensions. These apply to all GA4 accounts.
They explained that UTM term, UTM content, conversion, and bounce rate are all new dimensions that SEO professionals can use to report their data.
Starting this week, we’re introducing additional metrics and dimensions to all Google Analytics 4 accounts — including UTM term, UTM content, conversion and bounce rate. Learn more → https://t.co/fsMHX1br7ppic.twitter.com/uhIXPUvgiT
— Google Analytics (@googleanalytics) July 12, 2022
Google explained the following about Bounce Rate in GA4:
“In Google Analytics 4, Bounce rate is the percentage of sessions that were not engaged sessions. In other words, Bounce rate is the inverse of Engagement rate, which is the number of engaged sessions divided by the total number of sessions in a specified time period.
Bounce rate is available in Explorations and Reporting Customization.
Bounce rate is calculated in Google Analytics 4 in a different way from how it’s calculated in Universal Analytics. To learn more about how the calculations are different between Universal Analytics and Google Analytics 4, see [UA→GA4] Comparing metrics: Google Analytics 4 vs Universal Analytics.”
They also explained the following about UTM Term and UTM ad content:
UTM term and UTM ad content
We’ve added new dimensions that surface the utm_content and utm_term parameter values in Explorations, Reporting, and the Audience Builder. Both parameters have a user-scoped and session-scoped dimension.
The following new dimensions enable you to see the value assigned to the utm_content parameter across user and session scopes:
- First user manual ad content
- Session manual ad content
Additionally, the following dimensions enable you to see the value assigned to the utm_term parameter across user and session scopes:
- First user manual term
- Session manual term
In addition, Google explained that conversion rate is a new metric that “reports on the conversion rate for any conversion event”:
- User conversion rate is the percentage of users who triggered any conversion event.
- Session conversion rate is the percentage of sessions in which any conversion event was triggered.
New Scams Now In Google’s Crosshairs
A rise in one-star bandit-style reviews on Google has the media reporting on these blackmail threats. The New York Times highlighted these threats.
Kara, Google Community Manager, responded on a thread on the Google Business Profile help discussion that they are taking action on these review scams.
She said:
We’ve recently become aware of a scam targeting businesses on Google with the threat of 1-star reviews unless they send money via gift cards. Our policies clearly state reviews must be based on real experiences, and our teams are working around the clock to thwart these attacks, remove fraudulent reviews, and put protections on business profiles that may have been affected.
If your business is being targeted by these scammers, please do not pay them. Instead, please flag the reviews here or reach out to Google support via our Help Center, so that our team can review and remove policy-violating content. If you haven’t yet claimed your business profile, you can do so here.
You can also learn more about how our review moderation systems work to ensure that Google reviews remain helpful and authentic.
The New York Times reported that:
“In a new scam targeting restaurants, criminals are leaving negative ratings on restaurants’ Google pages as a bargaining chip to extort digital gift cards.
Restaurateurs from San Francisco to New York, many from establishments with Michelin stars, said in recent days that they’ve received a blitz of one-star ratings on Google, with no description or photos, from people they said have never eaten at their restaurants. Soon after the reviews, many owners said they received emails from a person claiming responsibility and requesting a $75 Google Play gift card to remove the ratings. If payment is not received, the message says, more bad ratings will follow.
The text threat was the same in each email: “We sincerely apologize for our actions, and would not want to harm your business but we have no other choice.” The email went on to say that the sender lives in India and that the resale value of the gift card could provide several weeks of income for the sender’s family. The emails, from several Gmail accounts, requested payment to a Proton mail account.”
Brand-New Algorithm Update Hit Mid-July
A major indexing issue that was having a negative impact on new content in Google Search arrived on Friday July 15th.
The next day, on Saturday July 16th, we saw a brand-new algorithm update hit as well.
Barry Schwartz speculated that this may be related to the indexing issues that were previously occurring, but it’s difficult to know for sure.
He said that:
“If you have a ton of new content not being indexed all day, can that cause large volatility once the indexing bug is resolved the following day? Imagine there is this backlog of new content waiting to be indexed and then all of a sudden, Google opens the blockage and lets all that new content be indexed the following day. Does that cause volatility the following day? Sounds feasible? So keep that in mind when I share the rest below.”
Google Quality Raters Guidelines Updates
The major news story that was breaking towards the end of July included the fact that the Google Quality Raters Guidelines documentation was newly-updated.
This wasn’t a small update. Rather, this was a significant update with many changes made to their quality rater guidelines.
For those who are not aware: the Google Quality Raters guidelines documentation outlines exactly what Google postulates should be their minimum recommended bar for discerning quality content from spammy content.
To do this, they define different categories of content: YMYL, and non-YMYL content.
YMYL stands for “Your Money or Your Life.” This kind of content has such a high bar for fact-checked, accurate articles, as well as a high quality overall.
Their new documentation now also defines YMYL as topics that “present a high risk of harm.”
Search Engine Land also reported that:
Topics that present a “high risk of harm,” can significantly impact the “health, financial stability, or safety of people, or the welfare or well-being of society.”
There has been a lot of speculation in recent years (but very little evidence) that Google uses the Quality Raters Guidelines in their algorithms.
Actually, this is something that couldn’t be further from the truth. In fact, Google uses human quality raters. These quality raters (to the tune of 14,000 hires) work on evaluating search results on a human, user-experience basis.
Then, they report back to Google their findings on these particular websites and whether they meet the bar for high quality.
In addition to YMYL, Google also uses another acronym extensively within their documentation: E-A-T, or expertise, authoritativeness, and trustworthiness, in order to evaluate a given piece of content.
However, E-A-T is just that: it’s language to describe a framework used just for helping human quality raters to understand these factors.
This is not something that one should take as being fully evaluated by Google’s ranking algorithm.
Google has, in fact, stated this on numerous occasions:
Note (March 2020): Since we originally wrote this post, we have been occasionally asked if E-A-T is a ranking factor. Our automated systems use a mix of many different signals to rank great content. We’ve tried to make this mix align what human beings would agree is great content as they would assess it according to E-A-T criteria. Given this, assessing your own content in terms of E-A-T criteria may help align it conceptually with the different signals that our automated systems use to rank content.
The other major update to the guidelines include a refinement on the YMYL categories. They ask quality raters to consider YMYL in terms of the following types of categories that can cause serious harm to someone’s well-being:
- Health or safety
- Financial security
- Society
- “Other”
Here, you can read the full Google Quality Raters Guidelines.
That covers our roundup of SEO news for July 2022. Check back with us next month for our August 2022 SEO news recap!