Google Considers Reducing Webpage Crawl Rate

Google Crawl Rate_Parker2010

Google Considers Reducing Webpage Crawl Rate

As it grows increasingly concerned about the long-term viability of crawling and indexing, Google may reduce the frequency with which it crawls webpages. Google’s Search Relations team’s John Mueller, Martin Splitt, and Gary Illyes, discuss the problem. In the most current installment of the Search Off the Record podcast, they discuss what to expect from Google in 2022 and beyond.

They cover crawling and indexing, and SEO specialists and website owners say they’ve seen less of it in the last year. This will be a big focus for Google this year as it tries to make crawling more sustainable by reducing processing resources. What does this mean for your website’s search engine optimization?

The Persistence of Crawling and Indexing

You might not think that Googlebot scanning and indexing has an environmental impact because it occurs online. When Illyes argues that computers aren’t long-term viable, he mentions the following:

“…what I mean is that computing isn’t sustainable in general.”Bitcoin mining, for example, has a significant environmental impact that can be measured, primarily if the electricity is generated by coal plants or other less environmentally friendly plants.

We’ve been carbon-free since, like, 2007 or 2009, but that doesn’t mean we can’t do even more. And crawling is one of those situations where we could take advantage of the low-hanging fruit early on.”

The low-hanging fruit here alludes to web crawling that isn’t necessary—for example, crawling pages that haven’t been updated in a long time.

What can Google do to make crawling more long-term viable?

Illyes adds that by minimizing the number of refresh crawls, web crawling may be more sustainable. Googlebot crawling can be divided into two types: searching for new material and updating existing stuff. Google is thinking about limiting crawling to keep content fresh.

“…one thing we do, and we may not need to do that much,” Illyes continues. If we find a document, a URL, we go ahead and crawl it, and then we will return to that URL at some point in the future. A refresh crawl is what you’re looking for. Then we’ll do a refresh crawl every time we return to that one URL. How frequently do we need to revisit that URL?”

He notes that some websites necessitate a high number of refresh crawls for some parts of the site but not others.

The Wall Street Journal, for example, deserves a lot of refreshes crawls because it is constantly publishing new content to its homepage.

Google does not need to perform refresh crawls on those sites because the About page of WSJ is unlikely to be updated regularly.

“As a result, you won’t have to return there as often.” We frequently fail to estimate this correctly on refresh crawls, and there is space for improvement. Because accessing the same URL over and over again can feel wasteful at times.

For example, we may occasionally get 404 pages for no apparent or good reason. And all of these are things that we could work on to lower our carbon footprint further.”

If Google decreased the number of refresh crawls, this would happen to your website, which isn’t 100% proven.

What Does a Lower Crawl Rate Mean for Your Business?

It’s a common misperception that having a high crawl rate means you’re doing well in Local SEO, even if you’re not updating your material as regularly as Google. According to Illyes, this is a myth because more frequently crawled information does not consistently rank higher.

“I assume there’s also a misperception that people have in that they think that if a page gets crawled more, it’ll get ranked more,” Mueller says to Illyes. Is it right to say that this is a common misunderstanding, or is it true?”

“It’s a misunderstanding,” Illyes says. “OK, so there’s no use in forcing anything to be re-crawled if it doesn’t change,” Mueller responded. “It isn’t going to improve.”

Google hasn’t stated that refresh crawls would be lowered, but it’s a possibility they’re looking into.

If Google follows through on this idea, it will not be detrimental to your website. Crawling more does not imply that you will get higher levels. In addition, the purpose is to determine which sites need refresh crawls and which need not. This means that the search results for the pages you visit the most will be refreshed and updated.



What SEO Package works best for you?
Let's Chat - Its FREE !!!


whatsapp
×

Hello!

Let's Chat on Whatsapp

×
  • https://didascaliasdelteatrocaminito.com/
  • https://glenellynrent.com/
  • https://gypsumboardequipment.com/
  • https://realseller.org/
  • https://hotelcasasanagustin.com/game/
  • https://footy.dk/dir/
  • https://idiomas.be/beer/
  • https://harrysphone.com/upin/
  • https://gyergyoalfalu.ro/tokek/
  • https://vipokno.by/gokil/
  • https://winjospg.com/
  • https://winjos801.com/
  • https://www.logansquarerent.com/
  • https://internationalfintech.com/bamsz/
  • https://pancen.id/
  • https://winjosjakrata.com//
  • https://winlotrebandung.com/
  • https://condowizard.ca/
  • https://jawatoto889.com/
  • https://hikaribet3.live/
  • https://hikaribet1.com/
  • https://heylink.me/hikaribet/
  • https://www.nomadsumc.org/
  • https://condowizard.ca/aromatoto/
  • https://euro2024gol.com/
  • https://www.imaracorp.com/
  • https://daftarsekaibos.com/
  • https://stuffyoucanuse.org/juragan/
  • https://louisvilleladder.com/totomcu/
  • https://springshomes.com/totosgp/
  • Situs Togel Resmi
  • Toto Macau
  • Aromatoto
  • Lippototo
  • Mbahtoto
  • https://152.42.229.23/
  • https://bandarlotre126.com/
  • https://heylink.me/sekaipro