Welcome to the week’s Pulse: updates affect how Google ranks content, how its crawlers handle page size, and where AI referral traffic is heading. Here’s what matters for you and your work.
Google Rolls Out The March 2026 Core Update
Google began rolling out the March core update this week. This is the first broad core update of the year.
Key facts: The rollout may take up to two weeks. Google described it as a regular update designed to surface more relevant, satisfying content from all types of sites. It arrives two days after the March spam update completed in under 20 hours.
Why This Matters
The December core update was the most recent broad core update, finishing on December 29. That’s a three-month gap. The February 2026 update only affected Discover, so Search rankings haven’t been recalibrated since late December.
Ranking changes could appear throughout early April. Google recommends waiting at least a full week after the rollout finishes before analyzing Search Console performance. Compare against a baseline period before March 27.
What SEO Professionals Are Saying
John Mueller, a member of Google’s Search Relations team, wrote on Bluesky when asked whether the two updates overlap:
One is about spam, one is not about spam. If with some experience, you’re not sure whether your site is spam or not, it’s unfortunately probably spam.
Mueller later explained that core updates don’t follow a single deployment mechanism. Different teams and systems contribute changes, and those components can require step-by-step rollouts rather than a single release. That’s why rollouts take weeks and why ranking volatility often appears in waves rather than all at once.
Roger Montti, writing for Search Engine Journal, noted the proximity to the spam update may not be a coincidence. Spam fighting is logically part of the broader quality reassessment in a core update.
Read our full coverage: Google Begins Rolling Out March 2026 Core Update
Read Roger Montti’s coverage: Google Answers Why Core Updates Can Roll Out In Stages
Illyes Explains Googlebot’s Crawling Architecture And Byte Limits
Google’s Gary Illyes, an analyst on Google’s Search team, published a blog post explaining how Googlebot works within Google’s broader crawling systems. The post adds new technical details to the 2 MB crawl limit Google published earlier this year.
Key facts: Illyes described Googlebot as one client of a centralized crawling platform. Google Shopping, AdSense, and other products all route requests through the same system under different crawler names. HTTP request headers count toward the 2 MB limit. External resources like CSS and JavaScript get their own separate byte counters.
Why This Matters
When Googlebot hits 2 MB, it doesn’t reject the page. It stops fetching and passes the truncated content to indexing as if it were the complete file. Anything past 2 MB is never indexed. That matters for pages with large inline base64 images, heavy inline CSS or JavaScript, or oversized navigation menus.
The centralized platform detail also explains why different Google crawlers behave differently in server logs. Each client sets its own configuration, including byte limits. Googlebot’s 2 MB is a Search-specific override of the platform’s 15 MB default.
Google has now covered these limits in documentation updates, a podcast episode, and this blog post within two months. Illyes noted the 2 MB limit is not permanent and may change as the web evolves.
What SEO Professionals Are Saying
Cyrus Shepard, founder of Zyppy SEO, wrote on LinkedIn:
That said, as SEOs we often deal with extreme situations. If you notice certain content not getting indexed on VERY LARGE PAGES, you probably want to check your size.
Read our full coverage: Google Explains Googlebot Byte Limits And Crawling Architecture
Google’s Illyes And Splitt: Pages Are Getting Larger, And It Still Matters
Gary Illyes and Martin Splitt, Developer Advocate at Google, discussed page weight growth and crawling on a recent Search Off the Record podcast episode.
Key facts: Web pages have grown nearly 3x over the past decade. The 15 MB default applies across Google’s broader crawling systems, with individual clients like Googlebot for Search overriding it downward to 2 MB. Illyes raised whether structured data that Google asks websites to add is contributing to page bloat.
Why This Matters
The 2025 Web Almanac reports a median mobile homepage size of 2,362 KB. This indicates pages are getting larger, though it should not be considered safely below Googlebot’s 2 MB fetch limit. However, Illyes’s question about structured data contributing to bloat is worth monitoring. Google encourages sites to add schema markup for rich results, and that markup increases the weight of each page.
Splitt said he plans to address specific techniques for reducing page size in a future episode. Pages with heavy inline content should verify their critical elements load within the first 2 MB of the response.
Read our full coverage: Google: Pages Are Getting Larger & It Still Matters
Gemini Referral Traffic More Than Doubles, Overtakes Perplexity
Google Gemini more than doubled its referral traffic to websites between November 2025 and January 2026. The data comes from SE Ranking’s analysis of more than 101,000 sites with Google Analytics installed.
Key facts: SE Ranking measured a 115% combined increase over two months, with the jump starting around the time Google rolled out Gemini 3. In January, Gemini sent 29% more referral traffic than Perplexity globally and 41% more in the U.S. ChatGPT still generates about 80% of all AI referral traffic. For transparency, SE Ranking sells AI visibility tracking tools.
Why This Matters
In August 2025, Perplexity was sending about 2.9x more referral traffic than Gemini. Gemini’s December-January surge reversed that by January 2026. ChatGPT’s lead over Gemini also narrowed, from roughly 22x in October to about 8x in January.
All AI platforms combined still account for about 0.24% of global internet traffic, up from 0.15% in 2025. That’s measurable growth, but it’s still a small share compared to organic search. Two months of Gemini growth correlates with a known product launch, but it’s too early to call it a sustained pattern.
Gemini is now worth watching alongside ChatGPT and Perplexity in your referral reports.
Read our full coverage: Google Gemini Sends More Traffic To Sites Than Perplexity: Report
Theme Of The Week: Google Is Explaining Its Own Systems
Three of this week’s four stories are Google telling you how its systems work. Illyes published a blog post detailing Googlebot’s architecture. The same week, the Search Off the Record podcast covered page weight and crawl thresholds. Mueller explained why core updates roll out in waves rather than all at once. Each one fills a gap that documentation alone left open.
The Gemini traffic data provides a new perspective. Google is being open about how its crawlers and ranking systems operate. The traffic passing through its AI services is increasing rapidly enough to be reflected in third-party data, and Google isn’t explaining that part.
Top Stories Of The Week:
More Resources: