Making Sense of Your September Data
September proved to be an unusually chaotic month for anyone tracking SEO performance. At the centre of it all was Google’s decision to discontinue a little-known feature that let users view 100 search results per page instead of the standard 10. Most people had no idea this feature even existed, but its removal sent shockwaves through the SEO industry.
The fallout hit quickly. SEMrush and Ahrefs, two of the most widely used ranking trackers, struggled to maintain accurate reporting. For about a week, many SEO professionals found themselves staring at incomplete data, gaps in their reports, and clients asking uncomfortable questions about sudden ranking changes that hadn’t actually happened.
Interestingly, not all tools were affected. SEOPress Insights continued working throughout the disruption because it uses a different data collection method that doesn’t rely on the num=100 parameter. This highlighted something important: when it comes to SEO tracking, having all your eggs in one basket can leave you exposed.
What Google Changed on 10th September
Google removed the num=100 parameter from search results on 10th September. This parameter had existed for years, allowing anyone who knew about it to display 100 results on a single page rather than paginating through multiple sets of 10.
Hardly anyone browsing Google actually used this. Why would you? Who wants to scroll through 100 results when the answer you need is usually in the top five?
But here’s the thing: SEO tools had built entire data collection systems around it. SEMrush, Ahrefs, and countless other SERP tracking platforms were using these 100-result pages to efficiently scrape ranking data. Pull one page instead of ten, get all the positions you need in a single request. Elegant, efficient, and completely reliant on Google maintaining a feature that served virtually no real users.
When Google switched it off, those systems broke.
The Impact on SEMrush and Ahrefs
The problems showed up almost immediately. Between 10th and 17th September, SEMrush reports showed noticeable gaps and irregularities. The SEMrush Sensor, which tracks Google algorithm volatility, displayed its own glitch during this period because it couldn’t access the data it needed.
Ahrefs users encountered similar issues. Ranking data became patchy, particularly for keywords where sites ranked beyond position 10. For anyone running client reports that week, it meant awkward conversations and a lot of explaining.
Both platforms have since adjusted their data collection methods and reporting has stabilised. The gaps were temporary, but they exposed just how dependent the industry had become on reverse-engineering Google’s systems rather than using official APIs.
What Happened to Google Search Console Data
A few days after the num=100 parameter disappeared, something else became apparent. Website owners checking their Google Search Console data started seeing significant drops:
- Impressions fell across the board
- Keyword visibility appeared to decrease
- Overall metrics suggested sites were performing worse
If you monitor your WordPress site through SEOPress PRO, which syncs Search Console data automatically, these changes would have been particularly obvious. The numbers looked alarming at first glance.
But here’s what actually happened. Those impression drops weren’t reflecting declining performance. They were showing more accurate data for the first time in years.
The Data Was Always Wrong
Before 10th September, Google Search Console was counting impressions from real users and from all those SEO tools scraping data via the num=100 pages. Every time SEMrush checked your rankings, every time Ahrefs pulled SERP data, every time any tracking tool accessed those 100-result pages, Google’s systems logged impressions.
These weren’t real searches. They were bots doing their job, but they were inflating your numbers significantly. Your impression counts included thousands of automated queries that had nothing to do with actual people looking for your content.
After 10th September, those bot-generated impressions disappeared. What you’re seeing now in Search Console is genuine user activity. The data looks lower because it’s finally accurate.
What This Means Practically
If you’re running an agency or managing SEO for clients, this creates an immediate challenge with historical reporting. Comparing August to October figures directly will show a drop that needs explaining. September data will look odd no matter how you slice it.
The key is documentation. Make sure your team understands what happened and why. Add notes to client reports explaining the data shift. Otherwise you’ll spend months fielding questions about a ranking decline that never actually occurred.
There’s an upside though. More accurate Search Console data means better strategic decisions. You can now see which keywords are genuinely driving impressions from real users rather than having your numbers skewed by tracking bots. For conversion analysis, this makes the data considerably more useful.
Moving Forward
A few practical steps worth taking now:
Check your Google Search Console setup if you haven’t recently. It’s still the most reliable free tool for monitoring genuine search performance and catching indexing issues before they become problems.
Use multiple tracking sources where budget allows. The fact that SEOPress Insights kept working whilst SEMrush and Ahrefs struggled demonstrates why relying on a single tool can be risky. Different data collection methods provide redundancy when one approach fails.
Treat post-September data as your new baseline. The figures you’re seeing now are more representative of actual user behaviour. Use them as the foundation for tracking real improvements going forward.
Be wary of year-on-year September comparisons for the next twelve months. September 2025 data won’t be directly comparable to September 2024 without accounting for this methodology change.
The disruption was frustrating whilst it lasted, but we’ve ended up with cleaner data as a result. Search Console metrics now reflect genuine user activity rather than a mixture of real searches and bot traffic. That’s worth the temporary inconvenience of explaining the numbers to clients.
Sometimes the industry needs reminding that we’re building measurement systems on top of a platform we don’t control. Google can change the rules whenever it likes, and tools that depend on unofficial access methods will always carry that risk. September was an expensive lesson for some, but at least the data coming out the other side is more trustworthy.