Just about each website positioning software is powered by scraped search outcomes. With the intention to get extra perception from every scrape—and thus hold prices down for finish customers—instruments typically use this parameter to power prolonged search outcomes on the primary web page, as an alternative of the usually 10 common natural outcomes a person would see by default. Moz Professional, for instance, has for a very long time standardized on &num=50 – so 5 “pages” of outcomes per scrape.
This parameter had truly been deprecated for a few years, however continued to be unofficially supported. In mid-September, it began to slowly cease working, forcing website positioning instruments to hunt different strategies. Some instruments–together with Moz and STAT–ready an alternate we name “stitching”, the place we piece collectively a sequence of paginated outcomes, 10 at a time, into one longer set of outcomes. There are numerous difficulties with this strategy, a lot of which could be mitigated or averted, however the primary implication is value, which finally ends up being considerably larger, to the purpose of being unsustainable in lots of instances.
This also needs to be seen within the context of SERP information prices typically rising lately, as instruments are pressured to imitate actual browsers increasingly carefully as a way to get correct, consultant rankings.