I'd argue that technology usually causes the opposite effect.
Through innovation, technology gradually improves, which leads to higher quality for lower cost. So, perhaps there will be people who build models that create higher quality writing which will out-compete the current "generated sludge".
I'm also not convinced that a lack of data will cause stagnation. Sure, more data can improve LLM's, but it seems likely that there are other possible improvements to training algorithms that do not require more data.