I may be misunderstanding this measure but I don't think that's going to be mitigated.
If I understand correctly, this requires browsers requesting a page to do a small amount of "work" for no reason other than demonstrating they're willing to do it. As a once off for devices used by humans, it's barely noticeable. For bots reading millions of pages it's untenable - they'll just move on to easier targets.
However, that only works for bots who's purpose is to injest large quantities of text.
For a bot who's purpose is to make posts, or upvote things, or reply to other comments, they're much less sensitive to this measure because they don't need to harvest millions of pages.
Wow ok. Horseshoe political theory in action.