I mostly agree with you but do find it a fair point to suggest making it a straight-up paywall then. If they want some clients to pay for the content based on heuristic and black-box algorithms, that's going to be discriminatory, we just don't know to which groups (could be users from cheap connections or lower-income countries, could be unusual user agents like Ladybird on macOS, could be anything)
The scope of the average paywall is quite different, letting only some specific crawlers pass for indexing but not meaning to let anyone read who isn't subscribed. I can see the similarity you mean and it's an interesting case to compare with, but "everyone should pay, but we want to be findable" seems different to me from "only things that look like bots to us should pay". Perhaps also because the implementation of the former is easy (look up guidance for the search engines you want to be in; plain allowlist-based) and the latter is nigh impossible (needs heuristics and the bot operators can try to not match them but an average person can't do anything)