> If that were truly the LLM's "paperclip", then how far would it be willing to go?
While I'm assuming you didn't mean it literally, language is important, so let's remember that an LLM does not have any will of its own. It's a predictive engine that we can be certain doesn't have free will (which of course is still up for debate about humans). I only focus on that because folks easily make the jump to "the computer is to blame, not me or the folks who programmed it, and certainly it wasn't just statistics" when it comes to LLMs.
While I'm assuming you didn't mean it literally, language is important, so let's remember that an LLM does not have any will of its own. It's a predictive engine that we can be certain doesn't have free will (which of course is still up for debate about humans). I only focus on that because folks easily make the jump to "the computer is to blame, not me or the folks who programmed it, and certainly it wasn't just statistics" when it comes to LLMs.