Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> If that were truly the LLM's "paperclip", then how far would it be willing to go?

While I'm assuming you didn't mean it literally, language is important, so let's remember that an LLM does not have any will of its own. It's a predictive engine that we can be certain doesn't have free will (which of course is still up for debate about humans). I only focus on that because folks easily make the jump to "the computer is to blame, not me or the folks who programmed it, and certainly it wasn't just statistics" when it comes to LLMs.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: