Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>What I'm interested about is ability "reason" - analyze, synthesize knowledge, formulate plans, etc. And LLMs demonstrated those abilities.

I disagree that they have demonstrated that. In my interactions with them, I have often found that they correct themselves when I push back, only to say something that logically implies exactly the same incorrect claim.

They have no model of the subject they're talking about and therefore they don't understand when they are missing information that is required to draw the right conclusions. They are incapable of asking goal driven questions to fill those gaps.

They can only mimic reasoning in areas where the sequence of reasoning steps has been verbalised many times over, such as with simple maths examples or logic puzzles that have been endlessly repeated online.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: