Hacker Newsnew | past | comments | ask | show | jobs | submit | meroes's commentslogin

It’s one of the things I noticed in France and Italy. Like after a few days you notice the mens’ silhouettes are alien. Not in a bad way, but noticeable.

But due to lack of protein, vs less fat and sugar? I'm sure minus the fat, many American men would also lack muscle.

That said, Italy and France are known for smoking a lot, which supresses the appetite. Your original observation was swiss though (land of milk, chocolate and cheese)?


could be related to the "sexy Frenchman" stereotype

This all just boils down to the Chinese Room thought experiment, where Im pretty sure the consensus is nothing in the experiment (not the person inside, the whole emergent room, etc) understands Chinese like us.

Another example by Searle is a computer simulating digestion is not digesting like a stomach.

The people saying AI can’t form from LLMs are in the consensus side of the Chinese Room. The digestion simulator could tell us where every single atom is of a stomach digesting a meal, and it’s still not digestion. Only once the computer simulation breaks down food particles chemically and physically is it digestion. Only once an LLM received photons or has a physical capacity to receive photons is there anything like “seeing a night sky”.


Does this mean you think the collective datacenter mass of ChatGPT or something more emergent is AGI?


And I never took biology past sophomore year and yet I knew the first time I listened to Aubrey De Grey he was wrong to propose millennians (living to be 1,000+) had already been born (as of 2005).


Anyone have links for these:

> Yann LeCun was first, fully coming around to his own, very similar critique of LLMs by end of 2022.

> The Nobel Laureate and Google DeepMind CEO Sir Demis Hssabis sees it now, too.


For Yann LeCun, he says it here: https://www.weforum.org/meetings/world-economic-forum-annual...

He's personally moved on from LLM and exploring new architecture more built around world models.

Which he describes here: https://x.com/ylecun/status/1759933365241921817

Also I think the 2022 quoted refers to this Paper by Yann: https://openreview.net/pdf?id=BZ5a1r-kVsf


This is nonsense. LeCun is working on LLMs (and all the rest of it): https://arxiv.org/abs/2509.14252

His work isn't all that different from what many other people in the space are doing. He just prefaces himself to be far more iconoclastic and "out there" than he actually is.


I'm just paraphrasing what he says in the interview I linked.


Or when. Driving during peak commute hours really makes you a sardine in a box and it's harder for there to be intervene-worthy events just by nature of dense traffic.


That's the magic answer. It's a/the hard problem, but permeable to inquiry. The top neuroscience research into consciousness however doesn't seem like this kind of inquiry Dennett is referencing.


I like the more specific versions of those terms: the feeling of a toothache and the taste of mint. There's no need to grasp anything, they're feelings. There's no feeling when a metal bar is bent by a press.

Why they focus on feelings is a different issue.


More than half of STEM grads with careers don’t have a career in STEM.


Don't have, or didn't start with one?


And here I am - an ancient history grad with a career in STEM. Life is odd.


Including techbros thinking they have to answer to every question humanity has ever asked?


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: