Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I was kind of nodding and agreeing with you but then I remembered a bunch of optimal code from back in those days that was very unwound and super large so it depended on the type of work being done. I guess one could say that purely algorithmic code tended to have size and speed correlated whereas code that was bottlenecked in memory read/write could benefit from being large. Cache wasn't having the big impact like today.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: