Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The history of art or philosophy spans millenia.

The effective history of computing spans a lifetime or three.

There's no sense comparing the two. In the year 2500 it might make sense to be disappointed that people don't compare current computational practices with things done in 2100 or even 1970, but right now, to call what we have "history" does a disservice to the broad meaning of that term.

Another issue: art and philosophy have very limited or zero dependence on a material substrate. Computation has overwhelming dependence on the performance of its physical substrate (by various metrics, including but not limited to: cpu speed, memory size, persistent storage size, persistent storage speed, network bandwidth, network scope, display size, display resolution, input device characteristics, sensory modalities accessibly via digital to analog conversion, ...). To assert that the way problems were solved in 1970 obviously has dramatic lessons for how to solve them in 2025 seems to me to completely miss what we're actually doing with computers.





If Alan Kay doesn't respond directly to this comment, what is Hacker News even for? :)

You're not wrong about history, but that only strengthens Kay's case. E.g., our gazillion-times better physical substrate should have led an array of hotshot devs to write web apps that run circles around GraIL[1] by 2025. (Note the modeless GUI interaction.) Well, guess what? Such a thing definitely doesn't exist. And that can only point to programmers having a general lack of knowledge about their incredibly short history.

(In reality my hope is some slice of devs have achieved this and I've summoned links to their projects by claiming the opposite on the internet.)

Edit: just so I get the right incantation for summoning links-- I'm talking about the whole enchilada of a visual language that runs and rebuilds the user's flowchart program as the user modelessly edits it.

1: https://www.youtube.com/watch?v=QQhVQ1UG6aM


https://news.ycombinator.com/user?id=alankay Has not been active on Hacker News for several years now.

At 85 he has earned the peace of staying away from anything and everything on the internet.



Yes, Alan Kay is very ill.

My gut is your main complaint is largely the modern web ecosystem? Games can run circles around that application, as obvious inspiration. But high end architectural tools are probably more of what you have in mind.

The easy example I used to use to really blow people's minds on what was possible was Mathematica.

That is to say, it isn't so much lack of knowledge of history. It is lack of knowledge of the present. And a seeming unwillingness to want to pay for some things from a lot of folks.


> Such a thing definitely doesn't exist

Isn't that pretty much how things like simulink and gnu radio flowgraphs work?


> E.g., our gazillion-times better physical substrate should have led an array of hotshot devs to write web apps that run circles around GraIL[1] by 2025

Why? What problem did it solve that we're suffering from in 2025?


> The effective history of computing spans a lifetime or three.

That argument actually strengthens the original point: Even though it's been that short, youngsters often still don't have a clue.


Exactly. It's like actors two generations after Thespis not knowing the full history of their art. No excuse, really.

This is just "old person yelling at cloud" territory, though? People often don't know the actors, singers, authors, inventors, whatever from the last few generations. They know the current generation and maybe some originals.

But the rhyme and reason for who is known is not at all obvious. Outside of "who is getting marketed."


The man on the street may not know this history, but serious actors, singers, authors, and inventors themselves certainly know what came before them. If not, they are presumably not actually that interested in their own vocation (which is also normal, by the way).

Do you know this for fact? My gut is that most performers will know of the performers they watched for inspiration. Just like athletes. But few will know the history of their field.

I will agree that the "greats" seem to tend to know all of this. Such that I think I'm agreeing with your parenthetical there. But most practitioners?


I don't know it for fact, no. BUT...I would be very surprised if the average working film director hasn't heard of Ernst Lubitch or Ringo Lam (here I'm deliberately picking names that aren't commonly known by the public at large, like Steven Spielberg). Obviously we could do this for lots of vocations, but really my statement above was about serious practitioners, people who are deliberately trying to improve their art, rather than just hammer a check (which, again, is normal and fine!).

I'll confirm (and then nerdily complicate) your thesis for the art-form I practiced professionally for the first half of my adult life: yes, every serious actor I've been privileged to work with knows of previous performers, and studies texts they leave behind.

I owned at one time a wonderful two-volume anthology called Actors on Acting, which collected analysis and memoir and advice going back... gosh, to Roman theatre, at least. (The Greeks were more quasi-religious, and therefore mysterious - or maybe the texts just haven't survived. I can't remember reading anything first-hand, but there has been a good deal of experimental "original practice" work done exploring "how would this have worked?"). My graduate scholarship delved into Commedia dell'Arte, and classical Indian theatre, as well as 20th century performers and directors like Grotowski, and Michael Chekhov, and Joan Littlewood. Others, of course, have divergent interests, but anyone I've met who cares can geek out for hours about this stuff.

However, acting (or, really, any performance discipline), is ephemeral. It invokes a live experience, and even if you (and mostly you don't, even for the 20th c) have a filmed version of a seminal performance it's barely anything like actually being there. Nor, until very recently, did anyone really write anything about rehearsal and training practice, which is where the real work gets done.

Even for film, which coincidentally covers kinda the same time-period as "tech" as you mean it, styles of performance - and the camera technology which enables different filming techniques - have changed so much, that what's demanded in one generation isn't much like what's wanted in the next. (I think your invocation of film directors is more apt: there are more "universal" principles in composition and framing than there are in acting styles.)

Acting is a personal, experiential craft, which can't be learned from academic study. You've got to put in hours of failure in the studio, the rehearsal room, and the stage or screen to figure out how to do it well.

Now, here's where I'll pull this back to tech: I think programming is like that, too. Code is ephemeral, and writing it can only be learned by doing. Architecture is ephemeral. Tooling is ephemeral. So, yes: there's a lot to be learned (and should be remembered) from the lessons left by previous generations, but everything about the craft pulls its practitioners in the opposite direction. So, like, I could struggle through a chapter of Knuth, or I could dive into a project of my own, and bump up against those obstacles and solve them for myself. Will it be as efficient? No, but it'll be more immediately satisfying.

Here's another thing I think arts and tech have in common: being a serious practitioner is seldom what gets the prize (if by that you mean $$$). Knuth's not a billionaire, nor are any of my favorite actors Stars. Most people in both disciplines who put in the work for the work's sake get out-shined by folks lucky enough to be in the right place at the right time, or who optimize for hustle or politics or fame. (I've got no problem with the first category, to be clear: god bless their good fortune, and more power to them; the others makes me sad about human nature, or capitalism, or something.) In tech, at least, pursuing one's interest is likely to lead to a livable wage - but let's see where our AI masters leave us all in a decade, eh?

Anyway, I've gone on much to much, but you provoked an interesting discussion, and what's the internet for if not for that?


> Another issue: art and philosophy have very limited or zero dependence on a material substrate. Computation has overwhelming dependence on the performance of its physical substrate

That's absolutely false. Do you know why MCM furniture is characterized by bent plywood? It's because we developed the glues that enabled this during world war II. In fashion you had a lot more colors beginning in the mid 1800s because of the development of synthetic dyes. Really odd that oil paints were really perfected around Holland (major place for flax and thus linseed oil), which is what the dutch masters _did_. Architectural mcmansions began because of the development of pre-fab roof trusses in the 70s and 80s.

How about philosophy? Well, the industrial revolution and it's consequences have been a disaster for the human race. I could go on.

The issue is that engineers think they're smart and can design things from first principles. The problem is that they're really not, and design things from first principles.


> To assert that the way problems were solved in 1970 obviously has dramatic lessons for how to solve them in 2025 seems to me to completely miss what we're actually doing with computers.

True they might not all be "dramatic lessons" for us, but to ignore them and assume that they hold no lessons for us is also a tragic waste of resources and hard-won knowledge.


Its because CS is not cared about as a true science for the most part. Nearly all of the field is focused on consolidating power and money dynamics. No one cares to make a comprehensive history since it might give your competitors an edge.

Art and Philosophy are hardly regarded as science, either. Actually, less so. Yet...

Philosophy is definitely a social or formal science (depending on who you ask).

I'm well read on the topic and I've never heard it referred to as "a social or formal science." Where is this coming from?

I have thought that's the common definition and doesn't need much thought...

My dictionary absolutely implies that, it even claims that all the sciences were split of from Philosophy and that a common modern topic of Philosophy is the theory of science. The point of Philosophy is to define truth in all aspects, how is that not science? It's even in the name: "friend of truth". Philosophy is even more fundamental and formal than mathematics. Mathematics asks what sound systems are, what properties they have and how they can be generalized. Philosophy asks, what something truly is, what it means to know, what it means to have a system and whether it's real. The common trope of going even more fundamental/abstract goes: "biology -> chemistry -> physics -> mathematics -> philosophy"


Science used to be referred to as philisophy of nature.

You're confusing computer science with economics. The ahistorical nature of classical and neoclassical economics basically declares that history is irrelevant. Economists do not really concern themselves with economic history, like at all.

> art [has] very limited or zero dependence on a material substrate

This seems to fundamentally underestimate the nature of most artforms.


You first sentences already suggest one comparison between the histories of computing and philosophy: history of computing ought to be much easier. Most of it is still in living memory. Yet somehow, the philosophy people manage it while we computing people rarely bother.

I always think it is great value to have a whole range of history of X courses.

I once thought about a series of PHYS classes that focus on historical ideas and experiments. Students are supposed to replicate the experiments. They have to read book chapters and papers.


History of physics is another history where we have been extremely dependent on the "substrate". Better instruments and capacity to analyze results, obviously, but also advances in mathematics.

That's not that far off the standard physics course, is it? Certainly lots of labs I took were directly based on historical experiments.

Art and philosophy have very limited or zero dependence on a material substrate

Not true for either. For centuries it was very expensive to paint with blue due to the cost of blue pigments (which were essentially crushed gemstones).

Philosophy has advanced considerably since the time of Plato and much of what it studies today is dependent on science and technology. Good luck studying philosophy of quantum mechanics back in the Greek city state era!


Just because a period of history is short doesn't make it _not history_.

Studying history is not just, or even often, a way to rediscover old ways of doing things.

Learning about the people, places, decisions, discussions, and other related context is of intrinsic value.

Also, what does "material substrate" have to do with history? It sounds here like you're using it literally, in which case you're thinking like an engineer and not like a historian. If you're using it metaphorically, well, art and philosophy are absolutely built on layers of what came before.


The rate of change in computer technology has been orders of magnitudes faster than most other technologies.

Consider transport. Millennia ago, before the domestication of the horse, the fastest a human could travel was by running. That's a peak of about 45 km/h, but around 20 km/h sustained over a long distance for the fastest modern humans; it was probably a bit less then. Now that's about 900 km/h for commercial airplanes (45x faster) or 3500 km/h for the fastest military aircraft ever put in service (178x faster). Space travel is faster still, but so rarely used for practical transport I think we can ignore it here.

My current laptop, made in 2022 is thousands of times faster than my first laptop, made in 1992. It has about 8000 times as much memory. Its network bandwidth is over 4000 times as much. There are few fields where the magnitude of human technology has shifted by such large amounts in any amount of time, much less a fraction of a human lifespan.


That gives even more reason to study the history of CS. Even artists study contemporary art from the last few decades.

Given the pace of CS (like you mentioned) 50 years might as well be centuries and so early computing devices and solutions are worth studying to understand how the technology has evolved and what lessons we can learn and what we can discard.


> The effective history of computing spans a lifetime or three.

"Computer" goes back to 1613 per https://en.wikipedia.org/wiki/Computer_(occupation)

https://en.wikipedia.org/wiki/Euclidean_algorithm was 300 BC.

https://en.wikipedia.org/wiki/Quadratic_equation has algorithms back to 2000 BC.


> Computation has overwhelming dependence on the performance of its physical substrate (by various metrics, including but not limited to: cpu speed, memory size, persistent storage size, persistent storage speed, network bandwidth, network scope, display size, display resolution

This was clearly true in 01970, but it's mostly false today.

It's still true today for LLMs and, say, photorealistic VR. But what I'm doing right now is typing ASCII text into an HTML form that I will then submit, adding my comment to a persistent database where you and others can read it later. The main differences between this and a guestbook CGI 30 years ago or maybe even a dialup BBS 40 years ago have very little to do with the performance of the physical substrate. It has more in common with the People's Computer Company's Community Memory 55 years ago (?) using teletypes and an SDS 940 than with LLMs and GPU raytracing.

Sometime around 01990 the crucial limiting factor in computer usefulness went from being the performance of the physical substrate to being the programmer's imagination. This happened earlier for some applications than for others; livestreaming videogames probably requires a computer from 02010 or later, or special-purpose hardware to handle the video data.

Screensavers and demoscene prods used to be attempts to push the limits of what that physical substrate could do. When I saw Future Crew's "Unreal", on a 50MHz(?) 80486, around 01993, I had never seen a computer display anything like that before. I couldn't believe it was even possible XScreensaver contains a museum of screensavers from this period, which displayed things normally beyond the computer's ability. But, in 01998, my office computer was a dual-processor 200MHz Pentium Pro, and it had a screensaver that displayed fullscreen high-resolution clips from a Star Trek movie.

From then on, a computer screen could display literally anything the human eye could see, as long as it was prerendered. The dependence on the physical substrate had been severed. As Zombocom says, the only limit was your imagination. The demoscene retreated into retrocomputing and sizecoding compos, replaced by Shockwave, Flash, and HTML, which freed nontechnical users to materialize their imaginings.

The same thing had happened with still 2-D monochrome graphics in the 01980s; that was the desktop publishing revolution. Before that, you had to learn to program to make graphics on a computer, and the graphics were strongly constrained by the physical substrate. But once the physical substrate was good enough, further improvements didn't open up any new possible expressions. You can print the same things on a LaserWriter from 01985 that you can print on the latest black-and-white laser printer. The dependence on the physical substrate has been severed.

For things you can do with ASCII text without an LLM, the cut happened even earlier. That's why we still format our mail with RFC-822, our equations with TeX, and in some cases our code with Emacs, all of whose original physical substrate was a PDP-10.

Most things people do with computers today, and in particular the most important things, are things fewer people have been doing with computers in nearly the same way for 30 years, when the physical substrate was very different: 300 times slower, 300 times smaller, a much smaller network.

Except, maybe, mass emotional manipulation, doomscrolling, LLMs, mass surveillance, and streaming video.

A different reason to study the history of computing, though, is the sense in which your claim is true.

Perceptrons were investigated in the 01950s and largely abandoned after Minsky & Papert's book, and experienced some revival as "neural networks" in the 80s. In the 90s the US Postal Service deployed them to recognize handwritten addresses on snailmail envelopes. (A friend of mine who worked on the project told me that they discovered by serendipity that decreasing the learning rate over time was critical.) Dr. Dobb's hosted a programming contest for handwriting recognition; one entry used a neural network, but was disqualified for running too slowly, though it did best on the test data they had the patience to run it on. But in the early 21st century connectionist theories of AI were far outside the mainstream; they were only a matter of the history of computation. Although a friend of mine in 02005 or so explained to me how ConvNets worked and that they were the state-of-the-art OCR algorithm at the time.

Then ImageNet changed everything, and now we're writing production code with agentic LLMs.

Many things that people have tried before that didn't work at the time, limited by the physical substrate, might work now.


Reconsider this post after playing with an HDR10 480hz monitor.

One question, why are you prepending zero to every single year?

Usually it's because of an initiative by the Long Now Foundation that is supposed, among other things, to raise awareness about their 10,000 years clock and what it stands for.

See https://longnow.org/ideas/long-now-years-five-digit-dates-an...


And I, semi seriously, say that it's only the "Medium Now", because they only prepend one zero.

But I hate it. It makes most readers stumble over the dates, and it does so to grind an axe that is completely unrelated to the topic at hand.


Everyone hates it.

There are always a few trolls who complain about how other people spell their words, wear their hair, format their dates, choose their clothes, etc. Usually on HN these complaints get flagged into oblivion pretty quickly. But most people don't care, preferring to read about things like the interaction between technological development and digital art forms than complaints about whether "aluminum" should actually be spelled "alumium".

I would rather attribute the term "few trolls", if to use it at all, to the people pushing an agenda for a thing happening in 8000 years.

You do know people have imagination and guys back in 1970 already imagined pretty much everything we use now and even posed problems that are not going to be solved by our computing power.

Dude watch original StarTrek from 1960’s you will be surprised.

You might also be surprised that all AI stuff nowadays is so hyped was already invented in 1960’s only that they didn’t have our hardware to run large models. Read up on neural networks.


> The history of art or philosophy spans millen[n]ia.

And yet for the most part, philosophy and the humanities' seminal development took place within about a generation, viz., Plato and Aristotle.

> Computation has overwhelming dependence on the performance of its physical substrate [...].

Computation theory does not.


Philosophy died with Newton.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: