Every time I read about Plan 9 I get a little sad. Unix is so good at what it does that the better enemy of the good doesn't stand a chance.
There are lots of better alternatives to Unix out there. But with the incumbents strengths being what they are I think we'll be stuck with Unix for a long time to come.
Possibly the only thing that will change this is if it is ever deemed that the Unix model is inherently broken from a security point of view and one of these other alternatives provides a fix. Performance and conceptual models are not enough to ditch decades of investment in technology that is good enough.
There's a slight problem with simply deeming Plan 9 "better" and Unix (for whatever value of Unix-ness) "good". Depending on your definition of worth, it's probably easy enough to find a system that is even better, i.e. more "pure" in that regard. Obvious examples, if your focus is underlying concepts, would be Oberon and Lisp Machines, where it's straight access to data structures and functions, without superfluous parsing of lines of text and command line arguments...
Compared to really other systems, Plan 9 is still Unix. Don't think the designers would disagree...
Plan 9 is Unix in the way that Unix is still Multics. So is Plan 9 still Multics?
Really, the superficial similarities should not overshadow the fundamental differences. QnX is also Unix by that definition, but in fact is a completely different system under the hood.
Compared to the others I've mentioned? Sure, and that would most likely mean that most systems are closer to Unix than e.g. SmallTalk. Probably even including Windows, unless one would have a very favorable view of COM.
I'm getting a new Macbook in a few days (after 5 years with my current machine I desperately need an upgrade) and I'll leave this one for a permanent virtual machine with Plan9 on it. In some sense it will be my "just write, no idle browsing" machine." There's something about Plan9 I find great.
If the more optimistic predictions regarding the upcoming generation of non-volatile memory are true, then we may see store and main memory merging. I think that would be a window of opportunity for new OSes, though I don't doubt that Linux could be made to run under that model too, somehow.
Unix will stay as long as the machine model is Von Neumann with shared memory and interrupts. If any architecture is sufficiently remote from that model, especially something without shared memory, Unix won't stand a chance.
Multiprocessors weren't commonplace until users of the Windows 9x kernel were dragged screaming and kicking to the XP kernel. 64-bit processors were not commonplace until AMD came out with a clever way to make them look like 32-bit ones that could run 32-bit Windows.
In the meantime, Unix users were using 64-bit multi-processor boxes since ages. Yet, there was no mainstream consumer 64-bit multi-processor architecture before there was no 64-bit multi-processor mainstream consumer OS.
The same applies here - most current improvement is directed towards making current x86 and ARM computer architectures faster because there is no popular corpus of software that could take advantage of different architectures. Sony, IBM and Toshiba tried with the Cell and failed. The Xeon Phi may prove to be an interesting stepping stone in that direction but it too uses a shared memory model.
I'd love to see some radical ideas tried, but, right now, I think I won't.
It's a bit tragic that the most popular OSs of today are the bastard descendants of the most popular mini-computer OSs of the 70's.
Although Unix was not designed for multi-processors originally, its main run-time abstraction (process networks) is easy enough to port to any platform with a common store. So there's not much to discuss about single-core vs multi-core, it's either one VN or multiple VN machines connected to a single abstract memory.
Remove the shared memory, then issues start to rise.
You say "radical ideas tried" but I believe that is not relevant. Radical ideas typically don't fly because they are radical. However you can already see things happening that are not radical but are breaking the Unix machine model very hard:
- "accelerators": these are really fully fledged co-processors with their own local memory. There is no easy way to conceptually delegate Unix threads to accelerators precisely because they don't share the memory. (Now with recent GPUs the architecture was modified to actually share the memory between the CPU and GPU so things become possible, but not all "accelerators" have that feature)
- on systems-on-chip you now have scratchpads next to cores on the architecture. Some cores may not have access to the "main" memory of another core (eg the radio controller in telephones) although they are fully-fledged cores as well. Because of this lack of shared memory, it is not conceptually possible to envision a Unix system where processes access these extra processors transparently.
tl;dr there are already hardware architecture with separated memories and Unix can't cope with that easily because its main abstraction requires shared memory.
Linux is better. In comparison, Plan 9 is like primordial soup, whereas Linux is an advanced alien life from from the future.
There are so many parts of Linux that are far better than Plan 9, and polished for production work loads. Real life solutions are sometimes messy, and take many evolutions to get right.
Sure, elegant and simple maths is good. But if the maths is just wrong, and doesn't work, then the more complicated maths that actually works is better.
Linux is not better. Linux looks better, feels better and so on but under the hood it is just a rehash of the predecessor of plan 9.
I'm using linux every day on both my desktop and a whole army of servers and I have some experience with both plan 9 and other (micro kernel based) systems. Linux (and in fact all Unix flavours) have some systemic problems that those other os variants attempt to address. That the user experience is less polished does not detract from that.
Maths don't enter into it.
To re-phrase your analogy: Linux is a crocodile, it is ancient, dangerous and well adapted to its environment. So well adapted that it is a force to be reckoned with, even if you're an advanced creature from the future.
I would lean towards the opposite.
Plan 9 is a creature that was designed well from day one, but initially limited by restrictive licensing.
Linux is probably one of the largest hack and slash jobs in programming history, but it was available under very free and liberal terms from the beginning and development was open to everyone, hence it garnered more community support and as a result was more successful.
There are lots of better alternatives to Unix out there. But with the incumbents strengths being what they are I think we'll be stuck with Unix for a long time to come.
Possibly the only thing that will change this is if it is ever deemed that the Unix model is inherently broken from a security point of view and one of these other alternatives provides a fix. Performance and conceptual models are not enough to ditch decades of investment in technology that is good enough.