It's _still_ pointless. There's no chance of displacing Linux or the BSDs in practical terms, and there's no particularly interesting ideas in it. We already know you can build a Unix on top of a microkernel and it will sorta work.
Enough, already. Kill this late-1980s design and build something interesting already. Surely at least one of the things that have happened between 1990-2011 in hardware, software and networking warrants a rethink of OS assumptions rather than this stale rehashing?
Don't you think it's a bit tragic that the two most sophisticated OSs widely available are either direct descendants of or largely inspired by AT&T 60's technology, while the third is the bastard child of VMS?
Try new ideas, prove them with a large audience and see what you can get from them.
One of these days someone will come up with the idea that replaces Unix. Didn't happen until now, but the only way to be sure it never happens is to stop trying new ideas.
Don't you think it's a bit insightful that the two most sophisticated OSs widely available are either direct descendents of or largely inspired by AT&T 60's technology?
That these technologies were highly modular, focused on a simple core, emerged as an independent (an unauthorized) side project at AT&T, and were immediately adapted and extended by the academic and research communities?
The principle change that's come about in the descendant / inspired instances present today is that there's a strong assurance that the license of these projects is and will remain free, allowing, say, independent entities, individuals, and organizations to try new ideas, prove them with a large audience, and see what you can get from them.
There will be an idea that replaces Unix. I don't know what it will be. However, it will be called Unix.
> Don't you think it's a bit insightful that the two most sophisticated OSs widely available are either direct descendents of or largely inspired by AT&T 60's technology?
It's meaningful, I agree, but I get the same feeling when I drive my car running on an internal combustion engine (although my car runs on ethanol) that my grandfather would easily recognize and, quite possibly, be able to repair.
Technology is iterative and incremental, not revolutionary.
Most of what's happening on the Web right now (as cool as it is) is enabled more from increases in compute power, speed, interconnects, bandwidth (rate/volume of access), and connectivity (persistence/ease of access), but not from any inherent lack of vision in a prior age.
And when I say "a prior age", I'm not talking about the 90s, 80s, or 50s. Try 19th century (Jeremy Bentham, Jules Verne). Or the 10th century (1001 Arabian Nights has some cool stuff in it). Or Roman, Greek, Biblical, or Egyptian eras.
Drag yourself off to an ancient history museum sometime. You'll find items there that you use daily, or at least are very, very familiar with, from literally thousands of years ago: tweezers, dice, mirrors, bracelets, Egyptian sand toilets (a large stone seat over a box in the ground).
"Old" isn't synonymous with "tired" in technology. Particularly for tools which are in continuous use and evolution, it means "tried, true, tested, and refined". There's a law of diminishing returns, and of asymptotic convergence on an ideal.
I'd love to see the ICE replaced by something cleaner, more energy efficient, and more sustainable. People have tried for nearly 150 years, and I've been a fan of several alternatives (alternate fuel, Wankel, gas turbine, steam, electric) at various times. With the exception of the Wankel, all of those were present at the time the ICE emerged.
We kept the four wheels, steering, and even the dashboard (you know where that term comes from, right?). The only thing replaced was the prime mover.
That's admittedly an oversimplification, but if you look at early cars, the were called ... what was it again? Oh yeah: horseless carriages.
Neat stuff: rack and pinion, you know, we use it for steering? Leonardi da Vinci used it, though it dates from far before then.
The real innovations for the car were incremental increases in metalurgy (allowing creation of efficient engines), machining, and the availability of cheap, plentiful, portable, and dense energy sources (petroleum).
Much of the advances of the 19th and 20th century is attributable to increased available energy, compounded and confounded with technological increases allowing better and more useful use of that energy.
"One of these days someone will come up with the idea that replaces Unix."
I believe it will rather build atop of it, maybe changing it slightly. Sort of like OOP & functional didn't replace structural/procedural programming - both paradigms still use procedures and both changed them slightly to suit their ways.
it is happening right now, right in front of our eyes. We've got hardware. We've got OS. The 2 building blocks. And now we add 3d component - hypervisor - in between and thus there is virtualization which is the foundation for the most fundamental change which is happening now (with "cloud" being just the most noticeable and the most marketable [today] facet of it)
I'm surprised by the amount of negativity in this thread. I was initially going to say that writing a working, solid, free kernel was akin to climbing Everest in terms of human achievement. After the first guy does it, does it really diminish the others' contributions? If they do it using a micro-kernel, isn't that kind of like climbing the mountain without oxygen? But given the amount of community effort that is required it's really a lot bigger than the Everest analogy. Maybe it's more like space travel.
Regardless, I feel like these devs are working towards something worthwhile, even if only they benefit from the experience (as they imply in the article, they've already directly benefited from their free work, good for them!). Is there some sort of opportunity cost that I'm not aware of here? Could we say that they should be working on developing a driver for the latest Nvidia graphics card for the Linux kernel instead? Would more people benefit from that? In the next sixty months, probably, in the next sixty years, it's a lot harder to say...
What do you care what other programmers do with their time? The ideas in Hurd are particularly interesting enough to those who work on it. That alone justifies its continued development.
This is disingenuous. Under this logic there's no such thing as a project that's more or less objectively interesting.
If 999 programmers decide to work on 999 basically identical Soduku solvers, and 1 programmer decides to hack up a novel object recognition system with OpenCV, and all 1000 guys are equally keen on their own projects, we'll just have to all agree that all thousand projects are equally interesting and call it a day.
I care because it's wasted effort that would be better spent elsewhere.
The logic here is that your opinion on the value of the Hurd isn't objective and that there's no accounting for taste. The Hurd authors know the fruits of their labor far better than you do. If they don't consider their effort wasted and are happy with their results, what are you upset about? If programmers like working on things you don't like, that's their own business.
This is about people sitting on their asses disparaging the work done by others in their freetime as "a waste of time". If we started saying that you were wasting your time with your hobby, you would be right in telling us to fuck off.
"I care because it's wasted effort that would be better spent elsewhere."
This idea that programmers must somehow "donate their free-time to the betterment of society" or whatever the hell you are suggesting is frankly quite offensive.
Getting pissy about it doesn't change the fact that the Hurd serves no useful purpose and doesn't have an interesting idea animating it, just a series of shop-worn ideas from 2 decades ago.
If I spent my off-hours building, say, a slightly different clone of Java that I'd started in 1996, you'd be entitled to tell me that I was wasting my time. In fact, this would be doing me a service. If I couldn't turn around and say, "well, Onanva is different from Java because of interesting features X, Y and Z" or "Onanva is about to take over from Java because I've got 100K users in beta and they're screaming for a commercial version" or whatever, then I'd have some serious introspection to do. What I wouldn't do is go into a screaming frenzy that someone who is "sitting on their ass" (whatever that means) is disparaging my precious, precious work and start running around with my dress over my head.
I can't tell from your profiles whether you or Mr P9 have ever been in an environment where you get asked tough questions, but any decent CS school will make you run the "why is this interesting/useful" gauntlet repeatedly, especially if you want to get a PhD or something silly like that. The bulk of the cruel, cruel people who ask you these terrible questions will do so from a seated position, and it will be imperative on you to harden up and supply answers. Not every hobbyist has to deal with this sort of thing, but not everyone outside industry or academic research is incapable of doing so; see also Young Torvalds vs. Tanenbaum for a less pathetic approach to criticism.
Programmers are free to do whatever they want; I doubt that a better OS will have much to do with the "betterment of society" in any case (you're not sure what I'm suggesting, but you're sure it's "quite offensive"). IMO they should work on projects that are objectively interesting. It's ok to disagree on what that might be, but no-one has been able to furnish a reason why building a UNIX clone on top of Mach (again, after cycling through several other underlying microkernels) is in fact interesting in 2011.
Most people watch TV or play video games in their free time. These people choose to work on the Hurd.
What you are doing is simply being an asshole.
"IMO they should work on projects that are objectively interesting."
I rest my case. If you have a better idea of what they should be doing, pay them to do it. Don't tell anyone what they should or should not do otherwise.
Furthermore, my background, p9idf's background, and the PhD process all have fuck all to do with what other people do with their own free time.
Leaving aside your frenzied name-calling, you are unclear on the concept of criticism. It is possible to make and even promulgate value judgements on the work of other people; further, to suggest that some ways of spending your free time are objectively more useful or interesting than others. This applies even if you don't pony up the cash to fund alternative works.
As a rule academics, writers, artists and critics (for example) have all felt reasonably empowered to say all sorts of things about each other - often in language that would make my 'why is this interesting' stuff seem positively tame - without giving undue weight to the fact that the target of the criticism could be otherwise spending time watching soccer or masturbating. When I got my ass rightfully handed to me for trying to get a mediocre paper into SIGMETRICS a few years back, not one reviewer made positive mention of the fact that at least I was writing lousy papers as opposed to, say, punching people on the street or making a color-sorted collection of my nose pickings.
Casual internet forum judgements are worth very little, and one can ignore them one sees fit. I don't remember the bit where I proposed making further research on the Hurd illegal or running them out of town on a rail. One might even muster up a coherent argument against such judgements that doesn't depend on the All Important Right of Enthusiasts to Exist in a Plane Above All Comment on the Relevance of their Work.
> I care because it's wasted effort that would be better spent elsewhere.
Too bad it's not your effort that's being used. If you want things you find interesting (and seemingly no one else do) you've got to be prepared to do them yourself.
I particularly would like to see operating systems implemented using JIT compilation and software isolation. Managed code running safely on the ring0 of the processor.
Microsoft Singularity started this but I'd like to see the open source people trying something like that! I really think we need processor/architecture independency if we really want to achieve freedom.
Sadly, neither JNode (Java) nor House (Haskell) look very active. I keep thinking I should start something though I'm sure there are hundreds of embryonic kernels out there already.
> I particularly would like to see operating systems implemented using JIT compilation and software isolation. Managed code running safely on the ring0 of the processor.
Lisp Machines were arguably like this, except with microcode instead of JIT. AS/400 (IBM midrange systems, now called System i) does something like this as well, except the whole program is compiled from bytecode to machine code on first run, or whenever the bytecode on disk is newer than the machine code on disk.
The biggest change is that you have either a trusted compiler, a garbage collector in the kernel, or both.
We used to joke about governmental 'Stop Grants' at our startup to match the 'Start Grants'. The idea would be that they would pay you to "please, just go away".
Interesting! If they can really get it ticking this time, it would be really interesting to see how a microkernel plays out in 'popular' use.
It's sort of funny how the major kernels are primarily monolithic (please correct me if I'm wrong wrt recent versions of Windows), but academic research says microkernels are better. Worse is better? First-mover advantage?
I'm not an expert in this but I think the issue is one of tradeoffs. I think I remember reading (someone correct me if I'm wrong) that broadly generalized, microkernels have better security at the expense of performance, vice versa for macrokernels. See the "Tanenbaum–Torvalds debate" on Wikipedia.
Either way I'm interested to see if HURD will ever take off in any real sense. For example, could we see an Ubuntu/HURD mix in 5 or 10 years? Will it even matter with that kind of timeframe? Would there ever be any practical advantage to use HURD vs Linux besides more "freedoms"?
> I think I remember reading (someone correct me if I'm wrong) that broadly generalized, microkernels have better security at the expense of performance, vice versa for macrokernels.
that's true, but the "security" is not really what we perceive in the post-google, massively distributed era.
microkernels made sense back when key to uptime was hot-swappable devices. e.g. if your nic goes awry due to a hardware problem, it can crash its driver. in case of a monolithic kernel, this would in turn crash the os, whereas with a microkernel, the rest of the system should continue humming just fine. this makes replacing the failed component possible without causing downtime by shutting down the single node the service runs on.
but nowadays, we know how to set up systems so that taking down an entire node (gasp!) won't harm the operation of the service as a whole. so i don't think microkernels are that relevant anymore.
IIRC Tru64 or OSF/1 (nee Ultrix) was a mainstream Unix based on a microkernel.
The problem with microkernels is their advantages (in terms of, a component of the kernel crashing won't take down the box) has rarely been worth it in light of the disadvantages (all that message passing hurts performance). So "better" depends on your POV.
The BSD-style kernel "subsystem" of XNU, Darwin's kernel, is statically linked with Mach and executes in privileged mode. So, while it "has" Mach and some parts of the kernel interact with other parts via Mach port interfaces, XNU isn't really a microkernel as classically defined anymore since it provides all the facilities you'd expect of a non-micro kernel.
It's been a while since I have kept up with the kernel space but I remember when I last was interested in it, that there was a trend toward hybridization. That Linux was adopting some micro like architectures as well as windows. OSX probably could be considered a hybrid from the start.
Actually I believe that's incorrect: stock Mach 2.x was considered a microkernel (though, like Mach 3.x, not always used as one in practice), and, versions of XNU since the first client release of Mac OS X are built on a modified Mach 3.0 kernel. But you're right that it's not used in a microkernel-like way.
You're partially right with Windows. Windows NT, since the beginning IIRC, uses a hybrid kernel in which things like drivers and IPC are and such are still in the kernel, but the application subsystems and fileservers run in user space. Plan 9 has a very similar design.
Would've been awesome in 1980 on a timesharing system. Now I've got as much control of all my personal devices as I want and I don't want any more control of devices at work than what I need to do my job (with control comes responsibility).
For a brief shining moment, it was open source. It's legacy is very interesting, worth a read to see just how far the little OS that could has come along since it first ran on PDAs. In fact, it's atypical architecture gives Symbian some advantages over contemporary OSes in terms of absurdly low power and CPU requirements. It is also quite secure, offering granular security settings for almost anything - the only Symbian malware that ever made the news were those that the user installed him/herself. Sadly this also gives it a very complex programming model (for example, it uses an obscure variant of C; also many basic services like audio need to be accessed through a server running on device) and a difficult threading model. Qt provides a beautiful API that wraps over the native layer these days, but it seems it's simply not fashionable among the majority of developers.
RMS has never really liked the fact that Linux was not under direct GNU control. I am sure the rift is greater after the rejection of GPL 3 by the Linux camp. RMS has a vision for how things should be and tries to align the GNU offerings to that vision. Some agree with it, some kind of agree with it, and some think he is a fruitcake. Understanding his vision puts the push for Hurd into perspective. He cannot achieve it with Linux because Linux is steered by stewards that don't exactly line up with his world view.
I rather like the fact that there is one branch of kernel dev that favors practicality, expediency, and commercial viability in their worldview and another that favors a strident free software philosophy. The world is a better place for having people working on both paths.
I agree, I think there is a place in the world for each philosophical world view. I personally don't like absolutism with the exception of possibly the ones found in pure science.
GNU doesn't really seem to rule their associated projects with an iron fist, so this doesn't stack up. glibc, gcc and gnome all seem to be considerably self-determining.
My summation was not intended to infer that there was a GNU command structure on each project but rather that there is a philosophical agreement among the confederation of projects and an umbrella organization. The stewards of Linux are not in total philosophical agreement with the GNU so the GNU organization probably feels that they need their own project that aligns to their philosophical world view 100%.
They have to compare to something someone knows about. If you see a comparison of Minix3 and Genode/L4 with Linux, you can determine what the comparison between Hurd and them would be.
Hurd is still using Mach? I don't see the point if it's just going to cheat around the IPC problem like OS X did.
And in the past decade, there have been more promising microkernels, like L4. Unless I missed something and there's still a good reason for Hurd to still be using Mach.
There were attempts to use L4 and several other micro kernels like Coyotos and Viengoos, but as far as I know all of them stalled and the Mach variant has been the only one to be continued to present day.
It's _still_ pointless. There's no chance of displacing Linux or the BSDs in practical terms, and there's no particularly interesting ideas in it. We already know you can build a Unix on top of a microkernel and it will sorta work.
Enough, already. Kill this late-1980s design and build something interesting already. Surely at least one of the things that have happened between 1990-2011 in hardware, software and networking warrants a rethink of OS assumptions rather than this stale rehashing?