Forgot your password?
typodupeerror
GUI Operating Systems Programming Software Unix IT Technology

Rob Pike Responds 284

Posted by Roblimo
from the straight-talk-express dept.
He starts by clearing up my error in saying he was a Unix co-creator in the original Call For Questions. From there he goes on to answer your questions both completely and lucidly. A refreshing change from the politicians and executives we've talked to so much recently, no doubt about it.


Pike:
First, let me clear up a misstatement. I am not a co-creator of Unix. I suppose I am described that way because I am co-author (with Brian Kernighan) of a book about Unix, but neither Brian nor I would want to take credit for creating Unix. Ken Thompson and Dennis Ritchie created Unix and deserve all the credit, and more. I joined their group - the Computing Science Research Center of Bell Labs - after 7th Edition Unix had come out.

1) Innovation and patents - by Zocalo
With so many of your ideas being used with such ubiquity in modern operating systems, what is your stance on the issue of patenting of software and other "intellectual property" concepts? Assuming that business isn't going to let IP patents go away as they strive to build patent stockpiles reminiscent of the nuclear arms buildup during the cold war, how would you like to see the issue resolved?


Pike:
Comparing patents to nuclear weapons is a bit extreme.

2) Systems research - by asyncster
In your paper, systems software research is irrelevant, you claim that there is little room for innovation in systems programming, and that all energy is devoted to supporting existing standards. Do you still feel this way now that you're working at Google?

Pike:
I was very careful to define my terms in that talk (it was never a paper). I was speaking primarily about operating systems and most of what I said then (early 2000) is still true.

Here at Google the issues are quite different. The scale of the problem we're trying to solve is so vast there are always challenges. I find it interesting that the slide in that talk about 'Things to Build' is a close match to the stuff we're doing at Google, if you squint a bit. To summarize:

GUI: Google put the cleanest, prettiest UI on the internet and work continues to find new ways to present data and make it easy to explore.

Component architectures: We use a number of big (BIG!) piece parts like the Google File System (GFS) and MapReduce (see the paper by Jeff Dean and Sanjay Ghemawat in the upcoming OSDI http://labs.google.com/papers/mapreduce.html) to build massive engines for processing data. Using those pieces we can harness zillions of machines with a few keystrokes to attack a problem like indexing the entire internet. (OK, it's not quite that easy, but it's still amazing.) I have a daily test job I run to monitor the health of one of the systems I'm developing; it uses a week of CPU time but runs for only a few minutes of real time.

Languages for distributed computing: I'm part of a team working on something along those lines that we hope to write up soon.

Bringing data to the user instead of the other way around: Those damn browsers are still in the way, but other ways of connecting to data are starting to appear, things like the Google API. However, the surface is barely scratched on this topic.

System administration: Google's production people are phenomenal at keeping all those machines humming and ready for your queries. They demonstrated that there was real progress to be made in the field of system administration, and they continue to push forward.

3) Back in The Day - by Greyfox
Were programmers treated as hot-pluggable resources as they are today? There seems to be a mystique to the programmer prior to about 1995.

From reading the various netnews posts and recollections of older programmers, it seems like the programmer back then was viewed as something of a wizard without whom all the computers he was responsible for would immediately collapse. Has anything really changed or was it the same back then as it is now? I'm wondering how much of what I've read is simply nostalgia.


Pike:
Isn't it just that today there are a lot more computers, a lot more programmers, and most people are familiar with what computers and programmers do? I'm not sure I understand your reference to 1995, but twenty or thirty years ago, computers were big expensive temples of modernity and anyone who could control their power was almost by definition a wizard. Today, even musicians can use computers (hi gary).

4) What are you doing... - by Mark Wilkinson
Google employees are apparently allowed to work on their own projects 20% of the time. Given that you probably can't comment on what you're doing for Google, what are you doing to fill the other 20%?


Pike:
One of the most interesting projects out there, one I am peripherally (but only peripherally) involved with, is the Large Synoptic Survey Telescope http://www.lsst.org, which will scan the visible sky to very high angular precision, in multiple colors, many times a year. It's got an 8.4 meter aperture and 10 square degree field, taking an image every 20 seconds with its 3 gigapixel (sic) camera. The resulting data set will be many petabytes of image and catalog data, a data miner's dream. The software for the telescope is as big a challenge as the instrument itself; just the real-time pixel pipeline on the mountain will make today's compute clusters look wimpy.

5) Database filesystems - by defile
The buzz around filesystems research nowadays is making the UNIX filesystem more database-ish. The buzz around database research nowadays is making the relational database more OOP-ish.

This research to me sounds like the original designers growing tired of the limitations of their "creations" now that they're commodities and going back to the drawing board to "do things right this time". I predict the reinvented versions will never catch on because they'll be too complex and inaccessible.

Of course, this second system syndrome isn't just limited to systems. It happens to bands, directors, probably in every creative art.

I think what we've got in the modern filesystem and RDBMS is about as good as it gets and we should move on. What do you think?


Pike:
This is not the first time databases and file systems have collided, merged, argued, and split up, and it won't be the last. The specifics of whether you have a file system or a database is a rather dull semantic dispute, a contest to see who's got the best technology, rigged in a way that neither side wins. Well, as with most technologies, the solution depends on the problem; there is no single right answer.

What's really interesting is how you think about accessing your data. File systems and databases provide different ways of organizing data to help find structure and meaning in what you've stored, but they're not the only approaches possible. Moreover, the structure they provide is really for one purpose: to simplify accessing it. Once you realize it's the access, not the structure, that matters, the whole debate changes character.

One of the big insights in the last few years, through work by the internet search engines but also tools like Udi Manber's glimpse, is that data with no meaningful structure can still be very powerful if the tools to help you search the data are good. In fact, structure can be bad if the structure you have doesn't fit the problem you're trying to solve today, regardless of how well it fit the problem you were solving yesterday. So I don't much care any more how my data is stored; what matters is how to retrieve the relevant pieces when I need them.

Grep was the definitive Unix tool early on; now we have tools that could be characterized as `grep my machine' and `grep the Internet'. GMail, Google's mail product, takes that idea and applies it to mail: don't bother organizing your mail messages; just put them away for searching later. It's quite liberating if you can let go your old file-and-folder-oriented mentality. Expect more liberation as searching replaces structure as the way to handle data.

6) Thoughts on Bell Labs - by geeber
Plan 9, Unix and so many other great things came out of Bell Labs. Since the crash of the internet bubble, telecom companies have suffered immensely. One of the results of this is that Lucent has systematically dismantled one of the world's greatest industrial research facilities. You spent a great part of your career at Bell Labs. What are your thoughts about the history and future (if any) of Bell Labs, and how did the culture of the Labs influence the growth of Unix?


Pike:
It's unfair to say `systematically dismantled', as though it was a deliberate process and there's nothing left. A more honest assessment might be that changes in the market and in government regulation made it harder to keep a freewheeling research lab thriving at the scale of the old Bell Labs. Bell Labs Research is much smaller these days, but there are still some very bright people working there and they're doing great stuff. I hope one day to see Bell Labs restored to its former glory, but the world has changed enough that that may never happen.

I could go on for pages about the old Bell Labs culture, but I must be brief. When I arrived, in 1980, the Computing Science Research Center, also known as 127 (later 1127; therein lies a tale) had recently launched 7th Edition Unix and the Center, after a long period of essentially zero growth, was just entering a period of rapid expansion. That expansion brought in a lot of new people with new ideas. I was a graphics guy then, and I hooked up with Bart Locanthi, another graphics guy, and we brought graphics to Research Unix with the Blit. Other folks brought in new languages, novel hardware, networking; all kinds of stuff. That period in the early 80s generated a lot of ideas that influenced Unix both within the Labs and in the outside community. I believe the fact that the Center was growing was a big part of its success. The growth not only provided new ideas, it also generated a kind of enthusiasm that doesn't exist in the steady state or in a shrinking group. Universities harness a variant of that energy with the continuous flow of graduate students; in industrial research you need to create it in other ways.

One odd detail that I think was vital to how the group functioned was a result of the first Unix being run on a clunky minicomputer with terminals in the machine room. People working on the system congregated in the room - to use the computer, you pretty much had to be there. (This idea didn't seem odd back then; it was a natural evolution of the old hour-at-a-time way of booking machines like the IBM 7090.) The folks liked working that way, so when the machine was moved to a different room from the terminals, even when it was possible to connect from your private office, there was still a `Unix room' with a bunch of terminals where people would congregate, code, design, and just hang out. (The coffee machine was there too.) The Unix room still exists, and it may be the greatest cultural reason for the success of Unix as a technology. More groups could profit from its lesson, but it's really hard to add a Unix-room-like space to an existing organization. You need the culture to encourage people not to hide in their offices, you need a way of using systems that makes a public machine a viable place to work - typically by storing the data somewhere other than the 'desktop' - and you need people like Ken and Dennis (and Brian Kernighan and Doug McIlroy and Mike Lesk and Stu Feldman and Greg Chesson and ...) hanging out in the room, but if you can make it work, it's magical.

When I first started at the Labs, I spent most of my time in the Unix room. The buzz was palpable; the education unparalleled.

(And speaking of Doug, he's the unsung hero of Unix. He was manager of the group that produced it and a huge creative force in the group, but he's almost unknown in the Unix community. He invented a couple of things you might have heard of: pipes and - get this - macros. Well, someone had to do it and that someone was Doug. As Ken once said when we were talking one day in the Unix room, "There's no one smarter than Doug.")

7) Languages - by btlzu2

Hello!

Maybe this is an overly-asked question, but I still often ponder it. Does object-oriented design negate or diminish the future prospects of Unix's continuing popularity?

I've developed in C (which I still love), but lately, I've been doing a lot of purely object-oriented development in Java. Using things like delegation and reusable classes have made life so much easier in many respects. Since the *nixes are so dependent upon C, I was wondering what future you see in C combined with Unix. Like I said, I love C and still enjoy developing in Unix, but there has to be a point where you build on your progress and the object-oriented languages, in my opinion, seem to be doing that.

Thank you for all your contributions!!!


Pike:
The future does indeed seem to have an OO hue. It may have bearing on Unix, but I doubt it; Unix in all its variants has become so important as the operating system of the internet that whatever the Java applications and desktop dances may lead to, Unix will still be pushing the packets around for a quite a while.

On a related topic, let me say that I'm not much of a fan of object-oriented design. I've seen some beautiful stuff done with OO, and I've even done some OO stuff myself, but it's just one way to approach a problem. For some problems, it's an ideal way; for others, it's not such a good fit.

Here's an analogy. If you want to make some physical artifact, you might decide to build it purely in wood because you like the way the grain of the wood adds to the beauty of the object. In fact many of the most beautiful things in the world are made of wood. But wood is not ideal for everything. No amount of beauty of the grain can make wood conduct electricity, or support a skyscraper, or absorb huge amounts of energy without breaking. Sometimes you need metal or plastic or synthetic materials; more often you need a wide range of materials to build something of lasting value. Don't let the fact that you love wood blind you to the problems wood has as a material, or to the possibilities offered by other materials.

The promoters of object-oriented design sometimes sound like master woodworkers waiting for the beauty of the physical block of wood to reveal itself before they begin to work. "Oh, look; if I turn the wood this way, the grain flows along the angle of the seat at just the right angle, see?" Great, nice chair. But will you notice the grain when you're sitting on it? And what about next time? Sometimes the thing that needs to be made is not hiding in any block of wood.

OO is great for problems where an interface applies naturally to a wide range of types, not so good for managing polymorphism (the machinations to get collections into OO languages are astounding to watch and can be hellish to work with), and remarkably ill-suited for network computing. That's why I reserve the right to match the language to the problem, and even - often - to coordinate software written in several languages towards solving a single problem.

It's that last point - different languages for different subproblems - that sometimes seems lost to the OO crowd. In a typical working day I probably use a half dozen languages - C, C++, Java, Python, Awk, Shell - and many more little languages you don't usually even think of as languages - regular expressions, Makefiles, shell wildcards, arithmetic, logic, statistics, calculus - the list goes on.

Does object-oriented design have much to say to Unix? Sure, but no more than functions or concurrency or databases or pattern matching or little languages or....

Regardless of what I think, though, OO design is the way people are taught to think about computing these days. I guess that's OK - the work does seem to get done, after all - but I wish the view was a little broader.

8) One tool for one job? - by sczimme
Given the nature of current operating systems and applications, do you think the idea of "one tool doing one job well" has been abandoned? If so, do you think a return to this model would help bring some innovation back to software development?

(It's easier to toss a small, single-purpose app and start over than it is to toss a large, feature-laden app and start over.)


Pike:
Those days are dead and gone and the eulogy was delivered by Perl.

9) Emacs or Vi? - by Neil Blender

Pike:

Neither.

When I was a lad, I hacked up the 6th Edition ed with Tom Duff, Hugh Redelmeier, and David Tilbrook to resuscitate qed, the editor Ken Thompson wrote for CTSS that was the inspiration for the much slimmer ed. (Children must learn these things for themselves.) Dennis Ritchie has a nice history of qed at http://cm.bell-labs.com/cm/cs/who/dmr/qed.html> .

I liked qed for one key reason: it was really good at editing a number of files simultaneously. Ed only handled one file at a time.

Ed and qed were command-driven line editors designed for printing terminals, not full-screen displays. After I got to Bell Labs, I tried out vi but it could only handle one file at a time, which I found too limiting. Then I tried emacs, which handled multiple files but much more clumsily than qed. But the thing that bothered me most about vi and emacs was that they gave you a two-dimensional display of your file but you had only a one-dimensional input device to talk to them. It was like giving directions with a map on the table, but being forced to say "up a little, right, no back down, right there, yes turn there that's the spot" instead of just putting your finger on the map.

(Today, emacs and vi support the mouse, but back in 1980 the versions I had access to had no support for mice. For that matter, there weren't really many mice yet.)

So as soon as the Blit started to work, it was time to write an editor that used the mouse as an input device. I used qed (mostly) and emacs (a little) to write the first draft of jim, a full-screen editor that showed you text you could point to with a mouse. Jim handled multiple files very smoothly, and was really easy to use, but it was not terribly powerful. (Similar editors had been at Xerox PARC and other research labs but, well, children must learn these things for themselves.)

A few years later I took the basic input idea of jim and put a new ed-like command language underneath it and called it sam, a locally popular editor that still has its adherents today. To me, the proof of sam's success was that it was the first full screen editor Ken Thompson liked. (He's still using it.) Here's the SP&E paper about sam from 1987: http://plan9.bell-labs.com/sys/doc/sam/sam.pdf.

A few years later, I decided the pop-up menu model for commanding an editor with a mouse was too restrictive, so I started over and built the much more radical Acme, which I'm using to write these answers. Here's the Acme paper: http://plan9.bell-labs.com/sys/doc/acme/acme.pdf

I don't expect any Slashdot readers to switch editors after reading these papers (although the code is available for most major platforms), but I think it's worth reading about them to see that there are ways of editing - and working - that span a much larger gamut than is captured by the question, 'Emacs or vi?'

10) Biggest problem with Unix - by akaina
Recently on the Google Labs Aptitude Test there was a question: "What's broken with Unix? How would you fix it?"

What would you have put?


Pike:
Ken Thompson and I started Plan 9 as an answer to that question. The major things we saw wrong with Unix when we started talking about what would become Plan 9, back around 1985, all stemmed from the appearance of a network. As a stand-alone system, Unix was pretty good. But when you networked Unix machines together, you got a network of stand-alone systems instead of a seamless, integrated networked system. Instead of one big file system, one user community, one secure setup uniting your network of machines, you had a hodgepodge of workarounds to Unix's fundamental design decision that each machine is self-sufficient.

Nothing's really changed today. The workarounds have become smoother and some of the things we can do with networks of Unix machines are pretty impressive, but when ssh is the foundation of your security architecture, you know things aren't working as they should.

Looking at things from a lower altitude:

I didn't use Unix at all, really, from about 1990 until 2002, when I joined Google. (I worked entirely on Plan 9, which I still believe does a pretty good job of solving those fundamental problems.) I was surprised when I came back to Unix how many of even the little things that were annoying in 1990 continue to annoy today. In 1975, when the argument vector had to live in a 512-byte-block, the 6th Edition system would often complain, 'arg list too long'. But today, when machines have gigabytes of memory, I still see that silly message far too often. The argument list is now limited somewhere north of 100K on the Linux machines I use at work, but come on people, dynamic memory allocation is a done deal!

I started keeping a list of these annoyances but it got too long and depressing so I just learned to live with them again. We really are using a 1970s era operating system well past its sell-by date. We get a lot done, and we have fun, but let's face it, the fundamental design of Unix is older than many of the readers of Slashdot, while lots of different, great ideas about computing and networks have been developed in the last 30 years. Using Unix is the computing equivalent of listening only to music by David Cassidy.

11) Re: Plan9 - by Spyffe

Rob,

Right now, there are a large number of research kernels. Plan 9, Inferno, AtheOS, Syllable, K42, Mach, L4, etc. all have their own ideas about the future of the kernel. But they all end up implementing a POSIX interface because the UNIX userland is the default.

The kernel space needs to be invigorated using a new userland that demands new and innovative functionality from the underlying system. Suppose you were to design a user environment for the next 30 years. What would the central abstractions be? What sort of applications would it support?


Pike:
At the risk of contradicting my last answer a little, let me ask you back: Does the kernel matter any more? I don't think it does. They're all the same at some level. I don't care nearly as much as I used to about the what the kernel does; it's so easy to emulate your way back to a familiar state.

Applications - web browsers, MP3 players, games, all that jazz - and networks are where the action is today, and aside from irritating little incompatibilities, the kernel has become a commodity. Almost all the programs I care about can run above Windows, Unix, Plan 9, and on PCs, Macs, palmtops and more. And that, of course, is why these all have a POSIX interface: so they can support those applications.

And then there's the standard network protocols to glue things together. It's all a uniform sea of interoperability (and bugs).

I think the future lies in new hardware as much as in new software. A generation from now machines will be so much more portable than they are now, so much more powerful, so much more interactive that we haven't begun to think about the changes they will bring. This may be the biggest threat to Microsoft: the PC, the desktop, the laptop, will all go the way of the slide rule. As one example, when flexible organic semiconductor displays roll out in a few years, the transformation in how and where people use computers and other devices will be amazing. It's going to be a wild ride.

===============
This discussion has been archived. No new comments can be posted.

Rob Pike Responds

Comments Filter:
  • Damn. (Score:3, Informative)

    by DAldredge (2353) <SlashdotEmail@GMail.Com> on Monday October 18, 2004 @12:05PM (#10556773) Journal
    Well that was a complete and total ignoring of the intent of the patent question on the basis of not agreeing with a minor portion of the question.

    Is he running for office?
    • Re:Damn. (Score:2, Insightful)

      by Anonymous Coward
      Maybe because he's getting tired of this issue? Maybe he wants to focus on actual code instead of politics?
      • by FreeUser (11483) on Monday October 18, 2004 @12:31PM (#10556961)
        Maybe because he's getting tired of this issue? Maybe he wants to focus on actual code instead of politics?

        And how, pray tell, is he going to do that when all but the most trivial code runs afoul of patents and is vulnerable to litigation? (According to many analysts, this is already the case.)

        Refusing to answer the question and using disagreement with the analogy used by the questioner as cover is an exceedingly political answer (and a tried and true method of dodging uncomfortable questions used by virtually every political candidate for office in recent years, as alluded to the "is he running for office" comment) ... dismissing the issue on such a weak pretense clearly amounts to taking sides on the issue, namely the side of the status quo, i.e. pro software patents.

        Hardly a non-political stance, merely a disingenuous one.
        • There are a couple of important differences.

          For one, it is completely obvious to any reader that Mr Pike didn't answer the question. He dismissed the question while making a valid statement. Hardly as bad as, say, Bill Shatner's /. interview [slashdot.org], even. I can't think of a good example of a politician dodging a question based on disagreement with the posed analogy in recent history off the top of my head. But - consider Bush's recent response to the question posed to him in the third debate on the minimum wa

        • by HyperbolicParabaloid (220184) on Monday October 18, 2004 @02:31PM (#10558038) Journal
          Maybe he considers Arms control an issue more significant by many orders of magnitude than patents. I reasonable person might think that even discussing nuclear weapons and IP in the same sentence trivializes the significance that the arms race played in our lives for decades.
          Are software patents important? Yes. Do they threaten the very survival of our species? No.
          • Patents could be as signifigant as Arms Control, if they could be used to solve the Arms Control Problem. Just grant the MPAA non-expiring patents on nuclear bomb technology, and let them go after the terrorists and rogue nations! I bet they'd be a lot more effective than Bush.

            -Don

    • by Black-Man (198831) on Monday October 18, 2004 @12:19PM (#10556875)
      C'mon... comparing corporate IP/Patents to the nuclear arms race? That kind of flawed reasoning works on slashdot, but not with anyone out in the real world.

      He gave an appropriate response to a STUPID analogy.

      • by Anoraknid the Sartor (9334) on Monday October 18, 2004 @12:34PM (#10556980) Homepage
        I took the point to be that some companies may feel they have to build up a patent portfolio merely so that they have something to wave at another company that attacks them with THEIR patent portfolio.

        A kind of "mutually assured destruction" stance...

        As such, the analogy with the reasoning that lay behind the nuclear arms race seems quite apt.

        The dismissal of the question does rather suggest that the speaker did not want to address the point at issue.
      • I don't believe its a stupid analogy. It seems that for the most part, companies patent software for self protection or Mutually Assured Destruction. "If you sue me, I'll sue you right back."

        In terms of comparing it to a nuclear arms race, I think the analogy hit the nail on the head.

        Now, just acquiring patents to put the clamps down on innovation is what the original poster was probably referring to and that doesn't really apply to nuclear arms races. Maybe its more like being a rancher [from the movi
      • by Zocalo (252965) on Monday October 18, 2004 @12:47PM (#10557094) Homepage
        Well, as the guy who asked the question, I'm not so sure that the analogy to MAD (or Mutually Assured Destruction for those to young to remember) is all that extreme. Comparing an IP patent to an ICBM, yes, but the analogy was to the arms race, not the devastation that each causes. How many times have we seen a patent lawsuit dissolve into a cross licensing deal with a mere token financial settlement, if any? Patent's are not being used to protect the inventors rights, they are being used to deter potential IP lawsuits: you sue us, and we'll sue you right back with *our* IP portfolio...

        The fact of the matter is that software patents are not going to go away, something that I touched upon in the original question. Aside from that, thier main use, so far at least, seems to be either for dying companies too leech some more existance from a more successful one, or too browbeat a smaller competitor into competition through the threat of legal costs they cannot sustain. Whether you think that is equivalent to the intent of a patent; essentially granting the inventor a reward for their efforts, no matter how stupid or obvious that invention might seem, is another matter.

        Patents in general, and software patents in particular, are undeniably a big issue in the IT world at the moment. That Rob Pike dismissed the entire question out of hand leaves me with two more possible conclusions to yours: He's pro-IP patents, but is afraid to admit to it on Slashdot, although to be fair he'd *would* get savaged in the comments. Alternatively, he is anti-IP patents, but is afraid to admit it where his employers might see - which would say a lot about his employers if that is the case.

        • >I'm not so sure that the analogy to MAD ... is all that extreme.

          But it leads the question. You put patents in the light of "pure evil" and so come off as having your own agenda and not an honestly-interested-in-the-answer question.

          Its like bringing up Nazi's in a converstation.
        • That Rob Pike dismissed the entire question out of hand leaves me with two more possible conclusions to yours: He's pro-IP patents...

          That is indeed how I took it. Comparing them to nuclear weapons is extreme to him because he doesn't see them as a threat. I think he IS ansering the question in a short, yet distinct, manner. I doubt that he has any great fear of being flamed by several thousand geeks on a messageboard though.
        • maybe he's neither pro- nor anti- IP patents and dismissed it as a loaded question. why should he be required to answer an obviously loaded question? you're obviously extremely anti-IP patents. do you know what'll happen in 10 years? no. it's possible they'll all be pointless. the nuclear arms race of the 70's-80's ended up with what? nothing. the soviet union eventually collapsed. sure it was big and scary, but in the end, nothing came of it.

          so now you have big and scary corporations sucking up p
          • as for saying that their "invention" might be stupid or obvious, it's perfectly within the boundaries set by the law to patent something that doesn't already exist or hasn't been thought up. so while the amazon 1-click patent might seem stupid or obvious, if it's so obvious, why hasn't anyone else used the idea?

            Actually, non-obvious is a criteria for being granted a patent. If you try to patent something that is obvious to any reasonably competent professional in the field, it's supposed to be rejected.

      • Actually, the analogy isn't, in a business sense.

        The goal of Microsoft, IBM, etc. (and a few smaller players and non-entities) is partly to gain as much IP control as possible, to not only avoid having to be beholden to some other IP-holding company, but also to use it to control competition, if required.

        The end-game is different, but what ends up is a business equivalent of nuclear detente, where the major players have enough weapons to counter any attack in court.

        Nuclear detente would have been the bet
      • Pike:
        "The promoters of object-oriented design sometimes sound like master woodworkers waiting for the beauty of the physical block of wood to reveal itself before they begin to work."


        Comparing programming to woodworking is a bit extreme.
    • >Is he running for office?

      He does have a job, you know. He probably doesn't want to piss off his employer.
    • It was a perfectly valid response indicating that the patent question is not such a big deal. FWIW I agree with him - people get far too worked up about the issue.
    • Re:Damn. (Score:5, Insightful)

      by jd (1658) <imipak@yahoo.cEINSTEINom minus physicist> on Monday October 18, 2004 @12:26PM (#10556930) Homepage Journal
      I have to agree. Sure, I'll go along with one of the ACs who said that some things don't dignify a reply, but software patents are a major element in today's world.


      The one-click patent is a symptom of the problem, but the K5 debate on applying the logic of the DMCA to patents is a symptom of the attitudes. Attitudes won't change, just because we're tired of them. They'll change when those who ARE tired of them propose a workable, viable alternative that meets the needs of industry and inventors.


      It's obvious enough that he knows that walking the walk is important - that's one reason he developed Plan 9! Kevin Mitnick proved, very conclusively, that computer security and data integrity are vulnerable to the foolishness of mere mortals. Redesigning on this scale is more than just rewriting some code. the sort of redesign Plan 9 represents is about seeing what doesn't work, and replacing it with something that does.


      Attitudes are broken. They need patching or replacing. Keep them the same, and all the software fixes in the world won't secure a single computer.

    • Re:Damn. (Score:2, Insightful)

      by tjic (530860)
      It seemed quite clear to me that he was saying "the question was idiotic. I'm not going to call you an idiot, but I suggest that you reevaluate your axioms." Seems like a reasonable response to me.
    • Re:Damn. (Score:3, Insightful)

      by ProfKyne (149971)

      I wondered about this too, surely he doesn't think the OP was suggesting patents are comparable to nukes -- the question referenced the way in which large corps gobble down patents, often with no immediate designs to follow through on a business plan or implementation.

      Pike does, however, mention that he works at Google, so maybe he interpreted the question as a shot against his employer or was simply advised by Google's PR not to answer the question.

    • Re:Damn. (Score:3, Interesting)

      by Liselle (684663) *
      Roblimo should clean up the question by removing the silly analogy, and re-submit it to Rob Pike, since most of the discussion seems to be centered around his flippant answer. Include his reply in a Slashback. How's that sound?
    • GIGO (Score:4, Insightful)

      by Doc Ruby (173196) on Monday October 18, 2004 @02:23PM (#10557952) Homepage Journal
      He answered the question by pointing out that it's nonsensical, as posed. He could have answered any number of other questions that weren't asked. A better question is why the moderator picked that poorly constructed question, rather than any of the answerable ones that weren't asked.
  • Nice (Score:4, Funny)

    by Exmet Paff Daxx (535601) on Monday October 18, 2004 @12:07PM (#10556783) Homepage Journal
    " Comparing patents to nuclear weapons is a bit extreme. "

    Now there's a sidestep George Bush would approve of.
    • Re:Nice (Score:5, Insightful)

      by jd (1658) <imipak@yahoo.cEINSTEINom minus physicist> on Monday October 18, 2004 @12:29PM (#10556952) Homepage Journal
      I agree. A nuclear weapon can only be used to attack someone once and the fallout is confined to a few thousands of square miles.
    • by postbigbang (761081) on Monday October 18, 2004 @12:36PM (#10557001)
      There's no congruity between IP issues and nuclear weapon stockpiling. Nuclear weapons are mass destruction devices. IP protection embues certain rights under various juridictions. There might be very important issues for the questioner in IP, but the question was worded poorly and was presumed to foster a baited answer. The context was poorly set, and the answer put the question in the nebulous context by which it was asked. Good answer.
      • by Exmet Paff Daxx (535601) on Monday October 18, 2004 @12:44PM (#10557066) Homepage Journal
        Large companies are stockpiling software patents in exactly the same mutually-assured-destruction mindset anticipated in the cold war: If you sue me to death, I'll sue you to death. They even have the same peace treaties: I promise not to sue you with my patents if you promise the same. You could call them Patent Noproliferation Pacts.

        The fact that that question was sent to the interviewee meant that Slashdot's readers wanted to know his opinion of the patent system. He could have answered it in any manner he chose, but he chose to sidestep it instead because his employer (Google) believes in using patents aggressively in a mutually-assured-destruction way, even if it means the end of Linux. That is why he didn't answer, and your faux-objective pseudointellectual babble isn't fooling anyone.
        • "You could call them Patent Noproliferation Pacts."

          That's a bad analogy. A nonproliferation pact would mean patents in the hands of fewer companies (nukes in the hands of fewer countries). They would be agreements by companies (countries) that did not have patents not to apply for patents (build nukes). These are just agreements saying that we won't sue (nuke) you if you don't sue (nuke) us. The patents have already proliferated by that point. In fact, this encourages proliferation (my patent portfoli
        • Saying "Bzzt!" completely overwhelms anything intelligent you might have said on the matter. You want to give Pike a hard time for answering a question in a flippant manner and then you pull off that "Bzzt" crap?
      • Ok, I'll go along with it, with a minor correction.

        IP protection embues certain rights under various juridictions.

        IP protection is about taking rights from the people/society/the public. IP law has at its foundation the concept of communal ownership over ideas. So when you invent something, the Law of Nature is that everybody owns the invention. In order to encourage invention, the Law of Nature is briefly discarded by something called a "patent".

        I don't totally disagree with your post, I just take i

  • Huh? (Score:5, Funny)

    by Anonymous Coward on Monday October 18, 2004 @12:10PM (#10556809)
    "From there he goes on to answer your questions both completely and lucidly."

    Is he recovering from head trauma or something? It makes it sound like his next step is walking to the restroom without assistance...

  • I suppose (Score:4, Funny)

    by Anonymous Coward on Monday October 18, 2004 @12:10PM (#10556813)
    He didn't want to talk about the Year 2038 Bug [wikipedia.org]...

    I'm disappointed.
    • Re:I suppose (Score:3, Informative)

      by AriesGeek (593959)
      Nah. 64-bit platforms are catching on so fast it will be a non-issue long before 2038.

      But it was a fun fact to throw out when the whole why-too-kay bug was big.
  • "Using Unix is the computing equivalent of listening only to music by David Cassidy"

    To continue the musical comparison. Windows, 15 different variations of the same mass produced pop song whos only existance is to make money for a company that already has a lot of money.

    I'll take David Cassidy, even if he has a CLI only.

  • by Thagg (9904) <thadbeier@gmail.com> on Monday October 18, 2004 @12:20PM (#10556879) Journal
    But I suppose it's not too surprising, considering the havoc that he and ATT wreaked upon X for Pike's save-under patent.

    Save-under was/is a good idea, and so insanely simple it's hard to believe that a patent was granted -- much less weilded with such force. For youngsters (and as an oldster, perhaps my memory isn't quite perfect on this) some early machines had overlay planes for menus. You could draw the menu over the frame, then clear the overlay plane, without disturbing the contents of the window beneath. To do this on a bitmapped display without overlays, the idea was that you would screen-grab the image under where the menu would be, then paste it back when the menu disappeared.

    Pike defended ATT's refusal to allow the X consortium to use save-under without royalty at the time.

    Thad
  • Is that sarcasm? (Score:4, Insightful)

    by Anonymous Coward on Monday October 18, 2004 @12:24PM (#10556918)
    Hmm...The summary claims:

    From there he goes on to answer your questions both completely and lucidly. A refreshing change from the politicians and executives we've talked to so much recently, no doubt about it.

    The actual interview says:

    Pike:
    Comparing patents to nuclear weapons is a bit extreme.


    Clearly the summary is being sarcastic...
  • by elid (672471) <eli.ipod@noSpaM.gmail.com> on Monday October 18, 2004 @12:25PM (#10556923)
    Recently on the Google Labs Aptitude Test there was a question: "What's broken with Unix? How would you fix it?"

    What would you have put?

    Nice answer given by Pike (and no, I'm not going to requote the whole thing), but good luck fitting it into the box here on the 'test.' [google.com] :-)

  • by DrSkwid (118965) on Monday October 18, 2004 @12:26PM (#10556931) Homepage Journal

    object-oriented design is the roman numerals of computing.

    -- Rob Pike

    and seeing as he mentioned perl :

    > To me perl is the triumph of utalitarianism.

    So are cockroaches. So is `sendmail'.

    -- jwz [http://groups.google.com/groups?selm=33F4D777.7BF 84EA3%40netscape.com]

  • Emacs or vi (Score:5, Informative)

    by mithras the prophet (579978) on Monday October 18, 2004 @12:27PM (#10556939) Homepage Journal

    Great answer to that question: Neither, he wrote his own (twice!), and wrote papers about the products. That's a Unix power user, defined.


    • Re:Emacs or vi (Score:5, Interesting)

      by julesh (229690) on Monday October 18, 2004 @12:33PM (#10556971)
      Yeah, who'd have thought it -- the joke question got the most informative answer of them all, while the most serious one was just dismissed. :)

      (I'm reading the ACME paper now. Looks interesting.)

      Jules, who writes his own editors too. :)
      • Re:Emacs or vi (Score:3, Informative)

        by HyperChicken (794660)
        If you want to give Acme a try (I love it), you can do one of two things:

        A: Download Inferno [vitanuova.com]. It's a Virtual Machine-based operating system that runs on top of Linux, Mac OS X, Windows, and Plan 9 (to name a few). Acme is included. Free to download.

        Or B: plan9port [swtch.com]. It's a port of the Plan 9 libraries to UNIX, including Linux and BSD. Acme is included (screen shot under KDE [swtch.com]). Again, free to download.

        You should read the Plan 9 wiki entry on acme [bell-labs.com] before trying to use it.

        Enjoy!
    • by TheOtherChimeraTwin (697085) on Monday October 18, 2004 @01:27PM (#10557485)
      Don't you understand that by voting for Acme, you are throwing away your vote?!

      A vote for Acme (which would have otherwise certainly gone to Emacs), is like a vote for vi! It is a well known fact that vi supporters have been secretly throwing Acme parties around the world.
    • by harlows_monkeys (106428) on Monday October 18, 2004 @02:36PM (#10558073) Homepage
      Great answer to that question: Neither, he wrote his own (twice!), and wrote papers about the products

      Once upon a time, at Caltech High Energy Physics, where Rob Pike had worked before going to Bell, two programmers (me and Karl Heuer) were bitching about existing editors, each claiming he could do better. That night, it got to the "oh yeah...prove it!" stage, and both sat down to write editors. By morning, they were each using their respective editors on themselves. Norman Wilson at cithep had kept in contact with Pike, and told him of this spate of editor hacking. Note that this was well before Pike did his jim and sam stuff.

      Pike wrote back something like this: "writing a screen editor is fun and easy and makes them feel important. Tell them to work on something useful".

      We were quite amused when we found out that Rob went on to write editors, instead of sticking to "something useful"!

  • The Unix Room (Score:5, Insightful)

    by joelethan (782993) on Monday October 18, 2004 @12:37PM (#10557010) Journal
    I was particularly struck by the story of the Unix Room where all the Unix people hung out.

    These days, developers seem to have their accommodation organised by blind chance, or worse, corporate whim.

    Many of my colleagues left their 6-12 man offices to join a 70 desk open-plan floor. The six of us architects (yeah, right) were pretty miffed to be shunted into a 1980's room just for six with beige vinyl on the walls and phones straight out of Flash Gordon. Now, two months later, we appreciate the working community that is our office.

    Good call Mr. Pike: humans function well in small self-organising or randomly-organised groups of up to 8. I'll rue the day we have to move out.

    /JE

    • Re:The Unix Room (Score:4, Insightful)

      by nthomas (10354) on Monday October 18, 2004 @01:12PM (#10557351)
      I was particularly struck by the story of the Unix Room where all the Unix people hung out.

      Fascinating.

      I was at Columbia University last week for a meeting sponsored by the local ACM chapter [columbia.edu] and LXNY [lxny.org], the speaker was Stephen Bourne [eldoradoventures.com] (he who is sh [wikipedia.org]).

      At some point during his excellent talk on the history of Unix and his place in it, someone asked what he thought was the reason for the success of the operating system, and without hesitating, he talked about the room where all the terminals were located (he never specifically referred to it as the "Unix room" though) and how when you released software it was used immediately by those in the room and if something broke, you were called "idiot" (and probably worse) by your peers -- it was in your best interest to make sure you didn't put out junk as you really didn't have that dilution of responsibility that engineers have in a large corporation where the design team is in one wing of the building, the coders are in another, and the testers in yet another location, etc.

      It was a great speech, anyone who hasn't seen Dr. Bourne speak should do so, he is an excellent source of insight into the early years of Unix and software engineering in general. He is now working for a venture capital firm and roughly a third of his talk was spent talking about that, it's a testament to his great speaking skills that most of the people in the room didn't lose interest when he switched topics like that (I'm convinced that most hackers suffer from ADD).

      Thomas

    • Good call Mr. Pike: humans function well in small self-organising or randomly-organised groups of up to 8.

      Right there is the rub. My last manager had heard of this idea as well and stuffed 46 of us in a room together. Testers, developers, managers, even the client reps all in one room. After 3.5 years they cancelled the project with virtually nothing to show for it. Of course completely changing direction on the project no less that 5 times during that timeframe didn't help either. However, I do believe t
  • by MattW (97290) <matt@ender.com> on Monday October 18, 2004 @12:45PM (#10557067) Homepage

    Pike:
    Comparing patents to nuclear weapons is a bit extreme.


    No it isn't. The comparison is drawn often, because both large patent portfolios as well as large nuclear arms stockpiles create a situaiton of Mutually Assured Destruction. Once the nukes start flying, nobody wins. Likewise, once the lawyers start slinging patent lawsuits, only the lawyers win.

    So the answer may be, "I have no idea", but the comparison is legitimate.
    • One advantage of this compararison is that it stays analogous if you extend it. The big nuclear powers engaged in the arms race used the system not jsut to discourage others actually building nukes, but to try and keep smaller countries from growing economically enough to be capable of joining the nuclear club - the big corporations engaged in hoarding patents use the system both to keep smaller companies from growing up enough to join the patent club and from ever becoming the current corporation's new co
    • In both cases only the cockroaches win!
  • by graveyhead (210996) <fletch@fletchtroni c s . n et> on Monday October 18, 2004 @12:45PM (#10557076)
    I modded this question up in the question round because I wanted a real answer, damnit.

    I sincerely believe that "one tool for one job" isn't dead, the landscape has simply changed.

    Yesteryear, the only way software tools worked together was via stdin/out over the command line.

    Nowadays, we have brought the concept into application space through component architectures and IDLs (COM/XPCOM/JavaBeans to name 3). These new tools allow for that clean separation. Plug-ins or components are free to concentrate on doing one thing very well.

    The change, IMO, is a good one. Formalized interfaces are good, and components are better optimized than launching a whole separate process.
    • by Anonymous Coward
      You got a real answer. He said that era is dead.

      You may not like his answer and may disagree but that does not make his answer any less valid. He said that one tool for one job is dead. If you want a different answer then ask someone else. Usually if you ask enough people you can find one that agrees with you. Hey, when that happens you can then say that you theory is correct becasue X agrees with you and ignore the droves of others who disagree. Comeom, all the kids are doing it. manipulate the data to
      • OK I have to agree with you.

        My first response was a bit over-emotional.

        I simply meant this: his answer seemes very glib to me. It would simply be nice if he had elaborated a bit :/
      • He didn't say why... (Score:4, Interesting)

        by argent (18001) <peter AT slashdo ... taronga DOT com> on Monday October 18, 2004 @03:05PM (#10558306) Homepage Journal
        Perl hardly refutes it, when in the previous question he gave a laundry list of tools and Perl wasn't on it... and Awk was.

        I really think he was evading the answer.

        The real answer is that you need a framwork that lets you connect the tools together easily before you can use a software tools approach. For the command line era, that framework was the UNIX shell. For the GUI era, there really hasn't been a popular framwork that's also portable. AREXX, Plan 9, Applescript, these seem to be the best frameworks I've seen so far, and they're all isolated to ghettoes... we're still waiting for the GUI equivalent of the UNIX shell.
    • Containment (Score:2, Insightful)

      by tepples (727027)

      Formalized interfaces are good, and components are better optimized than launching a whole separate process.

      Not in all cases. It's often easier for a program to contain a misbehaving component if the component runs in a separate process. For instance, if a web browser plug-in segfaults, do you want it to destroy the data you've entered into a form on another page?

      • Re:Containment (Score:3, Interesting)

        by graveyhead (210996)

        Not in all cases. It's often easier for a program to contain a misbehaving component if the component runs in a separate process. For instance, if a web browser plug-in segfaults, do you want it to destroy the data you've entered into a form on another page?

        In reality, both major browsers (IE, Moz) use component architectures, not separate processes, so I'm not sure your example is truely relevant.

        Also, a formalized interface means two things that help stabalize components:

        1. Scripts. Components with an ID
    • by sczimme (603413) on Monday October 18, 2004 @01:32PM (#10557539)

      (I submitted this particular question, and appreciate the mod point.)

      I was looking at it from a slightly simpler and broader angle: the functionality of discrete widgets. There are so many products (software in particular; computing devices in general) that are designed to be a single answer to all of the customer's needs. This is extremely difficult to do correctly, and many efforts end up as one or more of the following:

      too hefty/bulky/bloated

      too expensive

      too resource-hungry (be it RAM or battery power)

      too fragile (where one misbehaving widget causes a ripple effect throughout the device/app/entity)

      performing several functions but not doing any one task particularly well
      Those days are dead and gone and the eulogy was delivered by Perl seems to mean that we only need one tool to do our jobs, and that tool is perl. I respectfully disagree with this: perl is very handy but it is not always The Right Tool for the Job(tm).

      Rob - thank you for the answer.

  • I guess listening to David Cassidy while I write this comment seems so appropiate!
  • Well that was a complete and total ignoring of the intent of the patent question on the basis of not agreeing with a minor portion of the question.

    Thus, today's lesson is: don't insert your own stupid analogies into the question just to appear intelligent.

    I would have loved to see his response to the same question without the analogy. He would have been forced to answer, or explicitly acknowledge his dodging, if the submitter had merely posed the question by itself.

    Obviously, as an employee of a corpor

  • by Samrobb (12731) on Monday October 18, 2004 @01:19PM (#10557415) Homepage Journal
    Those days are dead and gone and the eulogy was delivered by Perl.

    Hey! Perl still adheres to the "one tool for one job" metaphor.

    It's just that Perl's "one job" seems to be defined as "replace all the other tools"...

    • Hey! Perl still adheres to the "one tool for one job" metaphor.

      Really? I thought it was more like "six different tools for one job".
    • Think about the question and then the answer. Perl's job is both to replace tools AND patch together other tools. There are plenty of Perl programs I have written that do not use the input or output of another program, but I have done some mighty work with Perl where it saves the old ass at 4AM by taking divergent, incompatible crap from one program, fixes said output, and puts it to another piece of shit program that is sixpence none the wiser.
  • One of the big insights in the last few years...is that data with no meaningful structure can still be very powerful if the tools...are good.

    Yep. Ask anybody who ever used askSam for their desktop database needs back in the day. Lordy, I miss that software. When was that, anyway? Back in the late 1980s? The brain's a little foggy today...

  • DragonFlyBSD (Score:4, Informative)

    by ArbitraryConstant (763964) on Monday October 18, 2004 @01:30PM (#10557515) Homepage
    "I don't care nearly as much as I used to about the what the kernel does; it's so easy to emulate your way back to a familiar state."

    DragonFlyBSD has a system call layer [dragonflybsd.org] that would allow potentially very different interfaces to be presented to userspace stuff with essentially no penalty. This may allow newer ideas to be explored in a familiar environment.
  • by Chris_Keene (87914) on Monday October 18, 2004 @01:37PM (#10557591) Homepage Journal
    "Instead of one big file system, one user community, one secure setup uniting your network of machines, you had a hodgepodge of workarounds to Unix's fundamental design decision that each machine is self-sufficient."

    I hate to say this, but doesn't Windows 2000/2003 server, Active Directory (and Novell NDS etc) do a lot of this. One set of users, a network of machines (without being reliant on one master machine*), and one security model. Maybe not quite there on 'one big file system', though can basically be achieved with a bit of setting up.

    (* I haven't manafged a Windows domain for a few years, seem to remember 2k had a PDC-like machine as such, but also with backup servers - ready to take over).
    • Windows is still built around the idea that each machine is self-sufficient. And, well, it has to be. You can build a distributed system on top of that, but if it's going to be widely adopted you need to be able to do it all in one machine: that is, Plan 9's problem is that it required a network. I once asked Dennis Ritchie if there was any real point to running one Plan 9 computer, and his response dissuaded me from trying it.

      No, what you need is a standalone system that you can build up into a distribute
    • Novell NDS pioneered the functional and useful PC implementation of this idea with NDS, which was created by hacking the bejabbers out of a genealogical database created by the Mormons (AKA the Church of Jesus Christ of Latter Day Saints).

      NDS inherited limitations from the Mormon theology (for example: the concept of multiple roots is anathema in a genealogical database designed to relate all the descendants of Adam. Thus NDS could not handle multiple roots and separate trees had to be merged in order to
  • Does the kernel matter any more? I don't think it does. They're all the same at some level. I don't care nearly as much as I used to about the what the kernel does; it's so easy to emulate your way back to a familiar state.
    I wonder what his thoughts are on something like the TUNES Project [tunes.org] as an OS alternative.
  • It's got an 8.4 meter aperture and 10 square degree field, taking an image every 20 seconds with its 3 gigapixel (sic) camera - in this sentence, what does 3 gigapixel (sic) camera - this thing '(sic)' mean?

    • According to www.acronymfinder.com ...
      Sic [not an acronym] Latin: thus; so (not a mistake and is to be read as it stands)
    • by Anonymous Coward on Monday October 18, 2004 @02:05PM (#10557801)
      "Sic" literally means, "thus," as in "Sic semper tyrannis!," "Thus always to tyrants!" Generally when used in print, however, it is used an instruction to the reader to take the preceding as it is printed. It is used often when quoting someone, and denotes that a misspelled word or exceedingly ungrammatical phrase in the quote is in the original quote, rather than an error in transcription. In this case, however, I think the intent is to note that the camera is actually a 3 gigapixel camera as stated, so as to prevent a stream of posts whose text is "Uh, shouldn't that be megapixels?"
  • He got #5 wrong... (Score:5, Interesting)

    by MattRog (527508) on Monday October 18, 2004 @01:45PM (#10557677)
    ... but then again so did the person who posed the question.

    I understand the idea that anything user-facing should probably be as simple as possible. This means that ideas that require user-supplied metadata (as the typical XML-in-filesystem ideas require) are probably not going to be successful. I also agree that Joe User doesn't care whether or not his data is stored in a RDBMS or in a plain text file if his search tool does a good job.

    The phrase "structure is meaningless; search is king" is a non-sequitur to someone aware of data management fundamentals. Structure gives meaning which in turn allows you to relate the data to others. The problem today is that we're creating data and storing it in'plain text' (or flat file, proprietary, etc.) physical formats instead of storing emails, word processing documents, etc. in a RDBMS.

    The RDBMS is more than simply a search tool; that it has a sound model, provides for easier application development, etc. Wouldn't search be significantly easier to do if your data is given a consistent logical view? If you know the semantics of a particular piece of data, you no longer need to waste your time classifying it to search.

    It seems that a proper solution would be that every PC contained a RDBMS, all data is stored in one, and that the internet would simply be a series of interconnected, distributed RDBMS (D-RDBMS). This idea would probably be fairly difficult to implement, but is already being performed at Google anyway (albeit in a slightly different format). Back when Codd developed the model he was primarily concerned with institutional databases -- centralized schema validation/data storage/etc. The problems implementing D-RDBMS products are not trivial, but then again are not insurmountable. The world has been able to standardize on protocols, etc. so I don't think it is out of the realm of possibility to suggest that different companies/users/applications could agree on a particular schema for, say, emails.
    • The phrase "structure is meaningless; search is king" is a non-sequitur to someone aware of data management fundamentals. Structure gives meaning which in turn allows you to relate the data to others.

      Or the form of the query in combination with a semantically agnostic indexing scheme (ie. PageRank but there are others) gives structure to the results which the user uses to give meaning to the data.

      • In order to provide relevant results the search algorithm must derive at least some meaning from the data. The RDBMS does just this in a well-known, accurate manner. Why not give the algorithm more data with which to make its inferences? That would lead directly to algorithms that are:
        1) Less complicated
        2) More accurate
        3) Easier to develop/debug (probably ties to #1)

        And of course the end-user is going to derive more meaning from the data than computers can (currently, without true AI) provide. But the poin
  • by Anonymous Coward

    I asked in my +5 interesting post why modern OS's are written in the lowest-level practical modern language (C) instead of the highest-level ones? I noted that at the time C was one of the highest-level languages and that was behind UNIX's success.

    Instead he wastes his space completely dodging a decent question on patents, and takes some softball OS question so he can spout off about how his baby Plan 9 is maginally better (you can pass any number of parameters on the command line). Wtf?

    Also his analo

  • Musical analogies (Score:5, Interesting)

    by sadtrev (61519) on Monday October 18, 2004 @02:11PM (#10557842) Homepage
    Using Unix is the computing equivalent of listening only to music by David Cassidy.


    No, it's more like listening only to music composed before Schoenberg. Those of us with taste recognise that that most of the stuff produced since that is either pretentious cacophony or ignorant, synical, commercial bilge.

    Thus WinXP is to Unix what Britney Spears is to Beethoven. Plan9 would be some anachronistic romanticism like Pfizner or Elgar.

    • Re:Musical analogies (Score:3, Interesting)

      by argent (18001)
      Plan9 would be some anachronistic romanticism like Pfizner or Elgar.

      Steady on, old man, steady on. You're getting awfully carried away...

      ---

      I like to use lines like that. "There hasn't been any good music since Joplin died... no, I mean Scott Joplin..."

      Windows XP is like the Monkees. It's not just commercial pap, it's old commercial pap.

      Plan 9? Plan 9 is Jazz.
    • actually, i was thinking more like Frank Zappa or David Byrne; a little off the beaten path but once grasped, quite interesting and diverse.
  • After all this talk about Doug McIlroy, when will /. interview him?

Organic chemistry is the chemistry of carbon compounds. Biochemistry is the study of carbon compounds that crawl. -- Mike Adams

Working...