Rob Pike Responds 284
He starts by clearing up my error in saying he was a Unix co-creator in the original Call For Questions. From there he goes on to answer your questions both completely and lucidly. A refreshing change from the politicians and executives we've talked to so much recently, no doubt about it.
Pike:
First, let me clear up a misstatement. I am not a co-creator of Unix. I suppose I am described that way because I am co-author (with Brian Kernighan) of a book about Unix, but neither Brian nor I would want to take credit for creating Unix. Ken Thompson and Dennis Ritchie created Unix and deserve all the credit, and more. I joined their group - the Computing Science Research Center of Bell Labs - after 7th Edition Unix had come out.
1) Innovation and patents - by Zocalo
With so many of your ideas being used with such ubiquity in modern operating systems, what is your stance on the issue of patenting of software and other "intellectual property" concepts? Assuming that business isn't going to let IP patents go away as they strive to build patent stockpiles reminiscent of the nuclear arms buildup during the cold war, how would you like to see the issue resolved?
Pike:
Comparing patents to nuclear weapons is a bit extreme.
2) Systems research - by asyncster
In your paper, systems software research is irrelevant, you claim that there is little room for innovation in systems programming, and that all energy is devoted to supporting existing standards. Do you still feel this way now that you're working at Google?
Pike:
I was very careful to define my terms in that talk (it was never a paper). I was speaking primarily about operating systems and most of what I said then (early 2000) is still true.
Here at Google the issues are quite different. The scale of the problem we're trying to solve is so vast there are always challenges. I find it interesting that the slide in that talk about 'Things to Build' is a close match to the stuff we're doing at Google, if you squint a bit. To summarize:
GUI: Google put the cleanest, prettiest UI on the internet and work continues to find new ways to present data and make it easy to explore.
Component architectures: We use a number of big (BIG!) piece parts like the Google File System (GFS) and MapReduce (see the paper by Jeff Dean and Sanjay Ghemawat in the upcoming OSDI http://labs.google.com/papers/mapreduce.html) to build massive engines for processing data. Using those pieces we can harness zillions of machines with a few keystrokes to attack a problem like indexing the entire internet. (OK, it's not quite that easy, but it's still amazing.) I have a daily test job I run to monitor the health of one of the systems I'm developing; it uses a week of CPU time but runs for only a few minutes of real time.
Languages for distributed computing: I'm part of a team working on something along those lines that we hope to write up soon.
Bringing data to the user instead of the other way around: Those damn browsers are still in the way, but other ways of connecting to data are starting to appear, things like the Google API. However, the surface is barely scratched on this topic.
System administration: Google's production people are phenomenal at keeping all those machines humming and ready for your queries. They demonstrated that there was real progress to be made in the field of system administration, and they continue to push forward.
3) Back in The Day - by Greyfox
Were programmers treated as hot-pluggable resources as they are today? There seems to be a mystique to the programmer prior to about 1995.
From reading the various netnews posts and recollections of older programmers, it seems like the programmer back then was viewed as something of a wizard without whom all the computers he was responsible for would immediately collapse. Has anything really changed or was it the same back then as it is now? I'm wondering how much of what I've read is simply nostalgia.
Pike:
Isn't it just that today there are a lot more computers, a lot more programmers, and most people are familiar with what computers and programmers do? I'm not sure I understand your reference to 1995, but twenty or thirty years ago, computers were big expensive temples of modernity and anyone who could control their power was almost by definition a wizard. Today, even musicians can use computers (hi gary).
4) What are you doing... - by Mark Wilkinson
Google employees are apparently allowed to work on their own projects 20% of the time. Given that you probably can't comment on what you're doing for Google, what are you doing to fill the other 20%?
Pike:
One of the most interesting projects out there, one I am peripherally (but only peripherally) involved with, is the Large Synoptic Survey Telescope http://www.lsst.org, which will scan the visible sky to very high angular precision, in multiple colors, many times a year. It's got an 8.4 meter aperture and 10 square degree field, taking an image every 20 seconds with its 3 gigapixel (sic) camera. The resulting data set will be many petabytes of image and catalog data, a data miner's dream. The software for the telescope is as big a challenge as the instrument itself; just the real-time pixel pipeline on the mountain will make today's compute clusters look wimpy.
5) Database filesystems - by defile
The buzz around filesystems research nowadays is making the UNIX filesystem more database-ish. The buzz around database research nowadays is making the relational database more OOP-ish.
This research to me sounds like the original designers growing tired of the limitations of their "creations" now that they're commodities and going back to the drawing board to "do things right this time". I predict the reinvented versions will never catch on because they'll be too complex and inaccessible.
Of course, this second system syndrome isn't just limited to systems. It happens to bands, directors, probably in every creative art.
I think what we've got in the modern filesystem and RDBMS is about as good as it gets and we should move on. What do you think?
Pike:
This is not the first time databases and file systems have collided, merged, argued, and split up, and it won't be the last. The specifics of whether you have a file system or a database is a rather dull semantic dispute, a contest to see who's got the best technology, rigged in a way that neither side wins. Well, as with most technologies, the solution depends on the problem; there is no single right answer.
What's really interesting is how you think about accessing your data. File systems and databases provide different ways of organizing data to help find structure and meaning in what you've stored, but they're not the only approaches possible. Moreover, the structure they provide is really for one purpose: to simplify accessing it. Once you realize it's the access, not the structure, that matters, the whole debate changes character.
One of the big insights in the last few years, through work by the internet search engines but also tools like Udi Manber's glimpse, is that data with no meaningful structure can still be very powerful if the tools to help you search the data are good. In fact, structure can be bad if the structure you have doesn't fit the problem you're trying to solve today, regardless of how well it fit the problem you were solving yesterday. So I don't much care any more how my data is stored; what matters is how to retrieve the relevant pieces when I need them.
Grep was the definitive Unix tool early on; now we have tools that could be characterized as `grep my machine' and `grep the Internet'. GMail, Google's mail product, takes that idea and applies it to mail: don't bother organizing your mail messages; just put them away for searching later. It's quite liberating if you can let go your old file-and-folder-oriented mentality. Expect more liberation as searching replaces structure as the way to handle data.
6) Thoughts on Bell Labs - by geeber
Plan 9, Unix and so many other great things came out of Bell Labs. Since the crash of the internet bubble, telecom companies have suffered immensely. One of the results of this is that Lucent has systematically dismantled one of the world's greatest industrial research facilities. You spent a great part of your career at Bell Labs. What are your thoughts about the history and future (if any) of Bell Labs, and how did the culture of the Labs influence the growth of Unix?
Pike:
It's unfair to say `systematically dismantled', as though it was a deliberate process and there's nothing left. A more honest assessment might be that changes in the market and in government regulation made it harder to keep a freewheeling research lab thriving at the scale of the old Bell Labs. Bell Labs Research is much smaller these days, but there are still some very bright people working there and they're doing great stuff. I hope one day to see Bell Labs restored to its former glory, but the world has changed enough that that may never happen.
I could go on for pages about the old Bell Labs culture, but I must be brief. When I arrived, in 1980, the Computing Science Research Center, also known as 127 (later 1127; therein lies a tale) had recently launched 7th Edition Unix and the Center, after a long period of essentially zero growth, was just entering a period of rapid expansion. That expansion brought in a lot of new people with new ideas. I was a graphics guy then, and I hooked up with Bart Locanthi, another graphics guy, and we brought graphics to Research Unix with the Blit. Other folks brought in new languages, novel hardware, networking; all kinds of stuff. That period in the early 80s generated a lot of ideas that influenced Unix both within the Labs and in the outside community. I believe the fact that the Center was growing was a big part of its success. The growth not only provided new ideas, it also generated a kind of enthusiasm that doesn't exist in the steady state or in a shrinking group. Universities harness a variant of that energy with the continuous flow of graduate students; in industrial research you need to create it in other ways.
One odd detail that I think was vital to how the group functioned was a result of the first Unix being run on a clunky minicomputer with terminals in the machine room. People working on the system congregated in the room - to use the computer, you pretty much had to be there. (This idea didn't seem odd back then; it was a natural evolution of the old hour-at-a-time way of booking machines like the IBM 7090.) The folks liked working that way, so when the machine was moved to a different room from the terminals, even when it was possible to connect from your private office, there was still a `Unix room' with a bunch of terminals where people would congregate, code, design, and just hang out. (The coffee machine was there too.) The Unix room still exists, and it may be the greatest cultural reason for the success of Unix as a technology. More groups could profit from its lesson, but it's really hard to add a Unix-room-like space to an existing organization. You need the culture to encourage people not to hide in their offices, you need a way of using systems that makes a public machine a viable place to work - typically by storing the data somewhere other than the 'desktop' - and you need people like Ken and Dennis (and Brian Kernighan and Doug McIlroy and Mike Lesk and Stu Feldman and Greg Chesson and ...) hanging out in the room, but if you can make it work, it's magical.
When I first started at the Labs, I spent most of my time in the Unix room. The buzz was palpable; the education unparalleled.
(And speaking of Doug, he's the unsung hero of Unix. He was manager of the group that produced it and a huge creative force in the group, but he's almost unknown in the Unix community. He invented a couple of things you might have heard of: pipes and - get this - macros. Well, someone had to do it and that someone was Doug. As Ken once said when we were talking one day in the Unix room, "There's no one smarter than Doug.")
7) Languages - by btlzu2
Hello!
Maybe this is an overly-asked question, but I still often ponder it. Does object-oriented design negate or diminish the future prospects of Unix's continuing popularity?
I've developed in C (which I still love), but lately, I've been doing a lot of purely object-oriented development in Java. Using things like delegation and reusable classes have made life so much easier in many respects. Since the *nixes are so dependent upon C, I was wondering what future you see in C combined with Unix. Like I said, I love C and still enjoy developing in Unix, but there has to be a point where you build on your progress and the object-oriented languages, in my opinion, seem to be doing that.
Thank you for all your contributions!!!
Pike:
The future does indeed seem to have an OO hue. It may have bearing on Unix, but I doubt it; Unix in all its variants has become so important as the operating system of the internet that whatever the Java applications and desktop dances may lead to, Unix will still be pushing the packets around for a quite a while.
On a related topic, let me say that I'm not much of a fan of object-oriented design. I've seen some beautiful stuff done with OO, and I've even done some OO stuff myself, but it's just one way to approach a problem. For some problems, it's an ideal way; for others, it's not such a good fit.
Here's an analogy. If you want to make some physical artifact, you might decide to build it purely in wood because you like the way the grain of the wood adds to the beauty of the object. In fact many of the most beautiful things in the world are made of wood. But wood is not ideal for everything. No amount of beauty of the grain can make wood conduct electricity, or support a skyscraper, or absorb huge amounts of energy without breaking. Sometimes you need metal or plastic or synthetic materials; more often you need a wide range of materials to build something of lasting value. Don't let the fact that you love wood blind you to the problems wood has as a material, or to the possibilities offered by other materials.
The promoters of object-oriented design sometimes sound like master woodworkers waiting for the beauty of the physical block of wood to reveal itself before they begin to work. "Oh, look; if I turn the wood this way, the grain flows along the angle of the seat at just the right angle, see?" Great, nice chair. But will you notice the grain when you're sitting on it? And what about next time? Sometimes the thing that needs to be made is not hiding in any block of wood.
OO is great for problems where an interface applies naturally to a wide range of types, not so good for managing polymorphism (the machinations to get collections into OO languages are astounding to watch and can be hellish to work with), and remarkably ill-suited for network computing. That's why I reserve the right to match the language to the problem, and even - often - to coordinate software written in several languages towards solving a single problem.
It's that last point - different languages for different subproblems - that sometimes seems lost to the OO crowd. In a typical working day I probably use a half dozen languages - C, C++, Java, Python, Awk, Shell - and many more little languages you don't usually even think of as languages - regular expressions, Makefiles, shell wildcards, arithmetic, logic, statistics, calculus - the list goes on.
Does object-oriented design have much to say to Unix? Sure, but no more than functions or concurrency or databases or pattern matching or little languages or....
Regardless of what I think, though, OO design is the way people are taught to think about computing these days. I guess that's OK - the work does seem to get done, after all - but I wish the view was a little broader.
8) One tool for one job? - by sczimme
Given the nature of current operating systems and applications, do you think the idea of "one tool doing one job well" has been abandoned? If so, do you think a return to this model would help bring some innovation back to software development?
(It's easier to toss a small, single-purpose app and start over than it is to toss a large, feature-laden app and start over.)
Pike:
Those days are dead and gone and the eulogy was delivered by Perl.
9) Emacs or Vi? - by Neil Blender
Pike:
Neither.
When I was a lad, I hacked up the 6th Edition ed with Tom Duff, Hugh Redelmeier, and David Tilbrook to resuscitate qed, the editor Ken Thompson wrote for CTSS that was the inspiration for the much slimmer ed. (Children must learn these things for themselves.) Dennis Ritchie has a nice history of qed at http://cm.bell-labs.com/cm/cs/who/dmr/qed.html> .
I liked qed for one key reason: it was really good at editing a number of files simultaneously. Ed only handled one file at a time.
Ed and qed were command-driven line editors designed for printing terminals, not full-screen displays. After I got to Bell Labs, I tried out vi but it could only handle one file at a time, which I found too limiting. Then I tried emacs, which handled multiple files but much more clumsily than qed. But the thing that bothered me most about vi and emacs was that they gave you a two-dimensional display of your file but you had only a one-dimensional input device to talk to them. It was like giving directions with a map on the table, but being forced to say "up a little, right, no back down, right there, yes turn there that's the spot" instead of just putting your finger on the map.
(Today, emacs and vi support the mouse, but back in 1980 the versions I had access to had no support for mice. For that matter, there weren't really many mice yet.)
So as soon as the Blit started to work, it was time to write an editor that used the mouse as an input device. I used qed (mostly) and emacs (a little) to write the first draft of jim, a full-screen editor that showed you text you could point to with a mouse. Jim handled multiple files very smoothly, and was really easy to use, but it was not terribly powerful. (Similar editors had been at Xerox PARC and other research labs but, well, children must learn these things for themselves.)
A few years later I took the basic input idea of jim and put a new ed-like command language underneath it and called it sam, a locally popular editor that still has its adherents today. To me, the proof of sam's success was that it was the first full screen editor Ken Thompson liked. (He's still using it.) Here's the SP&E paper about sam from 1987: http://plan9.bell-labs.com/sys/doc/sam/sam.pdf.
A few years later, I decided the pop-up menu model for commanding an editor with a mouse was too restrictive, so I started over and built the much more radical Acme, which I'm using to write these answers. Here's the Acme paper: http://plan9.bell-labs.com/sys/doc/acme/acme.pdf
I don't expect any Slashdot readers to switch editors after reading these papers (although the code is available for most major platforms), but I think it's worth reading about them to see that there are ways of editing - and working - that span a much larger gamut than is captured by the question, 'Emacs or vi?'
10) Biggest problem with Unix - by akaina
Recently on the Google Labs Aptitude Test there was a question: "What's broken with Unix? How would you fix it?"
What would you have put?
Pike:
Ken Thompson and I started Plan 9 as an answer to that question. The major things we saw wrong with Unix when we started talking about what would become Plan 9, back around 1985, all stemmed from the appearance of a network. As a stand-alone system, Unix was pretty good. But when you networked Unix machines together, you got a network of stand-alone systems instead of a seamless, integrated networked system. Instead of one big file system, one user community, one secure setup uniting your network of machines, you had a hodgepodge of workarounds to Unix's fundamental design decision that each machine is self-sufficient.
Nothing's really changed today. The workarounds have become smoother and some of the things we can do with networks of Unix machines are pretty impressive, but when ssh is the foundation of your security architecture, you know things aren't working as they should.
Looking at things from a lower altitude:
I didn't use Unix at all, really, from about 1990 until 2002, when I joined Google. (I worked entirely on Plan 9, which I still believe does a pretty good job of solving those fundamental problems.) I was surprised when I came back to Unix how many of even the little things that were annoying in 1990 continue to annoy today. In 1975, when the argument vector had to live in a 512-byte-block, the 6th Edition system would often complain, 'arg list too long'. But today, when machines have gigabytes of memory, I still see that silly message far too often. The argument list is now limited somewhere north of 100K on the Linux machines I use at work, but come on people, dynamic memory allocation is a done deal!
I started keeping a list of these annoyances but it got too long and depressing so I just learned to live with them again. We really are using a 1970s era operating system well past its sell-by date. We get a lot done, and we have fun, but let's face it, the fundamental design of Unix is older than many of the readers of Slashdot, while lots of different, great ideas about computing and networks have been developed in the last 30 years. Using Unix is the computing equivalent of listening only to music by David Cassidy.
11) Re: Plan9 - by Spyffe
Rob,
Right now, there are a large number of research kernels. Plan 9, Inferno, AtheOS, Syllable, K42, Mach, L4, etc. all have their own ideas about the future of the kernel. But they all end up implementing a POSIX interface because the UNIX userland is the default.
The kernel space needs to be invigorated using a new userland that demands new and innovative functionality from the underlying system. Suppose you were to design a user environment for the next 30 years. What would the central abstractions be? What sort of applications would it support?
Pike:
At the risk of contradicting my last answer a little, let me ask you back: Does the kernel matter any more? I don't think it does. They're all the same at some level. I don't care nearly as much as I used to about the what the kernel does; it's so easy to emulate your way back to a familiar state.
Applications - web browsers, MP3 players, games, all that jazz - and networks are where the action is today, and aside from irritating little incompatibilities, the kernel has become a commodity. Almost all the programs I care about can run above Windows, Unix, Plan 9, and on PCs, Macs, palmtops and more. And that, of course, is why these all have a POSIX interface: so they can support those applications.
And then there's the standard network protocols to glue things together. It's all a uniform sea of interoperability (and bugs).
I think the future lies in new hardware as much as in new software. A generation from now machines will be so much more portable than they are now, so much more powerful, so much more interactive that we haven't begun to think about the changes they will bring. This may be the biggest threat to Microsoft: the PC, the desktop, the laptop, will all go the way of the slide rule. As one example, when flexible organic semiconductor displays roll out in a few years, the transformation in how and where people use computers and other devices will be amazing. It's going to be a wild ride.
===============
Damn. (Score:3, Informative)
Is he running for office?
Re:Damn. (Score:2, Insightful)
That is Disingenuous Spin, His answer IS political (Score:5, Insightful)
And how, pray tell, is he going to do that when all but the most trivial code runs afoul of patents and is vulnerable to litigation? (According to many analysts, this is already the case.)
Refusing to answer the question and using disagreement with the analogy used by the questioner as cover is an exceedingly political answer (and a tried and true method of dodging uncomfortable questions used by virtually every political candidate for office in recent years, as alluded to the "is he running for office" comment)
Hardly a non-political stance, merely a disingenuous one.
Re:That is Disingenuous Spin, His answer IS politi (Score:2)
There are a couple of important differences.
For one, it is completely obvious to any reader that Mr Pike didn't answer the question. He dismissed the question while making a valid statement. Hardly as bad as, say, Bill Shatner's /. interview [slashdot.org], even. I can't think of a good example of a politician dodging a question based on disagreement with the posed analogy in recent history off the top of my head. But - consider Bush's recent response to the question posed to him in the third debate on the minimum wa
Re:That is Disingenuous Spin, His answer IS politi (Score:5, Insightful)
Are software patents important? Yes. Do they threaten the very survival of our species? No.
Patents as the solution for Arms Control (Score:3, Funny)
-Don
If You Want a Serious Answer... Don't Get Cute (Score:5, Insightful)
He gave an appropriate response to a STUPID analogy.
Re:If You Want a Serious Answer... Don't Get Cute (Score:5, Insightful)
A kind of "mutually assured destruction" stance...
As such, the analogy with the reasoning that lay behind the nuclear arms race seems quite apt.
The dismissal of the question does rather suggest that the speaker did not want to address the point at issue.
Re:If You Want a Serious Answer... Don't Get Cute (Score:3, Insightful)
Re:If You Want a Serious Answer... Don't Get Cute (Score:2, Interesting)
In terms of comparing it to a nuclear arms race, I think the analogy hit the nail on the head.
Now, just acquiring patents to put the clamps down on innovation is what the original poster was probably referring to and that doesn't really apply to nuclear arms races. Maybe its more like being a rancher [from the movi
Re:If You Want a Serious Answer... Don't Get Cute (Score:5, Interesting)
The fact of the matter is that software patents are not going to go away, something that I touched upon in the original question. Aside from that, thier main use, so far at least, seems to be either for dying companies too leech some more existance from a more successful one, or too browbeat a smaller competitor into competition through the threat of legal costs they cannot sustain. Whether you think that is equivalent to the intent of a patent; essentially granting the inventor a reward for their efforts, no matter how stupid or obvious that invention might seem, is another matter.
Patents in general, and software patents in particular, are undeniably a big issue in the IT world at the moment. That Rob Pike dismissed the entire question out of hand leaves me with two more possible conclusions to yours: He's pro-IP patents, but is afraid to admit to it on Slashdot, although to be fair he'd *would* get savaged in the comments. Alternatively, he is anti-IP patents, but is afraid to admit it where his employers might see - which would say a lot about his employers if that is the case.
Re:If You Want a Serious Answer... Don't Get Cute (Score:3, Insightful)
But it leads the question. You put patents in the light of "pure evil" and so come off as having your own agenda and not an honestly-interested-in-the-answer question.
Its like bringing up Nazi's in a converstation.
Re:If You Want a Serious Answer... Don't Get Cute (Score:2)
That is indeed how I took it. Comparing them to nuclear weapons is extreme to him because he doesn't see them as a threat. I think he IS ansering the question in a short, yet distinct, manner. I doubt that he has any great fear of being flamed by several thousand geeks on a messageboard though.
Re:If You Want a Serious Answer... Don't Get Cute (Score:2)
so now you have big and scary corporations sucking up p
Re:If You Want a Serious Answer... Don't Get Cute (Score:2)
as for saying that their "invention" might be stupid or obvious, it's perfectly within the boundaries set by the law to patent something that doesn't already exist or hasn't been thought up. so while the amazon 1-click patent might seem stupid or obvious, if it's so obvious, why hasn't anyone else used the idea?
Actually, non-obvious is a criteria for being granted a patent. If you try to patent something that is obvious to any reasonably competent professional in the field, it's supposed to be rejected.
Re: Sorry. Don't Agree. (Score:2)
It's simply a fact that the current patent race is quite similar to an arms race, and that if everyone started enforcing their patents we'd have a "software nuclear winter", in the sense that no-one could still write a program without infringing any patent.
It's not a hyperbole, because it's not meant to show that software patents are supposedly just as bad as Hiroshima or Nagasaki, but simply an analogy because of the way the
Re:If You Want a Serious Answer... Don't Get Cute (Score:2, Insightful)
The goal of Microsoft, IBM, etc. (and a few smaller players and non-entities) is partly to gain as much IP control as possible, to not only avoid having to be beholden to some other IP-holding company, but also to use it to control competition, if required.
The end-game is different, but what ends up is a business equivalent of nuclear detente, where the major players have enough weapons to counter any attack in court.
Nuclear detente would have been the bet
Re:If You Want a Serious Answer... Don't Get Cute (Score:3, Insightful)
"The promoters of object-oriented design sometimes sound like master woodworkers waiting for the beauty of the physical block of wood to reveal itself before they begin to work."
Comparing programming to woodworking is a bit extreme.
Re:Damn. (Score:2)
He does have a job, you know. He probably doesn't want to piss off his employer.
Re:Damn. (Score:2)
Re:Damn. (Score:5, Insightful)
The one-click patent is a symptom of the problem, but the K5 debate on applying the logic of the DMCA to patents is a symptom of the attitudes. Attitudes won't change, just because we're tired of them. They'll change when those who ARE tired of them propose a workable, viable alternative that meets the needs of industry and inventors.
It's obvious enough that he knows that walking the walk is important - that's one reason he developed Plan 9! Kevin Mitnick proved, very conclusively, that computer security and data integrity are vulnerable to the foolishness of mere mortals. Redesigning on this scale is more than just rewriting some code. the sort of redesign Plan 9 represents is about seeing what doesn't work, and replacing it with something that does.
Attitudes are broken. They need patching or replacing. Keep them the same, and all the software fixes in the world won't secure a single computer.
Re:Damn. (Score:5, Insightful)
None of these flaws are necessary, all of them are serious impedements and each in turn is likely to be the reason users will migrate away when viable alternatives exist.
I wasn't overly impressed by Rob Pike's answer to the idea of specialist Unix tools, as opposed to more generalized software that can handle many different types of task, but it illustrates a blind-spot that could prove troublesome for Rob.
The blind-spot is thus - if extreme specializing and narrow focussing is a dead philosophy in coding, it must also be a dead philosophy in software engineering in general. If the logic doesn't hold true any more, then it should be dispensed with completely. Dropping it only in one or two narrow areas is, in itself, an application of the very philosophy that is being rejected.
(ie: It is an extreme specialization, rather than the general application, of a change in attitude.)
Of course, this argument only holds true if my central beliefs are correct, which are that:
What could Rob say about patents? Well, for a start, he could have said that they are the existing method of solving the complex problem of fairly compensating people for their work, but that the solution is probably not the best and may need to be replaced with something better. He doesn't need to produce a working flow-chart on what the politicians should come up with. Nobody asked him to actually invent a better method.
By not really answering the question, he sounds like he cares more about what his pay-masters would think than with giving an honest answer. Now, that's not horribly unreasonable, but how much extra effort does it take to say "that's not a question I can really answer"? At the very least, it would be an answer, and therefore respectful of the questioner.
Personally, I don't subscribe to the notion that the one who pays the piper calls the tune. You could pay me - or anyone else - whatever money you liked, but not a single one of us could change the laws of physics, violate Pythagoras' Theorum, or make 1+1=3. Some things can't be changed for love or money.
Programming is one such area. A problem is computable or it isn't. If it is computable, you can solve it with a computer program in finite time. If it is not computable, no general algorithmic solution exists. An act of Congress won't change this. If you assume God to be constrained by logic, then even an act of God woudn't change it.
If it is possible to patent an algorithm, then it is possible to create patents that cover the ONLY workable solution to some set of problems. The owner of such a patent is claiming ownership of not just that specific solution, but the entirity of that class of problems. Since these are the only patents worth having (you can circumvent any others, because there are other algorithms which do the same thing) it follows that the system is inherently unstable and self-destructive.
Creating or supporting bug-ridden implementations is bringing computing into disrepute, whether those implementations are in C on a computer, or in english at a patent office. Bugs are bugs are bugs, whatever the form, whoever the implementor. And bugs are never a Good Thing.
Re:Damn. (Score:2, Insightful)
Re:Damn. (Score:3, Insightful)
I wondered about this too, surely he doesn't think the OP was suggesting patents are comparable to nukes -- the question referenced the way in which large corps gobble down patents, often with no immediate designs to follow through on a business plan or implementation.
Pike does, however, mention that he works at Google, so maybe he interpreted the question as a shot against his employer or was simply advised by Google's PR not to answer the question.
Re:Damn. (Score:3, Interesting)
GIGO (Score:4, Insightful)
Nice (Score:4, Funny)
Now there's a sidestep George Bush would approve of.
Re:Nice (Score:5, Insightful)
It's not a side step; it's a precharged question. (Score:5, Insightful)
Bzzzt! Wrong answer. (Score:5, Insightful)
The fact that that question was sent to the interviewee meant that Slashdot's readers wanted to know his opinion of the patent system. He could have answered it in any manner he chose, but he chose to sidestep it instead because his employer (Google) believes in using patents aggressively in a mutually-assured-destruction way, even if it means the end of Linux. That is why he didn't answer, and your faux-objective pseudointellectual babble isn't fooling anyone.
Re:Bzzzt! Wrong answer. (Score:2)
That's a bad analogy. A nonproliferation pact would mean patents in the hands of fewer companies (nukes in the hands of fewer countries). They would be agreements by companies (countries) that did not have patents not to apply for patents (build nukes). These are just agreements saying that we won't sue (nuke) you if you don't sue (nuke) us. The patents have already proliferated by that point. In fact, this encourages proliferation (my patent portfoli
Re:Bzzzt! Wrong answer. (Score:2)
Re:You don't know that. (Score:2)
Re:IP problems |= nuclear stockpiling (Score:2)
Analogies are not equivacations, and the orginal poster was true to form in that he wasn't equivacating. Software patent stockpiles, like nuclear weapon stockpiles (and unlike inidiviual nuclear weapns), are infinitely more concerned about the threat than they are about having the bluff called and intervening in a real manner; e.g., if y
Re:It's not a side step; it's a precharged questio (Score:2)
Ok, I'll go along with it, with a minor correction.
IP protection embues certain rights under various juridictions.
IP protection is about taking rights from the people/society/the public. IP law has at its foundation the concept of communal ownership over ideas. So when you invent something, the Law of Nature is that everybody owns the invention. In order to encourage invention, the Law of Nature is briefly discarded by something called a "patent".
I don't totally disagree with your post, I just take i
Huh? (Score:5, Funny)
Is he recovering from head trauma or something? It makes it sound like his next step is walking to the restroom without assistance...
Re:Huh? (Score:2)
I suppose (Score:4, Funny)
I'm disappointed.
Re:I suppose (Score:3, Informative)
But it was a fun fact to throw out when the whole why-too-kay bug was big.
But what if you like listening to David Cassidy (Score:2, Insightful)
To continue the musical comparison. Windows, 15 different variations of the same mass produced pop song whos only existance is to make money for a company that already has a lot of money.
I'll take David Cassidy, even if he has a CLI only.
Disappointed in Pike's flip answer to patent Q (Score:5, Interesting)
Save-under was/is a good idea, and so insanely simple it's hard to believe that a patent was granted -- much less weilded with such force. For youngsters (and as an oldster, perhaps my memory isn't quite perfect on this) some early machines had overlay planes for menus. You could draw the menu over the frame, then clear the overlay plane, without disturbing the contents of the window beneath. To do this on a bitmapped display without overlays, the idea was that you would screen-grab the image under where the menu would be, then paste it back when the menu disappeared.
Pike defended ATT's refusal to allow the X consortium to use save-under without royalty at the time.
Thad
We can surmise that, but we don't really know (Score:4, Insightful)
Well, since he dodged the question with a disingenuous slam of the questioner, using his disagreement with the questioner's analogy as cover to do so, we really don't know the answer to that. Based on his unwillingness to answer the question and defend his point of view (which one may surmise based on previous behavior and his dismissal of software patents as an issue worthy of addressing, is pro-software patent) we can guess that his perspective does differ from most in both the industry and academia (including Stallman), but with his refusal to answer the question we really don't know.
We do know (Score:5, Interesting)
More here [mit.edu]
Pike has a few misused patents to his name, and his unwillingness to answer a perfectly valid question is a good indicator of his stance on the issue. As another poster suggested earlier, Pike really was caught between a rock and a hard place by the question: admit that he supports patents and face the wrath of the slashdot crowd or deny his past stands and expose the duplicity of his current employer. Either of the two answers might've opened some fanboy eyes around here. Too bad it didn't come to pass.
Is that sarcasm? (Score:4, Insightful)
From there he goes on to answer your questions both completely and lucidly. A refreshing change from the politicians and executives we've talked to so much recently, no doubt about it.
The actual interview says:
Pike:
Comparing patents to nuclear weapons is a bit extreme.
Clearly the summary is being sarcastic...
What's wrong with Unix - the GLAT (Score:5, Funny)
What would you have put?
Nice answer given by Pike (and no, I'm not going to requote the whole thing), but good luck fitting it into the box here on the 'test.' [google.com] :-)
Re:What's wrong with Unix - the GLAT (Score:2)
But he already works for Google, remember?
I guess he already aced the test.
Re:What's wrong with Unix - the GLAT (Score:2, Funny)
And *that* is the actual test - not the answer itself.
not the Rob I (don't) know (Score:5, Funny)
object-oriented design is the roman numerals of computing.
-- Rob Pike
and seeing as he mentioned perl
> To me perl is the triumph of utalitarianism.
So are cockroaches. So is `sendmail'.
-- jwz [http://groups.google.com/groups?selm=33F4D777.7B
Emacs or vi (Score:5, Informative)
Great answer to that question: Neither, he wrote his own (twice!), and wrote papers about the products. That's a Unix power user, defined.
Re:Emacs or vi (Score:5, Interesting)
(I'm reading the ACME paper now. Looks interesting.)
Jules, who writes his own editors too.
Re:Emacs or vi (Score:3, Informative)
A: Download Inferno [vitanuova.com]. It's a Virtual Machine-based operating system that runs on top of Linux, Mac OS X, Windows, and Plan 9 (to name a few). Acme is included. Free to download.
Or B: plan9port [swtch.com]. It's a port of the Plan 9 libraries to UNIX, including Linux and BSD. Acme is included (screen shot under KDE [swtch.com]). Again, free to download.
You should read the Plan 9 wiki entry on acme [bell-labs.com] before trying to use it.
Enjoy!
Re:Emacs or vi (Score:5, Funny)
A vote for Acme (which would have otherwise certainly gone to Emacs), is like a vote for vi! It is a well known fact that vi supporters have been secretly throwing Acme parties around the world.
Re:Emacs or vi (Score:5, Funny)
Once upon a time, at Caltech High Energy Physics, where Rob Pike had worked before going to Bell, two programmers (me and Karl Heuer) were bitching about existing editors, each claiming he could do better. That night, it got to the "oh yeah...prove it!" stage, and both sat down to write editors. By morning, they were each using their respective editors on themselves. Norman Wilson at cithep had kept in contact with Pike, and told him of this spate of editor hacking. Note that this was well before Pike did his jim and sam stuff.
Pike wrote back something like this: "writing a screen editor is fun and easy and makes them feel important. Tell them to work on something useful".
We were quite amused when we found out that Rob went on to write editors, instead of sticking to "something useful"!
The Unix Room (Score:5, Insightful)
These days, developers seem to have their accommodation organised by blind chance, or worse, corporate whim.
Many of my colleagues left their 6-12 man offices to join a 70 desk open-plan floor. The six of us architects (yeah, right) were pretty miffed to be shunted into a 1980's room just for six with beige vinyl on the walls and phones straight out of Flash Gordon. Now, two months later, we appreciate the working community that is our office.
Good call Mr. Pike: humans function well in small self-organising or randomly-organised groups of up to 8. I'll rue the day we have to move out.
Re:The Unix Room (Score:4, Insightful)
Fascinating.
I was at Columbia University last week for a meeting sponsored by the local ACM chapter [columbia.edu] and LXNY [lxny.org], the speaker was Stephen Bourne [eldoradoventures.com] (he who is sh [wikipedia.org]).
At some point during his excellent talk on the history of Unix and his place in it, someone asked what he thought was the reason for the success of the operating system, and without hesitating, he talked about the room where all the terminals were located (he never specifically referred to it as the "Unix room" though) and how when you released software it was used immediately by those in the room and if something broke, you were called "idiot" (and probably worse) by your peers -- it was in your best interest to make sure you didn't put out junk as you really didn't have that dilution of responsibility that engineers have in a large corporation where the design team is in one wing of the building, the coders are in another, and the testers in yet another location, etc.
It was a great speech, anyone who hasn't seen Dr. Bourne speak should do so, he is an excellent source of insight into the early years of Unix and software engineering in general. He is now working for a venture capital firm and roughly a third of his talk was spent talking about that, it's a testament to his great speaking skills that most of the people in the room didn't lose interest when he switched topics like that (I'm convinced that most hackers suffer from ADD).
Thomas
Re:The Unix Room (Score:2, Funny)
It's AD&D, and I wouldn't say it's suffering
Re:The Unix Room (Score:2)
Right there is the rub. My last manager had heard of this idea as well and stuffed 46 of us in a room together. Testers, developers, managers, even the client reps all in one room. After 3.5 years they cancelled the project with virtually nothing to show for it. Of course completely changing direction on the project no less that 5 times during that timeframe didn't help either. However, I do believe t
patents and nukes: not extreme comparison (Score:4, Insightful)
Pike:
Comparing patents to nuclear weapons is a bit extreme.
No it isn't. The comparison is drawn often, because both large patent portfolios as well as large nuclear arms stockpiles create a situaiton of Mutually Assured Destruction. Once the nukes start flying, nobody wins. Likewise, once the lawyers start slinging patent lawsuits, only the lawyers win.
So the answer may be, "I have no idea", but the comparison is legitimate.
Re:patents and nukes: not extreme comparison (Score:2)
Re:patents and nukes: not extreme comparison (Score:2)
#8 One tool for one job? (Score:5, Interesting)
I sincerely believe that "one tool for one job" isn't dead, the landscape has simply changed.
Yesteryear, the only way software tools worked together was via stdin/out over the command line.
Nowadays, we have brought the concept into application space through component architectures and IDLs (COM/XPCOM/JavaBeans to name 3). These new tools allow for that clean separation. Plug-ins or components are free to concentrate on doing one thing very well.
The change, IMO, is a good one. Formalized interfaces are good, and components are better optimized than launching a whole separate process.
Re:#8 One tool for one job? (Score:3, Insightful)
You may not like his answer and may disagree but that does not make his answer any less valid. He said that one tool for one job is dead. If you want a different answer then ask someone else. Usually if you ask enough people you can find one that agrees with you. Hey, when that happens you can then say that you theory is correct becasue X agrees with you and ignore the droves of others who disagree. Comeom, all the kids are doing it. manipulate the data to
Re:#8 One tool for one job? (Score:3, Insightful)
My first response was a bit over-emotional.
I simply meant this: his answer seemes very glib to me. It would simply be nice if he had elaborated a bit
He didn't say why... (Score:4, Interesting)
I really think he was evading the answer.
The real answer is that you need a framwork that lets you connect the tools together easily before you can use a software tools approach. For the command line era, that framework was the UNIX shell. For the GUI era, there really hasn't been a popular framwork that's also portable. AREXX, Plan 9, Applescript, these seem to be the best frameworks I've seen so far, and they're all isolated to ghettoes... we're still waiting for the GUI equivalent of the UNIX shell.
Containment (Score:2, Insightful)
Formalized interfaces are good, and components are better optimized than launching a whole separate process.
Not in all cases. It's often easier for a program to contain a misbehaving component if the component runs in a separate process. For instance, if a web browser plug-in segfaults, do you want it to destroy the data you've entered into a form on another page?
Re:Containment (Score:3, Interesting)
In reality, both major browsers (IE, Moz) use component architectures, not separate processes, so I'm not sure your example is truely relevant.
Also, a formalized interface means two things that help stabalize components:
I was hoping for a bit more detail, too (Score:4, Interesting)
(I submitted this particular question, and appreciate the mod point.)
I was looking at it from a slightly simpler and broader angle: the functionality of discrete widgets. There are so many products (software in particular; computing devices in general) that are designed to be a single answer to all of the customer's needs. This is extremely difficult to do correctly, and many efforts end up as one or more of the following:
too hefty/bulky/bloated
too expensive
too resource-hungry (be it RAM or battery power)
too fragile (where one misbehaving widget causes a ripple effect throughout the device/app/entity)
performing several functions but not doing any one task particularly well
Those days are dead and gone and the eulogy was delivered by Perl seems to mean that we only need one tool to do our jobs, and that tool is perl. I respectfully disagree with this: perl is very handy but it is not always The Right Tool for the Job(tm).
Rob - thank you for the answer.
Re:I was hoping for a bit more detail, too (Score:2)
I took his answer as more of a slam on perl than anything else, and that made me smile.
Re:#8 One tool for one job? (Score:3, Interesting)
Microsoft had an "object" model in COM, but that model only allowed extension of an object into a new object by composition - did not support inheritance. The notion of "object" in VB 6 was the same way.
There is a school of thought that inheritance is to be avoided, that everything should be composition. Inheritance may be bad -- think of OO newbies making inheritance trees 12 levels deep. Also think how
Oh well... (Score:2)
Thus, the lesson is... (Score:2, Insightful)
Thus, today's lesson is: don't insert your own stupid analogies into the question just to appear intelligent.
I would have loved to see his response to the same question without the analogy. He would have been forced to answer, or explicitly acknowledge his dodging, if the submitter had merely posed the question by itself.
Obviously, as an employee of a corpor
One tool for one job? (Score:5, Funny)
Hey! Perl still adheres to the "one tool for one job" metaphor.
It's just that Perl's "one job" seems to be defined as "replace all the other tools"...
Re:One tool for one job? (Score:2)
Really? I thought it was more like "six different tools for one job".
Re:One tool for one job? (Score:3)
askSam (Score:2)
Yep. Ask anybody who ever used askSam for their desktop database needs back in the day. Lordy, I miss that software. When was that, anyway? Back in the late 1980s? The brain's a little foggy today...
DragonFlyBSD (Score:4, Informative)
DragonFlyBSD has a system call layer [dragonflybsd.org] that would allow potentially very different interfaces to be presented to userspace stuff with essentially no penalty. This may allow newer ideas to be explored in a familiar environment.
Plan 9, Unix may not have it, but another OS does (Score:3, Interesting)
I hate to say this, but doesn't Windows 2000/2003 server, Active Directory (and Novell NDS etc) do a lot of this. One set of users, a network of machines (without being reliant on one master machine*), and one security model. Maybe not quite there on 'one big file system', though can basically be achieved with a bit of setting up.
(* I haven't manafged a Windows domain for a few years, seem to remember 2k had a PDC-like machine as such, but also with backup servers - ready to take over).
Re:Plan 9, Unix may not have it, but another OS do (Score:3, Interesting)
No, what you need is a standalone system that you can build up into a distribute
Yes, you are nearly right. (Score:2, Informative)
NDS inherited limitations from the Mormon theology (for example: the concept of multiple roots is anathema in a genealogical database designed to relate all the descendants of Adam. Thus NDS could not handle multiple roots and separate trees had to be merged in order to
Question for next time (Score:3, Interesting)
I always wanted to find out (Score:2)
Re:I always wanted to find out (Score:3, Informative)
Sic [not an acronym] Latin: thus; so (not a mistake and is to be read as it stands)
Re:I always wanted to find out (Score:5, Informative)
He got #5 wrong... (Score:5, Interesting)
I understand the idea that anything user-facing should probably be as simple as possible. This means that ideas that require user-supplied metadata (as the typical XML-in-filesystem ideas require) are probably not going to be successful. I also agree that Joe User doesn't care whether or not his data is stored in a RDBMS or in a plain text file if his search tool does a good job.
The phrase "structure is meaningless; search is king" is a non-sequitur to someone aware of data management fundamentals. Structure gives meaning which in turn allows you to relate the data to others. The problem today is that we're creating data and storing it in'plain text' (or flat file, proprietary, etc.) physical formats instead of storing emails, word processing documents, etc. in a RDBMS.
The RDBMS is more than simply a search tool; that it has a sound model, provides for easier application development, etc. Wouldn't search be significantly easier to do if your data is given a consistent logical view? If you know the semantics of a particular piece of data, you no longer need to waste your time classifying it to search.
It seems that a proper solution would be that every PC contained a RDBMS, all data is stored in one, and that the internet would simply be a series of interconnected, distributed RDBMS (D-RDBMS). This idea would probably be fairly difficult to implement, but is already being performed at Google anyway (albeit in a slightly different format). Back when Codd developed the model he was primarily concerned with institutional databases -- centralized schema validation/data storage/etc. The problems implementing D-RDBMS products are not trivial, but then again are not insurmountable. The world has been able to standardize on protocols, etc. so I don't think it is out of the realm of possibility to suggest that different companies/users/applications could agree on a particular schema for, say, emails.
Re:He got #5 wrong... (Score:2)
The phrase "structure is meaningless; search is king" is a non-sequitur to someone aware of data management fundamentals. Structure gives meaning which in turn allows you to relate the data to others.
Or the form of the query in combination with a semantically agnostic indexing scheme (ie. PageRank but there are others) gives structure to the results which the user uses to give meaning to the data.
Re:He got #5 wrong... (Score:2)
1) Less complicated
2) More accurate
3) Easier to develop/debug (probably ties to #1)
And of course the end-user is going to derive more meaning from the data than computers can (currently, without true AI) provide. But the poin
Re:He got #5 wrong... (Score:3, Interesting)
PageRank is an algorithm of popularity and not an algorithm of relevancy and as such, it really bears little relevance to implementation of relevancy algorithms as we are discussing. Of course, relevancy algori
Re:He got #5 wrong... (Score:4, Insightful)
I mentioned in another post that SQL products are NOT RDBMS
Ah, a disciple of Date. If we're going to switch vocabularies that's fine. In the wider world DB2, Oracle, SQL Server etc. are synonymous with RDBMS and the terms are used interchangeably. You are of course correct, that technically that's incorrect, but the technical point has largely been ignored by the computer industy, users and developers. I can work with your definition as well.
PageRank is an algorithm of popularity and not an algorithm of relevancy and as such, it really bears little relevance to implementation of relevancy algorithms as we are discussing. Of course, relevancy algorithms could contain page rank as a heuristic. See http://www.google.com/technology/
My fault for being imprecise, PageRank is only part of Google's search algorithm and I used it to refer to the whole. The point about scalability still stands (I'll get to why Oracle et al. are a propos even though they aren't RDBMSes in a bit).
In order for you to create a document (in the New World Order there really is no such thing as a web page any more) about Atlantic Slave Trade, you would have to have some sort of schema that defines it (by definition, it would require one).
I think this is a case of worse is better. The New World Order may never be imposed because ignorant neophytes have gone and ran with HTML in directions the high priests never intended. Lacking any armies with which to enforce compliance it's an open question whether the high priests will ever be able to control the chaos again. Technically superior solutions (and I actually agree with you here in the beauty of a real RMBS as opposed to what's marketed as such) may have aesthetic superiority, but just as real RDBMSes will never replace Oracle, DB2 etc. until either the replacement cost is 0 or there is some orders of magnitude greater functionality not available in other systems, so those waiting for the NWO may still be waiting on their dying day.
How does Google know that "my trip" refers to the trip that you took from 01-OCT-1995 to 20-OCT-1995 with three friends? What about the content of the pictures?
It doesn't, and it doesn't have to. The query, my name Versailles pictures is probably good enough to find those pictures (and 10000 unrelated items, but the one I'm interested in is easily identifiable). And there's the problem with replacing Google: it's good enough for so many tasks that a replacement would have to be orders of magnitude better to displace it. As for when the pictures were taken, or what their contents are, I get that information the same way I get it now, from the file metadata, assuming it's correct. On the other hand, every photo organization program available (iPhoto etc.) allows the user to add all kinds of meta data to search for content. And guess what? Most user's don't, because the last thing you want to do after taking 200 pictures on a trip is spend 2 days typing captions for all of them. People still identify pictures they way they did with old fashioned photo albums: by context, comparing them with pictures of similar scenes or with time information (clothing style, etc.) to fill in details that may have been forgotten. This is easy enough for human intelligence to do, and so far beyond current AI ability, that I don't see any application improving on this model any time soon, although at some point someone may.
Finally, there is no requirement for a 'formal language' - when you do a Google search do you have to specify a formal language? That is a matter of implementation
But there are no implementations that don't have some form of formality involved. The problem is I (and the vast majority of people) have many bytes of free form text that Google indexes just fine. Simple Vector Quantization (I use a Mac) works fine for searching for documents locally on my machine, and my guess is that Google's algorithms are some form of vector quantiza
Re:He got #5 wrong... (Score:4, Interesting)
Simply because most of the world is ignorant does not make it a particularly welcome idea to willingly embrace their ignorance. Hence, I try and use correct terms whenever possible (RDBMS vs. SQL DBMS, Cracker vs. Hacker, the Terrorist Attacks of September 11, 2001 vs. 9/11) etc. But, that is neither here nor there.
I'll address your points briefly before I get to the root of my initial desire for posting.
"Technically superior solutions may have aesthetic superiority"
That seems like a contradiction. Something that is "technically" (I am assuming you mean 'of a technical nature' and not 'abstractly') superior is certainly more than aesthetically superior!!
RDBMS certainly have considerably more functionality than SQL DBMS products. This is clear once you read the original theories and the foundations behind them. Your sentence illustrates the myth that, in the IT industry, technically superior products will rise to the top. Your mention of "worse is better" (I really, really hate that title, it should really be "Worse is Sometimes More Marketable" or the like) reinforces this point exactly.
"The query, my name Versailles pictures is probably good enough"
It is good enough only in the micro. There is a statistic which mentions the geometric (maybe even exponential) rate at which we are creating and storing new data. Sure, for your current family album this level of granularity may suffice - but I suspect in the future our family albums will be composed of video, audio, stills, etc. at a magnitude that makes getting 10,000 results impossible to sort through by hand. You'll require more accurate search results and will want to ask more precise questions. The RDBMS is the way to get this; read my other posts on this thread to see some suggestions re: metadata; in short, the solution for tagging metadata is obviously not a 'solved problem' yet - this is mostly because no one has seriously tried to study it; also having a complete RDBMS there would aid immensely with relating and tagging your information. Properly implemented (whatever that may be) I would think that there would be little typing required.
The reason why I decided to post my initial reply was that this was a questionnaire by a guy at Google. If there is one company that could/would implement a D-RDBMS it would be Google.
It's obvious that Microsoft, Oracle, et al would not lead the way in this sort of innovation. Their products, marketing strategy, and internal politics would not allow for a TRDBMS to be at the core of any Microsoft operating system and Office Suite they ship, nor would Oracle want to adapt to something which required a shift from SQL or allow for easy migration to a competing product.
That brings us back to Google. Google is just the right kind of company to pull it off: it's got the technical expertise, name recognition and reputation, and the willingness to truly revolutionize the way we work with computers.
Ideally, Google would start using a form of RDBMS for all the search indexes it creates for their desktop search tool (I don't know what kind of DB it uses now). It would take a given document, rip it into their RDBMS, and then allow for searching. Since Google has virtually written the textbook on large scale data distribution they could load your local DB into their pool, so now whenever you log into Google.com you can search (and with enough bandwidth, retrieve) your information anywhere, any time (this would be perfect for companies trying to manage data for projects, etc.).
But, since it was in a RDBMS, other applications could be written to extend the idea. I could extend my product with the Google tool by storing my data in some format edible by the search tool. I now have Google Search built into my application. Or, I write a different UI which allows you to abandon the Windows "Explorer.exe" altogether - it gets rid of the archaic 'files' an
Re:He got #5 wrong... (Score:3, Insightful)
Your sentence illustrates the myth that, in the IT industry, technically superior products will rise to the top.
I think I actually implied otherwise, that there are a host of reasons why products and ideas succeed in the marketplace, technical superiority being only one of them. Time to market, capitalization, flexibility of the developer (often products succeed in markets other than the original target, Java for example) etc. As an example consider Ted Nelson's Xanadu, arguably a superior system to HT
What a prick, seriously (Score:2, Insightful)
I asked in my +5 interesting post why modern OS's are written in the lowest-level practical modern language (C) instead of the highest-level ones? I noted that at the time C was one of the highest-level languages and that was behind UNIX's success.
Instead he wastes his space completely dodging a decent question on patents, and takes some softball OS question so he can spout off about how his baby Plan 9 is maginally better (you can pass any number of parameters on the command line). Wtf?
Also his analo
Musical analogies (Score:5, Interesting)
No, it's more like listening only to music composed before Schoenberg. Those of us with taste recognise that that most of the stuff produced since that is either pretentious cacophony or ignorant, synical, commercial bilge.
Thus WinXP is to Unix what Britney Spears is to Beethoven. Plan9 would be some anachronistic romanticism like Pfizner or Elgar.
Re:Musical analogies (Score:3, Interesting)
Steady on, old man, steady on. You're getting awfully carried away...
---
I like to use lines like that. "There hasn't been any good music since Joplin died... no, I mean Scott Joplin..."
Windows XP is like the Monkees. It's not just commercial pap, it's old commercial pap.
Plan 9? Plan 9 is Jazz.
Re:Musical analogies (Score:2)
Interview with Doug McIlroy (Score:2)
Re:Missing SCO question (Score:4, Funny)
DOS 3.2 (Score:4, Funny)
Re:flamebait, robin (Score:2)
Re:Doug McIlroy invented macros??? (Score:5, Informative)
Re:Doug McIlroy invented macros??? (Score:3, Interesting)