This paper is possibly the grimmest thing that I’ve read about computer science in years, and covers a lot of why I think that working in the industry is boring and a lot of why I haven’t gone back to grad school. I don’t agree with all of it, though, and the suggestions aren’t really suggestions, more ‘just do the right thing already’ bitching that doesn’t really get anything done. Of course, I don’t have the answers either, or I’d be pursuing them. A lot of the trends that he’s pointed out have been continuing. There are precious few new and interesting OSes out there, and the ones that there are don’t seem to get any traction. It’s impossible to market a totally novel chip architecture. Even Intel can’t manage it and there is no one in the world with more leverage or money to spend on getting people to adopt and write software for something. An enormous amount of money is spent developing deeper and deeper emulation and virtualization layers to get the stuff that we already have to work on new systems and chips. The issue is that all of it is fundamentally boring. The OS and applications chicken and egg problem is huge, and it’s likely to grow, with no end in sight. I think that if you asked most people today what the answer is, they’d likely say the web, which is possible. If most applications are delivered that way, then a new system really only needs a few applications: a windowing system, a C compiler, a text editor and a web browser. This might allow for some interesting new operating systems to sprout up and make the cost of migration between them lower, which is always a good thing. However, it also reduces the urge to innovate or adopt new systems, since most of what you’re doing is all on the web anyway, and if your system is good enough, there’s little real benefit to adopting a new system.
Which brings us to the crux of the issue, which is primarily that there are few new problems that we’re trying to solve with computers. I can’t, off the top of my head, think of a new CS problem that I’ve heard people talking about. Ubicomp, maybe, but most of the things that I’ve heard of as applications for it are just retreads of older things. There’s still a lot of interest in the old ones, and there are a lot of problems yet to be solved, but there aren’t a whole lot of actual new topics. Nothing new under the sun? Maybe, but I feel like there must be something that we’re missing, whole categories of new things that one can do with computers other than making better web servers at the top end and better web browsers at the bottom end.