Possibly because software is marketed in a very different way than hardware. When you price a given piece of commodity hardware, it drops in price as new hardware arises. A given piece of software is sold at full price and then withdrawn. (Forgive my ignorance of price models, though).
> Since there are quite a few hardware and software experts on this
> list, I'm wondering what you all think. Is software quality improving
I wouldn't say that I'm a software expert but I play one on tv. No, I'm not on tv, but I'm a technical lead at a large software integrator, and thus entitled to gripe about the state of the industry. My observations with respect to your questions follow:
* Software quality is decreasing, not increasing. Although programming techniques, such as object-oriented programming, can decrease the number of errors per line of code, code bloat and creeping featurism (more on those later) are more than making up for this. There are also two rather disturbing trends in commodity software development:
* vapourware - selling a product which doesn't exist with features that haven't been created yet. This is a failure of sales.
* shipping incomplete products - as a product expands in scope, deadlines become less and less likely to make. The product is either delayed or shipped before testing is complete, which is to say that it's shipped before testing is complete.
The huge size of new platforms is also rendering old testing methods useless. Win 2000, for example, required that a whole new testing paradigm be explored. It is entirely possible that these new testing protocols will intermittently fail to deliver. (Most of the Windows operating systems are excellent examples of software shipped too early, but almost every shrink-wrapped application is an excellent example of software shipped too early.)
Creeping featurism is caused by sales and marketing but also by architects and designers, who try to include all the features for all the users. An average user uses less than 10% of the functionality in MS Word - but it isn't the same 10% for every user. Functionality is driven by user research, and I don't think that any is placed as a whim - it's just that the user interface is getting clunkier and clunkier and the application harder to use because of the great surplus of functionality.
* The actual performance of computers - the ability to calculate numbers - is increasing. However, we use it for different things now. Running one's handy-dandy graphical user interface takes up a huge amount of computational power. Games are even more power-hungry. So you're doubly right - they are indeed getting faster and the power is being consumed by software applications as fast as it can be produced.
Cheers,
Marco
,--------------------------------------------------------------------------.
> Marco Anglesio | We think in generalities, <
> mpa at the-wire.com | but we live in details. <
> http://www.the-wire.com/~mpa | --A. N. Whitehead <
`--------------------------------------------------------------------------'