The BBC is running an article on the previously mentioned study on the environmental impact of computers.
A big part of this is that users are looking at the power factor. Obviously, a 4GHz processor is more desireable than a 1GHz, or *gasp* a 950MHz like I'm using. Guess what though- my 950 runs almost any app _just_ fine, and for gaming, my sweet ass video card makes up the difference- and really, my card is overkill for what I use it for. Anyway, back to the point, as you upgrade processors, the heat output of the processor increases rapidly. Even my 950 wastes a tremendous amount of energy in the form of heat.
And that's not even looking at the production factors. In making chips a tremendous amount of lead and mercury gets used. Behind the chip factory where my father works is a dead-zone dumping ground for these chemicals. He goes through haz-mat training on a regular basis to deal with accidential spills.
We're nearing the upper-end of what's possible with semiconductor technology. So it's time to refine semiconductor technology, instead of pushing for a faster chip. We don't need them- what we use our computers for doesn't need that kind of power. I rarely ever use 100% of my processor for more than a few seconds. I'd be more interested in cutting the power consumption of the computer I use.
Also, disposal is an issue. Really, that old 333MHz computer is far from useless. I've still got a 333 board in my closet-
I'm kinda rambling at this point, but I think people can see where i'm going.