What does it mean that computing power is so inexpensive? Is the world any better for this?
Word has come of a new Nokia smartphone, the 808 PureView, with a camera that sports a 41-megapixel sensor. That's not a typo; the phone has a super-high-res camera built into it. But why? Why would any camera need such resolution, much less one disguised as a mobile phone?
Not simply to make sharper images, it turns out. Snapping photos with so many pixels, Nokia officials explain, actually makes it possible to zoom in on scenes digitally with no appreciable loss of detail. That's in lieu of a complex and costly optical zoom lens. (Digital zooming is a matter of simply cropping an image or discarding pixels, for a tighter close-up.) Building a good zoom lens into the body of a phone that's only the thickness of a pancake is quite difficult, so Nokia decided to just throw pixels at the problem.
And why not? Pixels are cheap and getting cheaper all the time. A few years ago, of course, even high-end digital cameras had only 10 megapixels.
Clearly, something similar seems to be going on all across the IT landscape. As the costs of computing cycles, memory, mass storage, and bandwidth continue to fall, these resources come to be consumed -- or worse, it might seem, left idle and essentially wasted -- in quantities that would have made earlier generations gasp in disbelief.
One often hears about desktop computers or even smartphones that have more oomph than a mainframe from 1970, or some such. Yet, that mainframe costs so much that its owners had to do everything possible to keep it busy while the smartphone's and PC's circuits remain pretty much idle most of the time. (It's the same with sportier cars whose extra power gets used mainly to get through yellow lights or changing lanes on the highway.)
Now, one can take this argument in many directions. From a purely physical point of view, today's computers are essentially disposable items, all destined to end up in a waste dump somewhere. Like it or not, obsolescence is a given with computers, and it's not clear that efforts at recycling e-waste are keeping up with the flow of discarded equipment.
More intriguing is the notion that with gobs of cheap computing available, information technology gets to be too cheap to meter -- a phrase once famously applied to the abundance of electricity that atomic energy would supposedly make available at some time in the future. As far as I know, atomic energy has not added a single net kilowatt-hour to the grid. And as yet, there is no source of free computing power available. But it is true that in the largest datacenters right now, the highest expenses are no longer the computing gear itself but the electricity needed to run and cool that machinery.
In 2009, Chris Anderson, editor of Wired magazine, published a well-received book, Free: The Future of a Radical Price, about how the Internet was bringing into play all sorts of business strategies based on the idea of giving stuff away that once was charged for. Bands, for instance, give out their music on the Web and try to make money from live gigs.
Of course, there's a major paradox lurking behind the inarguable abundance of computing power we all enjoy right now: A great deal of this computing power, and perhaps even most of it, is used to manage scarcity in one way or another. Marketeers use analytics to help drive demand for goods that are scarce, and thus can be charged for. Likewise, the "Internet of things," with buildings and the landscape peppered with physical sensors, is largely about metering usage of scarce commodities. Modeling the earth's climate and the effects of industrial gases is essentially an exercise in managing scarce resources.
Paradoxically, this era's amazing and accelerating wealth of computing power is shadowed by increasing scarcity in many other realms.
The moral of the story? Perhaps only that things are not always what they seem. Or, as Nokia's phone seems to show, the most pleasing images often are those from which distracting elements have been removed.