November 30, 2012

Solved

I recently set aside my aging iPhone 3GS for a new iPhone 5. Naturally, the latter covers all the bullet points expected of an update to a consumer electronic device: It’s faster, thinner, bigger-screened. Yet as much as these iterative advances may improve the day-to-day experience of using of the device, they actually add up to a tradeoff.

One gives up several things along with the exchange of a 2009 smartphone for a 2012 smartphone. It might sound obtuse to say the things given up include low pixel density and time spent waiting for things to load, but these are more than annoyances made perceptible by the march of technology: They are connections to the medium. They are the signatures of the technology we use, bonds to time and place forged in memory; over time they become the familiar sensations of home.

In exchange for these connections to the medium, upgrades give us abstraction from it, the ability to perform tasks less encumbered by the technology’s inherent compromises.

Dissolving with the pixels

The history of raster-based computer displays may be seen as a single thread of increasing medium-abstraction from the technology’s earliest green-phosphor text terminals through today’s Retina displays. The experience of using the oldest screens was deeply connected to the limitations of the technology: Far from reproducing photographs in the millions of colors discernible by humans, images were limited to a single color and two intensities; even such screens’ greatest strength, text, was far removed from capturing the subtleties of centuries’ worth of typographic refinement. In the use of these technologies, the medium itself was ever-present.

As graphics technology improved over the next few decades, the technology itself began to abstract away as images could be reproduced at greater fidelity to the human eye and typography could be rendered with at least a recognizable semblance of its heritage. With high-DPI displays, the presence of the medium is all but gone – while dynamic range and depth cues may yet evade modern LCDs, the once-constant reminder that you are viewing a computer display has become so subtle as to have disappeared.

Computation, time, and distance

Every time you wait for a computer to catch up with you, whether it’s a second or two for a disk cache or an hour for a ray-traced image to render, you experience a signature of the medium in which you are working. Waiting for a document to save in HomeWord on an 8088 was a strong reminder that you weren’t dealing with paper. Invisible, automatic saving in Apple Pages lends a physicality to the document on which you’re working, abstracting the volatile nature of the medium.

A significantly faster network connection, such as the leap from 3G to LTE, further abstracts the already unimaginably-abstracted distances of the Internet. As this abstraction increases, our expectations adjust accordingly, pointing to a change in our mental models. I still recall that first time in the 1990s when I loaded a web page from outside the US, imagining the text and images racing over transatlantic cables as they piled up in the browser. A 20-megabit connection leaves no temporal space for such imagination.

The last one you’ll ever need

For the past two years, during the ascendancy of “retina”-DPI displays, it has seemed plausible that the industry is at last approaching a point in display technology where further innovation won’t be necessary—displays could be “solved,” having reached the apotheosis of their abstraction. As Moore’s Law continues to conspire with faster networks and better UI design to melt away all the other aspects of the tool-ness of the digital tools we use, our consciousness of those tools predictably becomes less pronounced. In the long run, more responsive, more reliable, more accurate, more abstracted interfaces trend toward invisibility.

Given enough time and enough iterations, can the technology and design of an interface simply be solved, in totality, like the game of checkers? Can it be abstracted away entirely, leaving perceptible only user intent and system response? Can we ever become truly independent from a medium—visual information matched with the limits of human vision, latency for every network request below the threshold of human perception, and a UI with nearly zero cognitive load?

When we’ve lost the last traces of the “computer-ness” of a computer, will we have lost something meaningful? Or will our only loss be of fodder for nostalgia?