[digital media semiology]
by Geoff Nunberg – NPR.org – October 25, 2011
Listen to the story here: http://www.npr.org/player/v2/mediaPlayer.html?action=1&t=1&islist=false&id=141655550&m=141655576
Steve Jobs did his last product launch last March, for the iPad 2. At the close, he stood in front of a huge picture of a sign showing the intersection of streets called Technology and Liberal Arts.
It was a lifelong ideal for Jobs, the same one that had drawn him to make his famous 1979 visit to the Xerox Palo Alto Research Center, or Xerox PARC for short. That was where a group of artistically minded researchers had developed the graphical user interface, or GUI, which Apple’s developers were to incorporate into the Lisa and the Macintosh a few years later.
The interfaces bundled pretty much everything we take for granted now: the mouse and windows, icons and pop-up menus, bitmapped displays where what you saw was what you got. With point-and-click and drag-and-drop, it moved the wheels and gears of the technology out of the line of view. You could engage directly with texts and images, free of the geeky tyranny of command-line instructions like “copy a:\filename c:\filename.” Suddenly, operating a computer could be an aesthetic experience. Who knew?
I wound up at Xerox PARC myself a few years after Jobs’ visit. The director John Seely Brown was bringing in linguists, anthropologists, psychologists and even philosophers and artists, the idea being that no technology as intensely social as this one should be entrusted to engineers to figure out. It made for a lively lunchroom, and we got to use these amazing workstations that were more sophisticated than anything Apple would be selling for the next 15 years.
But in those days, the Xerox Corporation was a complacent office-equipment company that had no idea how to get most of its researchers’ insights out the door. It was left to Apple to make a first installment of that vision accessible to a wide public. With just 128k of memory, the first Macintosh landed with a modest thump, not the crash of a hammer. But the echo is still audible.
The second installment of that vision would have to wait until Jobs returned to Apple in the late ’90s. In the meantime, the Internet had come of age, chewing up a succession of prefixes along the way. First there was cyber-, which conjured up the opening sequence of Star Trek, a vast universe on the other side of the screen that we could cruise from our own private holodecks. Then came the Internet boom and the prefix e-. The image here looked less like Star Trek than Mall Rats, with virtual arcades lined with virtual businesses purveying virtual wares. After a while, that prefix was getting old, too, as the boundaries between online and offline got blurry: Where do the real banks and newspapers leave off and their e-clones begin?
And then there was i-. The prefix had actually been around for several years before Apple adopted it for those gumdrop-colored iMacs that Jobs introduced in 1999. According to Apple’s ad agency, i- was meant to stand for “Internet,” with overtones of “individual” and probably the first-person pronoun as well. But the meaning of a product name isn’t something you fix in advance. It has to accumulate bit by bit, like dust bunnies. By the time i- was fleshed out, Apple had transformed itself from a culty computer-maker to a major religion. At least, that was the impression you got from the global surge of sentiment that Jobs’ death evoked.
A lot of that feeling was rooted in the obsessive attachment that people form with their i-devices, particularly the mobile iPods, iPhones and iPads. In a period when others saw personal technology becoming a mere commodity, Jobs showed how to make high-tech appliances that were as easy to fetishize as a Rolex watch. That obviously owed a lot to Apple’s genius for design, from the interface and hardware down to those nested white boxes. My sister once said that getting a Mac product was like getting a present from Tiffany’s — the only thing missing was the blue velvet pouch.
But the i-devices are more than just elegant and easy to use. They also invite fondling, even when they’re hidden away in your pocket. In a way, they’re less like a Swiss watch than a blankie or night-night — those transitional objects that children clutch to allay their separation anxiety.
There’s a bit of that in every cellphone, but the Apple i-objects are standing in not just for absent loved ones, but absent music collections, TV shows, restaurant reviews, driving directions and baseball scores, not to mention those aggravated avians. We don’t think of those things as floating out in cyberspace anymore, or as sitting on the shelves at some e-mall. They’re right here, literally on hand; nobody needs to be separated from anything anymore.
The i-devices pushed the Internet out of our consciousness, just as the Mac did with the operating system. And the prefix that began as an abbreviation for “Internet” wound up as a mark of its disappearance. Make that i- for “immanent.”
You could argue that Jobs’ greatest achievements in that final period didn’t owe anything to his engineers or designers. They were the deals he made that brought the world to our fingertips — getting the media companies to make their content available at a single, turn-key service and opening the iPhone app store to hundreds of thousands of third-party developers.
But really that was just one iconic instance of broad transformation of the role of technology. When you think of the digital phenomena that have changed the face of daily life over the past 15 years or so, the breakthroughs are less technological than social. They’re things like blogs and Twitter, Craigslist and Wikipedia, social networks and Internet dating. Or you think of the way cheap cellphones are shaping popular political movements and helping African fishermen find out which port is paying the most for their catch. This isn’t just about hardware and software anymore.
It isn’t just about computer science anymore, either. That isn’t where you go to find out how technology changes people’s lives, and where it fails them, or how to make it less intrusive and more humane. Those are the questions people are taking up at the Schools of Information that have sprung up at research universities like UCLA, Toronto and Washington — iSchools, for short. It’s a different i-, but it too stands in for a connection between technology and the social world.
I wound up at the one at Berkeley, surrounded by another bunch of anthropologists, historians and legal scholars, with techies and humanists to fill out the ends of the lunch table. But nowadays it isn’t odd to find technology and liberal arts intersecting on the campuses of Google and Microsoft, either. Jobs knew better than anyone that it’s a bit trickier to make the final leap to Artistry. But we’re closer to his interdisciplinary vision. As Victor Hugo might have put it, nothing is as powerful as an i- whose time has come.