Personal History: Steve Jobs and the Singularity

Posted: November 16th, 2011 | Filed under: Books, Design, Long, Tech | 6 Comments »

“The ghost was her father’s parting gift, presented by a black-clad secretary in a departure lounge at Narita.” –  William Gibson, Mona Lisa Overdrive

1. Unboxing the Past

It was a few days after Steve Jobs died, and I was talking with an old friend. Actually, we were unpacking an original Macintosh computer, which we’d brought down from my attic to try to re-start as kind of an impromptu homage. The golden October sun poured in my office windows. We were here to talk business, but it wasn’t long before the subject changed to Jobs. It was sad of course, we said, but totally expected.

This computer hadn’t really been used since 1990. Last seen, it must have been ten years ago, hastily wrapped in an old curtain and stuffed in a plastic storage bin before my last move, another memento to hang onto. Now picking up the compact and surprisingly heavy beige box with its integrated handle somehow activated old muscle memory, and we remembered seeing it for the first time in 1983, our younger selves standing in the Harvard Coop before this strange new thing.

Young readers, perhaps you really had to live back at that time to fully understand its resemblance then to an object from the future. We were like those babies using iPads that you see on YouTube now. We hadn’t seen anything any like it, and yet we’d been waiting for it. Now it smelled old and the molded plastic case was yellowed. But when we flipped the switch once more, it came to life and the smiling icon appeared on the absurdly small nine-inch screen, just as it had in the grimy Boston of 1984.

Guys of a certain age like us don’t get too emotional about the news of the day. It’s always something. And we’re a little cautious about nostalgia, a wound best not re-opened. In sunny 2011, my old friend and I laughed at the ridiculous little Mac smiling back at us.

2. Steve Jobs

Maybe you’re tired of hearing about the kid who was given up for adoption and grew up in a young Silicon Valley; had a summer job at H-P, dropped out of Reed College after one semester and travelled in India; came home and invented the personal computer in a garage with his friend Steve Wozniak; brought us the graphical user interface, the mouse, and the laser printer; built the fastest growing public company in history as of that time and then was forced out in a power struggle; struck out with his NeXT computer company, but then struck gold with animation studio Pixar and launched a new golden age of animation becoming the single biggest shareholder of the Walt Disney Co.; returned to Apple and brought it back from the dead to become the most valuable company in the world, with a decade long string of home run products, re-inventing the music industry and the telephone and even the PC industry that he launched 30 years ago; the iMac, iPod, iTunes, iPhone, iPad, iCloud; the guy with the “think different” and the designer’s eye, always making technology and simpler and better; a cross between Horatio Alger, Henry Ford, Bob Dylan, and Thomas Edison; a hero to investors, geeks, creatives, college dropouts, second-chancers and comeback kids; the tyrant and the svengali who drove people to fulfill his vision through fear, persuasion, and force of will, but who met his match in the disease that killed him slowly over seven years; cancer and a liver transplant and all the king’s billions and horses and men couldn’t stop it; and still up on the stage wasting away, but always with “one more thing, like Apple’s new 2.8 million square-foot flying saucer-like headquarters; and finally resigning but still working and working until he was too tired to climb the stairs in his own home; and then dead a few days later, and then there was no one more thing

A lot of words have been spilled about Jobs in the last month. Is there anything that hasn’t been said already? Walter Isaacson, the author of previous works on Benjamin Franklin and Albert Einstein, begins his new biography of Jobs by comparing him to Shakespeare’s Henry V — a “callous but sentimental, inspiring but flawed king.” He was a genius, a jerk, an inventor, a dreamer, a business tycoon, a magician, a tortured perfectionist, or a con man. A designer, a hacker, a hustler, a hippie. It seems kind of ridiculous to say about a billionaire that “he didn’t sell out.” But what’s the meaning of all those post-it notes and flowers on the outside of Apple stores?  Whatever you think, he cast a big shadow over the last 30 years.

“Well, he was one of us,” my old friend offered, tapping a 3.5″ diskette against the desk. We were having trouble getting the Mac to read these old disks. “I’m almost exactly the same age as Jobs.”

It’s true, for us Baby Boomers, Jobs was an iconic generational figure, a strange mix of hippie, yuppie, techie, and more. There at every stage from the 60s to the Millenium. Like Zelig, or Forrest Gump. And there’s something strange about the acceleration of his career and about Apple since the turn of the century, like a rocket powering up to escape velocity.

So on one level we agreed, it’s not about Jobs, it’s about us. Jorge Luis Borges: “After the age of fifty, all change becomes a hateful symbol of the passing of time.”

3. 1963

In a Computerworld interview in 1995, Jobs described growing up in the early 1960s in Mountain View, California: “Silicon Valley for the most part at that time was still orchards– apricot orchards and prune orchards– and it was really paradise. I remember the air being crystal clear, where you could see from one end of the valley to the other.

“It was a very interesting time in the United States,” he said. “America was sort of at its pinnacle of post World War II prosperity and everything had been fairly straight and narrow from haircuts to culture in every way, and it was just starting to broaden into the 60’s where things were going to start expanding out in new directions. Everything was still very successful. Very young. America seemed young and naive in many ways to me, from my memories at that time.”

Of course, Star Wars director George Lucas himself could hardly have scripted a more fortuitous place for our young hero to land, at what would soon become the epicenter of the technology world.

I grew up in Florida, not Silicon Valley, but it was the same clean new world, before 50 years of progress erased it from view. Riding my bike through flat, tree-less subdivisions of new houses, a transistor radio hanging from the handlebars. Packs of kids on every street. Or at home, watching Star Trek on TV.

I go to my hometown now and everything is gone, or changed almost beyond recognition. Have you ever felt this way? A few years ago, I was visiting a city where I attended school in the late 1970s. With time to kill I walked by the campus and found my feet carrying me toward my old dormitory, a modern building that was virtually new when I lived there. But when I got to the street I looked up and down and couldn’t find it to save my life. Until after a few minutes I realized I was standing directly  in front of it. A thicket of trees and shrubs had sprung up, the bright red brick facade was faded and stained, and roots cracked and upended the concrete before the entrance. The ravages of time, and so on.

Do you remember where you were when Kennedy was assassinated, Jobs was asked: “I was walking across the grass at my schoolyard going home at about three in the afternoon when somebody yelled that the President had been shot and killed.”

It’s only a month later now, so obviously I can still remember where I was when I heard the news Jobs died. It was about seven or eight o’clock in the evening and I was working on  the computer in the study. My wife had just put our four year old to bed, come downstairs, and sat down to check her iPhone. “Hey, Steve Jobs died,” she called out.

In interviews with Walter Isaacson, Jobs he said he was “50/50″ on the existence of an afterlife, although after his cancer diagnosis, he started believing a little bit more. “But sometimes I think it’s just like an on-off switch,” he said. “Click and you’re gone. And that’s why I don’t like putting on-off switches on Apple devices.”

But the funny thing about death’s on-off switch: for the living, things don’t stop. Immediately, the world begins to change again. My father, who died suddenly in 1999, used to say the worst thing about dying was that he’d never get to see how the story turned out. The never-ending story goes on.

I’m reminded of how Borges begins his story The Aleph, “On the burning February morning Beatriz Viterbo died, after braving an agony that never for a single moment gave way to self-pity or fear, I noticed that the sidewalk billboards around Constitution Plaza were advertising some new brand or other of American cigarettes. The fact pained me, for I realized that the wide and ceaseless universe was already slipping away from her and that this slight change was the first of an endless series.”

4. Science Fiction

When you’re a parent of a young child, old women will often come up to you in the supermarket and tell you– Enjoy this time. It goes so fast. The days last forever, but the years fly by.

To a child, one day is an eternity. And as we age, time moves faster. It’s a truism. What’s more, every generation seems to feel that the pace of change has accelerated, that more remarkable things happened in their lifetime than in a typical one. Maybe it’s just another temporal illusion.

When I think back to Boston in the 1980s, I see it dark and raining. Fog rising out of steam tunnels. Film noir. I suppose I’m confounding reality with fantasy here, because we were watching Blade Runner, fooling around with our newfangled computers, and reading cyberpunk. The books I really associate with that time are Neuromancer (1984), Count Zero (1986), and Mona Lisa Overdrive (1988), by William Gibson.

The “Sprawl Trilogy,” as the three books are called, is set in the not-too-distant future in the sprawling Boston-Atlanta Metropolitan Axis (BAMA), now enclosed in an climate controlled twi-lit dome and networked into completely globalized post World War 3 world dominated by large, secretive corporations and peopled with mercenaries, a super-rich elite, and technology and plastic surgery-augmented hackers who plug into a vast immersive cyberspace (Gibson coined the term). Somewhere in the center of this live two advanced artificial intelligences who are trying to unite.

Now you may love Science Fiction or you may despise it, but you must accept that it was for a lot of male adolescents deeply linked with growing up in the 1960s and 1970s. Starting with HG Wells and Jules Verne and accelerating rapidly after the Second World War, science or “speculative” fiction travelled on a parallel path with the modern world, sometimes ridiculous schlock (Edgar Rice Burroughs, Star Wars), other times prophetic (Ray Bradbury, Robert Heinlein, Arthur C. Clarke).  It was history written in advance, then documented as it happened like a Wikipedia entry. Our grandparents saw Lindbergh fly across the Atlantic in a newsreel. We watched and listened to astronauts walk on the moon on the TV sets in our living rooms.

What Gibson and some of his peers did was toss aside the ridiculous far off future speculation and alien laser wars, and replace it with a dark, near future world almost like our own. We could already feel the present speeding up to a closer and strange future, and then Gibson brought the future back to the present. The two met at an explosion of magnitude between now and whatever lay next, beyond the event horizon. It took over 100 years for some of  Vernes’ predictions to come true, but now we could see things come true one by one in almost real time.

Life imitated art. We saw computers in movies, then we saw one in our school, and then we had our own. (It was an Apple of course.) It all happened very quickly. One day you stood on the street and people walked by, lost in their thoughts or looking in windows. Then everyone on the street was talking into a microphone to a phantom or staring at a small screen, there but not there. Where once you scoured through an old used bookstore for the odd and the unusual that might wash up there, now every text and every fact in the world, from the mundane to the bizarre, was scanned and digitized and searchable in your hand.  Issac Asimov: “Any sufficiently advanced technology is magic.”

At the turn of the 20th century the change became exponential. That’s what some people said. If you’d lived long enough, you could start to feel as if everything had changed. Perhaps it was changed faster. The Internet and digitization. Extreme globalization. The human genome and genetic engineering. Battle robots and drones in a never-ending war against global terrorism. Nanotechnology. Financial engineering and high frequency computerized stock trading in cyberspace. Monitoring cameras on every corner with facial recognition software. Cyber attacks against governments and corporations. Wikileaks and social media where everything you do is public. A self-driving car from Google. A supercomputer that defeats a grand master in chess. And another that beats the champions of Jeopardy.

Try for a minute to imagine Steve Jobs as a character in a Gibson novel and Apple as one of Gibson’s megacorporations. It’s not that strange. The secretive magician leader of the most valuable company in the world, with more cash on hand at one point than the U.S. Treasury. Up on a giant stage in 2022, unveiling the latest Apple-Maas-Neotek bio-chip powered personal assistant.  Could he be Josef Virek, the enormously rich industrialist in Count Zero being kept alive in a vat in Stockholm who presents himself when necessary through virtual environments to interact with others? Would Apple announce that the company would continue to be run by the saved consciousness of Jobs, backed up at his death in a ROM, like “The Dixie Flatline” in Neuromancer? Or Apple might announce plans to build a satellite world in high orbit to clone Jobs, as the Tessier-Ashpools did?  Sounds crazy, but some are already predicting that at the iPhone 5 launch sometime in 2012, a posthumous video of Jobs will take the stage to announce his “one more thing.”

In the last 50 years something really did change. It wasn’t an illusion. It happened faster than anyone except maybe a science fiction writer could have predicted. And Jobs was at the center of a shift that seemed to be happening and even accelerating. He wasn’t the only piece by far, of course. Apple in 2011 is the visible convergence of many important threads of technological change. Jobs wasn’t causing the change, but it was definitely operating through him, and through others in technology. And at his death it seems like we’re not the end of change, but just the very beginning.

“Okay, now you’re reaching a little,” my old friend interrupted. It wasn’t 1984 anymore. “That’s not quite how it turned out. The mega-corporations in Gibson are a lot more evil than the companies of today.”

“Well, all right,” I said, “It’s just that he’s dead, and that’s the reason we’re talking about it.”

“You know,” he said. “Don’t worry so much about death. It’s probably not so bad.”

5. The Singularity

In 1958, a friend of the mathematician and theorist John Von Neumann wrote of a conversation where they discussed “the ever-accelerating progress of technology… gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.”

Ray Kurzweil, the inventor and futurist, seizes on this conversation as possibly the first reference to what people now call the Technological Singularity, or just the Singularity.

A mathematical singularity is a value where a function is not defined, like the explosion in magnitude of (y) in y=1/x as the value of (x) approaches zero. In physics, a space-time or gravitational singularity is a place of near infinite gravity and density, as in a Black Hole. In his 2005 book The Singularity is Near, Kurzweil defines the Singularity as “a transforming event looming in the first half of the twenty-first century…a future period during which the pace of technological change will be so rapid, its impact so deep, that human life will be irreversibly transformed.”

Kurzweil spent many years studying evolution and technology and came to believe that the history of the universe would consist of six epochs, where each “uses the information processing methods of the previous epoch to create the next.” In the beginning, the universe is born, along with the rules of physics and chemistry, the basic building blocks of everything. In the second epoch, life and DNA evolve the ability information in genes. In the third epoch, animal brains evolve to store information in neural patterns and, in humans, to acquire, design, and store mental patterns. And in the fourth epoch, where we are now, comes the invention of telecommunications and the creation of digital computers. Humans begin to design and build “intelligent” machines that replicate or augment the capabilities of the brain.

Still with me? Now change is happening faster, and the epochs grow shorter. DNA is billions of years old. It took advanced mammals approximately 100,000 years to add each additional cubic inch of brain matter. Today the computational capacity of computers is doubling every year. Kurzweil says that people have always dramatically underestimated future possibilities because of an inherently “linear” view of change. However, in the last 50 years or so, we seem to be becoming aware, waking up to a rapidly accelerating rate of change. Most observers seem to credit a number of forces for the palpable change of today:

  • Moore’s Law, the observation that the number of transistors on an integrated circuit can cost-effectively double every one to two years (and has been since 1971).
  • The growth of online data, or “big data,” since the creation of the Internet has created fuel for the expanded processing capacity to do something with.
  • The explosion of usage and activity on the Internet, over 2 billion people essentially training systems like Google’s search engine in the meaning and context of data.
  • Maturing software componentry, from machine language to 4GL and beyond, which vastly increases the productivity and power of engineers and developers.

But what happens next? Nobody really knows, admits Kurzweil. But he and other “Singularitarians” believe something critical occurs during the fourth epoch when technology becomes recursive. When software can change its own source code, and computers can design computers, and robots can build better robots, then the rate of change grows hyperbolic. Machine and human intelligence merge, yielding machine-augmented biology and biologically-enhanced machines. Some sort of self-aware artificial intelligence, like the ones in Gibson’s Neuromancer, may appear next, for better or worse. Then in the end, either instantaneously or gradually, this newly expanded intelligence spreads into all the matter in the universe, essentially turning everything into one giant intelligence powered by all the energy in the universe.

That’s the Singularity. And it’s coming sooner than you think, according to a loose knit group of technologists, futurists, novelists, and even possibly Google and NASA, who helped found the Singularity University in 2000 at a research park in Silicon Valley. Kurzweil points to the middle of our own century, not soon enough for me probably, but close enough for my son to see it unfold. But who knows?

All of this has attracted plenty of criticism and skepticism. It’s not really science, of course, but rather speculative non-fiction. Depending on your point of view, the Singularitarian vision can be interpreted as either utopian or dystopian. Like tragedy and comedy, it depends on your perspective. Humanity is divided into two camps, those who see man-made technology as the source of the world’s problems, and those who see it as the solution.

Could it be that every generation imagines some kind of enormous transformation is just over the horizon? In the Middle Ages, Europeans awaited the imminent return of Jesus Christ.  In the 1970s, we thought the world would end by nuclear holocaust or overpopulation.The Aztecs predicted that in 1519 their god Quetzalcoatl would re-appear, and then suddenly Cortez arrived at the gates of Tenochtitlan and everything really did change.

The aspect of the Singularity that gets the most airplay is the prospect of vastly extended human life, perhaps even immortality, at least for an uber-class of billionaires who can afford to have their consciousness poured into a non-biological vessel.

Interestingly, Steve Jobs didn’t seem to have much interest in immortality. At his commencement address at Stanford University in 2005, he said “Death is the destination we all share. No one has ever escaped it. And that is as it should be, because Death is very likely the single best invention of Life. It is Life’s change agent. It clears out the old to make way for the new.”

6. One more thing

You may remember way back in early October, the day before Steve jobs died. At an Apple press event in San Francisco, they introduced the new iPhone 4s to lackluster reviews. Jobs charisma was missed, people said. There was no new form factor or radical change. Not different enough.

Of course now, three weeks later there are over 75,000 videos on YouTube of Siri conversing with its owners, two Siris talking with each other, Siri “easter eggs,” a mock Siri app for the Android platform, Siri hacks, and so on.

Maybe Siri will turn out to be another bust, like pen computing. Or maybe it will turn out to be a little parting gift from Jobs, who knew that the forces had converged and now was the time. Analysts are predicting that in 2012 Apple will finally release an integrated Apple television product with Siri voice control.

Interestingly, a lot of the press coverage of Siri has kind of missed the point. It’s not really about speech recognition. In fact, Apple is reportedly using using the same Nuance speech recognition technology that’s been available on various platforms, including the iPhone, for years.

Siri, like Nuance, is a byproduct of US Military funded project called CALO (Cognitive Assistant that Learns and Organizes), the largest AI project in history. Nuance does the speech to text processing while Siri does performs the syntactical and semantic breakdown to figure out what we really mean and then decide how to respond. It’s closer in that regard to Watson, the IBM Q&A system that defeated Jeopardy champions Ken Jennings and Brad Rutter in an exhibition match earlier this year. Siri is also just the friendly voice and ear of an enormous processing engine housed in Apple’s new 500,000 s.f. data center in North Carolina. There Siri is listening, thinking, searching, and learning about how we think and what we want, each of us individually, and all of us collectively. She’ll get smarter.

In William Gibson’s Mona Lisa Overdrive, Kumiko is a Japanese girl sent by her father to London for protection from the Yakuza. He sends along a device, “a small dark oblong, one side impressed with the ubiquitous Maas-Neotek logo, the other gently curved to fit the user’s palm.” In 1988 it certainly sounded like science fiction:

“The ghost woke to Kumiko’s touch as they began their descent into Heathrow. The fifty-first generation of Maas-Neotek biochips conjured up an indistinct figure on the seat beside her, a boy out of some faded hunting print, legs crossed casually in tan breeches and riding boots. ‘Hullo,’ the ghost said.

Kumiko blinked, opened her hand. The boy flickered and was gone. She looked down at the smooth little unit in her palm and slowly closed her fingers.”

Years ago Rodney Brooks, the robotics and artificial intelligence researcher and entrepreneur, when asked to name the goals that robot developers should be aiming for, came up with the following list: the object-recognition capabilities of a 2-year-old, the language capabilities of a 4-year-old, the manual dexterity of a 6-year-old, and the social understanding of an 8- or 9-year-old.

Now Brooks probably meant this to show how much progress needed to be made and what difficult work it would be. But it convinces me that if and when we do meet the first trans-human life form, the first self-aware AI, it isn’t going to be a metal Frankenstein or a menacing Terminator. Maybe it’s going to appear as a child. A very precocious child. But a child who is also an “old soul.” And he will seem to be the most interesting little person you’ve ever met. And the first time you meet him he will already seem like an old friend, as if he has been here all along.

It’s easy to look at this polluted old world, as people did in the 1960s, and think that it can’t possibly survive another generation. But this crazy talk of Kurzweil’s makes me wonder if we are not only at the very beginning, like Adam and Eve. And I want to know how the story is going to turn out.

***

Learn more: 

 


  • Memory Years

    Tired of hearing about Steve Jobs.  He’s dead, tough.

  • T.Rob

    He was definitely in the right place at the right time. If Steve Jobs had been born in central Africa or even Michigan, would there still have been an Apple?

  • http://avocadopress.com Puranjay

    This will take some time to read. But the first 1000 words are definitely a solid lure for the weekend. Bookmarked and looking forward to finishing it off over breakfast on a Sunday morning :)

  • Michael J. Reilly

    Great piece of writing and
    social commentary! I enjoyed the connections to science fiction and the
    observations on how our digital world is closer to this genre than most would
    have ever predicted. Imaginative, personal and well written.

  • Pingback: fredwhite.net » Blog Archive » The Used Book Store Was the Internet of the 1970s

  • Pingback: fredwhite.net » Blog Archive » TO DO: Go to an old movie and then buy somebody a book for Christmas