Friday, October 2, 2009

What is the Singularity, and why does it matter?

I know I won’t get any sympathy from most of you, but I’m not usually awake at this hour (8:05 am). Today I’m on the Acela Express, already speeding between Providence and Stamford, CT. They say it reaches a top speed of 150 mph, but it only does so briefly, because our country is too shortsighted to invest in a decent set of tracks on the nation’s busiest rail corridor. The regular train takes only 30 minutes longer to travel between Boston and NYC. Regardless, I should be at Penn Station by 10:45 am.

After checking in at the Americana Inn, I plan to take in a museum or two today. The conference starts bright and early on Saturday, so I don’t plan to stay up late.

I was hoping to finish Kurzweil’s book, The Singularity Is Near, but I won‘t have time. The book’s subtitle is ’When humans transcend biology’. The Singularity has also been described as the era when humans merge with their technology. Most important in my mind, however, is the notion that the Singularity is a point where technology will advance at such a rate that it will feed on itself and appear to progress almost infinitely fast. This will happen because of the confluence of several forces. The lower cost and higher power of computing. The advent of AGI (Artificial General Intelligence, to distinguish itself from so-called Narrow AI, such as chess-playing computers, or financial trading programs). The fruition of the Human Genome Project in new medicines extending our lives. Nanotechnology transforming everything from manufacturing to medicine.

I really want to believe in the bright, limitless future being painted by most of the presenters at the conference. However, I am a little skeptical that this is just another flying car prediction from middle of last century, dusted off for this century. This latest round is a little different, however, with much greater emphasis on evidence-based predictions. The thrust of Kurzweil’s book is that technology always has and always will advance exponentially. Supposedly we’re now entering the ’knee of the curve’ where technology acceleration becomes so rapid that the Singularity will happen, after which all bets are off.

I’m also a little concerned with the short shrift given to ethics and safety in the realms of AGI and nanotechnology. There’s some talk of it, but I’d like to see every speaker address these topics with more than patronizing assurances. As in Donald Fagen’s ‘Glorious Time to be Free’, we need to be wary of a future where we have “Just machines to make big decisions, programmed by fellows with compassion and vision.” Fagen’s lyric subtly implies that an invention is only as smart as its creator. But what about when the computers are programmed by other computers? How will we even know what’s going on, let alone control it?

I will also be shopping for new career paths here. This is based on the realization that whether the Singularity comes in 2030 or never, my job is likely to be outsourced to a computer before I’m ready to retire. Even narrow AI could do what I do for a living (design and construct software). I’m intrigued by making the programs that will make future programs, but such a worker inherently seeks to make himself obsolete. What jobs will never be outsourced? If you assume that moderately skilled and intelligent robots will be around in the next 20 years, few jobs are safe. Will humans become obsolete? Are The Matrix and The Terminator accurate predictions after all? What can we do to ensure AGI, robotics, and nanotechnology remain beneficial to humans? Or at least beneficial to what humans are evolving into? Because that’s the real story here: evolution by natural selection is coming to a close. From now on, evolution by intelligent artificial selection will leave natural selection in the dust. What will you do when the future arrives? Happily go with the flow, possibly transforming yourself radically? Or be one of the biological Luddites whose bloodlines eventually peter out?

“Homo sapiens, the first truly free species, is about to decommission natural selection, the force that made us… [S]oon we must look deep within ourselves and decide what we wish to become.” -- E.O. Wilson, Consilience: The Unity of Knowledge, 1998.

No comments:

Post a Comment