Last year I spoke to my mobile phone. I wasn’t ringing anyone, but I asked my phone a question. No answer. Last week I spoke to my phone, and it gave me some answers right there on my screen. Soon there will be no need to read an answer, and in another few decades there may not even be a question. The singularity is rising, and futurists tell us that we will be our technology, and information will be who we are or what it made us. As we watch the fast-paced changes taking place in technology, the web of data and the social connections between us, the value of information as knowledge remains the core business of librarians, teachers and info-nerds.
It starts with the mobile device in your hand and Siri is a tool that I am constantly being surprised by. Here’s why.
For Apple lovers like me the iPhone 4S was at first a mixed blessing. I was desperately in need of a new upgrade, but initially underwhelmed with the features of the 4S. (Where was the iPhone 5 I had been dreaming of?)Siri (voice recognition software on the new iPhone operating system) has made my life easier and most importantly it has increased my productivity.How? In the data driven world of education that demands consistent documentation–evidence that I’m doing my job– Siri has enabled me to document student conferences and create comments to post on student work. Here’s a link that lays out everything for you.
In short, Diigo is an amazing tool for knowledge workers to annotate, archive and organize the web – either for yourself or in collaboration with others. And as an educator, you even get a free upgrade to a Diigo Education account with unlimited highlighting. Cha-ching!!
As a teacher, my Evernote use falls into three categories:
- Prior to class
- During class
- After class
Evernote for Teachers is is a great tool for teachers to capture notes, organize lesson plans, collaborate on projects, snap photos of whiteboards, and more.
But seriously, I wonder where it will actually end. Using tools FOR empowering our thinking and organisation of ideas and workflow is one thing. Using technology to BE me is quite another.
If you have followed the topic of the singularity, and the merger between humans and machines, you’ll have an idea why this news report about cyborg futures is weirdly scary.
3D printing is a mere blip on the creative horizon of Dmitry Itskov and his project. Scientists are taking tiny, incremental steps towards melding humans and machine all the time. Ray Kurzweil, the futurist and now Google’s director of engineering, argued in The Singularity Is Near, a 2005 book, that technology is advancing exponentially and that “human life will be irreversibly transformed” to the point that there will be no difference between “human and machine or between physical and virtual reality”.
To change that picture, he reasons, we must change our minds, or give them a chance to “evolve,” to use one of his favourite words. Before our minds can evolve, though, we need a new paradigm of what it means to be human. That requires a transition to a world where most people aren’t consumed by the basic questions of survival.
Hence, avatars. They may sound like an improbable way to solve the real problems on Itskov’s laptop, or like the perfect gift for the superrich of the future. But the laws of supply and demand abide in Itskov’s utopia, and he assumes that once production of avatars is ramped up, costs will plunge. He also assumes that charities now devoted to feeding, clothing and healing the poor will focus on the goal of making and distributing affordable bodies, which in this case means machines.
For now, just acquiring a lifelike robotic head is a splurge. Among the highlights of the New York congress will be the unveiling of what Itskov describes as the most sophisticated mechanical head in history.
Weird, right? Check out our progress in this timeline from the same article.
On the road to avatars
Some random stops along the way to joining humans and machines.
1784: First known use of the word “avatar”, according to the Merriam-Webster dictionary. From Sanskrit, it refers to a Hindu deity in human form.
1924: Hans Berger begins the history of brain-computer interfaces by developing EEG, which measures electrical activity in the brain.
1958: In Sweden, Arne Larsson becomes the first person to receive a surgically implanted pacemaker.
1961: The first cochlear implant, called a bionic ear. It marks the first time a machine is able “to restore a human sense”.
1987: Max Headroom, about a fictional avatar, makes its debut on TV. In the story line, Max was created by downloading the memories of a TV reporter into a computer.
1992: Snow Crash, a Neal Stephenson novel, helps popularise avatars. “If you’re ugly,” he writes, “you can make your avatar beautiful.”
1997: Researchers at Emory University teach a stroke victim to use electrodes implanted in his brain, and sensors taped to his body, to move a cursor and spell words with his thoughts.
2003: Linden Lab starts Second Life, an online world that allows users to create avatars that can interact with other avatars.
2008: At Duke University, a monkey implanted with a brain-computer interface controls a robot on a treadmill in Japan.
2011: Dmitry Itskov starts the 2045 Initiative.
2012: At the University of Pittsburgh, a quadriplegic woman, Jan Scheuermann, eats a chocolate bar attached to a robotic arm controlled by implants in her brain.
2013: The MIT Technology Review reports that Samsung is working on a tablet computer that can be controlled by your mind.
Image: Warhol bots.
- 2045 Initiative Shoots for Android Surrogates, Immortality (tested.com)
- Creating Machine Avatars For All (tightwind.net)