Wednesday, February 3, 2010

2 Brainwave Technology articles

http://hplusmagazine.com/articles/neuro/bci-x-prize-time-it%E2%80%99s-inner-space

irgin Galactic and Scaled Composites recently rolled out SpaceShipTwo, a commercial passenger spaceship designed after the winning ship that captured the $10M Ansari X PRIZE for spaceflight in 2004. For those few of you who don't yet know, an X PRIZE is a $10 million+ award given to the first team to achieve a specific goal, set by the X PRIZE Foundation, which has the potential to benefit humanity.)

The latest X PRIZE, however, has nothing to do with the commercialization of outer space. The Brain-Computer Interface (BCI) X PRIZE will reward nothing less than a team that provides vision to the blind, new bodies to disabled people, and perhaps even a geographical “sixth sense” akin to a GPS iPhone app in the brain. Communicate by thought alone? Recent h+ articles have explored early research into this intriguing possibility (see Resources).

Peter Diamandis modeled his Ansari X PRIZE after the Orteig Prize that Charles Lindbergh won in 1927 by flying solo across the Atlantic Ocean. Inspired by President Kennedy's 1961 goal of putting a man on the moon by the end of the decade, Diamandis has, in turn, inspired pioneers and risk takers to take up the X PRIZE Challenge of flying humans into space — except this time it’s inner space.

The Brain-Computer Interface X PRIZE will reward a team that provides vision to the blind, new bodies to disabled people...

A recent workshop on the BCI X PRIZE – sponsored by Singularity University and held on the campus of MIT – brought together Peter Diamandis (Chairman of the X PRIZE Foundation), Ray Kurzweil, John Donoghue (Founder of Cyberkinetics), Dr. Gerwin Schalk (holds a brain computer interface patent), and Ed Boyden (MIT Synthetic Neurobiology Group). Diamandis’ X PRIZE foundation is just starting to conduct interviews with experts, governments, and potential competitors. The foundation must court donors to make the $10 million+ prize a reality. Once funding is secured, companies and teams from around the world will compete – as Burt Rutan once did with financing from Microsoft co-founder Paul Allen to engineer SpaceShipOne. The intent is that one or more teams will engineer a BCI solution with “ideas that could be won with a decade.”

During the MIT workshop, Peter Diamandis discussed the history of the X PRIZE. Ray Kurzweil followed with a 36-minute presentation called “Merging the Human Brain with Its Creations.” Here’s a video of the presentation:

After presentations by Donoghue, Schalk, and Boyden, the 50 or so workshop attendees broke into discussion groups on Input/Output, Control, Sensory, and Learning. Software Engineer and Singularity University alumnus Rod Furlan, who attended the workshop, writes about some of the problems discussed at the break-out sessions — for example, communicating with a brain v. implanting memories or skills, non-invasive v. invasive input/output solutions, and the difficulties of using EEG to capture brain wave states. Furlan concludes, “While we still have significant technical and scientific hurdles ahead of us, given the current pace of progress it is reasonable to expect that robust, albeit limited, implanted BCI solutions will be widely available commercially within a 10 to 20 year time frame.”

James Cameron’s trippy vision of a part-alien, part-human body controlled by the thoughts of a marine in the film Avatar or William Gibson's cyber-cowboy Case plugging into the Matrix, as in Neuromancer — if truly feasible — are likely more than 10 years out. But given the incentive of a $10 million+ BCI X PRIZE, who knows what might be possible by 2020?

http://www.hplusmagazine.com/articles/neuro/mind-reading-neural-decoding-goes-mainstream

Mind Reading (Neural Decoding) Goes Mainstream

In the new movie, The Men Who Stare at Goats, reporter Bob Wilton confronts Special Forces operator Lyn Cassady, “I’ve heard that you’re a psychic spy.” Lyn later comments, “We’re Jedi. We don’t fight with our guns, we fight with our minds.” Mind reading – formerly the stuff of science fiction and crystal gazers – is rapidly becoming science fact. A recent CBS 60 Minutes story reports that “technology may soon ‘read’ your mind” in this video (courtesy of CBS):

Toys such as Mattel’s Mindflex™ and the Start Wars Force Trainer™ include brain wave detection technology and are now readily available at your local Target or Walmart stores. For a younger generation raised on telekinetic X-Men – from Professor Xavier to Magneto – these fascinating mind-over-matter toys offer limitless play time opportunities:

NeuroSky leads the market in creating inexpensive, consumer brain-computer interfaces. NeuroSky's brain-reading hardware and software headsets are being designed for the automotive, health care and education industries. Using their Mindset™ package you can become NeuroBoy™ and use your special telekinetic powers to push, pull, lift, or burn objects in a virtual world –- by thought alone.

Emotiv Systems, a San Francisco-based neuroengineering company founded in 2003 by four award-winning scientists, builds EEG-based headsets that pass your brain’s electrical signals to software on your PC to extract patterns and translate them. As with the NeuroSky product, you can move objects in virtual worlds on your PC using Emotiv’s EPOC™ Neuroheadset:

In light of a recent announcement at the 2009 Society for Neuroscience conference in Chicago, “mind reading” has taken another scientific leap forward. Researchers are now able to determine what vowel and consonants a person is thinking of by recording activity from the surface of the brain. An MIT Technology Review editorial reports that Gerwin Schalk and colleagues at the Wadsworth Center, in Albany, NY, used a technology called electrocorticography (ECoG), in which a sheet of electrodes is laid directly on the surface of a patient's brain. Schalk's team asked patients to say or imagine words flashed on a screen while their brain activity was recorded. The researchers then used specially designed decoder algorithms to predict the vowels and consonants of the word, using only the pattern of brain activity. They found that both speaking and imagining the word gave roughly the same level of accuracy. This is essential for the system to be used by people who are so severely paralyzed that they have lost the ability to speak. The system has about a 50-to-70% accuracy rate. It may one day become a neural prosthesis for people with severe paralysis, translating their thoughts into actions on a computer or prosthetic limb.

It's understandable that researchers are wary of having their work referred to as mind reading. They call it neural decoding.

Advances in research-enabling technologies, such as functional magnetic resonance imaging (fMRI) and computational neuroscience, are resulting in techniques that can better assess the neural basis of cognition and allow the visualization of brain processes –- as well as thought-directed control of prosthetics. Government-financed projects include neural control of mechanical arms, hands and legs. These intelligent artificial limbs will be controlled by your nervous system and will allow you to pitch a fastball, thread a needle or play a piano as well as you did before your loss.

These developments are raising concerns about the potential exploitation of "mind reading" technologies by advertisers or oppressive governments. So it's understandable that researchers are wary of having their work referred to as mind reading. Emphasizing its limitations, they call it neural decoding. Jack Gallant, a leading "neural decoder" at the University of California, Berkeley, has produced some of the field's most impressive results yet. He and colleague Shinji Nishimoto showed that they could create a crude reproduction of a movie clip that someone was watching just by viewing their brain activity. Other neuroscientists claim that such neural decoding can be used to read memories and future plans and even to diagnose eating disorders.

Brain waves

Toyota is developing an advanced brain-sensing system that controls the movement of a wheelchair by reading a user's thoughts alone. By detecting and processing brain wave patterns, the system can “propel a wheelchair forward, as well as make turns, with virtually no discernible delay between thought and movement,” according to a recent press release. Rival automaker Honda’s Asimo robot can also be manipulated by detecting brain signals. Honda is exploring the concept that humanoid robots may one day replace home care nurses:

What was once speculative fiction -- the ability to read minds and to control the movement of objects using thought alone, sometimes called mind-over-matter –- is rapidly becoming neurotechnological fact. The upside of this technology will more freedom for the physically impaired –- imagine wheelchair-bound physicist Stephen Hawking able to control his wheelchair and capture and communicate his thoughts and sentences with a neuroheadset. The obvious downside is the potential dystopian nightmare of “thought police” strapping you to a chair to view the contents of your mind and gain a confession.