Wednesday, April 4, 2012

iBrain uses single EEG

 http://www.neurovigil.com/

Transmits all brainwave data through single EEG to wireless device/phone to monitor brain activity/sleep. 

http://www.10news.com/news/30829661/detail.html

La Jolla-Based NeuroVigil Says iBrain Can Help Read Thoughts

A local company has released a stunning finding after performing some unique tests on the human brain, including that of famed physicist Dr. Stephen Hawking.

Almost completely paralyzed from Lou Gehrig's disease, Hawking uses face muscles to operate a voice machine.
Hawking's face muscles are slowly failing, but soon he may not need them, thanks to technology by La Jolla-based NeuroVigil."I'm very enthusiastic about it," NeuroVigil Chairman Philip Low said of the iBrain.Low created the iBrain, the world's first mobile brain scanner. Low, who first met Hawking at a conference, said he believes it can read Hawking's thoughts.The iBrain fits right over the head and features three electrodes that connect very easily.

This past summer, Low flew to Cambridge, England, and Hawking agreed to wear the device. He was asked to think very hard about doing various tasks while his brain waves were tracked.Using a special algorithm to convert the wave patterns, Low learned each of those thoughts produced its own distinct brain wave patterns, which can be mapped out.

Eventually, Low believes a computer could read the patterns and speak Hawking's thoughts."We'd like to find a way to bypass his body, pretty much hack his brain," said Low.That ability to decipher each person's brain wave patterns could have major implications."Pharmaceutical companies can now fine-tune the drugs for individuals. This is the first step to personalized medicine," Low said.iBrain is also being looked at as a possible way to monitor and diagnose a number of conditions -- from sleep disorders, depression and neurological disorders, to autism and post-traumatic stress disorder."This is very exciting for us because it allows us to have a window into the brain. We're building technology that will allow humanity to have access to the human brain for the first time," said Low.NeuroVigil, a small company of about 11 employees, has received a big part of its funding from winning awards.Low said because doctors can monitor the device remotely, it could also save a lot in medical costs.

--

My Editorial:

 In 1997, I created an animated visualization and presented at Siggraph conference about a concept that would require conversion of brainwaves into an audio and visual sequence.

Ideally, the brainwave algorithms would produce images and sounds in the same way that algorithms produce fractal images and sounds, and in real-time.

The idea would be for people to be able to see and hear a representation of their current brainwave state and be able to consciously alter the pattern using the audio-visual feedback to reach a target state.

In my case, the intention was to create a method for people to reach target states that are more suitable for immersion into therapeutic virtual environments, and for receiving suggestion from a therapist.

At the time, I learned that a primary obstacle for achieving such a design amongst others was in getting the many signals. It sounds like iBrain may have achieved a way of solving this particular issue.

I also am interested in dream states and would be like to see such a representation in an audible and visual form - I am interested in how these kinds of technologies might be used for achieving new forms of communication. I understand that there was some kind of audible recording of dream states achieved my a Canadian team sometime around 2002-03, I'd have to dig up my old notes.

Of course, at some point the real excitement would be to project mental imagery and sounds in real time and to record imagery and sounds from dreams - I truly believe this is possible.

I don't know whether the iBrain technology bridges any kind of solutions for producing real-time affectations, as with BCIs.  Moving objects or controlling imagery could be a matter of mapping the mind to a library of imagery and sounds as a means of communicating. That library could become a common library that people use to communicate, much like an alphabet, but only limited to the library contents of imagery and sounds, or even sensations, smells, etc.

For now, I want to know whether a more literal audio-visual representation of brainwaves might be possible as I've described (real-time algorithmic sound and visuals created as fractal imagery and sounds are created) using iBrain.

Here is a link to some of the imagery - the final animation is only about 37 seconds long - I thought about creating a loop for inducing target states or even creating a track that could be sped up or slowed down through biofeedback - again, a library that could be triggered - surely Stephen Hawking could use it:

http://fallowfields.org/wave.html