Wednesday, October 31, 2012
Tuesday, October 23, 2012
Raised by animals
Family of woman who was 'raised by monkeys' speak out
http://www.telegraph.co.uk/telegraphtv/9637693/Family-of-woman-who-was-raised-by-monkeys-speak-out.html
--6 cases of children being raised by animals
http://theweek.com/article/index/235216/6-children-raised-by-animalsUnlike Rudyard Kipling's The Jungle Book, these tales of feral kids surviving in the wild are decidedly darker and reportedly true
A British housewife claims that when she was
about 5, she spent five years living as part of a pack of capuchin
monkeys in a Colombian jungle. Photo: Wolfgang Kaehler/CORBIS
1. Raised by monkeysWhen she was about five years old, Marina Chapman says she was kidnapped, probably for ransom, but was then abandoned in the Colombian jungle. For some five years, she lived out in the wild, where she was taken in by a group of capuchin monkeys, which experts say are known to accept young children into their fold. The animals taught young Marina how to catch birds and rabbits with her bare hands, so she was able to survive. She rejoined the human world when she was taken by hunters and sold to a brothel, from which she eventually escaped.
2. Raised by goats
In June 2012, social workers in Russia discovered a toddler who had been locked in a room with goats by his mother. The boy reportedly played and slept with the goats, but nourishment was apparently hard to come by as he weighed a third less than a typical child of his age. When the child was rescued, his mother had disappeared. Doctors have since tried to acclimate the toddler to human life, with some difficulty. "He refused to sleep in the cot. He tried to get underneath and sleep there. He was very scared of adults," one doctor said.
3. Raised by feral cats and dogsIn 2009, welfare workers were led to an unheated flat in a Siberian town where they found a 5-year-old girl they called "Natasha." While technically living with her father and other relatives, Natasha was treated like one of the many dogs and feral cats that shared the space. Like her furry companions, Natasha lapped up food from bowls left on the floor. She didn't know any human words and only communicated with hisses and barks. The father was nowhere to be found when authorities rescued the girl, and Natasha has since been placed in an orphanage.
4. Raised by wild catsArgentinean police discovered an abandoned 1-year-old boy surrounded by eight wild cats in 2008. The cats reportedly kept the boy alive during the freezing winter nights by laying on top of him and even tried to lick the crusted mud from his skin. The boy was also seen eating scraps of food likely foraged by his protective brood.
5. Raised by wild dogsA 10-year-old Chilean boy was found in 2001 to have been living in a cave with a pack of dogs for at least two years. The boy had already survived a rough and unstable childhood, having been abandoned by his parents and then fleeing alternative care. Alone, the child sought refuge with a pack of dogs who helped him scavenge for food and even protected him. Officials said the boy might have even drunk milk from one of the female dogs. "They were like his family," a spokesman said.
6. Raised by wolvesOne of the most well-documented cases of children raised by wild animals is that of Kamala and Amala, better known as the "wolf children." Discovered in 1920 in the jungles of Godamuri, India, the girls, aged 3 and about 8, had been living with a she-wolf and her pack. It's not known if the girls were from the same family, but the man who found the girls, Reverend J.A.L. Singh, took them back to his orphanage, where he tried to get them accustomed to their human surroundings. While the girls made some progress over the years, both eventually came down with fatal illnesses, leaving the reverend to wonder "if the right thing to do would have been to leave these children in the wild where I found them."
Motorola HC1: Google Goggles for the enterprise
http://www.networkworld.com/news/2012/102212-motorola-hc1-google-goggles-for-263559.html
Motorola HC1: Google Goggles for the enterprise
Called the HC1, the device runs on an ARM processor and has an optional camera to send back real-time video over a wireless network.
Unlike Google Goggles, though, the HC1 is aimed at the enterprise market with a price tag of US$4,000-$5,000 per unit.
Areas the company has been experimenting with include "high-end repair markets," such as aircraft engines, said Paul Steinberg, CTO of Motorola Solutions (which is the part of Motorola Google did not acquire). "Emergency medical personnel at trauma centers might be looking at this too."
The HC1 will augment what users see by providing additional data, he said. Multiple units could be networked together and share information.
See the HC1 in a video on YouTube.
One difficulty with products like the HC1 can be finding the exact position for the screen, so the user can see what's being displayed. The voice commands and gesture controls seemed accurate and responsive when a reporter tried them, however. Calling out category headings opens new applications.
The so-called "optical micro-display" from Kopin Corporation is supposed to simulate a view of a 15-inch screen.
The HC1 runs Microsoft Windows CE 6.0 Professional. When it ships in the first half of 2013 it will come with Wi-Fi connectivity, but Steinberg said it could eventually have 3G and 4G radios.
Nick Barber covers general technology news in both text and video for IDG News Service. E-mail him at Nick_Barber@idg.com and follow him on Twitter at @nickjb.
Motorola HC1: Google Goggles for the enterprise
The head-mounted computer is aimed at enterprises
By Nick Barber, IDG News Service
October 22, 2012 12:20 AM ET
October 22, 2012 12:20 AM ET
Unlike Google Goggles, though, the HC1 is aimed at the enterprise market with a price tag of US$4,000-$5,000 per unit.
Areas the company has been experimenting with include "high-end repair markets," such as aircraft engines, said Paul Steinberg, CTO of Motorola Solutions (which is the part of Motorola Google did not acquire). "Emergency medical personnel at trauma centers might be looking at this too."
The HC1 will augment what users see by providing additional data, he said. Multiple units could be networked together and share information.
See the HC1 in a video on YouTube.
One difficulty with products like the HC1 can be finding the exact position for the screen, so the user can see what's being displayed. The voice commands and gesture controls seemed accurate and responsive when a reporter tried them, however. Calling out category headings opens new applications.
The so-called "optical micro-display" from Kopin Corporation is supposed to simulate a view of a 15-inch screen.
The HC1 runs Microsoft Windows CE 6.0 Professional. When it ships in the first half of 2013 it will come with Wi-Fi connectivity, but Steinberg said it could eventually have 3G and 4G radios.
Nick Barber covers general technology news in both text and video for IDG News Service. E-mail him at Nick_Barber@idg.com and follow him on Twitter at @nickjb.
Monday, October 22, 2012
Human activity causes earthquakes
Back to my theory about underground nukes triggering earthquakes, and possibly intentional damage as with Fukishima via N.Korea who had set off explosions the size of those at Hiroshima.
http://www.ctvnews.ca/sci-tech/scientists-link-deep-wells-to-deadly-spanish-quake-1.1004512
Scientists link deep wells to deadly Spanish quake
he Associated Press
Published Sunday, Oct. 21, 2012 3:35PM EDT
Published Sunday, Oct. 21, 2012 3:35PM EDT
MADRID -- Farmers drilling ever deeper wells over decades to water
their crops likely contributed to a deadly earthquake in southern Spain
last year, a new study suggests. The findings may add to concerns about
the effects of new energy extraction and waste disposal technologies.
Nine people died and nearly 300 were injured when an unusually shallow magnitude-5.1 quake hit the town of Lorca on May 11, 2011. It was the country's worst quake in more than 50 years, causing millions of euros in damage to a region with an already fragile economy.
Using satellite images, scientists from Canada, Italy and Spain found the quake ruptured a fault running near a basin that had been weakened by 50 years of groundwater extraction in the area.
During this period, the water table dropped by 250 metres as farmers bored ever deeper wells to help produce the fruit, vegetables and meat that are exported from Lorca to the rest of Europe. In other words, the industry that propped up the local economy in southern Spain may have undermined the very ground on which Lorca is built.
The researchers noted that even without the strain caused by water extraction, a quake would likely have occurred at some point.
But the extra stress of pumping vast amounts of water from a nearby aquifer may have been enough to trigger a quake at that particular time and place, said lead researcher Pablo J. Gonzalez of the University of Western Ontario, Canada.
Miguel de las Doblas Lavigne, a geologist with Spain's National Natural Science Museum who has worked on the same theory but was not involved in the study, said the Lorca quake was in the cards.
"This has been going on for years in the Mediterranean areas, all very famous for their agriculture and plastic greenhouses. They are just sucking all the water out of the aquifers, drying them out," he told The Associated Press in a telephone interview. "From Lorca to (the regional capital of) Murcia you can find a very depleted water level."
De las Doblas said it was "no coincidence that all the aftershocks were located on the exact position of maximum depletion."
"The reason is clearly related to the farming, it's like a sponge you drain the water from; the weight of the rocks makes the terrain subside and any small variation near a very active fault like the Alhama de Murcia may be the straw that breaks the camels back, which is what happened," he said.
He said excess water extraction was common in Spain.
"Everybody digs their own well, they don't care about anything," he said. "I think in Lorca you may find that some 80 percent of wells are illegal."
Lorca town hall environment chief Melchor Morales said the problem dates back to the 1960s when the region opted to step up its agriculture production and when underground water was considered private property. A 1986 law has reduced the amount of well pumping, he said.
Not everyone agreed with the conclusion of the study, which was published online Sunday in Nature Geoscience.
"There have been earthquakes of similar intensity and similar damage caused in the 17th, 18th and 19th centuries when there was no excess water extraction," said Jose Martinez Diez, a professor in geodynamics at Madrid's Complutense University who has also published a paper on the quake.
Still, it isn't the first time that earthquakes have been blamed on human activity, and scientists say the incident points to the need to investigate more closely how such quakes are triggered and how to prevent them.
The biggest man-made quakes are associated with the construction of large dams, which trap massive amounts of water that put heavy pressure on surrounding rock.
The 1967 Koynanagar earthquake in India, which killed more than 150 people, is one such case, said Marco Bohnhoff, a geologist at the German Research Centre for Geosciences in Potsdam who wasn't involved in the Lorca study.
Bohnhoff said smaller man-made quakes can also occur when liquid is pumped into the ground.
A pioneering geothermal power project in the Swiss city of Basel was abandoned in 2009 after it caused a series of earthquakes. Nobody was injured, but the tremors caused by injecting cold water into hot rocks to produce steam resulted in millions of Swiss francs damage to buildings.
Earlier this year, a report by the National Research Council in the United States found the controversial practice of hydraulic fracturing to extract natural gas was not a huge source of man-made earthquakes. However, the related practice of shooting large amounts of wastewater from "fracking" or other drilling activities into deep underground storage wells has been linked with some small earthquakes.
In an editorial accompanying the Lorca study, geologist Jean-Philippe Avouac of the California Institute of Technology said it was unclear whether human activity merely induces quakes that would have happened anyway at a later date. He noted that the strength of the quake appeared to have been greater than the stress caused by removing the groundwater.
"The earthquake therefore cannot have been caused entirely by water extraction," wrote Avouac. "Instead, it must have built up over several centuries."
Still, pumping out the water may have affected how the stress was released, and similar processes such as fracking or injecting carbon dioxide into the ground - an idea that has been suggested to reduce the greenhouse effect - could theoretically do the same, he said.
Once the process is fully understood, "we might dream of one day being able to tame natural faults with geo-engineering," Avouac said.
Nine people died and nearly 300 were injured when an unusually shallow magnitude-5.1 quake hit the town of Lorca on May 11, 2011. It was the country's worst quake in more than 50 years, causing millions of euros in damage to a region with an already fragile economy.
Using satellite images, scientists from Canada, Italy and Spain found the quake ruptured a fault running near a basin that had been weakened by 50 years of groundwater extraction in the area.
During this period, the water table dropped by 250 metres as farmers bored ever deeper wells to help produce the fruit, vegetables and meat that are exported from Lorca to the rest of Europe. In other words, the industry that propped up the local economy in southern Spain may have undermined the very ground on which Lorca is built.
The researchers noted that even without the strain caused by water extraction, a quake would likely have occurred at some point.
But the extra stress of pumping vast amounts of water from a nearby aquifer may have been enough to trigger a quake at that particular time and place, said lead researcher Pablo J. Gonzalez of the University of Western Ontario, Canada.
Miguel de las Doblas Lavigne, a geologist with Spain's National Natural Science Museum who has worked on the same theory but was not involved in the study, said the Lorca quake was in the cards.
"This has been going on for years in the Mediterranean areas, all very famous for their agriculture and plastic greenhouses. They are just sucking all the water out of the aquifers, drying them out," he told The Associated Press in a telephone interview. "From Lorca to (the regional capital of) Murcia you can find a very depleted water level."
De las Doblas said it was "no coincidence that all the aftershocks were located on the exact position of maximum depletion."
"The reason is clearly related to the farming, it's like a sponge you drain the water from; the weight of the rocks makes the terrain subside and any small variation near a very active fault like the Alhama de Murcia may be the straw that breaks the camels back, which is what happened," he said.
He said excess water extraction was common in Spain.
"Everybody digs their own well, they don't care about anything," he said. "I think in Lorca you may find that some 80 percent of wells are illegal."
Lorca town hall environment chief Melchor Morales said the problem dates back to the 1960s when the region opted to step up its agriculture production and when underground water was considered private property. A 1986 law has reduced the amount of well pumping, he said.
Not everyone agreed with the conclusion of the study, which was published online Sunday in Nature Geoscience.
"There have been earthquakes of similar intensity and similar damage caused in the 17th, 18th and 19th centuries when there was no excess water extraction," said Jose Martinez Diez, a professor in geodynamics at Madrid's Complutense University who has also published a paper on the quake.
Still, it isn't the first time that earthquakes have been blamed on human activity, and scientists say the incident points to the need to investigate more closely how such quakes are triggered and how to prevent them.
The biggest man-made quakes are associated with the construction of large dams, which trap massive amounts of water that put heavy pressure on surrounding rock.
The 1967 Koynanagar earthquake in India, which killed more than 150 people, is one such case, said Marco Bohnhoff, a geologist at the German Research Centre for Geosciences in Potsdam who wasn't involved in the Lorca study.
Bohnhoff said smaller man-made quakes can also occur when liquid is pumped into the ground.
A pioneering geothermal power project in the Swiss city of Basel was abandoned in 2009 after it caused a series of earthquakes. Nobody was injured, but the tremors caused by injecting cold water into hot rocks to produce steam resulted in millions of Swiss francs damage to buildings.
Earlier this year, a report by the National Research Council in the United States found the controversial practice of hydraulic fracturing to extract natural gas was not a huge source of man-made earthquakes. However, the related practice of shooting large amounts of wastewater from "fracking" or other drilling activities into deep underground storage wells has been linked with some small earthquakes.
In an editorial accompanying the Lorca study, geologist Jean-Philippe Avouac of the California Institute of Technology said it was unclear whether human activity merely induces quakes that would have happened anyway at a later date. He noted that the strength of the quake appeared to have been greater than the stress caused by removing the groundwater.
"The earthquake therefore cannot have been caused entirely by water extraction," wrote Avouac. "Instead, it must have built up over several centuries."
Still, pumping out the water may have affected how the stress was released, and similar processes such as fracking or injecting carbon dioxide into the ground - an idea that has been suggested to reduce the greenhouse effect - could theoretically do the same, he said.
Once the process is fully understood, "we might dream of one day being able to tame natural faults with geo-engineering," Avouac said.
Sunday, October 21, 2012
Saturday, October 20, 2012
Friday, October 19, 2012
Petrol from air
http://www.independent.co.uk/news/uk/home-news/exclusive-pioneering-scientists-turn-fresh-air-into-petrol-in-massive-boost-in-fight-against-energy-crisis-8217382.html
Is scientific breakthrough a milestone on the road to clean energy?
A small British company has produced the first "petrol from air"
using a revolutionary technology that promises to solve the energy
crisis as well as helping to curb global warming by removing carbon
dioxide from the atmosphere.
Exclusive: Pioneering scientists turn fresh air into petrol in massive boost in fight against energy crisis
Is scientific breakthrough a milestone on the road to clean energy?
Air Fuel Synthesis in Stockton-on-Tees has produced five litres of
petrol since August when it switched on a small refinery that
manufactures gasoline from carbon dioxide and water vapour.
The company hopes that within two years it will build a larger, commercial-scale plant capable of producing a ton of petrol a day. It also plans to produce green aviation fuel to make airline travel more carbon-neutral.
Tim Fox, head of energy and the environment at the Institution of Mechanical Engineers in London, said: "It sounds too good to be true, but it is true. They are doing it and I've been up there myself and seen it. The innovation is that they have made it happen as a process. It's a small pilot plant capturing air and extracting CO2 from it based on well known principles. It uses well-known and well-established components but what is exciting is that they have put the whole thing together and shown that it can work."
Although the process is still in the early developmental stages and needs to take electricity from the national grid to work, the company believes it will eventually be possible to use power from renewable sources such as wind farms or tidal barrages.
"We've taken carbon dioxide from air and hydrogen from water and turned these elements into petrol," said Peter Harrison, the company's chief executive, who revealed the breakthrough at a conference at the Institution of Mechanical Engineers in London.
"There's nobody else doing it in this country or indeed overseas as far as we know. It looks and smells like petrol but it's a much cleaner and clearer product than petrol derived from fossil oil," Mr Harrison told The Independent.
"We don't have any of the additives and nasty bits found in conventional petrol, and yet our fuel can be used in existing engines," he said.
"It means that people could go on to a garage forecourt and put our product into their car without having to install batteries or adapt the vehicle for fuel cells or having hydrogen tanks fitted. It means that the existing infrastructure for transport can be used," Mr Harrison said.
Being able to capture carbon dioxide from the air, and effectively remove the principal industrial greenhouse gas resulting from the burning of fossil fuels such as oil and coal, has been the holy grail of the emerging green economy.
Using the extracted carbon dioxide to make petrol that can be stored, transported and used as fuel for existing engines takes the idea one step further. It could transform the environmental and economic landscape of Britain, Mr Harrison explained.
"We are converting renewable electricity into a more versatile, useable and storable form of energy, namely liquid transport fuels. We think that by the end of 2014, provided we can get the funding going, we can be producing petrol using renewable energy and doing it on a commercial basis," he said.
"We ought to be aiming for a refinery-scale operation within the next 15 years. The issue is making sure the UK is in a good place to be able to set up and establish all the manufacturing processes that this technology requires. You have the potential to change the economics of a country if you can make your own fuel," he said.
The initial plan is to produce petrol that can be blended with conventional fuel, which would suit the high-performance fuels needed in motor sports. The technology is also ideal for remote communities that have abundant sources of renewable electricity, such solar energy, wind turbines or wave energy, but little in the way of storing it, Mr Harrison said.
"We're talking to a number of island communities around the world and other niche markets to help solve their energy problems.
"You're in a market place where the only way is up for the price of fossil oil and at some point there will be a crossover where our fuel becomes cheaper," he said.
Although the prototype system is designed to extract carbon dioxide from the air, this part of the process is still too inefficient to allow a commercial-scale operation.
The company can and has used carbon dioxide extracted from air to make petrol, but it is also using industrial sources of carbon dioxide until it is able to improve the performance of "carbon capture".
Other companies are working on ways of improving the technology of carbon capture, which is considered far too costly to be commercially viable as it costs up to £400 for capturing one ton of carbon dioxide.
However, Professor Klaus Lackner of Columbia University in New York said that the high costs of any new technology always fall dramatically.
"I bought my first CD in the 1980s and it cost $20 but now you can make one for less than 10 cents. The cost of a light bulb has fallen 7,000-fold during the past century," Professor Lackner said.
The company hopes that within two years it will build a larger, commercial-scale plant capable of producing a ton of petrol a day. It also plans to produce green aviation fuel to make airline travel more carbon-neutral.
Tim Fox, head of energy and the environment at the Institution of Mechanical Engineers in London, said: "It sounds too good to be true, but it is true. They are doing it and I've been up there myself and seen it. The innovation is that they have made it happen as a process. It's a small pilot plant capturing air and extracting CO2 from it based on well known principles. It uses well-known and well-established components but what is exciting is that they have put the whole thing together and shown that it can work."
Although the process is still in the early developmental stages and needs to take electricity from the national grid to work, the company believes it will eventually be possible to use power from renewable sources such as wind farms or tidal barrages.
"We've taken carbon dioxide from air and hydrogen from water and turned these elements into petrol," said Peter Harrison, the company's chief executive, who revealed the breakthrough at a conference at the Institution of Mechanical Engineers in London.
"There's nobody else doing it in this country or indeed overseas as far as we know. It looks and smells like petrol but it's a much cleaner and clearer product than petrol derived from fossil oil," Mr Harrison told The Independent.
"We don't have any of the additives and nasty bits found in conventional petrol, and yet our fuel can be used in existing engines," he said.
"It means that people could go on to a garage forecourt and put our product into their car without having to install batteries or adapt the vehicle for fuel cells or having hydrogen tanks fitted. It means that the existing infrastructure for transport can be used," Mr Harrison said.
Being able to capture carbon dioxide from the air, and effectively remove the principal industrial greenhouse gas resulting from the burning of fossil fuels such as oil and coal, has been the holy grail of the emerging green economy.
Using the extracted carbon dioxide to make petrol that can be stored, transported and used as fuel for existing engines takes the idea one step further. It could transform the environmental and economic landscape of Britain, Mr Harrison explained.
"We are converting renewable electricity into a more versatile, useable and storable form of energy, namely liquid transport fuels. We think that by the end of 2014, provided we can get the funding going, we can be producing petrol using renewable energy and doing it on a commercial basis," he said.
"We ought to be aiming for a refinery-scale operation within the next 15 years. The issue is making sure the UK is in a good place to be able to set up and establish all the manufacturing processes that this technology requires. You have the potential to change the economics of a country if you can make your own fuel," he said.
The initial plan is to produce petrol that can be blended with conventional fuel, which would suit the high-performance fuels needed in motor sports. The technology is also ideal for remote communities that have abundant sources of renewable electricity, such solar energy, wind turbines or wave energy, but little in the way of storing it, Mr Harrison said.
"We're talking to a number of island communities around the world and other niche markets to help solve their energy problems.
"You're in a market place where the only way is up for the price of fossil oil and at some point there will be a crossover where our fuel becomes cheaper," he said.
Although the prototype system is designed to extract carbon dioxide from the air, this part of the process is still too inefficient to allow a commercial-scale operation.
The company can and has used carbon dioxide extracted from air to make petrol, but it is also using industrial sources of carbon dioxide until it is able to improve the performance of "carbon capture".
Other companies are working on ways of improving the technology of carbon capture, which is considered far too costly to be commercially viable as it costs up to £400 for capturing one ton of carbon dioxide.
However, Professor Klaus Lackner of Columbia University in New York said that the high costs of any new technology always fall dramatically.
"I bought my first CD in the 1980s and it cost $20 but now you can make one for less than 10 cents. The cost of a light bulb has fallen 7,000-fold during the past century," Professor Lackner said.
Friday, October 12, 2012
Steve Morse Solo Acoustic - Live @ Georgia Theatre '90
Between Col. Bruce Hampton's sets, no less...talking starts @ 1:44, music @ 2:27
...Night Meets Light - '90 (embed disabled)
http://youtu.be/P_zQkkbbaf4
Saturday, October 6, 2012
Allan Holdsworth instructional video
I used to own this video and loaned it out. Glad to find it again on good ol' You-ee Too-wub
Friday, October 5, 2012
The CIA and Jeff Bezos Bet on Quantum Computing
http://www.technologyreview.com/news/429429/the-cia-and-jeff-bezos-bet-on-quantum-computing/
Inside a blocky building in a Vancouver suburb, across the street
from a dowdy McDonald's, is a place chilled colder than anywhere in the
known universe. Inside that is a computer processor that Amazon founder
Jeff Bezos and the CIA's investment arm, In-Q-Tel, believe can tap the
quirks of quantum mechanics to unleash more computing power than any
conventional computer chip. Bezos and In-Q-Tel are in a group of
investors who are betting $30 million on this prospect.
If the bet works out, some of the world's thorniest computing problems, such as the hunt for new drugs or efforts to build artificial intelligence, would become dramatically less challenging. This development would also clear the tainted reputation of D-Wave Systems, the startup whose eight-year-long effort to create a quantum computer has earned little more than skepticism bordering on ridicule from prominent physicists.
D-Wave's supercooled processor is designed to handle what software engineers call "optimization" problems, the core of conundrums such as figuring out the most efficient delivery route, or how the atoms in a protein will move around when it meets a drug compound. "Virtually everything has to do with optimization, and it's the bedrock of machine learning, which underlies virtually all the wealth creation on the Internet," says Geordie Rose, D-Wave's founder and chief technology officer. In machine learning, a branch of artificial intelligence, software examines information about the world and formulates an appropriate way to act in the future. It underpins technologies such as speech recognition and product recommendations and is a priority for research by companies, such as Google and Amazon, that rely on big data.
"Our intelligence community customers have many complex problems
that tax classical computing architecture," Robert Ames, vice president
for information and communication technologies at In-Q-Tel, said in a
statement released today. In-Q-Tel's primary "customer" is the CIA, and
the National Security Agency is another. Both are known to be investing
heavily in automated intelligence gathering and analysis.
Rose, a confident Canadian with a guitar and samurai sword propped in the corner of his windowless office, has been making grand claims to journalists since 2007, when he unveiled D-Wave's first proof-of-concept processor at a high-profile event at the Computer History Museum in Mountain View, California. Attendees saw a D-Wave processor (apparently) solve sudoku puzzles and find a close match to a particular drug molecule in a collection of other compounds. But in the weeks, months, and years that followed, skepticism and accusations of fraud rained down on the company from academic experts on quantum computing. Rose's initial predictions about how quickly the company would increase the size and capabilities of its chips fell by the wayside, and the company, although still well-funded, was publicly quiet.
Signing up Bezos and In-Q-Tel—the company's most prominent backers yet—is the latest in a series of events that suggest D-Wave thinks it is ready to finally answer its critics. In May 2011, the company published a paper in the prestigious journal Nature that critical academics said was the first to prove D-Wave's chips have some of the quantum properties needed to back up Rose's claims. Artificial intelligence researchers at Google regularly log into a D-Wave computer over the Internet to try it out, and 2011 also saw the company sign its first customer. Defense contractor Lockheed Martin paid $10 million for a computer for research into automatically detecting software bugs in complex projects such as the delayed F-35 fighter (see "Tapping Quantum Effects for Software that Learns"). Questions remain about just how its technology works, but D-Wave says more evidence is forthcoming. It is readying an improved processor that Rose calls the company's first true product rather than a piece of research equipment. D-Wave is expected to announce other major customers in coming months.
Cold Spot
Step inside D-Wave's ground-floor office suite and you're greeted by bland meeting rooms, offices, and cubicles. But open the correct door off the main corridor and you emerge into a bright white lab space dominated by four black monoliths—D-Wave's computers. Roughly cube-shaped, and around 10 feet tall, they emit a rhythmic, high-pitched sound as supercooled gases circulate inside. Each of the machines has a door on the side and is mostly empty, with what looks like a ray gun descending from the ceiling, a widely spaced stack of five metal discs of decreasing size held together with cables, struts, and pipes plated with gold and copper. It is actually a cold gun: the structure is a chilly -259 °F (4 °Kelvin) at the wide end and a few thousandths of a degree above absolute zero at its tip, where D-Wave's inch-square chip can be found. Not even the deepest reaches of space are this cold, or so shielded from magnetic fields as this chip, which is etched at a plant in Silicon Valley from a niobium alloy that becomes superconducting at ultralow temperatures.
The processor in every computer you've used is made from silicon and patterned with transistors that create logic gates—switches that are either on (represented by a 1 in the computer's programming) or off (a 0). D-Wave's processors are also made up of elements that switch between 1 and 0, but they are loops of niobium alloy—there are 512 of them in the newest processor. These loops are known as qubits and can trap electrical current, which circles inside the loops either clockwise (signified by a 0) or counterclockwise (1). Smaller superconducting loops called couplers link the qubits so they can interact and even influence one another to flip between 1 and 0.
This delicate setup is designed so that the layout of qubits conforms to an algorithm that solves a particular kind of optimization problem at the core of many tasks difficult to solve on a conventional processor. It's like a specialized machine in a factory able to do one thing really well, on a particular kind of raw material. Performing a calculation on D-Wave's chip requires providing that raw material, in the form of the numbers to be fed into its hard-coded algorithm. It's done by setting the qubits into a pattern of 1s and 0s, and fine-tuning how the couplers allow the qubits to interact. After a wait of less than a second, the qubits settle into new values that represent a lower state of energy for the processor, and reveal a potential solution to the original problem.
What happens during that crucial wait is a kind of quantum mechanical argument. The qubits enter a strange quantum state where they are simultaneously both 1 and 0, like Schrodinger's cat being both dead and alive, and lock into a strange synchronicity known as entanglement, a phenomenon once described by Einstein as "spooky." That allows the system of qubits to explore every possible final configuration in an instant, before settling into on the one that is simplest or very close to it.
At least, that's what D-Wave's scientists say. Many questions remain about what actually happens inside the company's chips, not least in the heads of the company's own physicists, engineers, and computer scientists. "We're building this system empirically, not just following the theory," says Jeremy Hilton, the D-Wave vice president who leads its processor development. He and the company's other engineers don't know for sure what's happening in the chip, but as long as each design generates answers to the problems posed, the finer details of the quantum physics taking place inside can wait for retrospective validation.
It's an attitude that seems to have played well with investors, but it still rankles academics. "At an engineering level they've put together a setup that's impressive in various ways," says Scott Aaronson, an MIT professor who studies the limits of quantum computation. "But in terms of the evidence that they're solving problems using quantum mechanics faster than you could classically, I don't think it's there yet." A fierce critic of D-Wave in the years following its 2007 demo, Aaronson softened his stance last year after the company's Nature paper showing quantum effects. "In the past there was an enormous gap between the marketing claims and where the science was and that's come down, but there's still a gap," says Aaronson, who visited the company's labs in February. "The burden of proof is on them and they haven't met the burden yet."
Aaronson's biggest gripe is that the design of D-Wave's system could plausibly solve problems without quantum effects, in which case it would simply be a very weird conventional computer. He and other critics say the company must still prove two things: that its qubits really can enter superpositions and become entangled, and that the chip delivers a significant "quantum speed-up" compared to a classical computer working on the same problem. So far the company has presented proof of neither in a peer-reviewed forum.
Rose says that D-Wave is working on proving evidence of entanglement, and that recent head-to-head tests against classical computers showed it pulling ahead on the kind of computing problem that it is designed to solve.
Aaronson also says the way D-Wave's processor is hard-coded for one particular type of problem will inhibit the range of problems it might solve. In addition, the relatively small number of qubits on the processor today means it can handle only tiny strings of data. Using mathematical tricks to translate a problem into the right form to deal with those limitations, and reversing the process once D-Wave's chip has given its answer, could cause significant slowdowns, says Aaronson. Rose counters that a quantum processor will be fast enough to overcome any such penalties, and he says he has engineers working on ways to automatically translate normal programming code into what a D-Wave chip needs.
Whether or not D-Wave can satisfy Aaronson and other skeptics doesn't necessarily matter to investors and technology companies. That's because in so many areas of business, computing power is crucial to maintaining a competitive advantage, says Steve Jurvetson, a partner at venture capital firm Draper Fisher Jurvetson, who has invested in D-Wave twice and calls it "the most singular swing-for-the-fences technology" he ever funded. "The application space for this," he says, "is anywhere we've had to fall back on an heuristic—a rule of thumb—to solve a problem: day traders, molecular modeling, anyone in e-commerce and the Googles and Microsofts of the world." Companies such as Lockheed, Amazon, and big pharma companies are most familiar with the limits of conventional computers and will be first in line, says Jurvetson, but designing a new car or a new online store could also benefit.
Companies and government agencies have another, perhaps more urgent motivation to take a chance on a startup that has a beguiling idea but a few troubling loose ends. There is good reason to believe that the exponential growth in computing power seen over the last few decades is ending, says Bob Lucas, who directs research on supercomputing and quantum computing at the University of Southern California, where Lockheed's D-Wave computer is installed. Many of the regular advances in computing power have come from connections on chips shrinking year after year, but with leading chip maker Intel currently working on making them just 14 nanometers across, there's not much smaller things can get. "We're living in the last 10 years of exponential growth of [classical] computing power, and alternatives to that will become more of interest," Lucas says. He adds that through his experiments on Lockheed's D-Wave system he has been converted from "highly skeptical to cautiously optimistic" about the technology.
With funding from the Amazon
founder and the CIA's investment arm, the Canadian company D-Wave is
gaining momentum for its revolutionary approach to computing.
If the bet works out, some of the world's thorniest computing problems, such as the hunt for new drugs or efforts to build artificial intelligence, would become dramatically less challenging. This development would also clear the tainted reputation of D-Wave Systems, the startup whose eight-year-long effort to create a quantum computer has earned little more than skepticism bordering on ridicule from prominent physicists.
D-Wave's supercooled processor is designed to handle what software engineers call "optimization" problems, the core of conundrums such as figuring out the most efficient delivery route, or how the atoms in a protein will move around when it meets a drug compound. "Virtually everything has to do with optimization, and it's the bedrock of machine learning, which underlies virtually all the wealth creation on the Internet," says Geordie Rose, D-Wave's founder and chief technology officer. In machine learning, a branch of artificial intelligence, software examines information about the world and formulates an appropriate way to act in the future. It underpins technologies such as speech recognition and product recommendations and is a priority for research by companies, such as Google and Amazon, that rely on big data.
Rose, a confident Canadian with a guitar and samurai sword propped in the corner of his windowless office, has been making grand claims to journalists since 2007, when he unveiled D-Wave's first proof-of-concept processor at a high-profile event at the Computer History Museum in Mountain View, California. Attendees saw a D-Wave processor (apparently) solve sudoku puzzles and find a close match to a particular drug molecule in a collection of other compounds. But in the weeks, months, and years that followed, skepticism and accusations of fraud rained down on the company from academic experts on quantum computing. Rose's initial predictions about how quickly the company would increase the size and capabilities of its chips fell by the wayside, and the company, although still well-funded, was publicly quiet.
Signing up Bezos and In-Q-Tel—the company's most prominent backers yet—is the latest in a series of events that suggest D-Wave thinks it is ready to finally answer its critics. In May 2011, the company published a paper in the prestigious journal Nature that critical academics said was the first to prove D-Wave's chips have some of the quantum properties needed to back up Rose's claims. Artificial intelligence researchers at Google regularly log into a D-Wave computer over the Internet to try it out, and 2011 also saw the company sign its first customer. Defense contractor Lockheed Martin paid $10 million for a computer for research into automatically detecting software bugs in complex projects such as the delayed F-35 fighter (see "Tapping Quantum Effects for Software that Learns"). Questions remain about just how its technology works, but D-Wave says more evidence is forthcoming. It is readying an improved processor that Rose calls the company's first true product rather than a piece of research equipment. D-Wave is expected to announce other major customers in coming months.
Cold Spot
Step inside D-Wave's ground-floor office suite and you're greeted by bland meeting rooms, offices, and cubicles. But open the correct door off the main corridor and you emerge into a bright white lab space dominated by four black monoliths—D-Wave's computers. Roughly cube-shaped, and around 10 feet tall, they emit a rhythmic, high-pitched sound as supercooled gases circulate inside. Each of the machines has a door on the side and is mostly empty, with what looks like a ray gun descending from the ceiling, a widely spaced stack of five metal discs of decreasing size held together with cables, struts, and pipes plated with gold and copper. It is actually a cold gun: the structure is a chilly -259 °F (4 °Kelvin) at the wide end and a few thousandths of a degree above absolute zero at its tip, where D-Wave's inch-square chip can be found. Not even the deepest reaches of space are this cold, or so shielded from magnetic fields as this chip, which is etched at a plant in Silicon Valley from a niobium alloy that becomes superconducting at ultralow temperatures.
The processor in every computer you've used is made from silicon and patterned with transistors that create logic gates—switches that are either on (represented by a 1 in the computer's programming) or off (a 0). D-Wave's processors are also made up of elements that switch between 1 and 0, but they are loops of niobium alloy—there are 512 of them in the newest processor. These loops are known as qubits and can trap electrical current, which circles inside the loops either clockwise (signified by a 0) or counterclockwise (1). Smaller superconducting loops called couplers link the qubits so they can interact and even influence one another to flip between 1 and 0.
This delicate setup is designed so that the layout of qubits conforms to an algorithm that solves a particular kind of optimization problem at the core of many tasks difficult to solve on a conventional processor. It's like a specialized machine in a factory able to do one thing really well, on a particular kind of raw material. Performing a calculation on D-Wave's chip requires providing that raw material, in the form of the numbers to be fed into its hard-coded algorithm. It's done by setting the qubits into a pattern of 1s and 0s, and fine-tuning how the couplers allow the qubits to interact. After a wait of less than a second, the qubits settle into new values that represent a lower state of energy for the processor, and reveal a potential solution to the original problem.
What happens during that crucial wait is a kind of quantum mechanical argument. The qubits enter a strange quantum state where they are simultaneously both 1 and 0, like Schrodinger's cat being both dead and alive, and lock into a strange synchronicity known as entanglement, a phenomenon once described by Einstein as "spooky." That allows the system of qubits to explore every possible final configuration in an instant, before settling into on the one that is simplest or very close to it.
At least, that's what D-Wave's scientists say. Many questions remain about what actually happens inside the company's chips, not least in the heads of the company's own physicists, engineers, and computer scientists. "We're building this system empirically, not just following the theory," says Jeremy Hilton, the D-Wave vice president who leads its processor development. He and the company's other engineers don't know for sure what's happening in the chip, but as long as each design generates answers to the problems posed, the finer details of the quantum physics taking place inside can wait for retrospective validation.
It's an attitude that seems to have played well with investors, but it still rankles academics. "At an engineering level they've put together a setup that's impressive in various ways," says Scott Aaronson, an MIT professor who studies the limits of quantum computation. "But in terms of the evidence that they're solving problems using quantum mechanics faster than you could classically, I don't think it's there yet." A fierce critic of D-Wave in the years following its 2007 demo, Aaronson softened his stance last year after the company's Nature paper showing quantum effects. "In the past there was an enormous gap between the marketing claims and where the science was and that's come down, but there's still a gap," says Aaronson, who visited the company's labs in February. "The burden of proof is on them and they haven't met the burden yet."
Aaronson's biggest gripe is that the design of D-Wave's system could plausibly solve problems without quantum effects, in which case it would simply be a very weird conventional computer. He and other critics say the company must still prove two things: that its qubits really can enter superpositions and become entangled, and that the chip delivers a significant "quantum speed-up" compared to a classical computer working on the same problem. So far the company has presented proof of neither in a peer-reviewed forum.
Rose says that D-Wave is working on proving evidence of entanglement, and that recent head-to-head tests against classical computers showed it pulling ahead on the kind of computing problem that it is designed to solve.
Aaronson also says the way D-Wave's processor is hard-coded for one particular type of problem will inhibit the range of problems it might solve. In addition, the relatively small number of qubits on the processor today means it can handle only tiny strings of data. Using mathematical tricks to translate a problem into the right form to deal with those limitations, and reversing the process once D-Wave's chip has given its answer, could cause significant slowdowns, says Aaronson. Rose counters that a quantum processor will be fast enough to overcome any such penalties, and he says he has engineers working on ways to automatically translate normal programming code into what a D-Wave chip needs.
Whether or not D-Wave can satisfy Aaronson and other skeptics doesn't necessarily matter to investors and technology companies. That's because in so many areas of business, computing power is crucial to maintaining a competitive advantage, says Steve Jurvetson, a partner at venture capital firm Draper Fisher Jurvetson, who has invested in D-Wave twice and calls it "the most singular swing-for-the-fences technology" he ever funded. "The application space for this," he says, "is anywhere we've had to fall back on an heuristic—a rule of thumb—to solve a problem: day traders, molecular modeling, anyone in e-commerce and the Googles and Microsofts of the world." Companies such as Lockheed, Amazon, and big pharma companies are most familiar with the limits of conventional computers and will be first in line, says Jurvetson, but designing a new car or a new online store could also benefit.
Companies and government agencies have another, perhaps more urgent motivation to take a chance on a startup that has a beguiling idea but a few troubling loose ends. There is good reason to believe that the exponential growth in computing power seen over the last few decades is ending, says Bob Lucas, who directs research on supercomputing and quantum computing at the University of Southern California, where Lockheed's D-Wave computer is installed. Many of the regular advances in computing power have come from connections on chips shrinking year after year, but with leading chip maker Intel currently working on making them just 14 nanometers across, there's not much smaller things can get. "We're living in the last 10 years of exponential growth of [classical] computing power, and alternatives to that will become more of interest," Lucas says. He adds that through his experiments on Lockheed's D-Wave system he has been converted from "highly skeptical to cautiously optimistic" about the technology.
Subscribe to:
Posts (Atom)