Wednesday, December 28, 2016

body scanner

https://www.bloomberg.com/news/articles/2016-12-28/how-body-scanning-became-the-latest-health-club-must-have

How Body Scanning Became the Latest Health Club Must-Have

Finally, a New Year's resolution you can keep.
Screenshots from the Naked app
 
Source: Naked
Walk into David Barton's new gym in Manhattan, TMPL, and you will be greeted by an array of high-tech fitness options—fingerprint scanners, giant screens with lifelike landscapes behind the Spin instructors, and a saltwater pool, all bathed in his trademark recessed LED lighting. But the real game-changing gadget here is not on the weight room floor. It's a Styku 3D body scanner, tucked away in a room near the showers that's next to a minibar serving protein shakes.  
If history is any guide, next week millions of people will make a New Year's resolution to go to one of the 180,000 gyms across the globe in an annual, usually ineffective, effort to lose a few pounds. The primary reason for this failure, according to the experts I spoke to, is that checking your weight is a misguided, demoralizing way to gauge overall health. "Many people are focused on the scale," said Mark de Gorter, chief executive officer of Workout Anytime. "But in doing so, they lose the bigger picture of transforming the body."
A screen shot from the Naked app
This could be you: a screen shot from the Naked app.
Source: Naked
Fitness gurus have long complained that the public's myopic focus on weight is counterproductive. Muscle weighs more than fat, after all, and because fat takes up 22 percent more space than muscle, the real measure should be volume. As you lose fat, you literally shrink, a fact that you can feel in the fit of your clothes. But it's hard to be objective when the scale is still creaking beneath your feet.    
Enter the body scanner, which allows you to visualize your muscle gain and see, in three dimensions, how you are losing fat—and where. Companies such as Stykuand Fit3D, which is available in select Equinox gyms, use a powerful camera, housed in an aluminum base about the size of a kid's tee-ball stand, to extract millions of data points in fewer than 30 seconds. The machine takes the surface measurements of your waist, chest, and arms and then assembles a 3D model that can be rotated, panned, and zoomed from over 600 infrared images. 

A Growing Trend

In the last year, health club execs such as De Gorter have discovered that the technology is one of the most effective ways to attract and retain clients. After testing it out at four of her clubs, Diana Williams, who founded Fernwood Fitness in Melbourne 26 years ago and now has 70 franchises across the continent, was so impressed she is rolling it out to her full network. "We use it as a selling tool but more as a retention tool," she said. "A measurement is just a number. But a visual image of what they look like, rather than their imagination, is much more motivating."
Because it converts these measurements to a metric that people can understand, it also makes for easy before-and-after comparisons, said Raj Sareen, CEO of Los Angeles-based Styku, one of the largest suppliers of body scanning technology to fitness clubs. The company introduced the equipment at trade shows in 2015 after a pilot program with smaller gyms; in the 12 months since, year-over-year growth increased by 550 percent and it is now available in 350 locations in 25 countries around the world. It has been introduced most recently in Korea and the U.K. and will launch at select gyms in Brazil by early 2017. 
Naj Pareen, CEO of Styku body scanning
Raj Sareen, Styku CEO
Source: Styku
The technology is familiar to anyone who's raised their hands overhead inside a scanner at an airport. It's built off the innovations found in the motion-sensing technology of Microsoft's Kinect, part of the Xbox One introduced in 2011. Using it is a straightforward process: Stand on a raised circular platform that makes one 360-degree rotation while an infrared camera in the nearby aluminum stand takes pictures and then relays the information to a connected laptop.
David Barton, the fitness guru who in September opened TMPL(pronounced "temple," as in your body is one), pairs the Styku with an InBody machine, which measures body fat, and an on-site nutritionist to create a diet around the findings. "The most efficient way to change the outside is to know what's inside," he said. 

An Accidental Discovery

The technology was not initially designed for health clubs. Sareen got his start by hacking into webcams and turning them into body scanners, then really got into it when he saw the possibilities inherent in Microsoft's Kinect, which could create lifelike 3D scans of objects with its high-powered camera. In 2012, his proposal was one of only 11 accepted to Tech Stars, the respected accelerator program, and he came out of it with a business plan to market the technology to clothing retailers in order to create clothes that would be the right size every time. (In essence, the perfect virtual fitting room.) 
The Styku body scanner in action.
The Styku body scanner in action.
Source: Styku
Sareen did a pilot program with Nordstrom while one of his competitors, Bodymetrics, partnered with Bloomingdale's in New York and Selfridge's in London. But the clothing industry is famously slow to adapt to technology, and it wasn't the right environment, anyway. Turns out that people were not ready for quite that level of reality while they were shopping. "We tried plastic surgeons, spas, dermatologists," said Sareen, but it wasn't until they went to health clubs that they found a receptive environment.
Even then, though, De Gorter was lukewarm the first time he saw it in action. "I thought seeing someone in 3d might be too revealing and too weird, but that notion was blown out of the water by everyone who tried it," he said. "Some people are a little reluctant to get on it, and some people don't like the results. But it becomes a great validator, a benchmark as they improve. Our early adopters saw the value in that right away."
Until now, the only way to get an accurate measurement of body fat was either through calipers—those small pliers that literally measure the amount of loose skin around your waist and arms—or via MRI, which is not a commercially viable proposition. But the Styku runs about $10,000 for a franchise operator, with no recurring fees at the moment. Fernwood Fitness's Williams says that the cost is a worthwhile investment, given the competitive advantage. "It's an added service," she said. "We do charge for it, but if someone's not motivated, we'll give them another scan at no charge to keep them." She also offers short-term 12-week challenges at her gyms, with Styku scans before and after "so they can see the difference," she said. 
A Styku body scan readout
A Styku body scan readout.
Source: Styku

Beyond Fitness

It's not just health clubs. The Fairmont Scottsdale Princess and the Four Seasons Resort & Club Dallas at Las Colinas have introduced the Bod Pod, an egg-like scanning technology that measures muscle-to-fat ratio, so that their nutritionists can give recommendations while clients are traveling for business or just taking a few days off.
It may be available at the consumer level soon, as well. Farhad Farahbakhshian, CEO of Naked, is developing a version of the technology that works with your phone and can be set up at home. His background is in electrical engineering and computer science, but some work as a part-time Spin instructor gave him insight into what keeps people motivated. 
"I saw people going through several New Year's resolutions and it wasn't about motivation," he said. "Everyone is motivated on the 1st of January." The main issue is that the more motivated you are, the more you want to see the changes. But there was no way to quantify that process, other than by weight. "People were making tremendous progress, but their weight wasn't changing," he continued. "They were using the wrong gauge to measure their progress."
He hopes to roll out a retail-friendly version of the product by November, and he is optimistic that people will adopt it. "The biggest challenge is just convincing people that it's real," he said. "They think it's something you see in Star Trek."  

Tuesday, December 27, 2016

Robot marriage

http://qz.com/871815/sex-robots-experts-predict-human-robot-marriage-will-be-legal-by-2050/

Experts predict human-robot marriage will be legal by 2050

In the face of AI exerts repeatedly predicting the rise of sex robots, it’s increasingly difficult to insist that such machines strictly belong to a far-off, dystopian future. But some robotics experts predict we’ll soon be doing far more than having sexual intercourse with machines. Instead, we’ll be making love to them—with all the accompanying romantic feelings.
At this week’s “Love and Sex with Robots” conference at Goldsmith University in London, David Levy, author of a book on human-robot love, predicted that human-robot marriages would be legal by 2050. Adrian Cheok, computing professor at City University London and director of the Mixed Reality Lab in Singapore, says the prediction is not so farfetched.
“That might seem outrageous because it’s only 35 years away. But 35 years ago people thought homosexual marriage was outrageous,” says Cheok, who also spoke at the conference. “Until the 1970s, some states didn’t allow white and black people to marry each other. Society does progress and change very rapidly.”
And though human-robot marriage might not be legal until 2050, Cheok believes humans will be living with robot partners long before then.
Though Cheok acknowledges that sex robots could fulfill sexist male sexual fantasies, he believes robot-human marriages will have an overwhelmingly positive effect on society. “People assume that everyone can get married, have sex, fall in love. But actually many don’t,” he says. And even those who do might be in search of a different option. “A lot of human marriages are very unhappy,” Cheok says. “Compared to a bad marriage, a robot will be better than a human.”
Though various sex robots are on the market, there are none that come close to resembling a human sexual partner—and there’s certainly nothing like the type of humanoid robot capable of replicating a loving relationship. However, Cheok believes the greatest technological difficulty in creating love robots is not a mechanical challenge, but a matter of developing the software necessary to build a robot that understands human conversation skillfully enough for the job.
Once that problem has been addressed, Cheok sees no problem with romances between man and machine. “If a robot looks like it loves you, and you feel it loves you, then you’re essentially going to feel like it’s almost human love,” he says. Cheok points out that in Japan and South Korea, there are already cases of humans falling in love with computer characters. Cheok also compares robot love to human emotions for other species, such as pet cats. “We already have very high empathy for non-human creatures. That’s why I think once we have robots that act human, act emotional, or look human, it’s going to be a small jump for us to feel empathy towards robots,” he says.
Others are less convinced. Oliver Bendel, professor at University of Applied Sciences and Arts in Switzerland, with a focus on machine ethics, says he does not believe sex or love robots will have moral standing. “Marriage is a form of contract between human beings to regulate mutual rights and obligations including the care and the welfare of children. Perhaps one day robots can have real duties and rights, though I don’t really believe it,” he says. However, he acknowledges that human-robot marriage could become legal by 2050 simply in response to public pressure.
Then again, Bendel says, legislation could move in the other direction: As sex and love robots become more realistic, governments could choose to ban sexual relationships between humans and machines. Either way, though the technology is not ready yet, experts believe it’s best to start figuring out the moral conundrums now, so that we’ll be prepared once romantic sex robots do arrive.

Apple Publishes Its First AI Research Paper

When Apple said it would publish its artificial intelligence research, it raised at least a couple of big questions. When would we see the first paper? And would the public data be important, or would the company keep potential trade secrets close to the vest? At last, we have answers. Apple researchers have published their first AI paper, and the findings could clearly be useful for computer vision technology.
The paper tackles the problem of teaching AI to recognize objects using simulated images, which are easier to use than photos (since you don't need a human to tag items) but poor for adapting to real-world situations. The trick, Apple says, is to use the increasingly popular technique of pitting neural networks against each other: one network trains itself to improve the realism of simulated images (in this case, using photo examples) until they're good enough to fool a rival "discriminator" network. Ideally, this pre-training would save massive amounts of time and account for hard-to-predict situations that don't always turn up in photos.
This doesn't mean that Apple is suddenly an open book. It could take years before it's clear how transparent Apple has become with its scientific findings. However, this is a big step -- if also a necessary one. AI is an increasingly competitive field, and Apple's past reluctance to contribute to scientific knowledge may have scared away potential hires who wanted their discoveries recognized. If papers like these become relatively commonplace, Apple might have an easier time attracting the talent it needs for self-driving car platforms, Siri and other AI-based projects.

Lumis AR

http://lumus-optical.com/

http://www.roadtovr.com/augmented-reality-company-lumus-secures-30m-funding/?utm_source=Road+to+VR+Daily+News+Roundup&utm_campaign=ede48fe652-RtoVR_RSS_Daily_Newsletter&utm_medium=email&utm_term=0_e2e394ad33-ede48fe652-168175841

AR Optics Company Lumus Secures Further $30M in Funding

 
Lumus Ltd. recently confirmed a $30 million funding round led by Quanta and HTC. Lumus is a leading provider and developer of augmented reality technology.
Following from the $15 million funding led by Shanda Group and Crystal-Optech in June, Lumus Ltd. recently announced a $30 million Series C round of funding led by Quanta Computer, the world’s largest notebook computer ODM company, along with HTC and other strategic investors. “AR/VR is well aligned with our growth strategy and we’re pleased to invest in the Lumus optics solution for augmented reality”, says C.C. Leung, vice chairman and president of Taiwan-based Quanta. “This is pioneering technology, and we have great confidence in Lumus as an innovator and industry leader for transparent optical displays in the AR market.”
Founded in 2000, Lumus has established itself as a leading provider and developer of the core enabling technology for augmented reality. Their ‘Optical Engine’, which combines a patented Light-Guide Optical Element and a miniature projector, is already found in AR devices used in industries such as aviation, logistics, medical care, and the military. The technology is also being applied to consumer products in the development stage, with the current DK-50 development kit being provided to leading consumer electronics and smart eye-wear manufacturers, sporting a 40 degree field of view – larger than Microsoft’s HoloLens. Speaking to TechCrunch, Lumus CEO Ben Weinberger revealed that a prototype with a 50% larger field of view than the DK-50 will be shown at CES next month.
HTC’s interest is understandable, as their current involvement in VR will inevitably converge with AR in the near future. Alvin Wang Graylin, HTC China Regional President of Vive, recently predicted that the first ‘integrated selectable AR+VR product’ would arrive in 2018. “We are very committed to AR/VR,” says David Chang, COO of HTC. “Our current investment is aligned with HTC’s natural extension into augmented reality following our successful VIVE launch earlier this year.”
While the combined $45 million funding is eclipsed by the staggering $793.5 million Series C investment in Magic Leap earlier this year, the new financing is a tremendous boost to Lumus. According to the press release, they plan to ‘expand development, operations, and marketing of its display technology for the AR and smart eyewear industry’. The rapid rise of AR investment is expected to continue; in a recent IDC study, it was predicted that AR could become an everyday technology for more than a billion consumers within the next five years, with 30 percent of Global 2000 companies incorporating AR and VR into their marketing programs during 2017.
“This new funding will help Lumus continue to scale up our R&D and production in response to the growing demand from companies creating new augmented reality and mixed reality applications, including consumer electronics and smart eyeglasses,” says Ben Weinberger. “We also plan to ramp up our marketing efforts in order to realize and capture the tremendous potential of our unique technology to re-envision reality in the booming AR industry.”

Monday, December 19, 2016

Zuckerberg AI Jarvis



https://www.fastcompany.com/3066478/mind-and-machine/mark-zuckerberg-jarvis

...design the system to "visualize data in VR to help me build better services and lead my organizations [at Facebook] more efficiently."

Zuckerberg's notes on Jarvis - 100-150 hours, in-home personal assistant via phone app.
http://bit.ly/2h4tuN7

- Object recognition, face recognition
- Messenger bot
- Text vs. speech
- Speech: quality benefits from context specialization, audio condition specialization

https://ca.news.yahoo.com/virtual-butler-jarvis-takes-residence-facebook-founders-home-030407616--finance.html

AR VR UI UX

http://www.roadtovr.com/visualising-ui-solutions-mixed-reality-future/?utm_source=Road+to+VR+Daily+News+Roundup&utm_campaign=8643d4cf45-RtoVR_RSS_Daily_Newsletter&utm_medium=email&utm_term=0_e2e394ad33-8643d4cf45-168175841


Thursday, December 15, 2016

reverse aging

http://www.telegraph.co.uk/science/2016/12/15/scientists-reverse-ageing-mammals-predict-human-trials-within/

An end to grey hair and crows-feet could be just 10 years away after scientists showed it is possible to reverse ageing in animals.
Using a new technique which takes adult cells back to their embryonic form, US researchers at the Salk Institute in California, showed it was possible to reverse ageing in mice, allowing the animals to not only look younger, but live for 30 per cent longer.
The technique involves stimulating four genes which are particularly active during development in the womb. It was also found to work to turn the clock back on human skin cells in the lab, making them look and behave younger.

Scientists hope to eventually create a drug which can mimic the effect of the found genes which could be taken to slow down, and even reverse the ageing process. They say it will take around 10 years to get to human trials.
Ageing is a plastic process and more amenable to therapeutic interventions than we previously thoughtDr Juan Carlos Izpisua Belmonte, Salk Institute
"Our study shows that ageing may not have to proceed in one single direction," said Dr Juan Carlos Izpisua Belmonte, a professor in Salk's Gene Expression Laboratory. “With careful modulation, aging might be reversed.
"Obviously, mice are not humans and we know it will be much more complex to rejuvenate a person. But this study shows that ageing is a very dynamic and plastic process, and therefore will be more amenable to therapeutic interventions than what we previously thought."
Scientists have known for some time that the four genes, which are known collectively as the Yamanaka factors, could turn adult cells back to their stem cell state, where they can grow into any part of the body.
But it was always feared that allowing that to happen could damage organs made from the cells, and even trigger cancer.
However, it was discovered that stimulating the genes intermittently reversed ageing, without causing any damaging side effects.
In mice with a premature ageing disease, the treatment countered signs of ageing and increased their lifespan by 30 per cent. If it worked similarly in humans it could allow people to live until more than 100 years old. In healthy mice it also helped damaged organs to heal faster.
"In other studies scientists have completely reprogrammed cells all the way back to a stem-cell-like state," says co-first author Pradeep Reddy, also a Salk research associate.
"But we show, for the first time, that by expressing these factors for a short duration you can maintain the cell's identity while reversing age-associated hallmarks."
The breakthrough could also help people stay healthier for longer.  The ageing population means that the risk of developing age-related diseases, such as dementia, cancer and heart disease also rises. But if the body could be kept younger for longer then it could prevent many deadly diseases for decades.

Tuesday, December 13, 2016

machine learning sales - translation

http://venturebeat.com/2016/12/05/leadgenius-raises-4-million-for-machine-learning-sales-tool/

AI use cases


http://www.forbes.com/sites/bernardmarr/2016/09/30/what-are-the-top-10-use-cases-for-machine-learning-and-ai/#1b29198b10cf

The Top 10 AI And Machine Learning Use Cases Everyone Should Know About

I write about big data, analytics and enterprise performance  
Machine learning is a buzzword in the technology world right now, and for good reason: It represents a major step forward in how computers can learn.
Very basically, a machine learning algorithm is given a “teaching set” of data, then asked to use that data to answer a question. For example, you might provide a computer a teaching set of photographs, some of which say, “this is a cat” and some of which say, “this is not a cat.” Then you could show the computer a series of new photos and it would begin to identify which photos were of cats.
Machine learning then continues to add to its teaching set. Every photo that it identifies — correctly or incorrectly — gets added to the teaching set, and the program effectively gets “smarter” and better at completing its task over time.
It is, in effect, learning.
  • Data Security Malware is a huge — and growing — problem. In 2014, Kaspersky Lab said it had detected 325,000 new malware files every day. But, institutional intelligence company Deep Instinct says that each piece of new malware tends to have almost the same code as previous versions — only between 2 and 10% of the files change from iteration to iteration. Their learning model has no problem with the 2–10% variations, and can predict which files are malware with great accuracy. In other situations, machine learning algorithms can look for patterns in how data in the cloud is accessed, and report anomalies that could predict security breaches.
  • Personal SecurityIf you’ve flown on an airplane or attended a big public event lately, you almost certainly had to wait in long security screening lines. But machine learning is proving that it can be an asset to help eliminate false alarms and spot things human screeners might miss in security screenings at airports, stadiums, concerts, and other venues. That can speed up the process significantly and ensure safer events.
  • Financial TradingMany people are eager to be able to predict what the stock markets will do on any given day — for obvious reasons. But machine learning algorithms are getting closer all the time. Many prestigious trading firms use proprietary systems to predict and execute trades at high speeds and high volume. Many of these rely on probabilities, but even a trade with a relatively low probability, at a high enough volume or speed, can turn huge profits for the firms. And humans can’t possibly compete with machines when it comes to consuming vast quantities of data or the speed with which they can execute a trade.
  • HealthcareMachine learning algorithms can process more information and spot more patterns than their human counterparts. One study used computer assisted diagnosis (CAD) when to review the early mammography scans of women who later developed breast cancer, and the computer spotted 52% of the cancers as much as a year before the women were officially diagnosed. Additionally, machine learning can be used to understand risk factors for disease in large populations. The company Medecision developed an algorithm that was able to identify eight variables to predict avoidable hospitalizations in diabetes patients.
  • Marketing PersonalizationThe more you can understand about your customers, the better you can serve them, and the more you will sell.  That’s the foundation behind marketing personalisation. Perhaps you’ve had the experience in which you visit an online store and look at a product but don’t buy it — and then see digital ads across the web for that exact product for days afterward. That kind of marketing personalization is just the tip of the iceberg. Companies can personalize which emails a customer receives, which direct mailings or coupons, which offers they see, which products show up as “recommended” and so on, all designed to lead the consumer more reliably towards a sale.
  • Fraud DetectionMachine learning is getting better and better at spotting potential cases of fraud across many different fields. PayPal, for example, is using machine learning to fight money laundering. The company has tools that compare millions of transactions and can precisely distinguish between legitimate and fraudulent transactions between buyers and sellers.
  • RecommendationsYou’re probably familiar with this use if you use services like Amazon or Netflix. Intelligent machine learning algorithms analyze your activity and compare it to the millions of other users to determine what you might like to buy or binge watch next. These recommendations are getting smarter all the time, recognizing, for example, that you might purchase certain things as gifts (and not want the item yourself) or that there might be different family members who have different TV preferences.
  • Online SearchPerhaps the most famous use of machine learning, Google and its competitors are constantly improving what the search engine understands. Every time you execute a search on Google, the program watches how you respond to the results. If you click the top result and stay on that web page, we can assume you got the information you were looking for and the search was a success.  If, on the other hand, you click to the second page of results, or type in a new search string without clicking any of the results, we can surmise that the search engine didn’t serve up the results you wanted — and the program can learn from that mistake to deliver a better result in the future.
  • Natural Language Processing (NLP)NLP is being used in all sorts of exciting applications across disciplines. Machine learning algorithms with natural language can stand in for customer service agents and more quickly route customers to the information they need. It’s being used to translate obscure legalese in contracts into plain language and help attorneys sort through large volumes of information to prepare for a case.
  • Smart CarsIBM recently surveyed top auto executives, and 74% expected that we would see smart cars on the road by 2025. A smart car would not only integrate into the Internet of Things, but also learn about its owner and its environment. It might adjust the internal settings — temperature, audio, seat position, etc. — automatically based on the driver, report and even fix problems itself, drive itself, and offer real time advice about traffic and road conditions.