Sunday, October 30, 2016

Personal assistants - Sony Xperia

http://www.sonymobile.com/gb/products/smart-products/xperia-ear/#specifications

AT&T emerging tech


Data visualization
http://www.research.att.com/groups/infovis/videos.html?fbid=j0jSAVVPI59
http://www.corp.att.com/creativeservices/visualizer/

AT&T's Shape Conference
Focus areas: IOT, Emerging technologies, AR/VR, smart cities, wearables, 
https://shape.att.com/

The Futurist Reports.
The AT&T Foundry, Ericsson, and RocketSpace introduce: technologies of tomorrow.
http://www.futurecastseries.com/

Contest
https://developer.att.com/blog/category/augmented-reality

Considerations
https://developer.att.com/developer/forward.jsp?passedItemId=200116

Use cases
https://developer.att.com/blog/ar-use-cases

AR/VR video:
http://developer.att.com/video

Natural language processing (NLP) + deep learning vs. natural language understanding (NLU)

AT&T + IBM = IOT dev. (July 2016)
http://www-03.ibm.com/press/us/en/pressrelease/50168.wss
http://www.ibm.com/internet-of-things/iot-solutions/watson-iot-platform/

IOT dev platforms: M2X & Flow Designer:
https://www.business.att.com/enterprise/Family/internet-of-things/iot-platforms-development/

--

https://en.wikipedia.org/wiki/Natural_language_understanding

--

AT&T Watson 

About AT&T Watson:
http://www.research.att.com/projects/WATSON/index.html

AT&T WATSONSM is AT&T's speech and language engine that integrates a variety of speech technologies, including network-based, speaker-independent automatic speech recognition (ASR), AT&T Labs Natural Voices® text-to-speech conversion, natural language understanding (which includes machine learning), and dialog management tasks.
WATSON has been used within AT&T for IVR customers, including AT&T's VoiceTone® service, for over 20 years during which time the algorithms, tools, and plug-in architecture have been refined to increase accuracy, convenience, and integration. Besides customer care IVR, AT&T WATSONSMhas been used for speech analytics, mobile voice search of multimedia data, video search, voice remote, voice mail to text, web search, and SMS.

Name origin: Alexander Graham Bell's assistant, Thomas A. Watson. VS. IBM's first President, Thomas J. Watson:
http://languagelog.ldc.upenn.edu/nll/?p=3918

--

http://www.interactions.com/press-releases/interactions-closes-strategic-deal-acquire-att-watson-speech-recognition-natural-language-understanding-platform/

2014
http://about.att.com/story/att_and_interactions_agree_to_strategic_transaction_in_speech_and_multi_modal_technology_arena.html

Video from 2012


--

Company claiming to have natural language understanding:

http://venturebeat.com/2016/08/11/deep-learning-alone-will-never-outperform-natural-language-understanding/

http://www.businesswire.com/news/home/20160712005547/en/Pat-Reveals-Radical-Breakthrough-A.I.-Launches-API

http://pat.ai/


Saturday, October 29, 2016

pop song written by AI

http://qz.com/790523/daddys-car-the-first-song-ever-written-by-artificial-intelligence-is-actually-pretty-good/

AI for the future of music: FlowMachines

Scientists at SONY CSL Research Lab have created the first-ever entire songs composed by Artificial Intelligence: "Daddy's Car" and "Mister Shadow". 

The researchers have developed FlowMachines, a system that learns music styles from a huge database of songs. Exploiting unique combinations of style transfer, optimization and interaction techniques, FlowMachines composes novel songs in many styles. 

"Daddy's Car" is composed in the style of The Beatles. French composer Benoît Carré arranged and produced the songs, and wrote the lyrics. 

"Daddy's Car"


“Mr Shadow”, an electro-experimental song in the style of American songwriters such as Irving Berlin, Duke Ellington, George Gershwin and Cole Porter:

"Mr. Shadow"


more examples here:
http://www.flow-machines.com/tag/listen-to-artificial-intelligence-music/




Monday, October 24, 2016

iPhone 8 design


Highlights:
  • clear Gorilla Glass sandwich with other polycarbonates - shatter resistant
  • completely transparent to power augmented reality experience similar to Microsoft’s HoloLens.
  • next-generation 3D sensor from Primesense
  • battery and antennas hidden around the edges of the screen


http://bgr.com/2016/10/24/iphone-8-rumors-specs-design-revealed-or-nope/

Next year, Apple is expected to release a completely redesigned iPhone 8 to celebrate the 10th anniversary of the original iPhone. Following three consecutive years of seeing the same design reused in the iPhone 6, iPhone 6s and iPhone 7, Apple is expected to make dramatic changes to the appearance of its next-generation iPhone, and to the technology included inside the device. Previous reports have stated that the new iPhone 8 will feature an OLED display that covers almost all of the phone’s face, and the home button and fingerprint scanner will be embedded beneath the screen. The phone is also expected to have a glass back instead of aluminum, and metal will surround the outer edges as it did on the iPhone 4 and iPhone 4.
According to a new report, however, these claims have it all wrong and the iPhone 8’s new design will be unlike anything the world has ever seen.
According to everything we’ve heard so far from tried and true sources, next year’s iPhone 8 sounds pretty terrific. The new OLED screen is long overdue and Apple’s removal of the home button may finally let it shrink down the overall size of its iPhones without decreasing the size of the displays. Fans are excited enough as it is, but now a new report claims that Apple’s upcoming iPhone redesign is far more revolutionary than any previous rumors claimed.
“The next iPhone will be, I am told, a clear piece of glass (er, Gorilla Glass sandwich with other polycarbonates for being pretty shatter resistant if dropped) with a next-generation OLED screen (I have several sources confirming this),” wrote longtime tech blogger and current Entrepreneur in Residence at UploadVR in a post on Facebook. “You pop it into a headset which has eye sensors on it, which enables the next iPhone to have a higher apparent frame rate and polygon count than a PC with a Nvidia 1080 card in it.”
He says the next-generation iPhone will be clear — as in, completely transparent so that users can see straight through it — so that Apple can use it to power an augmented reality experience similar to the one offered by Microsoft’s HoloLens.
“The phone itself has a next-generation 3D sensor from Primesense, which Apple bought,” Scoble added. “Apple has 600 engineers working in Israel on just the sensor. It’s the 10th anniversary of the iPhone. It’s the first product introduction in Apple’s new amazing headquarters. It’s a big f**king deal and will change this industry deeply.”
He continued, “Also, updates from new sources: expect battery and antennas to be hidden around the edges of the screen, which explains how Apple will fit in some of the pieces even while most of the chips that make up a phone are in a pack/strip at the bottom of the phone.”
This all sounds absolutely incredible… and completely implausible. First, we’re years away from having technology that would enable Apple to build an iPhone like the one Scoble describes with a battery and antennas “hidden around the edges of the screen” that would last longer than about 4 minutes on a charge. We’ve seen some pretty spectacular tech as far as transparent displays are concerned — check out this invisible TV from Panasonic, which is incredible — but scaling it down to a device the size of a smartphone isn’t happening in 2017.
Beyond that, this contradicts everything we’ve heard so far from the most reliable sources in the world when it comes to leaking Apple’s plans, such as The Wall Street Journal and KGI Securities analyst Ming-Chi Kuo. Scoble has plenty of sources in the startup world and he’s had some scoops in the past, but he doesn’t quite have the kind of track record with Apple scoops that would make us confident in these outlandish claims, however much we’d love for them to be true.

Sunday, October 16, 2016

Cybathlon: Cyborg Athletes

http://spectrum.ieee.org/the-human-os/biomedical/bionics/at-the-worlds-first-cyborg-olympics-manmachine-hybrids-raced-for-the-gold

http://www.cybathlon.ethz.ch/en/the-disciplines.html

  • Brain-Computer Interface Race
  • Functional Electrical Stimulation (FES) Bike Race
  • Powered Arm Prosthesis Race
  • Powered Exoskeleton Race
  • Powered Wheelchair Race
  • Wednesday, October 12, 2016

    VR pay, Alibaba

    https://www.yahoo.com/tech/alibabas-payment-system-lets-virtual-reality-shoppers-pay-133818491--sector.html

    By Sijia Jiang
    HONG KONG (Reuters) - Alibaba Group Holdings' finance arm on Wednesday demonstrated a payment service that will allow virtual reality shoppers to pay for things in future just by nodding their heads.
    VR Pay, the new payment system, is part of Alibaba's efforts to capitalise on the latest technology in online shopping. In 2015, for example, it introduced a facial recognition technology for Alipay mobile payments service advertised as "pay with a selfie".
    The VR payment technology means people using virtual reality goggles to browse virtual reality shopping malls will be able pay for purchases without taking off the goggles. They can just nod or look instead.
    Lin Feng, who is in charge of Ant Financial's incubator F Lab that has been developing the payment service over the past few months, told Reuters: "It is very boring to have to take off your goggles for payment. With this, you will never need to take out your phone."
    User identity can be verified on VR Pay via account logins on connected devices or via voice print technology that recognises each person's unique voice. Lin said this was the most convenient method in a VR setting compared with other biometric recognition technologies.
    But passwords will still be needed for authentication, which the user can also enter with head movements, touch, or by staring at a point on virtual display for longer than 1.5 seconds, he said.
    VR Pay is expected to be ready for commercial launch by the end of this year.
    Ant Financial said its new VR-based payment infrastructure can make VR "a tool rather than just a toy" by connecting various VR goggle makers and app developers to the Alipay payments platform.
    Ant Financial Services Group, which was demonstrating VR Pay in Shenzhen on Wednesday, operates China's largest online payments service Alipay with more than 450 million daily users.
    In September, it bought U.S.company EyeVerify, a maker of optical verification technology used by U.S. banks including Wells Fargo.
    (Reporting by Sijia Jiang. Editing by Jane Merriman)

    Sunday, October 9, 2016

    gesture and machine personality




    I saw reference to a 'CNC machine', and I had no idea what it was. So, I googled, and it came up on youtube.

    After watching, I realized this is totally my sense of humor, and I had to think about why.

    Particularly, the surprise of not knowing what was coming and the revelation of the creation in 'mind' is funny. The moment the tool changes indicates when it's time for the next action, but my imagination has to immediately fathom what the tool is going to do before it shows me. Then, it begins the execution with utmost precision and zeal.

    Watching a person do anything creative can be just as funny and fascinating if you don't know what they have in mind. If you do know, for example, a person is going to draw, but you can't tell what they are doing until it takes more form, the surprise is in observing where they begin, maybe compared to how I myself might think to begin. 

    The same kind of effect can be achieved by watching three people perform the same simple, familiar task - washing hands, typing, tossing something in the trash, shaving, selecting food from a buffet, running or walking, scraping the last bite of yogurt from the bottom of the cup. I always wanted to make a montage of this kind of activity because the comparison reveals personality through gesture, pace, intensity, hesitation, the thought process and decision making, the level of attention, any addition of subtasks, and criteria for satisfying the given task to completion. Such purely absurd actions sometimes render performers into ridiculous caricatures as they submit to whatever method necessary to achieve the desired outcome.

    One of the more hilarious such tasks is the act of licking an envelope. At the post office, I've witnessed some who lick their finger and press it across the glue strip, which strikes me as unhygienic, though perhaps they are either avoiding the taste, or trusting the sanitation of their own finger over the envelope. I myself would run my tongue up one length of the glue strip from end to center, then again to complete the other side, and I believe this to be the most thorough and efficient. However, I observed one person licking in rapid upward strokes across the glue strip, the way a dog pesters its owner for attention. This very energetic and frivolous performance was exacerbated by his being a fat, buck-toothed fry cook with rolls of neck fat covered in spiny crew-cut hair buckling from under a mesh ball cap. 

    But, back to the CNC machine. A machine's movement is funny because of the precision, and because of the timing. There are moments of perceived finesse and brutality - the most precise, micro movements reading as meticulous and sensitive, and the macro movements seeming merciless, frighteningly relentless and forceful as teeth bare into metal the shards are cast away, appealing to my dark humor. Robots and machines might read as ultimate OCD introverts in that they exhibit strong intention and superior decisiveness, yet have no sense of self awareness, and are only concerned for monomaniacally focusing on the task at hand, like some kind of an autistic adolescent. 

    In this instance, the automated task is funnier because there is no processing, no evaluation, just sheer execution. The only pause is when the next tool rolls out to begin the next task, and though it's a mere formality, it invites us to personify the machine as sentient.

    Bear in mind, the machine is operated by a human, but it is performing a sequence of tasks previously handled by a human. Comparing human and machine performance of this operation might be funny by contrast, but the darkness in the humor would be stark.


    Wednesday, October 5, 2016

    google home







    Google Gets Serious About Home Automation: Unveils Google Home, Actions on Google and Google Wifi (techcrunch.com)38


    At its hardware launch event earlier today, Google launched Google Home, a voice-activated speaker that aims to give Amazon's Echo a run for its money. The speaker is always-listening and uses Google's Assistant to deliver sports scores, weather information, commute times, and much more. Tech Crunch reports:So like the Echo, Google Home combines a wireless speaker with a set of microphones that listen for your voice commands. There is a mute button on the Home and four LEDs on top of the device so you can know when it's listening to you; otherwise, you won't find any other physical buttons on it. As for music, Google Home will feature built-in support for Google Play Music, Spotify, Pandora and others. You can set up a default music service, too, so you don't always have to tell Google that you want to play a song "on Spotify." Google also noted that Home's music search is powered by Google, so it can understand relatively complex queries. Music on Google Home will also support podcast listening and because it's a Cast device, you can stream music to it from any other Cast-enabled device. Home integrates with Google's Chromecasts and Cast-enabled TVs. For now, that mostly means watching YouTube videos, but Google says it will also support Netflix, too. Google Home will cost $129 (with a free six-month trial of YouTube Red) and go on sale on Google's online store today. It will ship on November 4.What's more is that developers will be able to integrate their third-party apps with Google Assistant via "Actions on Google." With Actions on Google, developers will be able to create two kinds of actions: Direct and Conversation. Direct is made for relatively simple requests like home automation, while Conversation is made for a back and forth interaction utilizing API.ai. Actions on Google will also allow third-party hardware to take advantage of Google Assistant. Those interested can sign-up for the service today. But Google didn't stop there. The company went on to reveal all-new, multi-point Wifi routers called Google Wifi. The Verge reports:The Wifi router can be purchased two ways: as a single unit or in a multipack, just like Eero. A single unit is $129, while the three-pack will cost $299. Google says Wifi will be available for preorder in the U.S. in November and will ship to customers in December. There was no mention of international availability. Google says it has developed a number of technologies to make the Wifi system work, including intelligent routing of traffic from your phone or device to the nearest Wifi unit in your home. It supports AC 1200 wireless speeds, as well as simultaneous dual-band 2.4GHz and 5GHz networks. It also has beamforming technology and support for Bluetooth Smart. Google says the system will handle channel management and other traffic routing automatically.

    AI landscape:http://www.theverge.com/2016/10/5/13167230/walt-mossberg-google-pixel-phone-industry-shake-up
    All of this hardware is controlled by their AI - voice.

    Look at the navigation across the top:

    google daydream vr - $79 - November 16

    google pixel phone

    google home

    google wifi