Wednesday, April 24, 2024

Meta smart glasses update

The Ray-Ban Meta Smart Glasses Have Multimodel AI Now (theverge.com)20

The Ray-Ban Meta Smart Glasses now feature support for multimodal AI -- without the need for a projector or $24 monthly fee. (We're looking at you, Humane AI.) With the new update, the Meta AI assistant will be able to analyze what you're seeing, and it'll give you smart, helpful answers or suggestions. The Verge reports:First off, there are some expectations that need managing here. The Meta glasses don't promise everything under the sun. The primary command is to say "Hey Meta, look and..." You can fill out the rest with phrases like "Tell me what this plant is." Or read a sign in a different language. Write Instagram captions. Identify and learn more about a monument or landmark. The glasses take a picture, the AI communes with the cloud, and an answer arrives in your ears. The possibilities are not limitless, and half the fun is figuring out where its limits are. [...]

To me, it's the mix of a familiar form factor and decent execution that makes the AI workable on these glasses. Because it's paired to your phone, there's very little wait time for answers. It's headphones, so you feel less silly talking to them because you're already used to talking through earbuds. In general, I've found the AI to be the most helpful at identifying things when we're out and about. It's a natural extension of what I'd do anyway with my phone. I find something I'm curious about, snap a pic, and then look it up. Provided you don't need to zoom really far in, this is a case where it's nice to not pull out your phone. [...]

But AI is a feature of the Meta glasses. It's not the only feature. They're a workable pair of livestreaming glasses and a good POV camera. They're an excellent pair of open-ear headphones. I love wearing mine on outdoor runs and walks. I could never use the AI and still have a product that works well. The fact that it's here, generally works, and is an alright voice assistant -- well, it just gets you more used to the idea of a face computer, which is the whole point anyway. 

Gen AI + CRISPR

Forget about EVs. Pretty soon, we'll all fly to work with wings on our backs.

Oops, don't mind the nutsack dangling from my forehead. That was a typo.


Generative AI Arrives In the Gene Editing World of CRISPR (nytimes.com)7

An anonymous reader quotes a report from the New York Times:Generative A.I. technologies can write poetry and computer programs or create images of teddy bears and videos of cartoon characters that look like something from a Hollywood movie. Now, new A.I. technology is generating blueprints for microscopic biological mechanisms that can edit your DNA, pointing to a future when scientists can battle illness and diseases with even greater precision and speed than they can today. Described in a research paper published on Monday by a Berkeley, Calif., startup called Profluent, the technology is based on the same methods that drive ChatGPT, the online chatbot that launched the A.I. boom after its release in 2022. The company is expected to present the paper next month at the annual meeting of the American Society of Gene and Cell Therapy."Its OpenCRISPR-1 protein is built on a similar structure as the fabled CRISPR-Cas9 DNA snipper, but with hundreds of mutations that help reduce its off-target effects by 95%," reports Fierce Biotech, citing the company's preprint manuscript published on BioRxiv. "Profluent said it can be employed as a 'drop-in replacement' in any experiment calling for a Cas9-like molecule."

While Profluent will keep its LLM generators private, the startup says it will open-source the products of this initiative. "Attempting to edit human DNA with an AI-designed biological system was a scientific moonshot," Profluent co-founder and CEO Ali Madani, Ph.D., said in a statement. "Our success points to a future where AI precisely designs what is needed to create a range of bespoke cures for disease. To spur innovation and democratization in gene editing, with the goal of pulling this future forward, we are open-sourcing the products of this initiative."

Friday, April 12, 2024

home quantum computing

 

New Advances Promise Secure Quantum Computing At Home (phys.org)13

Scientists from Oxford University Physics have developed a breakthrough in cloud-based quantum computing that could allow it to be harnessed by millions of individuals and companies. The findings have been published in the journal Physical Review Letters. Phys.Org reports:In the new study, the researchers use an approach dubbed "blind quantum computing," which connects two totally separate quantum computing entities -- potentially an individual at home or in an office accessing a cloud server -- in a completely secure way. Importantly, their new methods could be scaled up to large quantum computations. "Using blind quantum computing, clients can access remote quantum computers to process confidential data with secret algorithms and even verify the results are correct, without revealing any useful information. Realizing this concept is a big step forward in both quantum computing and keeping our information safe online," said study lead Dr. Peter Drmota, of Oxford University Physics.

The researchers created a system comprising a fiber network link between a quantum computing server and a simple device detecting photons, or particles of light, at an independent computer remotely accessing its cloud services. This allows so-called blind quantum computing over a network. Every computation incurs a correction that must be applied to all that follow and needs real-time information to comply with the algorithm. The researchers used a unique combination of quantum memory and photons to achieve this. The results could ultimately lead to commercial development of devices to plug into laptops, to safeguard data when people are using quantum cloud computing services.
"We have shown for the first time that quantum computing in the cloud can be accessed in a scalable, practical way which will also give people complete security and privacy of data, plus the ability to verify its authenticity," said Professor David Lucas, who co-heads the Oxford University Physics research team and is lead scientist at the UK Quantum Computing and Simulation Hub, led from Oxford University Physics.


I've been wondering when predictive analytics will be available to consumers. Here's a starter conversation.

 https://medium.com/illumination-curated/prediction-ai-is-coming-are-you-and-the-world-ready-93da983994b8

Thursday, April 11, 2024

Joker films released before US Presidential elections


The previous Joker film builds up the Joker as a bullied victim who ultimately leads a violent mob.

The new Joker sequel trailer alludes to 'change' and mob violence in such a way, I couldn't help but check the release dates relative to the US elections...I can only speculate what cake is likely to build up ahead of the election that this film will serve to ice.

Bring in Lady Gaga and really sell a romantic revolution.

No coincidence, the first film was released October 4 of 2019, and the sequel will be released October 4 of 2024.

No doubt, the film and actors will win awards. Genius! Stupendous! 

https://www.youtube.com/watch?v=2UInBwhQQ0A



Thursday, April 4, 2024

Apple home robots

 

Apple Reportedly Exploring Personal Home Robots (cnbc.com)55

As reported by Bloomberg (paywalled), Apple is exploring the development of personal home robots following the shut down of its electric vehicle project. CNBC reports:Engineers at Apple have been looking into a robot that can follow users around their homes and a tabletop device that uses robotics to adjust a display screen, Bloomberg reported, citing people familiar with the research team. [...] Apple's hardware engineering division and its artificial intelligence and machine learning group are overseeing the work on personal robotics, Bloomberg reported. The home robot project is still in the early research and development phase, according to the report.