Friday, April 28, 2023

Tokyo Has 20x As Much Wi-Fi As It Needs

 Not that this is interesting, but it's hilarious.


Tokyo Has 20x As Much Wi-Fi As It Needs (theregister.com)27

An anonymous reader quotes a report from The Register:Tokyo has five million Wi-Fi access points -- and that's 20 times what the city needs, because they're reserved for private use, according to NTT. The Japanese tech giant proposes sharing the fleet to cope with increased demand for wireless comms without adding more hardware. NTT says it's successfully tested network sharing with a scheme that starts by asking operators of Wi-Fi access points or other connections if they're open to sharing their bandwidth and allowing random netizens to connect. In return they get a share of revenue from those connections.

Under the scheme, netizens search for available networks and, as they connect, a contract would be executed allowing a link to be made. That contract would use Ethereum Proof of Authority to verify identities and initiate the back-end billing arrangements before allowing signed-up users and devices to join private networks. The operator of the Wi-Fi access point gets paid, the punter gets a connection, and everything's on a blockchain so the results can be read for eternity. [...] If this all scales, NTT estimates Tokyo won't need to add any more Wi-Fi access points or private 5G cells, even as demand for connectivity increases. The company also suggests it can enable networks to scale without requiring commensurate increases in energy consumption, and that spectrum will also be freed for other uses.

Thursday, April 27, 2023

IVF babies conceived by robot

 

The First IVF Babies Conceived By a Robot Have Been Born (technologyreview.com)28

An anonymous reader quotes a report from MIT Technology Review:Last spring, engineers in Barcelona packed up the sperm-injecting robot they'd designed and sent it by DHL to New York City. They followed it to a clinic there, called New Hope Fertility Center, where they put the instrument back together, assembling a microscope, a mechanized needle, a tiny petri dish, and a laptop. Then one of the engineers, with no real experience in fertility medicine, used a Sony PlayStation 5 controller to position a robotic needle. Eyeing a human egg through a camera, it then moved forward on its own, penetrating the egg and dropping off a single sperm cell. Altogether, the robot was used to fertilize more than a dozen eggs. The result of the procedures, say the researchers, were healthy embryos—and now two baby girls, who they claim are the first people born after fertilization by a "robot."

The startup company that developed the robot, Overture Life, says its device is an initial step toward automating in vitro fertilization, or IVF, and potentially making the procedure less expensive and far more common than it is today. Right now, IVF labs are multimillion-dollar affairs staffed by trained embryologists who earn upwards of $125,000 a year to delicately handle sperm and eggs using ultra-thin hollow needles under a microscope. But some startups say the entire process could be carried out automatically, or nearly so. Overture, for instance, has filed a patent application describing a "biochip" for an IVF lab in miniature, complete with hidden reservoirs containing growth fluids, and tiny channels for sperm to wiggle through.

"Think of a box where sperm and eggs go in, and an embryo comes out five days later," says Santiago Munne, the prize-winning geneticist who is chief innovation officer at the Spanish company. He believes that if IVF could be carried out inside a desktop instrument, patients might never need to visit a specialized clinic, where a single attempt at getting pregnant can cost $20,000 in the US. Instead, he says, a patient's eggs might be fed directly into an automated fertility system at a gynecologist's office. "It has to be cheaper. And if any doctor could do it, it would be," says Munne.

Monday, April 3, 2023

new weight loss drug - Mounjaro - lose 50 lbs. in 17 mo.

 


Mounjaro helped a typical person with obesity who weighed 230 pounds lose up to 50 pounds during a test period of nearly 17 months.

https://www.wsj.com/articles/ozempic-mounjaro-weight-loss-drug-wegovy-eli-lilly-66f2906



Sunday, April 2, 2023

The real power of Open AI


The way I am thinking about these open-source AI tools, it’s the new human-computer interface - you plug it into anything and tell it what to do. Like a personal assistant. 

Suddenly, we are empowered to use more things that we don’t really understand but at a higher-stakes level.

Who really understands their microwave, their zipper, or even the technology of tying a necktie, let alone the kinds of things we might like to control, like a car, a drone, or a robot?

Or monkeys sticking a blade of grass into an ant hill for food for that matter, that's beyond most of us, but it's brilliant. That's technology.

It's the power of ABRACADABRA - "As I speak, I create."

God said, "Let there be light, and God saw that the light was good..."

Carlo and Simon said, "Let there be the large hadron collider with sprinkles on top." and God said, "What the fuck, Carlo and Simon?".

Combining any technology with AI tools like Chat GPT gives us the power to speak to create, the ultimate intuitive interface. 

Or, even faster, as I think, I create. As I imagine, I create. But we need the feedback to visualize and consciously digest before we just "Let there be...x" ourselves into oblivion. This virtual sandbox is referred to as a digital twin.

NOTE: According to Wiki, Abracadabra meaning "As I speak, I create" is not a supported derivation, but for the sake of making my point, I'll use it.

Certainly, there are specific use cases where simple gestures are more direct and or intuitive. One could argue that gestures reflect our unconscious states and are therefore more powerful and immediate, even predictive. So, can we improve upon voice commands? No doubt.

One of the most significant leaps, I think, will be connecting AI with computer vision, having cameras all around, and of course, there will be other sensors.We're still pushing haptics, maybe we'll be able to see and feel beyond our own bodies, or machines will be able to feel through us - tactile and emotional sensation.

Once AI has visual language and vocabulary, now we are moving towards the way humans really think, and I’m wondering when we will begin creating new forms of communication through images. Combined with predictive analytics, we approach enhanced imagination. What exactly is imagination? It's the processing of unconscious understanding to go beyond conscious awareness and then bringing it back to conscious understanding for some realization. This is perhaps our own natural gateway to go beyond what we know individually, to sense, connect and receive our universe beyond ourselves. Sounds like horseshit, but seriously, folks.

Then, through these tools, we break our limits of perception including time and space, forming senses beyond human perception. It's not a far leap - x-ray, infrared and beyond, and that's only the electromagnetic spectrum. The other energy is gravitational waves, and both light and gravity transcend physical time and space. 

We know there are senses beyond human perception, and I believe these are the places that we can evolve towards. It may sound esoteric, but the most important question is how are we evolving and where are we going with all this? Zoom out long enough to recognize we're about to go through a major shift in consciousness and we will anticipate and reflect our experience, in the same way that the industrial revolution and the invention of the camera caused Monet to explore impressionism to think about the nature of light, and Picasso to model multiple perceptions of time and space. 

It's hard to ignore that at this time in human evolution, as AI is really making it's debut into mainstream consciousness, we are also seeing an acceleration in people using psychedelics to go into other realms.

While the so-called psychonauts can compare notes that there's something beyond, it's still only existential. Yeah, dude, mechanical elves! Alrighty, then. Anyway... 

But, AI may give us a way to quantify and map those experiences and build new tools that fit our natural inputs to expand our perception and extend ourselves in the same way without tripping balls. New lenses for seeing beyond. And we'll need new vocabulary to navigate it. 

Maybe Elon's neural lace will give us a way to transmit thoughts directly - telepathy to each other, to our tools, maybe these AI tools will begin sourcing our minds instead of the internet from two years ago. Along with computer vision and whatever other sensors we're churning out. Innerspace, outerspace, hyperspace, cyberspace. What will the web become? 

Be sure to check out Open Water, and Mary Lou Jepsen's "Internet of Visible Thought".

Suddenly dog whistles cause our heads to swivel and we're even more interested in sniffing each other's assholes than we already do. Now that's human evolution. 

Right, God?

Right, Carlo and Simon. Just remember, those sprinkles go right through you.

When 'As I speak I create' shifts over to 'As I think I create', we're gonna need a sandbox for simulations so we don't Rube Goldberg our way to oblivion.

With real-time feedback systems, chemical monitoring and CRISPR who knows, maybe we'll accelerate to modify and mutate ourselves. All of these power tools will rapidly connect and converge faster than we can imagine and soon enough we'll be the putty to AI's play-doh fun factory.

As for machines learning to feel emotions - despite what we might like to think about ourselves, our emotions and personalities are largely the product of chemistry. Of course, there's reason and the whole nature vs nurture debate, but emotional states can be monitored and quantified through chemistry and MRIs. Sharing the visceral reaction may be another matter, but the mere though of machines being aware of our emotions - and that doesn't necessarily qualify as empathy, either. To observe it isn't the same as feeling it. Machines might need mirror neurons for that. Cameras? Now I'm back to my point about visual language and imagination.

More of my bullshit along these same lines here and here.