Monday, December 12, 2022

Future BCI communication forms, visual search engines


Update: Read my use case for visual search

Here's a list of existing visual search engines:

https://learn.g2.com/visual-search

https://lens.google/#cta-section

https://www.bing.com/visualsearch

And, here is the image I saw on a bumper sticker around 2000 that first inspired me to think of the idea of a visual search engine, being identified by Google Lens, probably 20-23 years later.


Search my blog for 'visual search engine' for older entries, along with the wish for real-time translation and AR currently in the works via Apple Glass (release 2025) and Google Glass.

Thinking about how visual search and AI may progress, I think about another idea I have for creating a new visual language form, whereby we may communicate more through images, abstract sounds, etc, rather than verbal language. We may begin to think more this way and ultimately develop new formal language forms.

Bear with me, this gets a little more out there.

By the time AI moving images and sounds can be created by interpreting and relaying imagination, society will be communicating through audio-visual language forms. I think in terms of direct relay of memory or visualization.

Combined with AR, the actual physical world may be difficult to discern from illusion (this is obvious and certainly intentional). But move forward to BCIs, and we have something more mind-to-mind.

Those who can express their thoughts through imagination in real-time will be the 'telepaths' and through predictive analytics, 'clairvoyants'.