In The Big Nine How the Tech Titans & Their Thinking Machines Could Warp Humanity, Amy Webb (2019) writes, “Building AI means predicting the values of the future” (p.127). Our values shift and change over time. Webb asks, “How do we teach machines to reflect our values without influencing them?” (Webb, A., 2019)[20] Especially when we already seem to be influenced by machines. In an Instagram video, a teacher posted that she knew she was teaching a new generation of kids when one of them accidentally called her Alexa - Amazon’s AI.
AI helps us see beyond natural human vision. Doctors can use AR glasses to see right to your organs. Police can see behind walls to catch the bad guys. And kids can see through the lawn to the earthy ecosystem beneath, watching worms inch away from moles and tree roots plunging deeper into the dirt.
Of course, all this power comes at a price. How privacy is managed in a see-through world is something to consider. And how artificial intelligence should handle the data it’s not supposed to see. What will AI learn from seeing into our bodies or through walls? Is the AI held locally on our devices or does it connect to a larger network?
Seeing through walls is already in our sights. The MIT Media Lab created X-AR, an AR headset add-on to Microsoft’s Hololens. The technology lets you see objects behind walls and in boxes. It works “by combining new antenna designs, wireless signal processing algorithms, and AI-based fusion of different sensors.” [21] The antenna on the headset lets the Hololens locate objects out of sight. Today, the object needs to have an RFID tag and be within 15 feet of the headset for the wearer to see it. As the world becomes more connected with RFID tags and NFC chips embedded in our clothes, credit cards, and other products, super-human seeing will be possible.
“Imagine Alexa for your eyes,” said Mitchell Feldman, co-founder of XRAI. (Gilmore, G., 2023)[22] AI-powered AR glasses let deaf people read conversations. In the future, neural interfaces and brain-computer interfaces might allow us to change our brains to “hear” what we might not be able to. But today AR glasses can transcribe in real-time conversations to subtitles that can be stored and rewound. One company working on these is called XRAI. Feldman sees XRAI being used by anyone like asking XRAI to remind you what your partner asked you to pick up from the grocery store or explain what your professor talked about during class that day, or what the doctor said about taking your medication.
AI-powered AR glasses will open up the world as language will no longer be a barrier to travel. Glasses can translate in real time what people are saying. Microphones embedded in your glasses will translate your words into your native language. You will be able to read the packaging or have your glasses read it for you. That would have been helpful during my summer internship in Spain, where I bought a bottle of lotion thinking it was shampoo. Or in Korea instead of miming my way from the commuter train to the bicycle rentals, my AR glasses could have directed me where to go. Thankfully, the Korean woman was savvy enough to accept a “gamsahapnida” thank you.
AI-powered smart glasses embedded into ski goggles can train skiers and snowboarders. These glasses can project a path down a ski hill, one for a confident skier or one for a newbie. Instead of following a 5-year-old and her mom down a slope at Crystal Mountain, AI glasses would have shown the way.
Whether AI becomes our friend, helps us explore the world, or tackle challenges outside our comfort zone, we’ll have to wait and see. Asking ChatGPT how it thinks artificial intelligence will impact AR glasses, it responded:
“Artificial intelligence has the potential to greatly impact AR glasses by enhancing their capabilities and user experience. AI can be leveraged to improve object recognition, real-time tracking, and contextual understanding, enabling AR glasses to provide more accurate and relevant information and immersive augmented experiences. Additionally, AI can aid in personalization, adaptive content generation, and natural language processing, making interactions with AR glasses more intuitive and tailored to individual users.”
Snapchat’s My AI answered the same question. “Artificial intelligence could enhance AR glasses by making them more intuitive and personalized.”
It seems like AI is not even sure of its own impact on AR glasses and the future.
As a side note, we recognize that generation alpha, the generation that is 12 years old and still being born, is a generation that will not know a world without AI. They have probably grown up speaking to Alexa, Siri, or Google. We also recognize that more advanced forms of AI voice assistants can potentially have positive and negative effects on children and that there is controversy around how AI is trained and what advice it gives young children, as is the case that Tristan Harris and Aza Razkin shared in their prescient A.I. Dilemma presentation in DC earlier in 2023. [23]