"So tell me about yourself." To some, it's a dreaded phrase that ruins blind dates or job interviews right from the start. Don't you ever wish you had a little AI assistant in your ear feeding you the right lines to say in any social setting? One student from Stanford has designed just that: a way to use ChatGPT and an AR monocle to act as your own Cortana, helping out in your day-to-day like you're Master Chief.
Bryan Hau-Ping Chiang (spotted by Tom's Hardware) created a prototype digital assistant he calls an "operating system for your entire life." Speech recognition software listens in to your conversation, feeds it to ChatGPT, and spits out a response that appears on the lens of an open-sourced AR monocle that clips onto your glasses. It can even recognize the faces of the people you're talking to.
Bryan has published a series of tweets illustrating the prototype's capabilities. The tech has been going by a couple of playful names, such as lifeOS or rizzGPT, but it all does roughly the same thing. One instance has it scan your friends' faces and then "bring up relevant details to talk about based on your texts with them." It's all presented somewhat unseriously, but it's not hard to imagine this sort of tech actually being used someday in an AI assistant that can scan a stranger's face, identify them, and pull up facts and talking points based on their social media posts.
That'd give you instant icebreakers, assuming you can get past the whole 'scanning people's faces without their consent' thing, don't mind interacting with other people through a language model proxy, and the latency isn't so high that you're standing there staring at them waiting for your monocle to spit out relevant information.
meet lifeOS: an operating system for your entire life 🌐a personal AI agent delivered directly through AR smart glasses 👓it uses computer vision to 👁️recognize👁️ your friends’ facethen brings up relevant details to talk about based on your texts with them (memory🤯) pic.twitter.com/BgCuODV4k2April 10, 2023
Another use sees the AI assistant helping choose a dish at a restaurant by analyzing the menu and quickly making a recommendation based on the user's historical taste preferences. However, the best example I've seen of this tech in action is during a mock job interview, where it responds to the interviewer's questions with appropriate (although not necessarily true) answers.
As it is, there's a pretty long delay in getting responses, causing awkward silences, and the text on the screen seems a little tricky to read. And this prototype monocle isn't what you call discrete, either. Apple's planned AR glasses might provide something more fashionable (although Google Glass notably did not take off, and also creeped people out), but if you can pull off a monocle on your sunglasses, more power to you.
It reminds me of that scene in Spider-Man: Far From Home where Peter Parker gets a pair of AI-powered AR glasses. But instead of spying on your friends and controlling lethal drones, you can be reminded that a colleague recently went on vacation, which is much less exciting. More practical, I guess.
Chiang is asking the community for other situations where they can test out the tech. One suggestion I have: give me the best responses for getting out of a speeding ticket.