Martin Young2 hours agoMark Zuckerberg says Meta wearables that read brain signals are coming soonThe new neural technology that Meta is developing will be “pretty wild,” said Zuckerberg, adding its first application will be for AR glasses.916 Total views10 Total sharesListen to article 0:00NewsOwn this piece of crypto historyCollect this article as NFTJoin us on social networksMeta CEO Mark Zuckerberg has hinted his firm is making progress on its first “consumer neural interfaces,” non-invasive wearable devices that can interpret brain signals to control computers. “One of the things that I’m pretty excited about — I think we’ll start getting some consumer neural interfaces soon. I think that’s going to be pretty wild.”
However, unlike Elon Musk’s Neuralink brain chip, Zuckerberg explained that these devices wouldn’t be something that “jacks into your brain” but something wearable on the wrist that can “read neural signals that your brain sends through your nerves to your hand to basically move it in different subtle ways.”
Meta first began discussing the development of “wrist-based interaction” in March 2021 as part of Facebook Reality Labs Research.
Meta’s wristband works using electromyography (EMG) to interpret brain signals about desired hand gestures and translate them into commands to control devices.
“We’re basically able to read those signals and use them to control glasses or other computing devices,” he added.
The most recent comments came during an interview on April 18 between the Facebook co-founder and tech entrepreneur and YouTuber Roberto Nickson.“We’re still at the beginning of the journey because we haven’t rolled out the first version of the product, but playing with it internally it’s … it’s really cool … really interesting to see.”
Earlier this year, the Meta CEO said that this neural wristband could become a consumer product in just a few years, using artificial intelligence to overcome the limitations of camera-based gesture tracking.Mark Zuckerberg on consumer neural interfaces. Source: YouTube
He has also envisioned the neural interfaces to work with Meta’s Ray-Ban augmented reality smart glasses.
Commenting on the firm’s smart glasses, he said the “hero feature” was integrating AI into them. “We’re really close to having multi-modal AI [...] so you don’t just ask it a question with text or voice; you can ask it about things going on around you, and it can see what’s going on and answer questions [...] that’s pretty wild,” he added.
Related:Meta’s AI boss says LLMs not enough: ‘Human level AI is not just around the corner’
Meanwhile, lawmakers in the United States are already working on legislation aimed at protecting privacy in the nascent field of neurotech.
The Protect Privacy of Biological Data Act, which expands the definition of “sensitive data” to encompass biological and neural data, was passed in Colorado this week, according to reports.
In other news, Meta has just released a new version of Meta AI, the assistant that operates across the firm’s applications and glasses. “Our goal is to build the world’s leading AI,” Zuckerberg said.
Meta AI is being upgraded with the new “state-of-the-art Llama 3 AI model, which we’re open-sourcing,” he added.
Magazine: How to get better crypto predictions from ChatGPT, Humane AI pin slammed: AI Eye# Mark Zuckerberg# AI# MetaAdd reaction