top of page

Revolutionizing AR/VR: Northeastern’s Breakthrough Could Shrink Headsets to Sleek Smart Glasses


Northeastern researchers have developed an AI tool that helps improve the performance of AR/VR applications. Illustration by Renee Zhang/Northeastern University
Northeastern researchers have developed an AI tool that helps improve the performance of AR/VR applications. Illustration by Renee Zhang/Northeastern University

Imagine slipping on a stylish pair of glasses that transport you into immersive augmented or virtual reality (AR/VR) without the clunky headsets we’re stuck with today. Thanks to groundbreaking research at Northeastern University, that sci-fi dream is inching closer to reality. Led by Francesco Restuccia, a professor of electrical and computer engineering, a team of innovators is tackling the biggest hurdle in AR/VR tech: making it sleek, wireless, and as comfortable as your favorite shades.


Northeastern electrical and computer engineering professor Francesco Restuccia will present this research at the at the IEEE International Conference on Computer Communication. Photo by Alyssa Stone/Northeastern University
Northeastern electrical and computer engineering professor Francesco Restuccia will present this research at the at the IEEE International Conference on Computer Communication. Photo by Alyssa Stone/Northeastern University

Current AR/VR headsets, like Apple’s $3,500 Vision Pro or Meta’s $300 Quest 3S, are packed with incredible features but come with a catch—they’re bulky, heavy, and let’s be honest, not exactly runway-ready. Companies like Meta have teamed up with Ray-Ban to experiment with smarter glasses, but even those fall short of the power packed into today’s headsets. The problem? AR/VR demands massive data processing, which means big batteries and messy cables that scream “prototype” rather than “future.”

Enter Restuccia’s team, including doctoral students Foysal Haque and Mohammad Abdi, who’ve developed a game-changing solution. Their new AI technology, based on deep neural networks, works its magic at the wireless level. This means the heavy lifting of AR/VR processing can happen on a nearby computer, slashing the need for oversized batteries and tangled wires. Their approach not only speeds up data transfer but also uses far less bandwidth, making AR/VR smoother and more efficient.

“This isn’t just a small step—it’s laying the groundwork for a future where AR/VR is as seamless and stylish as slipping on a pair of Ray-Bans,” Restuccia says. “We’re not there yet, but this is the kind of foundational work that will get us there.”

Their research, detailed in the paper PhyDNNs: Bringing Deep Neural Networks to the Physical Layer, is set to make waves at the IEEE International Conference on Computer Communications (INFOCOM) in London this May. Backed by the National Science Foundation, the Office of Naval Research, and the Air Force Office of Scientific Research, this innovation promises to redefine edge computing for AR/VR and beyond.

コメント


bottom of page