For the last two years, Meta has been building a supercomputer dubbed the AI Research SuperCluster (RSC), which it says will play an important role in bringing the metaverse to life. RSC is already operational in a limited capacity, and Meta claims that once completed, it’ll be the fastest supercomputer of its kind. And experts can’t wait to harness it to build the metaverse. “Companies like ours, who envision making experiences in the metaverse more enriching, emotional, and meaningful may be able to leverage a platform like this to amplify our impact,” Aaron Wisniewski, CEO and co-founder, OVR Technology, told Lifewire in an email interview.

One For All

Wisniewski, whose company works to enhance virtual reality (VR) by adding the sense of smell, opined smell deeply affects how we think, feel, and behave. “Imagine 7 billion people all defining and sharing their ideal sensory worlds in real-time?”  He said the “shared reality” of the metaverse would create incredible possibilities for connection, empathy, learning, and improved wellbeing, which is something he stressed “has never been possible in history.” Meta hopes to use the RSC to help create this new experience. In their blog post, the company said it’s working with NVIDIA to build the RSC. Currently, the supercomputer uses 760 NVIDIA DGX A100 systems as its compute nodes, for a total of 6080 NVIDIA A100 GPUs. Priced at over $10,000, the A100 GPU uses the same Ampere architecture that powers the consumer RTX 3000 GPU series graphics cards. However, the A100 is optimized for machine learning (ML) rather than gaming, Meta added that it plans to increase the number of GPUs to 16,000, which it suggests will increase the supercomputer’s AI training performance by more than 2.5 times. “Computing power of this magnitude could accelerate the arrival of the immersive, expansive, and inclusive metaverse we are all eagerly anticipating,” shared Wisniewski.

Eye for an AI

While introducing the RSC, Mark Zuckerberg wrote, “the experiences we’re building for the metaverse require enormous compute power (quintillions of operations/second!), and RSC will enable new AI models that can learn from trillions of examples, understand hundreds of languages, and more.” Meta researchers already use the RSC to train large-scale natural language processing (NLP) and computer vision models. Early benchmarks reveal that RSC already runs NLP models three times faster and computer vision workflows 20 times faster than its current infrastructure, paving the way for training more advanced models. More advanced models trained on RSC might improve the tracking quality of VR devices. The impending launch of the highly-anticipated Project Cambria headset could also perhaps enable new computer vision features.   The new machine will be used to train enormous new speech recognition, language processing, and computer vision models that will serve as the foundation for the company’s next generation of AI-driven services.

Artificial Reality

Abhishek Choudhary, the founder of AI-enabled edutech platform AyeAI, told Lifewire via LinkedIn that the RSC will enable more equitable access to cognitive computing facilities for the population in general.  “Once [the RSC] is completed, we can expect to have better and faster diagnosis from medical images, better communication across diverse linguistic settings, and the much-awaited enhancements to the metaverse.” Meta says that RSC has been built, keeping privacy and security as prime focus areas. Before the data is imported to RSC, it goes through a privacy review process to confirm correctly anonymized. After that, it is encrypted before it finds its usage in training AI models. Sharing their vision for RSC, Facebook said they hope it’ll help them build entirely new AI systems. For instance, they’d want the RSC to help facilitate real-time voice translations to large groups of people, each speaking a different language, so they can seamlessly collaborate on a research project or just play a game together. Choudhury is excited at the prospect. “As a technologist, I can barely wait for the metaverse experience where I can enjoy live music and theater shows in languages I do not understand and be able to talk in real-time with people with whom I share no common language.”