01:00 PM - 01:55 PM
At last year’s AWE, leaders of key standards development organizations (SDOs) and industry consortia including the Khronos Group, W3C, OpenAR Cloud, and ASWF discussed the need for more cross-organization collaboration to pave the way for an open, inclusive metaverse. It wasn’t just talk: these organizations have now come together along with hundreds of other SDOs, companies, and academic institutions to form the Metaverse Standards Forum, a venue for cooperation among metaverse stakeholders to foster the development of interoperability standards. The Forum has begun engaging in pragmatic, short-term projects like building a standards registar, coordinating use cases and requirements for input into SDOs, and creating interoperability prototypes, with more to come.
In this panel, leaders of Metaverse Standards Forum member organizations will discuss the interoperability challenges the group is prioritizing today, and how the forum is working to accelerate solutions. Attendees will gain insight into the nuances of topics including 3D asset interoperability; digital fashion; real/virtual world integration; and privacy, cybersecurity, and identity. The panel will explore current possibilities, challenges, and limitations within these domains, and discuss how attendees can contribute to the formation of an open metaverse.
02:00 PM - 02:25 PM
"AWS Spatial: Building Blocks of the Metaverse" is a session for developers and tech enthusiasts interested in the potential of XR to revolutionize various industries. It is focused on helping those who have moved past the pilot or prototype phase and are now looking to scale and adopt XR in a meaningful way. In this session, attendees will learn how to take disparate 3D workloads and XR proofs of concept and scale them into enterprise-level applications using the tools, building blocks, and patterns available right now. The session will cover the complete XR workload lifecycle, including supporting asset creation, ingestion, transformation, and distribution/deployment.
The session will be led by three AWS Spatial Computing prototype architects, each representing different emerging technology teams within AWS. They will demonstrate how to leverage CAD engineering models, photogrammetry, and AI/ML in 3D asset pipelines to create high-quality XR experiences. Attendees will have the opportunity to get started with the tools, building blocks, and patterns available right now, and learn how to use AWS Spatial to build XR workloads that are scalable, secure, and reliable.
02:30 PM - 02:55 PM
Showcasing the importance of a 3D real-time visualization to improve your workflow in different industries.
03:00 PM - 03:55 PM
It was recently announced that the Quest Store has made over $1.5 billion on sales of games and apps. With new headsets coming into the market steadily, how do VR game creators adapt to the ever-expanding VR ecosystem? New technologies that greatly affect game feel are released on a regular basis, such as hand tracking, haptics in headsets, mixed reality etc. How can creators stay ahead of the curve when designing a game that won't be released for months or years? Despite the fact that the player base expands and potential for ROI improves, there are still only a few active publishers in the VR space - who are they, and what are they looking for? Is self-publishing of VR titles sustainable?
04:00 PM - 04:25 PM
Learn how to quickly create real-time cloud-connected 3D/AR/VR apps in a few simple steps. Join to see how you can build apps for the Metaverse for Android, iOs, HoloLens, MagicLeap, Unity, Unreal, and web browsers, all connected to the same cloud. In the session, we will learn how to change 3D content across all platforms simultaneously through the cloud without rebuilding the apps. We will look at why Metaverse developers are looking for new cloud solutions and why 3D-focused cloud infrastructure is needed. Participants are encouraged to either bring their laptops with Unity installed, bring their mobile devices, or can follow the live demonstration.
04:30 PM - 04:55 PM
Join NVIDIA and Varjo as we unveil the next giant leap in immersive computing.
Real-time ray tracing is the holy grail of 3D visualization, as it allows engineers, designers and creators to experience in high-fidelity 3D scenes and objects while seeing every reflection, shadow, and light as they would in real life. Before now, rendering true-to-life mixed reality scenes with real-time ray tracing has not been possible due to the graphically demanding compute requirements.
In this talk, for the first time, NVIDIA and Varjo will demonstrate real-time ray tracing capabilities in human-eye resolution mixed reality, with never-before-seen visual quality, due to combined technological breakthroughs from both companies. With NVIDIA’s Omniverse, the leading platform for 3D collaboration and workflows, and Varjo XR-3, the most immersive XR headset on the planet, users can unlock real-time ray tracing for mixed reality environments due to the combination of a powerful multi-GPU setup and Varjo’s photorealistic visual fidelity.
In a joint presentation, NVIDIA and Varjo will dive deep into the technical achievement and how this new level of mixed reality visualization can unlock unprecedented value for industrial metaverse users across design, engineering, entertainment, and virtual collaboration.
05:00 PM - 05:25 PM
This presentation will discuss the tools and strategies we developed to bridge the “Valley of Death” point, which many optics developers struggle with, using the example of the world’s first pocket-sized AR device development by our customer and partner.
The developers - Bobak Tavangar, the CEO of Brilliant Labs and Olga Resnik, a co-founder of Joya Team - will share the process insights and show how by use of extensive experience and expertise in AR systems we create a unique holistic development process that helps to define and reach development goals. They'll discuss calculated trade-offs, how to optimize for most important marketing parameters, use of engineering tools to create “virtual prototypes” and finally discuss the manufacture of “quick and valuable” prototypes in a well-established way to reach an affordable, manufacturable and use-case optimal product ready for market adoption.