Holo-Light provides XR streaming technology and a platform for enterprises to scale XR use cases. Susanne Haspinger, co-founder and COO will be joined by Michael Hinckley, Senior Manager Manufacturing Technology at Northrop Grumman to discuss the challenges, requirements and a solution approach for enterprises looking to centralize and scale their XR initiatives. In this session, you'll learn what the industrial metaverse looks like, combining XR use cases, simplicity, data security and performance.
The JPEO-CBRND partnered with MRIGlobal and ForgeFX Simulations to produce The CBNRD HoloTrainer, a networked multiuser Microsoft HoloLens 2 augmented reality training simulator for Chemical, Biological, Radiological, and Nuclear detection device operators. This presentation will cover the entire project lifecycle, including government requirements and the development of holographically projected interactive virtual equipment, culminating in deployment to soldiers. The CBRND HoloTrainer is a groundbreaking spatial computing application that significantly increases the effectiveness of its users.
Did you know that collaboration in the metaverse can benefit greatly from haptics? Learn how realistic, multiuser touch feedback can be cost-effective and fun for teamwork in networked, immersive environments. We’ll also announce preliminary results of exciting new partnerships. Join us as we explore insights and developments which indicate an exciting 5-year path ahead for haptics-enabled training and industrial environments.
Stakeholders, schools, student and investors come together to fix a problem that has defeated educational system for decades. Now, with the creators economy, there’s a bright future for all!
Learning and earning through educational, collaborative and fun games is the wave of investment future and staking for long term wins!
• Creators economy is training and up-skilling
• Certification and rewards smart contracts
• Staking and funding early stage students projects
In large healthcare systems and academic institutions, there are often different departments utilizing similar VR, AR, and/or XR technology to perform specialized functions. As ideas, research, and use cases grow, the number of projects utilizing similar technology increases. This can lead to an array of hardware and software along with inconsistent deployments and support. This presentation will discuss the benefits of coordinating resources and stakeholders to identify systemic barriers and share solutions.
The AR landscape allows us to create unique experiences, but sometimes, we face platform limitations and constraints. Over the years we learned that it doesn’t mean having a lesser result. Your AR experiences could even be greater if you pick the right battles to fight.
We will go over a few projects from The Electric Factory that showcases how shaders, UV tricks, and procedural animations could add more value to your experiences.
We will dive into Talita Hoffman’s project, a 20 feet mural at Meta’s Brazil office with nine fully animated spaces living together in a single AR Filter. With the help of custom-built shaders and procedural animation, we delivered this experience under 3.7 MB with sound design included.
We will also revisit the SpaceJam AR project, a fully animated 360 panorama with 15 animated characters. With procedural animations and a series of UV tricks, we delivered an engaging experience under 1.7 MB with outstanding results.
Let us show you how to augment your reality with the best hacks in the business.
This panel features both the client and the agency that created a compelling AR activation leveraging story-telling, serialized content, and both in-person and virtual events to connect with developers.
This innovative campaign used web-based AR to help drive engagement at a virtual event, offering up a gamified and educational interactive experience featuring 'Chuck' - a sarcastic and snarky block character that led the user through the experience. The campaign further evolved by engaging in-person, conference attendees with booth-activated AR experiences that further told Chuck's story.
Hear how PwC is experimenting with the metaverse and implementing these new technologies into the workplace – to bring teams together, connect employees across the globe, and create a shared sense of culture across their entire organization.
Artificial Intelligence (AI) is already having a significant influence on the Metaverse and the intersection of AI and the Metaverse will become increasingly evident as AI evolves and the Metaverse grows.
AI can be used to create more realistic and engaging experiences in virtual worlds, intelligent avatars that can engage with users in a more natural way, and personalized content in real-time. All of this can improve the overall experience for users. Furthermore, AI can be used to create more efficient and scalable infrastructure for the Metaverse, enabling more users to participate and interact with each other.
However, there will be challenges as well, such as biases in algorithms that perpetuate harmful stereotypes, false or manipulated content leading to misinformation, and questions about privacy and security.
This expert panel will discuss the issues and how AI might potentially transform how we interact with others in immersive spaces.
Doublepoint builds a gesture-based touch interface to enable efficient, robust, and intuitive AR interaction. At this talk, they will showcase how these solutions can work as standalone input modalities or be combined to supercharge eye tracking and hand tracking.
User input is one of the many challenges standing in the way of the mass adoption of AR. How will the everyday person interact in Augmented Reality? What's the role of touch in interaction? At Doublepoint we research different ways for people to interact in AR and develop the best technologies to detect these interactions efficiently and reliably.
Currently, there are a lot of input methods for AR using built-in sensors of the headset, such as hand tracking, eye tracking, and voice input. However, if we want it to be as transformative as the graphical user interface or the capacitive touch screen, we need to put deliberate thought into building the ideal input paradigm and the needed hardware that might not be in the headset itself.
At this talk:
• We’ll demonstrate how a machine learning algorithm on existing smartwatches can already significantly improve AR interaction.
• We’ll show how it can be combined with eye tracking and hand tracking sensors in the headset to improve interactions even more.
• Lastly, we'll show some of our future custom hardware dedicated to sensing advanced micro-gestures in a small and convenient form factor.
Business, social, and educational organizations are just beginning to navigate the
potential of Web 3 for creating communities, including virtual worlds – both screen-based and in virtual reality. And while the drive to develop photorealistic avatars continues, the question of visual biases that persist in our cultures give argument to cases where flexible and individualized avatar design may allow for greater inclusivity, creativity, and engagement. This presentation will touch on current research looking into the opportunities and challenges of avatar design as a tool for diversity, equity, and inclusion.
There are billions of computing devices on Earth filled with millions of apps and hundreds of billions of media files. How do we bring their users into the future? 3D-aware AI technology is a stepping stone that provides a clear upgrade path for these users, creators, and developers ensuring full forwards and backwards compatibility of apps and media content. This talk will discuss free and paid tools powered by machine learning, as well as specific yet generalizable workflows for content capture, creation, development, editing, and distribution of software and media content. It will also highlight effective solutions for displaying XR content to all those people who are reluctant to put on a headset.
Realistic-looking materials are essential to creating virtual worlds that are tactile and inviting. Creating believable materials for 3D objects is a complex problem with several approaches, many of which are either prohibitively difficult to master or unsuited for transmission over the web. Over the past decade, however, Physically Based Rendering (PBR) has emerged as an artist-friendly, intuitive, expressive, and robust technique for adding material attributes to 3D objects.
As part of its stewardship of the glTF standard, the file format known as the “JPEG of 3D,” The Khronos Group has created formal specification extensions for a variety of PBR materials. In this session, members of the 3D formats working group will present the latest waves of PBR extensions, explaining what they can achieve and best practices for using them to create 3D assets. They’ll also discuss how to test for model performance consistency across 3D viewers.
Delta Air Lines is the U.S. global airline leader in safety, innovation, reliability, and customer experience. Powered by its employees around the world, Delta has, for a decade, led the airline industry in operational excellence while building on its reputation for award-winning customer service.
Creating a safe environment for its employees and customers is one of Delta's top priorities. As such, it is imperative that new employees go through rigorous training to do their job safely and effectively to achieve the high-quality service that Delta is known for.
This talk delves into the benefits and challenges of using a virtual reality first design approach for creating interactive training simulations. We explore the advantages of utilizing VR technology to enhance learning and development and how designing with full VR interactivity in mind leads to more effective training outcomes. We will then discuss the impact of adding moderator capabilities into an iPad companion app to enhance the experience for the person in virtual reality and test their skills at a higher level.
Find how Delta identified the intended means of distribution and deployment, as well as the process for determining if WebXR (desktop version of the VR simulation) is a viable solution. We will share the process of adapting the virtual reality simulations for use on desktop platforms to increase accessibility for a wider audience while still preserving immersion. The presentation will cover topics such as the process of designing simulations that are both immersive and effective and the technical considerations involved in adapting VR simulations for use on desktop platforms.
This session will leave attendees with the following takeaways:
• Deeper understanding of how to design and develop training simulations that take full advantage of the unique capabilities of VR technology
• Benefits of intentional design
• How to translate VR simulations to other platforms for maximum reach and impact
• Considerations for prototyping and production before you start your own project
• Technical considerations for designing simulations that can be used in both virtual reality and desktop environments
Join us for a thought-provoking panel discussion on the impact of extended reality (XR) on the fashion industry and its implications for brands and user engagement. Our panel of experts will delve into the ways in which XR technology is changing the fashion landscape and how it is being used to enhance the shopping experience for consumers. We will also explore the potential for XR to increase sustainability in the industry by reducing the need for physical products, as well as its impact on supply chain management and production processes. Don't miss this opportunity to gain insights and understanding of the future of fashion in the XR era.
News organizations were one of the first to embrace 360VR content. Now Paramount’s CBS News Bay Area and CreatorUp have taken immersive storytelling back into the spotlight. In the last 5 months, five customized storytelling projects set engagement and viewership records as Scott Warren from Paramount and Dan Krolczyk from CreatorUp were able to merge teams to remove obstacles. Join the session for an opportunity to see the fifth season finale’ of stories that range from the snowcapped mountains of Tahoe to raging Pacific outside the Golden Gate Bridge. Learn more about how Insta360, HTC Vive and other partners helped along the way.
Incorporating compassion and empathy in metaversive learning can be an effective strategy for increasing the retention of discipline-specific lessons in immersive XR learning environments. Morehouse College is exploring the use of emotive experiences to increase student engagement in their Metaversity. By integrating virtual reality simulations, storytelling, and perspective-taking exercises, students are able to connect with the learning material on a deeper, emotional level.
Virtual reality simulations allow students to experience real-world scenarios that demonstrate the importance of the lesson in a more tangible way. The storytelling aspect engages students' emotions and helps them to retain the material better. Perspective-taking exercises encourage students to think outside of their own perspective and see issues from multiple angles, fostering empathy and understanding. These emotive experiences have resulted in increased student engagement and a deeper understanding of the material, which can be applied in real-world settings.
Morehouse College's approach to incorporating emotive experiences in their Metaversity has the potential to improve student outcomes and better prepare them for future success. Take a walk around their digital twin campus, journey back in time on the La Amistad slave ship and World War II, and discover life of others from the lens of storytelling.
Dr. Paul Chapman demonstrates our novel XR patient leaflets for cleft lip, allowing parents to better understand and visualise their child's surgery. This is a collaboration between the Canniesburn Plastic Surgery Unit, School of Simulation and Visualisation (SimVis), Glasgow Children's Hospital Charity, Cleft Care Scotland and West of Scotland Innovation Hub. The work is published in January’s issue of The Cleft Palate Craniofacial Journal.
In addition, we demonstrate XR technology for working with complex pharmaceutical equipment by creating a digital twin of a typical laboratory.