Agenda

  • Expand all
Jun 2

02:00 PM - 02:25 PM

Description

How hologram telecommunication delivers the parts that every other technology leaves out — the parts that make us human.

Speakers

Founder , Proto
Jun 3

10:00 AM - 10:25 AM

Description

Coming Soon!

Speakers

Business Developer , MAXST
Jun 2

03:40 PM - 04:35 PM

Description

How does augmented reality overlap with the shift from web2 to web3?
Amy will moderate a panel of experts on the convergence of #AR with the growth in adoption of blockchain, NFT, and cryptocurrency.

Speakers

Unity Developer , Jadu
VP Accelerator & Portfolio , BoostVC
Managing Partner , WXR Fund
Founder / Executive Creative Director , &Pull
Spatial Interaction Designer , LATE-FX
Jun 1

04:05 PM - 05:00 PM

Description

Haptics is the next major unlock for augmented reality and virtual reallity, like “HD” and “4K” was for audio and video. This panel will explore the current state of haptics tech in XR, ethnics, accessibility and future-forward use cases, including live events, training and esports. The discussion will cover technologies, innovations, standards, advice on adoption, and hands-on development.

Speakers

Strategic Partnerships , TITAN Haptics
Strategic Partnership Manager , Contact CI
XR Producer & Content Creator , RealityCheckVR
Developer , OWO
Jun 1

03:25 PM - 03:50 PM

Description

Recently, there has been, and continues to be, a flurry of activities around AR and the Metaverse. How these domains intersect and unfold over time is still very much in the early stages. What is clear, however, is that the “on-ramp” or gateways into the Metaverse starts with the ability to perceive the physical and digital worlds simultaneously. Many technologies and devices are needed to enable the true immersion and first and foremost is the ability to overlay the digital domain onto the physical space. In this talk we will discuss these aspects and delve deeply into near-to-eye display technologies that allows uses to coexist in the physical and digital domains.

Speakers

Director, Strategic Marketing , STMicroelectronics
Jun 2

02:30 PM - 02:55 PM

Description

Headset-based AR/VR offers an immersive dive into these new digital worlds, but to many it still feels cumbersome and unfamiliar. As a result, mass adoption is still relatively slow.
3D Lightfield displays offer a naturally immersive, “window-like” 3D look into the Metaverse while leaving users’ faces unencumbered. They can be readily deployed on familiar terminals, from smartphones / tablets to laptops or automotive displays.
Better still, this method ensures compatibility with much of the existing digital content ecosystem, hence democratizing access to the Metaverse and potentially accelerating its deployment.
In this talk, I will review our efforts at Leia to commercialize Lightfield-based mobile devices and our take on how to steadily ramp consumer adoption of the Metaverse.

Speakers

CTO , Leia Inc
Jun 1

12:00 PM - 12:25 PM

Description

Volumetric video technology captures full-body, dynamic human performance in four dimensions. An array of 100+ cameras point inward at a living entity (person, animal, group of people) and record their movement from every possible angle. Processed and compressed video data from each camera becomes a single 3D file – a digital twin of the exact performance that transpired on stage – for use on virtual platforms. Finished volcap assets are small enough to stream on mobile devices but deliver the visual quality detail of 100+ cameras, making them a go-to solution for bringing humans into the Metaverse.

The volumetric video market is expected to grow from $1.5B USD in 2021 to $4.9B USD by 2026 as holographic imaging becomes increasingly crucial for the development of compelling, human-centric immersive content and Metaverse creators strive to solve the “uncanny valley” problem.

The session dives into the latest and greatest applications of volcap in augmented reality across multiple sectors – including fashion, entertainment, AR marketing and branding, enterprise training, and more…

We’ll examine the ground-breaking potential this technology holds for augmented and mixed reality as well as some of the challenges that may face this burgeoning industry.

Speakers

Stage Hand , Departure Lounge Inc.
General Manager , Metastage
Jun 1

05:05 PM - 05:30 PM

Description

In this talk, Luis will provide a structured overview of the key delivery bottlenecks, what technology advancements are being made and some case examples of recent metaverse implementations spanning transportation, education and entertainment, in order to answer the question: “Are we there yet?!”

Speakers

Founder and CEO , Mawari
Jun 3

11:35 AM - 12:00 PM

Description

The Open AR Cloud is working to democratize the AR Cloud with infrastructures based on open and interoperable technology. And we are building city-scale AR testbeds that are being experienced throughout cities around the world. These are real-world use cases that combine the digital with the physical–rich experiences that are synchronous, persistent, and geospatially tied to a specific location. Content in situ allows the user to explore the world, connect with others, and have a shared experience.

We will discuss new types of content activations based on proximity, gaze, voice, sensor data, and algorithmic spatial ads. Partners will present use cases such as wayfinding and NFT exhibits, as well as case studies that demonstrate how the technology is being used to build more diverse, equitable, and inclusive, real-world communities that raise awareness on important critical issues like climate change and public health.

Speakers

CEO , XR Masters
Director of Spatial Experiences , UXXR Design
Founder , XR Masters
CEO, Creative Director , Novaby
Jun 2

01:30 PM - 01:55 PM

Description

Building digital twins of our environments is a key enabler for XR technology. In this session, we will cover a few works, which have recently been developed at InnoPeak Technology, on 3D reconstruction (building digital twins) from monocular images. We first present MonoIndoor, a self-supervised framework for training depth neural networks in indoor environments, and consolidate a set of good practices for self-supervised depth estimation. We then introduce GeoRefine, a self-supervised online depth refinement system for accurate dense mapping. Finally, we talk about PlaneMVS, a novel end-to-end method that reconstructs semantic 3D planes using multi-view stereo.

Speakers

Director , OPPO
Jun 3

02:35 PM - 03:00 PM

Description

Ray tracing techniques improve graphics rendering qualities dramatically in recent years. In the coming few years, mobile chipsets will also support hardware accelerated ray tracing, which will bring a more visually believable virtual environment with realistic lighting and shadowing effects. It will become a major technique used in mobile gaming, augmented and virtual reality devices.

OPPO, collaborated with partners, began developing its ray tracing technology early in 2018 and had initially adopted a hybrid rendering method to gradually introduce ray tracing to existing mobile devices. This talk will introduce the short history of mobile ray tracing, forecast its trends in mobile devices and explore the potential applications.

Speakers

Sr. Director , OPPO
Jun 3

02:05 PM - 02:30 PM

Description

As part of a DoD project, Deloitte recently built a successful 5G Infrastructure and a multitude of technologies built on top adding measured value to one of the largest military branches in the US. In this talk we'll discuss AR, Edge Compute and the impact 5G is having actively across dozens of bases in the Military today and where we see the future of xR going beyond the current 'Metaverse' buzz.

Speakers

Senior Manager , Deloitte Consulting
Jun 3

01:30 PM - 02:00 PM

Description

We are rapidly entering the world of augmented intelligent reality where experiences are built at an intersection of the real and digital worlds. In this talk we will share some amazing success stories: discover how Passio is using Unity to build game-changing experiences by combining AR with AI that runs on user devices to transform fitness, healthcare, home remodel and other industries. Join us and be inspired to create the world of your dreams using the next wave of AI and AR technologies.

Speakers

Co-Founder & CEO , Passio
Jun 1

11:30 AM - 11:55 AM

Description

Digital Twins, The Metaverse, Game Engines, 3D Maps and Computer Vision are all merging to make a vision of "Sim City" where the city is our real city. When will this happen? How will it happen? What will it mean for AR and VR? What will we use it for? Why will web3 & crypto be important?

Matt will explore what comes after the AR Cloud is built, and we can finally connect The Metaverse to the Real World, showing technically ground-breaking world-first demos of his new startup which is making all this possible

Speakers

CEO , Dejavu
CEO , Stealth
Jun 3

09:00 AM - 09:25 AM

Description

In the session we will explore how dynamic aberrations correction can increase the apparent resolution, eliminate color fringing and pupil swim effects, and enlarge the eye-box in the highest-end displays, such as Oculus Quest, Varjo VR-1 and HP Reverb G2.

Having to achieve high resolution, wide field of view, and large eye-box, the VR/AR head-mounted display makers face the challenges impossible to overcome by hardware design alone. Even the latest-and-greatest devices retain the common flaws spoiling user experience: blur and color fringing outside of the small “sweet spot,” picture quality degradation and geometry distortion at wide gaze angles, and a tiny eye box.

In order to achieve realistic picture quality and natural visual experience, the rendering pipeline has to include advanced image pre-processing beyond the standard geometry warp and channel scaling. Almalence created the Digital Lens, a computational solution utilizing a precise characterization of HMD optical properties along with a dynamic aberration correction technique, adjusting on the fly to the eye-tracking data.

Speakers

CEO , Almalence
Jun 1

01:55 PM - 02:20 PM

Description

After a full house presentation at AWE 2017 our team will return with new hands free 3D user interfaces for XR headsets using the eyes alone. For the first time we will give insight into our technology and explain how eye-vergence based 3D interaction was achieved and why it is as significant for XR headsets as the touch screen for smart phones.

During the presentation we will demonstrate on the latest XR headsets how users can interact in their forward visual space e.g. pick up, move, rotate 3D content using binocular eye vergence movements without having to engage other objects e.g. controllers or display elements. In summary we will have a deep dive into hands free 3D user interfaces using the eyes alone and share how it is useful in consumer and enterprise environments.

Speakers

Co Founder, CEO , Pillantas Inc.
Jun 3

09:30 AM - 09:55 AM

Description

Miniaturization is the trend to manufacture even smaller mechanical, optical and electronic products, medical devices, and other high-value parts. This trend continues to be strong, with year-over-year growth in many markets. One of the limiting factors to miniaturization is the inability of traditional manufacturing methods like injection molding and CNC machining to effectively and economically produce smaller and smaller parts.

Additive Manufacturing (AM), or 3D Printing has been around now for over 30 years. For a long time, there were only a few technologies available and applications were generally limited to prototyping. Past advances in AM have come short in meeting the needs of small parts, printing them at a resolution, accuracy, precision and speed that made them a viable option for end-use production parts. That has all changed. Additive Manufacturing and Miniaturization are now converging – in a very meaningful and impactful way.

The growth in the AR/VR market and pace of innovation is opening up applications that were not even imagined a decade ago. With this comes challenges for manufacturing to scale at the same pace. Many leading AR/VR technology companies have started using micro 3D printing as a method for producing various micro-precision components as an alternative to traditional fabrication methods – finding huge time and cost savings.

Learn how micro-precision 3D printing enables companies in this competitive space to address development challenges limited by current microfabrication methods, but also allows companies to explore the potential of pushing the limits on miniaturization by expanding the boundaries otherwise thought impossible with 3D printing.

Speakers

Director of Sales, the Americas , Boston Micro Fabrication
Jun 1

02:55 PM - 03:20 PM

Description

A discussion of the emerging digital frontier of the Metaverse, and how it will change the way we use and interact with computers in both our personal and professional lives.

Speakers

Co-Founder & Chief Technology Officer , VISR Dynamics
Jun 3

03:05 PM - 03:30 PM

Description

The prize of functional, lightweight, all-day wearable smartglasses remains as elusive as ever. The missing linchpin component is the optical combiner which must reconcile the conflicting requirements of see-through quality, display performance, light weight, efficiency, prescription, compatibility and aesthetics. In this talk I’ll describe META’s approach to AR optical combiners based on a free space architecture enabled by holographic optical elements. Combined with our ophthalmic compatible ARfusion lens casting technology, the result is a monolithic combiner that provides extraordinary optical efficiency, ophthalmic grade see-through clarity, and the prescription and aesthetic properties that opticians, consumers, and regulators expect from eyewear. I’ll present META’s One Stop Shop approach to individualization that circumvents many of the concerns around eyebox size and manufacturability.

Speakers

CTO , META® Materials
Jun 1

05:35 PM - 06:00 PM

Description

Unlocking the potential of metaverse enabled devices involves contextual perception of multiple sensing domains like sound, vision, motion, and biometrics, and thus comprises on-device signal processing with edge AI inferencing, as well as wireless connectivity and 5G, for cloud immersion. In this session, you will learn how multi-domain sensing and connectivity paradigms in wearable and XR devices are overcome using a comprehensive platform approach.

Speakers

Senior Director of Customer Solutions , CEVA