16:05 - 16:30
The first full time adopters of AR glasses will be people that are already wearing eyewear on a daily basis. Virtually everyone in this category requires prescription lenses.
In order to allow personalized, prescription AR glasses, mass customization of optics is key. However, this is only part of the picture. This talk will focus on the supply chain that is needed to allow personalized AR glasses for the masses.
There is a lot to learn from an existing supply chain that allows the personalized prescription eyewear that consumers currently have access to. The eyewear industry has solved the problem of delivering personalized glasses by separating the end product into two components; the frame and the lenses.
In the future, AR glasses will be designed in a comparable way to current eyewear; consisting of a smart frame, and smart lenses, containing the display technology.
During this talk we will take a deeper look into the AR ecosystem, and how the development of smart lenses (with integrated light engines) will allow the industry to enable a supply chain that has many parallels with the current eyewear industry.
Specifically, we will focus on how current players in the AR ecosystem will collaborate in the future. Which players will enable smart frames? Which industries will enable the smart lenses, and how? How does software play a role in this?
10:35 - 11:00
Join Yacine Achiakh, Founder of Wisear, as he delves into the crucial role of neural interfaces in driving the widespread adoption of augmented reality (AR) and demo their benefit live on stage. In this captivating keynote, Yacine will explore the evolution of human-computer interfaces, from keyboards and mice to touchscreens, and highlight the need for a new generation of interfaces to propel AR into the mainstream. Discover how current controllers fall short in delivering seamless and immersive interactions, and why alternatives like voice and hand tracking have their limitations.
Yacine will unveil the game-changing potential of neural interfaces, which enable touchless and voiceless control through facial muscular, eye, and brain activity. Witness live how Wisear is at the forefront of building neural-interface powered products that revolutionize the way we interact with AR and VR devices, paving the road for ubiquitous adoption in consumer and enterprise applications. Don't miss this enlightening presentation that will shape the future of human-computer interactions in XR.
11:05 - 11:30
Through real-world case studies from various industries, witness the remarkable achievements of organizations that have successfully decreased implementation time by 40% or more, while consistently achieving right-first-time results. Gain valuable insights into the transformative power of XR technology and its potential to revolutionize the manufacturing landscape.
Join us in this engaging session as we delve into the future of implementation practices, harnessing the potential of XR technology and new methodologies to drive unprecedented speed and success in the implementation of production lines and automation projects. Prepare to be inspired and equipped with practical strategies to enhance your organization's efficiency and ensure the seamless execution of your next implementation endeavor.
12:30 - 12:55
Dispelix develops and delivers lightweight, high-performance see-through waveguide combiners that are used as transparent displays in extended reality (XR) devices. Our full-color near-eye displays encompass all dimensions of XR comfort - social, wearable, and visual alike – in a simple eyeglass-lens form. Rich and pleasant XR experience calls for seamless merger of display and light engine technologies. In her presentation, Dispelix Vice President Pia Harju discusses how advanced design contributes to XR comfort and same time helps draw full potential from the light engine and display. We will showcase how built-in mechanical and optical compatibility fuse esthetic and functional aspects of design, paving the way for mind-altering XR eyewear experience.
13:00 - 13:25
Testing optical performance of components and the image quality of a completed XR headset is an important but not well-known part of the product development and mass production cycle - in fact, today's headsets would likely not be existing without it.
In this presentation, we will give an overview of the various steps in the production process where optical testing comes into play and discuss new developments like active alignment technology where we make use of real-time test data to assemble XR modules for best image quality.
Finally, we describe a new generation of test equipment that uses custom high-end optics specially tailored to XR that integrates both "big" and "small" picture-scale image quality test capabilities in one instrument, bringing test technology to the next level to support tomorrow's high-resolution XR headsets.
13:30 - 13:55
Solving the vergence-accommodation conflict – the mismatch between perceived and focal distances for stereoscopic 3D displays – represents a critical hurdle to overcome for augmented reality (AR). Overcoming this would smooth interactions with virtual content, blurring the line between the simulated and the real, yet the industry has yet to settle on an approach.
In this presentation, IDTechEx outlines the range of display and optical systems proposed to solve the vergence-accommodation conflict, weighing up technological suitability and market forces to suggest likely candidates for wide deployment. Technologies including retinal projection, holographic and light field displays, and focus-tunable lenses are detailed and benchmarked on promotion of social acceptability, manufacturing feasibility and other factors. Alongside assessment of industry forces, this leads to the presentation of adoption roadmaps, plotting a path for integration into immersive consumer AR devices.