10:30 AM - 10:55 AM
Digitizing large physical spaces in Augmented Reality holds tantalizing possibilities across industries. Join us for lessons learned from the digitization of Building Construction Projects. Learn about how to leverage your enterprises' existing building assets to make project management and inventory data spatial & how to use your physical space to design efficient AR workflows. Finally, we'll dive into tech challenges implementing AR in large areas and how to overcome them--including drift, repetitive elements, and vertical travel.
10:50 AM - 11:15 AM
In this session participants will learn how to create real-time cloud-connected augmented and virtual reality (AR/VR) apps in less than 15 minutes.
Participants will see how to build native apps for Android with ARCore, iOs with ARKit, HoloLens & MagicLeap with Unity, and web browsers with WebXR and AR.js, all connected to the same cloud. Participants will learn how to change AR content across all platforms simultaneously through the cloud without rebuilding the apps. Participants will see how a small change in one cloud-connected app, can send updates to all other apps across the different AR SDKs & platforms.
Platforms for experimentation include: Google ARCore, WebXR, Vuforia, Unity-based apps, and more. Participants are encouraged to either bring their laptops with Unity installed, bring an ARCore-enabled mobile device, or can follow along the live demonstration.
10:50 AM - 11:15 AM
With Reality Faucet, Pantomime has created some of the first physically based augmented reality experiences -- where objects seem real because momentum, friction, collision sounds, gravity, dynamic lighting, shadows conspire to make us believe virtual objects are real. In particular, the 3D LiDAR sensors in modern iPad Pro and iPhone Pro devices is enabling a new frontier in consumer augmented reality -- where users can easily reach in with popular mobile devices.
Levitt will talk about this emerging frontier, how to have the best experiences with interactive LiDAR, and how to leverage this magical new level of realism with your own physically realistic content.
10:50 AM - 11:15 AM
3D creativity has been viewed by creatives as both exciting and daunting. While we’ve all been blown away by the intricate fantasy worlds that game developers build, and the immersive 3D visual effects in movies, many of us thought creating in 3D was too technically demanding. Our imaginations have run wild with the possibilities – yet we couldn’t imagine using it ourselves.
The pandemic has changed that. When we couldn’t get together for photo shoots and other creative projects, companies turned to 3D to create. Fashion brands used 3D to speed their design processes. Lowe’s used it to digitize their entire catalog for virtual shopping. And 3D virtual photography was the foundation for a Ben & Jerry’s marketing campaign.
What does this mean for creatives moving forward? New technologies and workflows are driving seamless and intuitive creation with end-to-end software integration, as many of our future experiences will likely come through a 3D, immersive interface. In this session, Sebastien Deguy, Vice President of 3D & Immersive at Adobe, will discuss how 3D goes beyond marketing campaigns and into creating entire immersive worlds in virtual or augmented reality that are purely the product of a creative person’s imagination.
10:50 AM - 11:15 AM
This presentation addresses the fundamental challenges in XR solutions creation in depth and how PlugXR provides the solution to these challenges. This SAAS based product is revolutionizing the XR industry by making it easy for a diverse audience to be able to use it with ease due to its simple and coding free interface. Also, introducing PlugXR's brand new features such as Geo Anchors, Area Targets, WebXR amongst others.
11:20 AM - 11:45 AM
Major companies are funding, in the billions of dollars, efforts to create the future of the Spatial Web, the AR Cloud--that ubiquitous, yet ethereal computing platform that we think of when imagining every day interactions with holograms and content, literally at our fingertips. Smaller companies, researchers and creators are also joining forces through the Open AR Cloud to assure that no one company wins the AR localization war. Test beds are being built and city scale AR creations are being experienced in cities throughout Europe and the United States.
Join us as we share our journey mapping our first city blocks, placing our first virtual objects in the real world, then building rich experiences that are shared, persistent and geospatially tied to a specific location. Weâll also talk about AR Cloud basic principles, the services required, and how the Open Spatial Computing Platform allows for interoperability between service providers. Weâll demonstrate our use of a variety of technologies including Augmented City, a cloud & platform ecosystem that allows you to capture, enrich, and visualize content on location;Zapparâs award-winning SDKâs, computer vision libraries, and AR creative tools;as well as, universally accessible webXR solutions. Additionally, we will present how we are pushing the boundaries of the Spatial Web by creating multi-user, synchronized experiences which are activated by proximity, gaze, voice command or click.
11:20 AM - 11:45 AM
This talk will address several common problems associated with traditional systems in medicine and how Extended Reality technologies are solving them. This will include the use of Virtual Reality for preoperative planning and surgical simulation in neurosurgery and the impact of virtual reality on patient consultation and informed consent. Further attention will be given to the deployment of Augmented Reality technologies to help make brain and spine surgery safer and less invasive, including recent published data on early outcomes. Finally, we will cover broad scale development and deployment of Immersive Therapeutics for pain and anxiety as well as a comprehensive program for maternal mental health.
11:20 AM - 11:45 AM
Aerospace manufacturing and maintenance, in even the most modern contexts, is dependent upon the successful execution of often complex and technically challenging processes by talented and properly trained technicians. Traditionally, learning these skills has required extensive classroom, lab, and on-the-job training based on static presentations and reference manuals. Once developed, deployment of these skills on the assembly line or in the maintenance hangar has further relied on “rectangular” reference information to provide guidance as to locations, dimensions, or process sequences critical to achieving quality results. What could a new approach using XR-based, contextually aware, and interactive presentations of technical information, in-situ, contribute to support initial learning and development for new employees, recurrent and differences training, and to provide “heads-up” guidance of the technician as to process flow and critical reminders? How might additional information fed from integrated IoT-enabled hand tools to this presentation provide further knowledge to the technician, enabling greater autonomy, decision making, and higher quality outcomes at lower cost? And, how could aggregation of this data into modern QMS and ERP systems efficiently record as-built quality data and unlock opportunities for process optimization? As aerospace manufacturing and maintenance endeavors to become more agile and the industry solves the simultaneous challenges of generational change and building the scale the workforce necessary to enable new markets like Advanced Aerial Mobility (AAM), we believe that XR and IoT have much to offer the industry, and particularly the talent, of tomorrow.
11:20 AM - 12:15 PM
The basis of human creativity is to discover new possibilities for expression. Today, emerging technologies empower artists to find new, accessible ways to showcase their creations and inspire their audiences by deviating from conventional forms. Many artists are finding holographic interfaces to be the ideal canvas for making and sharing 3D art. In this panel, we'll learn how 3D artists are adapting their work and creating new pieces for cutting edge holographic technologies today.
11:20 AM - 12:15 PM
Working across different time zones with overseas partners is nothing new to the designers and engineers. During the pandemic shutdown, many of them learned to perfect these remote protocols, powered by affordable AR/VR hardware and software. As U.S. businesses get ready to return to the conference rooms and cubicles that have been sitting idle for the past 12-15 months, we ask a panel of AR/VR veterans to envision the role of AR/VR in the post-pandemic world.
11:20 AM - 12:15 PM
One of the key challenges for augmented reality is the development of ultra-compact, lightweight, low-power near-to-eye display solutions with good image quality. Laser Beam Scanning (LBS) technologies shows great promise to meet these key requirements and deliver form-factors that enable light weight, fashionable, all-day wearable AR smart glasses with the ability to scale resolution and field-of-view (FoV) with low power consumption.
However, to enable the development of devices, products and solutions a broad range of technologies needs to converge holistically which necessities the need for an ecosystem. Towards that end the LaSAR Alliance was recently created. This non-profit organization seeks to create the environment, platform and structure for companies and organizations to collaborate on solutions that facilitates the growth of the augmented reality market.
In this panel, experts from leading companies, and LaSAR Alliance members, (including ST, OSRAM, AMAT, Dispelix and Oqmented) developing the critical technology bricks come together to discuss the challenges, opportunities, and the outlook for LBS solutions enabling near-to-eye displays for integration into all day wearable devices.
11:20 AM - 12:15 PM
Does it matter whether a momentary event is physical, digital, virtual OR is an experience an experience?
Our panel brings together leading designers from both the traditional experience world (e.g. theme parks) and the immersive experience world (e.g.,VR) to debate this exact topic and determine whether each needs its own tools or whether—and how—the same definitions, toolsets, taxonomies and frameworks might be applied to both. Where do they fall short? Where do they reach too far? Is “experience” already the same but merely without a shared language? How can we best use what already exists so that, as we create in XR, we’re standing on the shoulders of giants? Our panelists will conclude with concrete answers, providing tools you can use today to create the experiences of tomorrow.
11:50 AM - 12:15 PM
The Virtual Dimension Center (VDC) is Germany's leading network for virtual and augmented reality. Last year VDC started setting up a V/AR Measurement Lab and creating a benchmarking environment for headsets, tracking devices and AR marker tracking libraries. The published results include (among other things) measurements of image sharpness, image contrast, field of view and the stereoscopic range of headsets or the precision and working range of tracking devices.
01:00 PM - 02:30 PM
BRCvr World Tours take you on an interactive experience of five worlds, featuring the creative talent and ingenuity of BRC community of World Builders and 3D Modelers, all part of the 2021 Virtual Burn. Join Kallisti Dawn and MisterMeta4 on two tours organized for AWE 2021! We’ll meet at the BRCvr 2021 Hub World then go on a playa wander through five BRCvr Zones and Worlds. Come play in the digital dust with us!
BRCvr 2021 - HUB WORLD:
https://account.altvr.com/worlds/1436590711179837804/spaces/1546071944607040464
Where: Enter Code TAS579
BRCvr.org/RSVP/
PLAN ahead. First time visiting BRCvr? Give yourself at least 30 minutes before our BRCvr World Tour, to set up your AltspaceVR account, follow the tutorial and customize your avatar - all FREE. Be sure to enable Worlds Beta and Early Access in your settings. Then meet BRCvr’s KallistiDawn and MisterMeta4 in the BRCvr 2021 - HUB WORLD, see you there!
Join on VR headsets; Windows Mixed Reality; and in 2D Mode on desktop / laptop.
01:15 PM - 01:40 PM
The push towards integrative medicine and preventative care are changing the way we think about what healthcare is and how we access it. In this presentation, OVR Technology founder Aaron Wisniewski will talk about the role Olfactory Virtual Reality plays in integrative health and the recently published study outlining the benefits, capabilities and limitations of this exciting new platform for health and wellness.
01:15 PM - 01:40 PM
Construction is a wasteful industry ripe for improvement: half of the energy consumed by buildings is wasted, and half of construction costs are waste. This is exacerbated by a chronic shortage of skilled construction labor. Our attempts to mitigate these three crises in sustainability, affordability, and equity lead us to seek new solutions to enable more effective decision-making as we digitize our construction processes. Augmented Reality enables our construction installation crews to leverage spatially-contextual data just-in-time and just-in-place, at the point of action where and when it matters most.
We will discuss a recent joint pilot project by McKinstry and Spectar, in which we used immersive AR hardware and software to support several AR use cases and objectively measure their impact upon commercial construction. In addition to empowering our construction managers to increase engagement with the owner and other project stakeholders, we leveraged AR to improve installation and quality control. Our AR-based hanger installation effort is especially noteworthy, as it enabled us to address sustainability with paperless construction, affordability by significant reducing time to complete, and equity by empowering lower-skilled labor crews. Finally, we have identified other new use cases for AR, supporting prefabrication in the shop and facility operations and maintenance.
01:15 PM - 02:10 PM
"Where Am I? Why am I here? What does this mean?" Literary theorist Nicholas Burbules posed these as essential questions for those experiencing interactive and immersive media experiences. He argued that audiences should not only be asking themselves this, but should also be motivated to "do the work" of finding answers as they negotiate the spatially arranged content of interactive media experiences. This discussion panel will explore the use of space in XR storytelling. Particularly, how can we leverage new and emerging technologies to harness the potential of real, conceptual and virtual spaces to tell engaging ethnographic stories in new modes? This inter-disciplinary discussion will touch on insights from fields as diverse as anthropology, human-centered design, filmmaking, and emotional geography to investigate how XR authorship must not only navigate the spatial dimensions of human relationships and communities within our narratives, but also the spatial relationships between viewers/users ("vusers") and the ethnographic content they engage in XR experiences. As an emergent field, XR storytelling is at a critical juncture in its development, and this conversation will hopefully help map out the contours of what matters - and what makes current trends in immersive XR experiences such a unique opportunity in the history of telling stories about people and communities.
01:45 PM - 02:10 PM
In this talk I will describe how AI-driven 3D avatars will help us enable the metaverse. I will motivate how digital humans will impact the future of communication, human-machine interaction, and content creation. I will present our latest 3D avatar digitization technology from Pinscreen from a single photo, and give a live demonstration. I will also showcase how we use hybrid CG and neural rendering solutions for real-time applications used in next generation virtual assistant and virtual production pipelines. I will also show case a demo of a fully autonomous virtual human that we have developed at Pinscreen. I will then present a real-time teleportation system that only uses a single webcam as input, and our latest efforts at UC Berkeley in real-time AI synthesis of entire scenes using NeRF representations.