10:30 AM - 10:55 AM
Through countless partnerships, ZEPETO is home to virtual versions of celebrities like Selena Gomez, BLACKPINK, Twice, and more. From avatar-driven official music videos to virtual merchandise and virtual meet and greets, real world celebrities are embracing ZEPETO as they double down on metaverse strategies. However, there is a rapidly growing community of influencers in ZEPETO who are avatar natives. From a virtual fashion designer who makes six-figure incomes by selling clothes on ZEPETO to virtual influencers who land deals with MCNs that picked up their ZEPETO activities, there is a large - almost 1 million - community of creators on ZEPETO who are using ZEPETO exclusively or near-exclusively for their online output.
10:00 AM - 10:25 AM
During the live session, we will dive into the meaning and application of algorithmic diversity in technology (including such cases as neurodiversity). Using the latest experiences, cases and research, we will analyze the current state and problems of inclusive innovation and technology, including the problems of representation and criteria, inclusive research and design-thinking, the building of inclusive products (AI-driven platforms, devices, apps, social and emotional robotics), ethical considerations and concerns (the "black-box" and "double-check" problems, transparency, explainability, fairness, surveillance), shortcomings of current technology ecosystems, policies and human rights frameworks.
11:20 AM - 11:45 AM
Conversational AI systems now have the power to give intelligent, empathetic voices to all types of consumer electronics including xR devices, vehicles, robots and more. SapientX powers state of the art, photo-realistic, purpose driven characters and virtual assistants which can be embodied directly by popular game engines such as Unity and Unreal, including support for Epic’s new Meta-Humans and other high-resolution characters. Find out what it's like to have an intelligent conversation with a hologram (Meta-Ori!) and what's next for AI powered digital humans.
01:15 PM - 01:40 PM
Technology has revolutionized entire industries. But despite advancements in computing, commerce, and connectivity, healthcare has lagged behind. To democratize the healthcare industry, address the impending shortage of skilled physicians, and reduce surgical error for improved patient outcomes, immersive and intelligent technologies must be integrated into the operating room.
Discover how Proprio is collaborating with groundbreaking industry leaders at world-class medical centers like UCSF to democratize healthcare using XR, AI, and real-time volumetric imaging to recode and redesign the practice of medicine. The future of surgery is finally here.
* Joint presentation with Dr. Rajiv Saigal
01:45 PM - 02:10 PM
Should robots feel emotions? More importantly, would these emotions cause them to perform their tasks better? In humans, emotions have historically played a vital role in the way we survived and evolved as a species. Moreover, emotions are an indispensable part of what makes up the fabric of intelligence and critical functioning. No wonder it has been a long-held view amongst leading experts in Artificial Intelligence (Fig.1) that infusing robots and intelligent systems with emotions would greatly heighten their functioning capabilities in the role they were designed for and beyond.
"I don't personally believe we can design or build autonomous intelligent machines without them having emotions." Yann LeCun - Chief Ai Scientist at Facebook silver professor winner of the Turing award in 2019.
Affective computing, born in 1995, relies on big data to recognize, process, and simulate human emotions;however, it does not provide the machines with their own emotional response. In contrast, Emoshape aims to encode real-time emotions in AI using Emoshapeâs EPU (Emotion Processing Unit) through Emotion Synthesis. Instead of the bootstrap approach of Googleâs Deep Mind (1, 2), Emoshapeâs vision is to teach the machine to preserve biological life above mechanical life.
Through the combination of cloud learning and a novel microchip, Emoshapeâs patent-granted technology can be ported into any existing AI or robot. The EPU will synthesize an emotional state within the machine, enabling real-time language appraisal, emotion-based reasoning and emotional personality development. This will make possible intelligent machines capable of understanding how words are associated with feelings, and able to respond with the natural empathy of a human. This response will also be reflected in their vocal responses and visual expression, which play an important role in audiovisual speech communication, increasing comprehension and creating a soothing environment or a trust/loyalty connection with wide applications in the entertainment, education, healthcare, or industrial sectors. Moreover, emotion also improves body-consciousness based on situation, particularly relevant to real-world autonomous intelligent machines, whether it is robots, metaverse, games, etc.