Holographic headsets are a central theme to augmented reality (AR) today, but the recent AR in Action Conference demonstrated the diversity of the field and the potential to include many more technologies to augment humans.
The AR in Action Conference, held at the MIT Media Lab, expanded the definition of AR through a TED conference-like lens, delivering 70 diverse curated talks and 32 panels over two days to over 1,000 experts and practitioners in the field. Panelists included Mark Sage, AR for Enterprise Alliance Corp.; David Smith, Wearality; Patrick Ryan, Newport News Shipbuilding; Mark Kirby, Liberty Mutual Insurance Group; and Christine Perry, Perry Consulting--pictured left to right above.
As Chris Croteau, general manager of Intel’s Wearable Device Group, said:
“A liberal definition of AR focuses on the way data is presented to users and how they interact with it. The popular definition of the AR platform as a holographic projection system like the Hololens, Meta and ODG headsets limits what AR can be.”
The value of these holographic projection headsets was well represented with talks like the one by Harvard Medical School professor Jayender Jagadeesan, who spoke about surgical AR applications, and by combat fighter pilot Patrick Guinee, who spoke about enhanced 360-degree field of views that let the pilot look down, through his body, and the fuselage underfoot to see adversaries below.
“The conference widened my myopic aperture to the universal possibilities,” said Guinee, who has 25 years of experience with holographic AR military applications.
He pointed out that AR can be visual, fitting the popular paradigm of holographic projections overlayed onto reality using a headset. But in its essence, it is the overlay of a data sources impacting human perception. It is not necessarily visual, but it could be sound or haptic perception of interactions with electromechanical stimulus, though this is still somewhat limiting to the definition of this emerging field.
Digitally augmented humans
AR can also change human perception of the surrounding space when a digitally augmented human interacts with a data source, such digitally enhanced senses or accumulated data from surrounding IoT infrastructure.
Juan Enriquez, genomic and life sciences visionary, writer and frequent TED conference speaker, and Carl Byers, chief strategy officer of Contextere, explained the seemingly infinite possibilities of overlayed data and augmented perception and enhanced human capability that defines AR.
“The potential of AR reminds me of the early definition of the internet: thousands of people loosely connected. A few decades later AR is a similar network effect that augments people’s cognitive capabilities and perceptions through their interconnecting data acquired from bio- sensors, worn sensors and the space around them.”
“The talks reinforced my view of AR as a broader concept. It is a suite of technologies that extend human capability and understanding using a variety of approaches across industries and applications."
Pattie Maes, head of the MIT Media Labs Fluid Interface group, wrapped AR in a multidisciplinary explanation of the evolution of AR design and development of assistive technology user interfaces and interactions that integrate with a user’s mind, body and behavior.
The talk about the evolution from human to cyborg made a striking point about disabilities becoming augmented abilities with the example of double leg amputee Hugh Herr, an MIT professor of biometrics who has created breakthrough bionic limbs that enhance human mobility—like the blades Olympic runner Oscar Pistorius wears. The line between disability begins with augmentation, which at this point in time may only partially remedy disability but—like Pistorius’ legs—could in the future exceed normal human capabilities to interact with the surrounding space. We could have bionic eyes that exceed human vision and augmented cognition with which everything is remembered and everything in the environment is contextually understood.
Perhaps the term metasensory augmentation—from the father of wearables dating to the 1980s, Steve Mann—might replace the term AR. Mann talked about emitted energy transduced into visual applications and the brain-computer interface. Mann also added emphasis to the history of wearables.
Other speakers gave balance to the conference’s enthusiasm, including Monique Morrow, co-chair of the IEEE Global Initiative on Ethical Considerations for Mixed Reality Committee. She describes herself as the elephant in the room who is both excited about the possibilities of AR and wary of the potential for abuse. She is concerned with ethical considerations in an AR world driven by intelligent systems observing behaviors and sensing physical phenomena continuously to provide individuals with appropriate content. Could inequities arise if monopolies such as Facebook and Google emerge that monetize this data? What are the ethical surveillance considerations?
Enriquez and Ethernet inventor Bob Metcalfe also said a killer AR app has not emerged, unless the definition is stretched to include Pokémon Go. And Croteau explained that systems designers do not yet have all the tools and components to build light, unobtrusive and highly functional AR wearables with good battery life that he equated to the early smartphones. But AR applications are being developed to solve a broad range of problems in fields such as emergency response, architecture and the Internet of Things.
Metcalfe applied his many years of experience as an inventor, entrepreneur and advisor/inventor in summing up the conference:
“AR technology is on the verge of happening. The timing of the conference fits with the stage of the development of AR because at times like this, people get together to give the industry a vector for direction, at least for a while.”
Fifty videos will be posted to the conference website soon. Readers interested in learning more should check the conference website: arinaction.org.