Augmented Reality: Beneath the Radar
Design Trends and the Transformation of
Everything
Read the full Design Trends eZine:
By Darryl Wright for Mouser Electronics
AR at a Glance
Key Facts
- In a recent article by In’saneLab, Antoni Zolciak cites data showing that the number of VR/AR
users has increased from less than a million to more than 150 million in the past four years.
- A May 3, 2018, Information Age article projects that the top applications for AR over the next few years
will be in the gaming, health care, and engineering spheres.
- A recent study by Capgemini Research Institute, focusing on the use of AR and VR in the manufacturing,
automotive, and utilities industries, found that 31 percent of companies use VR or AR to view digital
technical content, 30 percent use them to remotely connect users with experts, and 30 percent use them to
view components digitally.
Many people still think of augmented reality (AR) and virtual reality (VR) as gaming technologies, but
increasingly, AR/VR applications are appearing in other places. Health care is seeing increased use of AR tools to
support remote consultations, remote surgery, and education. Automotive, aerospace, and other engineering-intensive
industries are adopting AR solutions for design visualization and collaboration. A growing number of AR applications
are being used in retail as visualization aids for consumers. Many see AR as eventually playing a key role in how we
interact with the world around us—to the point that we take personalized AR enhancements for granted.
Challenges exist in reaching that goal, including the data and compute-intensive nature of many AR applications, the
design challenge of optimizing AR experiences for their intended purpose, and the convergence of technologies
required to make AR’s utility broadly scalable.
Virtual and augmented reality are not new, but they are more capable and more pervasive in our everyday lives than
ever before. Although we’re not on the verge of routinely walking about in a real-time, digitally enhanced
world, piece-by-piece the convergence of technologies needed to make this kind of enhancement a normal part of our
lives is happening.
Flavors of Digital Enhancement
To understand where the technology is now and where it’s going, it’s useful to review the three
different flavors of digital enhancement:
Virtual Reality
This is a totally immersive reality. It replaces everything that’s around you, and it immerses you in a new
reality, whether you’re moving and interacting with objects or just looking around. Typically, it offers a
360-degree view of a virtual world. The most common applications of virtual reality (VR) today are in gaming,
although there are VR apps beginning to appear such as the Google Earth VR that could have potential in education.
Augmented Reality
Augmented reality (AR) provides an informational overlay that adds digital elements to your live view, enhancing
the real-world view around you. AR applications typically augment real-world views with data, but the data is not
interactive with the real world. Examples would be a heads-up display (HUD) in a car to provide speed, navigation,
and other driving information so you don’t have to take your eyes off the road. Snapchat photo filters,
phone-based stargazer applications, and furniture apps that scan a room and drop in properly scaled images of
furniture from the IKEA catalog, are other familiar examples. AR applications are beginning to be widely used in
engineering, education, and health care. For example, AR applications allow a surgeon to see blood pressure and
other vital patient information in real time without having to look up from a surgical procedure. It is also being
used by design engineers to see objects as they would appear in the real world.
Mixed Reality
Mixed Reality (MR) is a combination of VR and AR, where virtual objects are anchored in the real world. Examples
include interior design apps or automobile apps which allow user to configure their dream car in the showroom floor.
It is also used in medical applications where manipulating a real-world object also manipulates data associated with
that object, such as in a 3D-image overlay. Other applications include engineering and equipment maintenance. In
some applications, the distinction between AR and MR is blurred, and many consider MR a variation of AR.
AR: A Convergence of Technologies
Mouser Manufacturers Leading the Way
- Intel’s Augmented Reality brings Red Bull Rampage to your living room.
- TDK acquires sonar-on-a-chip inventor Chirp Microsystems to expand its portfolio of VR/AR sensors.
- Bosch wins Automechanika innovation award for its augmented reality application at technical training.
AR is currently the most widely used non-gaming technology in this family of enhanced realities. VR, AR, and MR all
share certain requirements, including a computer platform capable of combining all the image data, motion sensing,
real-time display data, and audio data if that is part of the application along with a means for viewing the virtual
or augmented reality images. Today’s AR solutions run on a variety of visualization hardware types, including
computer screens, smartphones, and wireless smart glasses, such as HoloLense and Magic Leap, that are used in AR and
MR applications. HUD systems project data overlays on a transparent medium, such as the windshield of a car, or, in
the case of civilian aircraft, on a small viewing screen in front of a pilot’s eyes. It’s important to
recognize that AR is not one technology. It is a convergence of technologies, and the more sophisticated the
application, the more technical integration must take place. For example, an AR retail application that involves
price comparisons might incorporate Global Positioning System (GPS) data, stored database data, machine learning,
and analytics in addition to the computing platform and visualization technology. Likewise, the total package needs
to be mobile in order to be practically useful, which means it must work on a smartphone. Engineering constraints of
any AR, VR, or MR application are generally dictated by an application’s functionality and system hardware
capabilities. For example, a game running on a computer typically runs at 45 or 60 frames per second. However, in
order to avoid motion sickness in a VR version of that game, the frame rate must increase. A VR headset has a screen
for each eye, and each screen must have a minimum of 45 frames per second to avoid motion sickness, which means the
application now requires a total frame rate of at least 90 frames per second. To get equivalent image quality in VR,
you need to nearly double the frame rate.
The convergence of technology and performance requirements places a big load on computing platforms. Yet relying on
cloud or network-based components can create latency issues that might deliver a poor realtime AR or MR experience.
That’s one reason why designers want as much processing done locally as possible. It’s also why
there’s a lot of interest in building smartphone-based AR applications. Smartphones are powerful computing
platforms with built-in motion and location functions, cameras, and image processors. The major operating system
(OS) providers all have their own development kits for building AR and VR applications. Apple provides ARKit,
Google’s is ARCore for use on Android systems, and Microsoft recommends Unity 2D and 3D for AR and MR
applications that use HoloLense. The technological convergence demanded of AR solutions means more than just
software and hardware integrations. Developing these apps requires a breadth of skills. AR and MR solution
developers follow the practices of gaming developers who work as a team, that typically includes software
developers, hardware specialists, database specialists, designers, user experience specialists, possibly sound
specialists, and production managers. The size and skills required depend on the nature of the application and what
it must do. It’s worth noting that the user experience specialist is very important for the success of AR and
MR applications. For example, this is the person who makes a HUD in your car a valuable driving aid, without being a
dangerous distraction. AR solutions must be carefully optimized for their intended purposes.
Conclusion
Advances in AR and MR will continue as hardware densities increase, local processing platforms become even more
powerful, wireless connectivity improves, and as development kits continue to improve. In the coming year, keep an
eye on progress in these areas:
- Continual updates to ARKit, ARCore, and Unity
- Smaller, more compact smart glasses for AR and MR applications
- More research into headset controls, such as eye tracking and voice control
- More smartphone-based AR applications
- Efforts to create a more hands-free smartphone-based AR experience by linking more streamlined smart glasses to
phones, with the ultimate goal to make using phonebased AR applications possible without holding the phone.
Although 2019 is not likely to see dramatic breakthroughs in AR and MR technologies, these systems will steadily
improve and be more widely applied in many fields.
Darryl Wright is a research engineer at
Georgia Tech Research Institute with over 20 years of industry experience. Mr. Wright created and leads the
“Immersive Computing” group within GTRI, which focuses on augmented and virtual reality (AR/VR) development,
military wargaming simulations, and user experiences and design. He has been the lead designer on several
user-experience focused projects, including projects for the DoD, DARPA, and other governmental and non-governmental
organizations, and developed mobile applications for military use. During the summer months, he mentors high school
students in AR and VR design and application development