...

Toward natural interfaces Barry Po of NGRAIN forecasts how authoring solutions will

by user

on
Category: Documents
6

views

Report

Comments

Transcript

Toward natural interfaces Barry Po of NGRAIN forecasts how authoring solutions will
www.pwc.com/technologyforecast
Technology Forecast: Augmented reality
Issue 1, 2016
Toward natural
interfaces
Barry Po of NGRAIN forecasts how authoring solutions will
redefine AR interfaces in the future.
Interview conducted by Vinod Baya
PwC: Barry, can you please
introduce yourself and your
company?
BP: Sure. I’m the senior director of
product and business development at
NGRAIN, which is an industrial
augmented reality [AR] and virtual reality
[VR] company.
Barry Po
Barry Po is the senior
director of product and
business development
at NGRAIN.
NGRAIN started as an interactive 3-D
volumetric rendering engine company,
offering solutions for training and
simulation to the aerospace and defense
industries. During the last five or six
years, NGRAIN has transitioned into the
emerging area of AR and VR for
enterprise applications.
We saw the opportunity for NGRAIN to
take all the technology and IP
[intellectual property] it created for
virtual maintenance training and
technical support in the field and then
apply that to wearable technology like
smartglasses, as well as functions such as
visual recognition and tracking.
PwC: In your marketing material,
NGRAIN makes reference to
“voxel” as something only you use.
What is a voxel and how is it
different from other methods?
BP: In short, a voxel is a 3-D pixel. One
of the ways that NGRAIN’s technology is
unique compared with most 3-D graphics
technologies is that we build our 3-D
content in voxels or as a collection of 3-D
pixels. You can think of voxels as tiny
grains of sand. The same way that you can
build a sand castle out of grains of sand,
you can build 3-D content in voxels.
The use of voxels has many benefits for
people working in 3-D graphics. One is
that voxels are naturally suited for 3-D
printing a design. A 3-D scanner will
typically capture a physical object in the
form of a 3-D point cloud. Generally you
must translate the point cloud to a mesh,
a process called tessellation in 3-D
graphics. We translate the point cloud
into a format that can be rendered on a
PC in real time without any need for
geometric conversion. That helps retain
all the features that were present in the
original geometry.
“For AR to take off, the ability to rapidly create,
manage, edit, and deploy 3-D content is one of the
problems that should be addressed.”
Also, volumetric rendering represents
not just the surfaces of objects but also
their interiors, so objects are fully solid
from surface to core. Polygonal models
in general will model only surfaces. We
can model any equipment or material
that has an interior density of
some kind.
PwC: What are some use cases
that your solutions support?
BP: One use case we support is part
familiarization. For a novice technician
who needs to work on a piece of
equipment, our technology will provide
access to information about individual
parts that the technician might have
never seen or touched before.
That’s very valuable in the field, as
workers now have the ability to perform
at a level of expertise that’s much higher
than what they otherwise would be able
to do. Also, they don’t need to keep
technical manuals and outdated
publications. Current information is
available to them on demand. They can
also collect and document information
as they’re working on the equipment.
Another use case we support is
automated damage assessment and
visual inspection. This capability is
particularly useful in manufacturing
situations where products require very
tight levels of tolerance. Lockheed
Martin is one of NGRAIN’s customers,
and the company deployed damage
assessment technology operationally for
use in the F-35 and the F-22 fighter
programs. The maintenance folks can
perform diagnostics and damage
assessment in real time after fighter jets
come in from a flight.
2
PwC Technology Forecast
PwC: Are these solutions on
tablets or smartglasses?
BP: Today they are used on tablets. We
have been exploring how to bring them
to smartglasses. Data processing and
overall functionality present no real
problems for the ability to port
the software.
Porting interaction is the interesting
issue, because the way people work or
annotate on a tablet will not be the same
as the way they annotate using a pair of
smartglasses. That’s certainly one of the
areas where further work must be done.
PwC: What is the challenge with
authoring AR content?
BP: If you look at the history of 3-D
computer graphics, you’ll see that
creating 3-D content is a very difficult
problem. You don’t need to look much
further than work in game development
or visual special effects in the
entertainment industry. The amount of
production effort that’s required to
create meaningful content is enormous.
For AR to take off, the ability to rapidly
create, manage, edit, and deploy 3-D
content is one of the problems that
should be addressed.
PwC: How are you addressing
this challenge?
BP: Our authoring solution, NGRAIN
Vergence, pulls from our years of
experience building 3-D content for
simulation and training systems. Our
objective was to create a tool that people
other than 3-D graphics authoring
experts and 3-D modelers can use. The
reality is that enterprises probably don’t
have very many 3-D graphics experts,
Toward natural interfaces
“The system will know what you are doing, and it will
anticipate what you will do next. This capability will
move the interaction from deterministic to
probabilistic scenarios.”
but they do have a lot of domain experts
or subject matter experts.
Vergence allows you to create and
deploy AR and VR content without
writing any code. You can import 3-D
content from a variety of sources, such
as a CAD [computer-aided design]
model, 3-D point cloud scans, and other
methods, and then link it to enterprise
knowledge. This knowledge can be
instructional content, step-by-step
directions, checklists, videos, charts, or
real-time sensor information. All of that
can be tied into a 3-D asset.
Then you can deploy the application to
the device of your choice, whether that’s
a mobile device, a desktop, or perhaps a
pair of smartglasses.
PwC: You mentioned that porting
interfaces is one sticky point. How
do you expect the interfaces for
AR evolve?
BP: Fundamentally, the UI [user
interface] has not changed very much
since the mouse and keyboard. There
has been movement toward touch
interfaces during the last five or six
years. There has been some
advancement in speech interfaces
as well.
There has been some work in the use of
gesture interfaces in AR and VR, such as
the six degrees of freedom input or
multidimensional input methods. But
the problem of building an interaction
language that can be used on a device
that works in the real world is not fully
solved yet. In addition, the solution
must be robust enough to work in a
general-purpose setting, including low
light or bright light, indoors or outdoors,
and so on.
3
PwC Technology Forecast
I believe we as an industry will evolve to
a point where interface isn’t necessarily
about an interaction language. Instead,
we will create systems that are smart
enough or intelligent enough to
interpret the context in which you’re
working. The system will know what you
are doing, and it will anticipate what you
will do next. This capability will move
the interaction from deterministic to
probabilistic scenarios.
Ultimately, the deterministic and
probabilistic will combine, so the system
will rely on not just gesture alone, but a
combination of context, voice, gestures,
touch, and so forth as the means of rich
interaction. The interactions will
become truly natural. I think such
technology is on its way.
PwC: Where is the industry today
on this journey?
BP: We’re getting there. Today we have
the ability to do what is called optically
transparent AR—the ability to use the
tracking technology in conjunction with
our rendering engine to overlay graphics
on top of the physical pieces of
equipment. That is possible today.
However, we don’t quite yet have the
ability to overlay information in context
and dynamically respond to the
environment as it changes. As the
tracking capability gets better—such as
tracking a 3-D object in 3-D space—then
the ability to figure out the state the
object is in gets better and so does our
ability to render the right content on top
of the objects.
Toward natural interfaces
“For AR, the industry will move away from the notion
of apps or bite-sized solutions that work in isolation
from one another, and toward a new paradigm of a
fully integrated system that is interconnected and
available through common natural interfaces.”
PwC: NGRAIN creates the
authoring tool to create AR
applications. What are you
learning about the nature of AR
applications? Will they be like
mobile apps or will they be very
different?
For AR, the industry will move away
from the notion of apps or bite-sized
solutions that work in isolation from one
another, and toward a new paradigm of
a fully integrated system that is
interconnected and available through
common natural interfaces.
BP: I think the very nature of what an
app is probably will change. The reality
is that for people working in the field,
there is no notion of an app. What
they’re really looking for is a
personal assistant.
Obviously, the engineering work that’s
required to make that happen will be a
significant investment. But the tools and
technologies that are necessary to build
such an integrated system already exist.
To have a deeper conversation about augmented reality,
please contact:
Gerard Verweij
Principal and US Technology
Consulting Leader
+1 (617) 530 7015
[email protected]
Vinod Baya
Director, Center for
Technology and Innovation
+1 (408) 817 5714
[email protected]
Chris Curran
Chief Technologist
+1 (214) 754 5055
[email protected]
About PwC’s Technology Forecast
Published by PwC’s Center for Technology
and Innovation (CTI), the Technology
Forecast explores emerging technologies
and trends to help business and technology
executives develop strategies to capitalize
on technology opportunities.
Recent issues of the Technology Forecast
have explored a number of emerging
technologies and topics that have
ultimately become many of today’s leading
technology and business issues. To learn
more about the Technology Forecast, visit
www.pwc.com/
technologyforecast.
© 2016 PwC. All rights reserved. PwC refers to the US member firm or one of its
subsidiaries or affiliates, and may sometimes refer to the PwC network. Each member
firm is a separate legal entity. Please see www.pwc.com/structure for further details. This
content is for general information purposes only, and should not be used as a substitute
for consultation with professional advisors.
Fly UP