Augmenting Reality
Transforming communication for the first responder
- By Jeff Schneider
- Nov 01, 2010
Collaboration is an essential feature
of all office environments.
However, what happens when
your office is not in a cubicle, but rather
in an emergency operations center
that sprung up amidst the tremendous
devastation of a hurricane, tsunami
or battlefield? What tools do you need
when your decisions are guided not by
the eight-hour workday but by the minutes
or seconds that remain before your
team is to be deployed?
In these situations, collaboration
quickly transforms from the best business
plan to a means of survival for fellow
human beings. The brave men and
women around the world who serve as
first responders have immense responsibility,
and access to immediate and secure
collaboration is vital to command
and control -- and ultimately to the success
of these missions.
My job is to provide the tools that
first responders need to successfully
complete these missions. The most
substantial step we’ve taken in developing
command and control solutions
for first responders is using the Natural
User Interface, or NUI, focusing on
current and emerging technologies and
platforms, such as Microsoft Surface,
Microsoft Windows 7, as well as mobile
development on devices like the iPhone
and BlackBerry.
What is NUI?
NUI allows users to directly interact
with content using common hand gestures.
It removes the constraints of the
application’s graphical framework, replacing
it with a user interface that is
both physical and invisible. Users perform
tasks through direct touch, and
the content becomes the interface.
NUI eliminates the proxy controls
of the keyboard and the mouse, and it
increases the overall immersion for the
user. NUI applications are navigated
differently as well, navigating intuitively
through human assumptions, trial and
error, and learned interactions. NUI
has combined the concepts of reality
and super-reality. For instance, if the
user has a photograph lying on the table,
he or she can move it, rotate it and
flip it. This is a concept of reality.
However, with NUI you can perform
those actions, as well as other actions
that seem intuitive, in a computer-based
environment -- for instance, increasing
or decreasing the size of the image by
performing certain hand gestures. That
is super-reality.
Command and Control
One of NUI’s greatest strengths is its
ability to take large amounts of raw
data from diverse data sources and provide
it to the user in an easy-to-understand,
simplified, visual manner. By removing
the constraints of a GUI-based
application and allowing direct interaction
with content, users can process
more information faster, and in proper
context. These collaborative tools have
the ability to transform communications
for first responders.
With a cutting-edge application
module developed specifically for first
responders, emergency personnel could
maintain a constant communication
link. With a GPS feed coming back to
the command center from the farthest
edge of the response, decision makers
would have real-time information as to
the location of staff and equipment assets
and could then minimize the time
it would take to relocate assets to the
areas where they are needed most. Chat
and instant messaging would enable a
constant flow of imagery coming from
an iPhone in the field back to the command
center, where it could be evaluated
immediately and acted upon, reducing
the time it takes for orders to flow
from decision makers.
These orders could be accompanied
by some type of data set that could also
flow in real time, back to the iPhone in
the field. The fluid transfer of information
in these types of scenarios ensures
first responders can act quickly and decisively,
ultimately saving lives.
Challenges
The NUI environment changes the social
dynamic. It’s gestural, multi-touch,
multi-user, has a 360-degree canvas and
environment, and it can directly interact
with objects. All of these concepts provide
new challenges for us as developers.
We must work through numerous
inquiries in order to provide a proper
social and collaborative environment.
For instance, how do we handle users’
identification? In a command and
control scenario, it is essential to know
exactly whose hand is interacting with
the application.
After we verify the users, how do
we track them and what specific functions
they use throughout their entire
session?
Session management has the potential
to be a big problem when you’re
working in the multi-user environment.
Orientation becomes important, and the
controls need to be fluid and mobile as
well as 360-degree accessible. We’re now
designing for both the interaction of the
system and direct interaction of users.
In future emergency response scenarios,
augmented reality will likely
play a vital role in command and control.
Command center personnel will
have the ability to overlay data sets
directly on top of a live video feed or
other imagery. They will have the ability
to send real-time information to a
first responder, miles away, who will
perhaps be wearing a high-tech pair of
sunglasses, where these data sets will be
in his field of vision along with whatever
is in front of him.
This data, which could conceivably
include the arrival time of additional
resources or dangerous areas to avoid,
will provide the first responder with vital
information to aid him as he makes decisions
that literally result in life or death.
Technology is ever changing and
evolves at a rapid pace. Datasets are
becoming increasingly complex, and
so is the demand for more intelligent
systems. Looking to the future, the distance
between users and content will
continue to narrow, and because of this
we will see platforms that provide both
more tactile and visual functionality.
Augmented reality will become very
prevalent and part of our day-to-day
work environment.
This article originally appeared in the issue of .