Future Technology Trends: What lies ahead for Emergency Services?

In part 2 of this learning series, we take a look at Augmented, Virtual and Mixed Reality.

Often people get confused between virtual reality (VR) and augmented reality (AR), and to confuse us even more, mixed reality (MR) has started to appear on the scene.

Virtual Reality can be used as an umbrella term to describe other technologies similar to, but different from, an actual Virtual Reality experience. But what's the difference between Augmented Reality and Mixed Reality? We've outlined some more details about each below. 

Virtual Reality

VR-website.jpg

VR is the most widely known of these technologies. It is fully immersive, which tricks your senses into thinking you’re in a different environment or world apart from the real world. Using a head-mounted display or headset, you’ll experience a computer-generated world of imagery and sounds in which you can manipulate objects and move around using controllers whilst connected to a console or PC.

Augmented Reality

AR overlays digital information on real-world elements. Pokémon GO is probably amongst the best-known examples. Augmented reality keeps the real world central but enhances it with other digital details, layering information onto the screen and supplementing your reality or environment.

Mixed Reality

MR brings together real world and digital elements. In mixed reality, you interact with and manipulate both physical and virtual items and environments, using next-generation sensing and imaging technologies. Mixed Reality allows you to see and immerse yourself in the world around you even as you interact with a virtual environment using your own hands—all without ever removing your headset.

With the developments in virtual, augmented and mixed realities continuing to offer new opportunities for a multitude of applications, it is not unsurprising that their potential for use within the emergency services is being explored across the globe.

One such example is AUGGMED. The Automated Serious Game Scenario Generator for Mixed Reality Training (AUGGMED) project has been developed as an online multi-user training platform for joint first responder and counter-terrorism training.

AUGGMED

The aim of AUGGMED is to develop a platform to enable single and team-based training of end-users with different level of expertise from different organisations responding to terrorist and organised crime threats.

The platform will automatically generate non-linear scenarios tailored to suit the needs of individual trainees with learning outcomes that will improve the acquisition of emotional management, analytical thinking, problem solving and decision making skills. The game scenarios will include advanced simulations of operational environments, agents, telecommunications and threats, and will be delivered through VR and AR environments with multimodal interfaces.

Virtual reality allows trainees to perform exercises within virtual reconstructions of the real world while interacting with virtual civilians and terrorists. However, augmented reality allows trainees to see and interact with virtual terrorists and civilians within the real world. Both technologies enable trainees to improve their decision making and gives them experience of performing within stressful situations.

In March 2018, security officers with the Piraeus Port Authority in Greece used AUGGMED to train for potential terrorist-related threats. Using augmented reality, on-site trainees in Piraeus worked alongside other trainees working remotely who were experiencing and responding to the same scenario through virtual reality. Together they had to effectively respond to a terrorist incident. This meant they had to assess the nature of the incident, before ensuring the safety of nearby civilians and neutralising the threat.

Trainees from multiple agencies can train simultaneously and this enables collaborative training between different disciplines, such as the police force, security personnel and paramedics. AUGGMED has been used to improve emergency service work across Europe and has been used by British police officers for critical incident response training.

Leicestershire Fire and Rescue and Virtual Reality

Leicestershire Fire and Rescue Service and Rivr, a virtual reality company, have developed a detailed simulation which allows firefighters to train on site.

The simulations recreate popular arson sites in Leicester in exact detail, and some sections were filmed using as many as 166 cameras.

The software allows for trainees to walk around a warehouse and inspect the street outside, pick up objects, find evidence, assess casualties and even listen/feel to see if they still have a pulse. While this is all happening, a trainer can watch the trainee’s every move on a table or desktop computer, seeing the trainee in both first-person, third-person and even a bird’s eye view. This means that the trainer is able to give real-time feedback that can be key to ensuring effective training.

I-REACT - Improving Resilience to Emergencies through Advanced Cyber Technologies

The work by RiVR in Leicestershire has led to the National Fire Chiefs Council recommending the software be adopted by all UK fire and rescue services and, to date, 30 out of the 47 services have signed up to it.

I-REACT is a European project funded within the Horizon 2020 Program. I-REACT is built on the outcomes of the FLOODIS project, which ended in 2015 and was focused on implementing a crowdsourcing approach to support the emergency response in cases of flooding. FLOODIS implemented a smartphone application to collect real-time reports from both citizens and civil protection agents, and to provide short and long-term projections of the flood extent for supporting in-field emergency rescue units.

I-REACT exploits the same approach, multiplying the opportunities: on top of photos taken from smartphones, I-REACT will exploit also social media, capturing messages and images from Instagram and Twitter, it will collect satellite images as well as reports from wearable technologies (bands, smart glasses) worn by on-site operators.

I-REACT recently showed all its technologies against disasters at the flood simulation in Ipswich. They hosted a two-day meeting in which they were able to interact with different potential end-users and test all their technologies in a real scenario.

In collaboration with the Environment Agency and the Suffolk Fire & Rescue Service, they successfully tested together all their tools for the very first time. Among others, they showed a fully functional mobile app, wearables and smart glasses for first responders, numerous information layers for decision-makers and the app for citizens.

A Look to the Future

Imagine emergency services first responders being able to navigate to and through their environments with emergency vehicles equipped with AR head-up windscreen displays providing route guidance and real-time sensor data on environmental and hazardous conditions; and with helmet-mounted AR devices and visors allowing them to see and hear through smoke, fire, rubble, poor weather and other conditions.

Imagine AR disaster applications that provide visual and audio guidance for citizens seeking refuge, evacuation routes, or emergency assistance in a disaster situation.

Imagine real-time data-driven AR applications that allow law enforcement officers to access location specific information and data on dangerous situations via smart glasses, in-vehicle displays and other wearables.

These scenarios are within reach. As mobile connectivity continues to improve as well as increased system interoperability between public sector organisations, newer and efficient ways of working, could lead to a safer environment for the public and the emergency services.

Find out more

If you'd like to find out more about how APD can integrate with your AR applications let us know at hello@apdcomms.com or by calling us on 01482 808300.