AR set to be a cornerstone of BAE’s future naval combat systems

BAE Systems has unveiled a host of the new technologies for future naval combat, including AR (augmented reality) glasses for the watch officer on the bridge.

AR

At a media briefing in London on November 22, 2019, the company provided details on the AR system currently under development, as well as a demonstration of the technology in action. According to Frank Cotton, BAE’s head of Technology for Naval Combat Systems, it could be ready for use by the Royal Navy within 12 months.

“In many ways, augmented reality – in an operational sense – has been a solution looking for a problem,” said Cotton. “Finding a safe and secure way of enabling the technology to add value at a reasonable cost is the challenge we picked up. The use case we’re looking at is for a bridge watch officer.”

The watch officer on the bridge of warships is the person ultimately responsible for the safety of the vessel. While on the bridge, this officer receives audio information from naval personnel in the ship’s operations room, located deep within the heart of the ship. An officer might ask for positional, heading or identification data relating to a specific object or vessel in his or her field of view. This will be duly provided by the ops team, the officer getting information to the ear that should match up to what the eye can physically see.

AR
Screen representation of how information will be displayed via the AR glasses

“It's a tried and trusted method that the Navy developed over the years, and it works very, very well,” said Cotton. “But it’s not hugely efficient, and the bridge watch officer is quite often overloaded when he can see lots of things out the window and he’s trying to tie them up – via voice – with a set of operators in the bowels of the ship.”

The solution that BAE is developing is a set of lightweight AR glasses that bypass the need for objects in the field of view to be confirmed by the ops team. Based on the same system found in the Striker II helmet-mounted display (HMD) used by Typhoon fighter pilots, the AR glasses deliver information on ships and aircraft direct to the officer, who also has the ability to interact with this data via a simple clicker. BAE is currently using Microsoft’s HoloLens AR headset platform to develop the system, but the plan is to scale the technology down to a more lightweight piece of hardware that can be used untethered.

“Rather than just giving (the office) a headset with comms on it…let’s give him a set of lightweight augmented reality glasses,” said Cotton. “So that when he looks out the window at a real-world object, instead of having to ask the operators whether or not what he’s seeing is friendly or hostile, whether or not it’s an inflatable boat or a fishing vessel, he can use his glasses and a clicker…to interact with the combat management system directly.

“He can display video of what the object is, he can get classification data…and if what he’s seeing doesn’t match the data that the system is telling him, he’s got the ability to change that data. If he suspects that something is potentially a hazard, he can flag that in the system directly, without having to tell the operators to do that for him.”

According to BAE, the Navy is hoping to trial the technology in operation as early as next year, but the system will need to be refined before it is combat ready. HoloLens isn’t designed to operate in bright sunlight or on a moving platform such as a ship in a swell. Incorporating elements of the Striker II electronics system into a scaled down AR package should help address these issues. BAE says a prototype should be ready by the in the first half of 2019, with operational testing planned for the second half of the year.

“Of course once we’ve got the lightweight glasses in operation here (on the bridge), the potential to use them in different applications across the ship is massive,” said Cotton.

Another area that BAE is targeting for change is the operations room itself. The hardware and software used there is functional and reliable, but much of it is decades old and ripe for a 21st-century makeover. According to Cotton, the ops room will see evolution rather than revolution, with different methods of human computer interaction (HCI) trialled and introduced incrementally.

AR

“So, virtual reality, augmented reality, touch control, gesture control, voice activation, these are the types of technology we’re exploring,” he said. “Not all of them will be appropriate for a Royal Navy operations room, but some of them will be.”

Unsurprisingly, artificial intelligence is also on the table. Some of the potential use cases include neural networks that can monitor ship movement and flag unusual patterns of behaviour, and target prioritisation that can maximise the ship’s chances of victory by engaging threats in a specific order.

Elsewhere, third parties including SMEs and academic institutions are being invited to develop software that could plug into BAE’s combat management system and harness the ship’s data. These apps would operate in a ‘sandbox’ environment where they could add value for the operations team without interfering with the function or integrity of the overall combat system. Where previously, adding new software could take months or even years before approval, the sandbox approach would enable a much more streamlined integration of new technology.

“It needs to be cheaper and faster to introduce new capability than it is today,” said Cotton. “Typically…it can take 18 months plus to get a new capability onboard a ship.

“What we’re looking for is to define a baseline system that we can bring new capability in much faster, and our model is actually an iPad and the App Store. We’re looking to be able to draw down pre-approved software packages on to the combat system readily and easily.”

MORE ON DEFENCE & SECURITY