Scalability

Supporting the free flow of data across the userbase to ensure a true common operating picture

  • Supporting data delivery at scale
  • A new approach to scalability
  • News & Knowledge
  • Explore

Supporting data delivery at scale

As communications, intelligence, and data become more critical in fulfilling military missions, the requirements for scaling C4ISR capabilities are increasing. Operations can range from discrete small team operations to multidivisional deployments; and cover a range of scenarios that include peacekeeping, drug interdiction, counterinsurgency, and conventional warfighting.

The resulting information requirements mean that scalable solutions to transferring critical data are growing in importance.

Scalability as a concept within the command-and-control environment is often seen through the prism of software deployment and user rights – how quickly can you increase a network’s size at the lowest affordable cost, while ensuring that appropriate users have the correct access rights to a software package.

However, when taking a more holistic approach to scalability within the modern military operational environment, a greater number of factors need to be considered, such as data delivery standards, volume, and user permissions. In deploying a C4ISR management system, you will need to ensure that it can handle high volumes of data from an ever-increasing number of sensor systems.

A new approach to scalability

  • Ensuring data transfer in complex communications environments
  • Supporting joint operations
  • Simplifying sensor integration
  • Covering multiple nodes

Key to understanding scalability is the operational environment that modern militaries face when deploying across a variety of static, fixed, and mobile locations. Modern warfighting requires flexibility, with those creating, consuming, and receiving data needing accurate information to make rapid decisions at pace. When the challenges of joint and coalition interoperability are further added to operators, ensuring that your systems are able to scale appropriately becomes a more complex task.

Ensuring data transfer in complex communications environments

As the modern battlefield has come with greater requirements for data transmission and digitised communication, transferring data in more difficult environments has become a higher priority for network managers. These challenges can come in a variety of forms, from disruption and distortion caused by the natural environment to electronic warfare and spectrum fratricide.

Ensuring that latency rates can be minimised, common operating pictures readily refreshed and maintained, and transmitted to fixed, static, and mobile users means that a solid communications backbone needs to be in place. High levels of redundancy are required to support constant connectivity, and the ability to ensure that data can more easily move around a network.

Operating a reliable communications backbone can ensure that network connectivity can be maintained in the most challenging of environments.

 These systems allow for the scaling of networks across complex and disrupted environments, meaning that data can get through thanks to redundancy links. Networks can readily flex and scale to support contested electromagnetic spectrum (EMS) environments without a detrimental impact on the userbase.

Supporting joint operations

Operating across services, domains, and in alliance with other partners – both military and civilian – can mean that the volume of data being spread across a network can demand rapid scaling capabilities. Within this operating environment, the need to incorporate data transfer protocols into a common architecture means that appropriate API technology and layer filtering is in place to ensure that properly credentialed users can achieve a common level of situational awareness for their areas of responsibility.

API and data exchange technologies help to ensure the smooth-running of large operations and exercises, enabling the greatest distribution of data into a common operating picture.

Data throttling also works to ensure that outputs can support both the strongest and weakest systems in a battlefield network. Similarly, comprehensive layer filtering means that operational pictures can be appropriately sanitised for sharing with users who may have lower clearances.

Simplifying sensor integration

The volumes of data being generated by C4ISR systems can rapidly change between user environments. Smaller amounts of data may be transmitted between lower levels of command (brigade level and below), while division and above can demand significant data processing requirements as greater numbers of data feeds can be added and require processing and presentation to more users operating in a variety of roles. Increased demand for battlefield data and intelligence to support missions – both at the planning stage and during execution – mean that the requirements to scale data delivery are constantly growing.

Demand for sensor outputs across the battlefield means that there is increasing pressure on computing power to capture, store, and disseminate generated data. Distributing the load throughout the network allows for the optimised processing of the data generated by sensors and users in the battlefield network, while supporting access for accredited users allows for greater situational awareness and intelligence sharing.

Covering multiple nodes

Recent conflicts and operations over the last 20 years have involved co-located headquarters units in secure locations, as counterinsurgency and peacekeeping operations focus more on growing security from fixed locations. Future conflicts are expected to reverse this trend, with mobility returning to the battlefield to ensure command security. In order to ensure full situational awareness in a dynamic operating environment, scaling data across multiple nodes is increasingly important when sharing a common operating picture.

As transmission and reception nodes move into disparate positions, the ability to transmit data in on-the-move and at-the-pause means that sudden peaks in demand for data can be placed on the network as users come online to resynchronise and access data. A network’s ability to continue data delivery under rapidly evolving demands defines its scalability, as well as its ability to rapidly self-heal when nodes are taken offline.