Conference PaperPDF Available

Towards Verification of NATO Generic Vehicle Architecture-Based Systems

Authors:

Abstract and Figures

Current military vehicles are equipped with a variety of sensors and effectors which are not yet connected or are only linked via proprietary, vendor-specific interfaces. Hence, changes and enhancements to IT-related vehicle equipment are often only possible by the original manufacturer and a seamless integration of new components is difficult and expensive. In order to unify the interfaces of subsystems and enhance interoperability, the NATO Generic Vehicle Architecture (NGVA) defines architecture concepts for future land vehicle electronics concerning data and power infrastructure. To verify that NGVA-based systems meet the defined requirements, as part of the NGVA, a generic framework for the verification and validation of NGVA systems based on an early version of the UK GVA Verification approach has been developed. This paper describes the basic concepts of the proposed NGVA Verification and Validation (V&V) specification. It mainly focuses on how to outline a detailed verification plan tailored to the specific NGVA system to define a verification process. Therefore, it provides details on organizational verification responsibilities; verification, review and analysis methods; as well as methods for verification independence. To assess the conformity of NGVA systems, three sequentially-related compatibility levels have been developed to facilitate the evaluation of the specific system requirements in a structured manner by arranging an order of their verification. These levels form the basis for a verification process consisting of five steps ranging from verification planning to capturing of the results.
Content may be subject to copyright.
21ST ICCRTS
Twenty-First International Command and Control
Research and Technology Symposium
C2 in a Complex Connected Battlespace
September 6-8, 2016
London, UK
Topic: Interoperability/Integration and Security
Title: Towards Verification of NATO Generic
Vehicle Architecture-Based Systems
Author:
Name: Daniel Ota
Organization: Fraunhofer FKIE
Address: Fraunhofer Str. 20
53343 Wachtberg-Werthhoven, Germany
Phone: +49 228 9435 732
Fax: +49 228 9435 685
E-Mail: daniel.ota@fkie.fraunhofer.de
Towards Verification of NATO Generic Vehicle
Architecture-Based Systems
Daniel Ota
Abstract
Current military vehicles are equipped with a variety of sensors and effectors
which are not yet connected or are only linked via proprietary, vendor-specific
interfaces. Hence, changes and enhancements to IT-related vehicle equipment are
often only possible by the original manufacturer and a seamless integration of
new components is difficult and expensive. In order to unify the interfaces of
subsystems and enhance interoperability, the NATO Generic Vehicle Architecture
(NGVA) defines architecture concepts for future land vehicle electronics concern-
ing data and power infrastructure. To verify that NGVA-based systems meet the
defined requirements, as part of the NGVA, a generic framework for the verifica-
tion and validation of NGVA systems based on an early version of the UK GVA
Verification approach has been developed.
This paper describes the basic concepts of the proposed NGVA Verification and
Validation (V&V) specification. It mainly focuses on how to outline a detailed
verification plan tailored to the specific NGVA system to define a verification pro-
cess. Therefore, it provides details on organizational verification responsibilities;
verification, review and analysis methods; as well as methods for verification inde-
pendence. To assess the conformity of NGVA systems, three sequentially-related
compatibility levels have been developed to facilitate the evaluation of the specific
system requirements in a structured manner by arranging an order of their ver-
ification. These levels form the basis for a verification process consisting of five
steps ranging from verification planning to capturing of the results.
1 Introduction
For decades the development of military land vehicle systems followed a similar approach.
System capabilities were outlined and a prime contractor was chosen to deliver the
system was chosen on the basis of a cost and feasibility analysis. Systems requirements
were derived from the demanded capabilities and satisfying sub-system were identified,
acquired and integrated by the prime. This approach led to the current situation that
military vehicles are equipped with a variety of sensors and effectors which are not yet
linked or are only connected via proprietary, vendor-specific interfaces. Hence, changes
and enhancements to IT-related vehicle equipment are often only possible by the original
prime and a seamless integration of new components is currently difficult and expensive.
In an era of asymmetric warfare, these monolithic or stove-pipe systems are prob-
lematic. Due to a quickly changing nature of threats, long procurement processes may
1
even lead to systems which are already no longer able to deal with uptodate threats in
their entirety when they are delivered. Nowadays, rapid adaptivity of the systems is
inevitable. In the best case, a reconfiguration of the system should be possible in the
field based on the needs for current mission. In order to achieve this, new system design
processes and system architectures have been proposed in the last years.
In 2004, the US Department of Defence published the Modular Open Systems Ap-
proach to Acquisition (MOSA) [1] as a technical and business strategy for developing
a new system or modernizing an existing one. Beside open systems efforts for the Air
Force [2], [3] and in the Navy [4], the MOSA led to new initiatives for the design and
integration of military land vehicle sub-systems. In the US, the Vehicle Integration for
C4ISR/EW interoperabilitY (VICTORY) [5] initiative was started to develop an open
combat system architecture. However due to security classification, open access to in-
formation about VICTORY is very limited.
Similar initiatives have commenced in Europe. In order to standardize the interfaces of
vehicle subsystems, the UK Ministry of Defence released the first version of the Generic
Vehicle Architecture (GVA). In the current issue 3 [6], the GVA defined specifications for
power supply, data distribution and data management as well as the design of controls.
To enhance interoperability across North Atlantic Treaty Organization (NATO) nations,
the GVA standard has been further developed in cooperation with European partners in
order to standardize it as a NATO Standardization Agreement (STANAG) since 2011.
The work on the initial version of the NATO Generic Vehicle Architecture (NGVA) was
completed in August 2015 and the STANAG was passed as a Study Draft [7] to the
NATO nations for the purpose of preparing the ratification.
Whilst Modular Open System Architectures potentially offer benefits related to the
facilitated addition of new capabilities because of easier upgradability and reduced life-
cycle cost due to improved maintainability, their verification, validation, and certification
of systems is more extensive and complex. Since the system components are no longer
static which can lead to changes in the system behaviour, new verification approaches
need to be designed.
Related, MOSA [1] states in the Certify Conformance principle: ”Openness of sys-
tems is verified, validated, and ensured through rigorous and well-established assessment
mechanisms, well-defined interface control and management, and proactive conformance
testing. The program manager, in coordination with the user, should prepare validation
and verification mechanisms such as conformance certification and test plans to ensure
that the system and its component modules conform to the external and internal open
interface standards allowing plug-and-play of modules, net-centric information exchange,
and re-configuration of mission capability in response to new threats and evolving tech-
nologies.”
To verify that systems meet the defined requirements stated as part of the NGVA,
this paper presents a generic framework for the verification and validation. First, related
verification approaches are presented in section 2. Afterwards, background information
of the NGVA STANAG 4754 is given in section 3, to provide a notion of the requirements
which have to be verified. Section 4 introduces a common terminology which has been
derived. Then, a refined and more detailed verification plan based on an early version of
2
the UK GVA verification approach is presented in section 5, followed by the suggestion
in section 6 to use Compatibility Level to structure the requirements verification proce-
dure. Section 5 provides a verification process consisting of five steps ranging from the
verification planning to the capturing of the results, before concluding in section 8 with
a proposal for future work.
2 Related Work
For the verification and validation of hardware and software many standards and guide-
lines have been developed over the years. This section provides an overview of relevant
approaches.
With respect to V&V processes, IEEE 1012 [8] is a process standard defining specific
V&V activities and related tasks. It describes the contents of the V&V plan and includes
example formats. Since there is strong coupling with life cycle processes, IEEE 1012
points out the relationships between V&V and life cycle processes.
This description of system life cycle processes is addressed by ISO/IEC 15288 [9]. By
defining the processes and associated terminology, it introduces a common framework to
describe the full life cycle of human-made systems from conception to retirement. The
outlined processes are applicable at any level in the hierarchy of a system’s structure.
Referring to ISO/IEC 15288, ISO/IEC/IEEE 29148 [10] gives guidelines for the execu-
tion of the requirements-related processes. It details the required processes necessary for
requirements engineering and gives recommendation for the format of the documentation
to be produced.
For conformity assessment and certification, ISO 17000 [11] provides general terms
and principles. Additionally, it provides a functional approach to conformity assessment
by specifying phases and activities to be carried out.
The standards listed so far are very generic to assure their applicability to a broad
range of hardware and software systems. To conduct actual V&V, they need to be
tailored to and implemented for the specific domain. For example, the NASA Systems
Engineering Handbook [12] gives a top-level overview of the NASA systems engineer-
ing approach addressing it over the entire life cycle – starting at mission requirements
through operations to disposal. Thus, it also covers verification and acceptance for the
aeronautics and space domain.
Dealing also with avionics, the Future Airborne Capability Environment (FACETM)
Conformance Policy [13] presents processes and policies for achieving FACE conformance
certification. Besides outlining the FACE verification process and the FACE certifica-
tion process, it explicitly addresses requirements for maintaining the certification when
modifying sub-systems.
For the land vehicle domain however, there are no specific verification and certification
standards released yet. Currently, it seems that system suppliers follow their own best
practices and procedures [14].
With the turn to modular open system architectures, naturally, the way of verifying
and certifying platforms has to be adapted due to the increased complexity in interactions
3
between sub-systems. Internal and external interface testing becomes a key aspect of
the verification process since matching interfaces ultimately decide if two systems are at
least syntactically interoperable.
Again, the avionics domain is a pathfinder and driver. Based on the integrated mod-
ular avionics concept, Rushby [15] describes a concept for the modular certification
of aircraft. This concept allows pre-certifying components based on assume-guarantee
reasoning to use them across many different air-planes.
In addition to new certification concepts, new metrics to assess the openess of systems
are under consideration. MOSA [1] proposes to measure the percentage of key inter-
faces defined by open standards to determine the degree of system openness. Moreover,
the number percentage of modules that can change without major redesign is given as
openness measure example.
3 NGVA Background Information
The draft NGVA STANAG defines architecture concepts for future land vehicle elec-
tronics (vetronics). These concepts are outlined in seven Allied Engineering Publication
(AEP) volumes. The main focus of the STANAG is the power [16] and data infrastruc-
ture [17] in the vehicle. This effort is supported by guidance on a Crew Terminal Software
Architecture (CTSA) [18] and a Data Model (DM) [19] describing the data exchange of
the different vetronics sub-systems. Additionally, procedures to deal safety [20] as well
as the verification and validation [21] have been outlined.
Since CTSA, DM and Safety are currently handled as guidance and not as specifi-
cations, these documents do not contain any specific requirements. Nevertheless, the
guidance documents will be updated and detailed in the next version of the NGVA and
may contain requirements in future revisions. Therefore, their potential contents have
to be considered appropriate in the design of the verification and validation framework.
To provide a fair understanding of the structure and content of the NGVA, the power
and data infrastructure volumes are briefly described in the next subsections. Both vol-
umes contain requirements related to vehicle sub-systems at which two different types
are distinguished: Compulsory Requirements and Optional Enhancements. A Compul-
sory Requirement (CR) specifies requirements which must be implemented in order to
conform to the NGVA and to gain certification. An Optional Enhancement (OE) does
not necessarily need to be implemented in order to conform to STANAG 4754. How-
ever, if such a capability is present, it needs to be implemented according to the stated
specification in order to be compliant.
3.1 Power Infrastructure AEP Volume
This volume specifies the power interfaces and requirements that form the NGVA Power
Infrastructure. This includes physical interfaces and connectors for a voltage range up
to nominal 28V DC and all components allow to distribute and manage electrical power.
The requirements comprise different levels of detail and abstraction.
4
NGVA POW 001 CR All vehicle platforms and vehicle platform sub-
systems shall conform to the requirements con-
tained within MILSTD 1275D.
NGVA POW 008 CR The NGVA 28V DC 25 ampere low power connec-
tor shall be of type MIL-DTL-38999 series III Rev
L Amdt (07/2009), D38999/XXαC98SA [...].
NGVA POW 027 OE The NGVA power [sub-system] shall inform the
[vehicle crew] of the battery life remaining in hours
and minutes at the current load.
NGVA POW 032 OE The NGVA Power Infrastructure shall provide
controls to disable NGVA power outlets when run-
ning on battery only.
Table 1: NGVA Power Requirements (adapted from [16])
Table 1 provides four examples for the nature of the power requirements. The require-
ments may relate to the whole platform (NGVA POW 001), to the power sub-system it-
self (NGVA POW 027) or to vetronics sub-system connectors (NGVA POW 008). Also,
the implications and therefore test procedures can range from checking the manu-
facturer statement of the correct connector (NGVA POW 008) to functional checks
(NGVA POW 032).
3.2 Data Infrastructure AEP Volume
This volume defines design constraints on the electronic interfaces forming the NGVA
Data Infrastructure. The Data Infrastructure is used for the interconnection of mission
or automotive sub-systems inside the vehicle. It consists of:
1. One or more Local Area Networks (LANs)
2. Data Exchange Mechanism based on Data Distribution Service (DDS) and Data
Distribution Service Interoperability (DDSI) wire protocol [22, 23] and the NGVA
Data Model [19] with the appropriate Quality of Service (QoS)1 Profiles
3. Network Services
4. Physical interfaces and network connectors
5. Audio and video streaming data and control protocol (based on STANAG 4697 -
PLEVID [24], extended by digital voice type specific control and codecs)
6. Gateways for NGVA external data communications, and for connection to legacy
and automotive systems (as required).
Figure 1 provides on overview on the electronic interfaces and protocols to be used
for the information exchange among all vehicle sub-systems. The information exchange
5
Figure 1: NGVA Data Infrastructure Layer
between vetronics sub-systems is primarily based on DDS which is a middleware us-
ing a publish-subscribe model to connect the consumers and providers of resources or
messages. The message structure is based on the NGVA Data Model [19]. From this
model standardized messages, called Topics, are generated to be exchanged between the
various vetronics systems. These Topics define what data structures can be published
and subscribed using the primitive and user-defined data types. QoS profiles regulate
the message transfer by means of specific QoS parameters, e.g stating that DDS com-
munication should be reliable to ensure that all messages are delivered to subscribers of
a particular Topic.
The Data Infrastructure AEP contains nearly 100 requirements specifying how ve-
hicle sub-system data should be transmitted. As characterised for the Power Infras-
tructure volume, the requirements vary in number of concerned entities, in the level
of abstraction and in the verification effort needed to assure conformity. Table 2 gives
a excerpt of five requirements. Depending on the specific requirement, it could affect
nearly all vetronics sub-systems (NGVA INF 002) or just a particular infrastructure el-
ement (NGVA INF 018). Verification might be simply conducted by checking of the
product specification (NGVA INF 009, NGVA INF 018) or might imply the extensive
use of software conformance test tools (NGVA INF 002, NGVA INF 032). In some
cases the requirements are specified on a level that they are even not directly verifiable
6
NGVA INF 002 CR NGVA ready sub-systems shall comply with the
NGVA Arbitration Protocol as defined in the NGVA
Data Model.
NGVA INF 004 CR The NGVA network topology shall be such that the
required data rates and latencies requirements can
be achieved.
NGVA INF 009 CR Ethernet cabling and network infrastructure shall
support data transfer at a minimum transmission
speed of 1Gb/s.
NGVA INF 018 OE If DHCP is intended to be used, all switches shall
be capable of DHCP Snooping.
NGVA INF 032 CR Vetronics Data shall be exchanged by DDS topics
using the ”QoS pattern” attached to it in the NGVA
Data Model to assure assignment of DDS topics.
Table 2: NGVA Data Infrastructure Requirements (adapted from [17])
(NGVA INF 004) and have to be refined by specific platform requirements instead de-
pending on the actually needed platform capabilities. In the current NGVA draft, the
requirements are not yet associated with any verification method, measure of perfor-
mance or justification. However, this is planned to change in the next versions of the
AEP volumes.
4 Terminology
The field of electronic systems V&V has been widely explored in research over the last
decades. It is strongly associated with quality management and conformity assessment.
Therefore, numerous guidelines and widely recognized standards have been published
over the years. However, there is no single standard which is directly applicable to the
V&V of NGVA-based (sub-)systems. As indicated, in the section 3, the sub-systems and
the related requirements differ highly in complexity and abstraction. Thus, several ISO,
IEC, and military standards as well as best practices were analysed and combined to
form a basis specificly for the NGVA Verification and Validation AEP volume [21]. This
section provides the findings of the literature review related to the terminology proposed
for the V&V volume.
4.1 Verification
With respect to NGVA, verification confirms that the requirements defined in the AEP
volumes have been followed and met. This means that the characteristics and behaviour
of the equipment or sub-system comply with the requirements specified in STANAG
4754 which might be refined in an additional System Requirements Document (SRD) or
equivalent.
7
Verification is an assessment of the results of both the design/development processes
and verification process carried out by a supplier, system integrator, designer or an
independent assessment body. Verification is not simply testing, as testing alone cannot
always show the absence of errors. It is a combination of reviews, analysis and tests
based on a structured verification plan. Verification is usually performed at sub-system
as well as platform level.
For the NGVA V&V volume, ISO 9000 [25] and ISO/IEC 15288 [9] was consulted for
the definition of Verification.
Definition (Verification).Confirmation, through the provision of objective evidence,
that specified requirements have been fulfilled. [ISO 9000:2005]. NOTE: Verification
is a set of activities that compares a system or system element against the required
characteristics. This may include, but is not limited to, specified requirements, design
description and the system itself. [ISO/IEC/IEEE 15288]
4.2 Validation
Especially with respect to military platforms, validation generates objective evidence
that the capability enabled by the equipment or system satisfies the needs defined in
the user requirements document or equivalent. Therefore, validation is an assessment
to confirm that the requirements defining the intended use or application of the system
have been met.
The overall intention is to build a vehicle fit for purpose that operates correctly for all
the defined scenarios in the system concept of use, noting that the concept of use may
change through life. Validation must also address the ability of the system to cope with
various faults and failure modes.
Validation evaluates the correct operation of the complete system on specific use
cases. Therefore, an operational context is needed which varies with the particular
purpose of the system. These specifics are not considered in this part of the NGVA in
the first version. Nevertheless, the compliance with overarching NGVA concepts such as
openness, modularity, scalability, and availability should be validated.
In NGVA, again ISO 9000 and ISO/IEC 15288 was accessed for the definition of
Validation.
Definition (Validation).Confirmation, through the provision of objective evidence, that
the requirements for a specific intended use or application have been fulfilled. [ISO
9000:2005]. NOTE: Validation is the set of activities ensuring and gaining confidence
that a system is able to accomplish its intended use, goals and objectives (i.e., meet stake-
holder requirements) in the intended operational environment. [ISO/IEC/IEEE 15288]
4.3 Conformity Assessment and Accreditation
Verification and Validation encompasses of the processes and activities of conformity
assessment concerning requirements. As stated in the introduction, the certification of
conformity after conducting the actual V&V is very important, as well. In order to define
8
the nessessary terminilogy ISO 17000 [11] was consulted since it provides a recognized
nomenclature for a accreditation chain.
Accreditation refers to the appointment of assessment bodies, for example independent
institutes or military test sites, which are authorized to conduct conformity assessment
of NGVA (sub-)systems. Thus, accreditation is by definition different from the issue of
a NGVA conformity statement.
In case of NGVA, therefore governments ratifying the STANAG have to appoint na-
tional accreditation bodies – usually governmental organizations – which have the au-
thority to perform accreditation of NGVA conformity assessment bodies. The national
accreditation bodies agree on procedures and aligned conditions to appoint conformity
assessment bodies. Thereby, the formation of a network of international conformity
assessment bodies is enabled where a conformity assessment body can specialise on
particular verification contents (e.g. power) and is accepted by accreditation bodies
from several other nations. The appointed conformity assessment bodies perform the
assessment services which include demonstration, test, analysis, inspection as well as
certification.
5 Verification Plan
The first release of the UK GVA [26] contained a section outlining a potential Verification
Plan for GVA-based systems. Therein, the GVA Office states that a Verification Plan
shall include:
Organisational responsibilities within the verification process
Verification methods to be used including review and analysis methods
Methods for verification independence, where necessary
Description of verification tools and hardware test equipment
Re-verification guidelines in case of system/design modifications
Guidelines for previously developed or off-the-shelf equipment.
Due to missing level of detail, the V&V section was completely removed in the next
release of the UK GVA Defence Standard.
However to support the verification process for NGVA systems, this verification plan
can be considered as a sensible starting point. For this reason, the demand is picked up in
this section by improving and extending the originally proposed structure. The following
verification plan fractions are written in a generic way and are therefore applicable
to single sub-systems or to a composition of sub-systems. Of course, the following
subsections have to be adapted to the specific system under verification.
9
5.1 Organizational Verification Responsibilities
For the development of a verification plan of a NGVA system, the different stakeholders
should be defined and their responsibilities should be determined. As given in Figure 2
these stakeholders may include:
1. The System Designer and Supplier; possibly represented by the same stakeholder.
The System Supplier is responsible for the Electronic Infrastructure of the NGVA
system by outlining and providing means for power distribution and data exchange
between the sub-systems forming the NGVA system.
2. The Sub-System Designer and Supplier; potentially subcontractors of the System
Designer. The Sub-System Suppliers are responsible for the provision of the indi-
vidual Sub-Systems.
3. The System Integrator may be the same player as the System Supplier initially,
but may change during the maintenance phase. The System Integrator delivers
the complete system.
4. The Customer, e.g. the Procurement Office, typically handles the acceptance of
the verification plan to ensure that it meets the initial (or refined) stakeholder
requirements.
5. The Conformity Assessment Authority is often a governmental institution or in-
dependent authority which provides verification and validation of the system.
Figure 2: NGVA Verification Stakeholders
The stakeholder roles may change during the systems development and procurement
process. Depending on the level of the verification activity, the same stakeholder or
NGVA (Sub-) System may have different roles. This can be illustrated by an example of
a camera C which should be integrated in a surveillance unit U that in turn is mounted
on a scout vehicle.
On the lowest level, the system to be verified, is the camera C itself. In this case, C is
the NGVA (Sub-) System and provides internally ”its infrastructure”; while the camera
manufacturer CM is Designer and Integrator. Thus, C and its manufacturer CM adopt
10
the three left roles of Figure 2. The surveillance unit manufacturer UM is the customer
and may even conduct conformity assessment according to the UM requirements.
Assuming that the surveillance unit U is directly procured by the government to be
deployed in several vehicles and U should be verified, U is the NGVA System that
provides the infrastructure for the NGVA Sub-System C. This results in CM being
the Sub-System Supplier and UM taking the roles of System Integrator and System
Supplier. Hence, the government procurement office has the role of the Customer and
can be supported by an independent assessment authority in the verification of U.
The highest abstraction level is the integration of the components in an actual scout
vehicle. Therefore, a Platform Manufacturer acts as the System Supplier providing
the foundation with the power and data distribution Infrastructure. UM from the last
paragraph would provide U, being one of the Sub-Systems to-be integrated, as a Sub-
System Supplier. The integration of all Sub-Systems on the platform is conducted by
the System Integrator which can be Platform Manufacturer again or a different prime
contractor. The System Integrator is responsible to deliver the entire NGVA System to
be verified. The System is checked for acceptance by the Customer, e.g. the Procurement
Office, according to the Verification Plan possibly with the help of an Independent
Assessment Authority.
5.2 Verification Methods
With respect to potential verification methods, the literature analysis yield ISO/IEC/
IEEE 29148 [10] to be a candidate providing appropriate methods. The detailed descrip-
tion of verification methods in ISO/IEC/ IEEE 29148 complements in a ideal way with
the method definition provided by MIL-STD-498 which were adapted for the NGVA
V&V terminology.
This section gives detailed overview of the four standard verification methods in that
standard: inspection, analysis or simulation, demonstration, and test. Additionally,
the section indicate how they benefit the verification to obtain objective evidence that
NGVA requirements have been fulfilled.
5.2.1 Inspection
According to [10], Inspection proves the item against applicable documentation to verify
properties best determined by examination and observation (e.g., - paint color, weight,
etc.). Inspection is generally non-destructive and typically includes the use of sight,
hearing, smell, touch, and taste; simple physical manipulation; mechanical and electrical
gauging; and measurement.
Regarding NGVA, Inspection is appropriate to verify requirements related to the ful-
filment of other standards (c.f. NGVA POW 001 in table 1), to the use appropriate
connectors (c.f. NGVA POW 008, table 1), or to check if the equipment supports rele-
vant features (NGVA INF 018 in table 2).
11
5.2.2 Analysis (including modelling and simulation)
Analysis uses analytical data or simulations under defined conditions to show theo-
retical compliance where testing to realistic conditions cannot be achieved or is not
cost-effective. Analysis may be based on ”similarity” by reviewing a similar item’s prior
verification and confirming that its verification status can be legitimately transferred to
the present system element [10].
Many requirements related to the correct implementation of the NGVA Data Model
should be proved using Analysis. Especially with respect to the verification of sub-
system implementations, an analysis in a test lab with simulated counterpart equipment
allows to detect issues with the correct usage of DDS Topics and QoS settings (e.g.
NGVA INF 002, NGVA INF 032 in table 2) before integration in the actual operational
environment.
5.2.3 Demonstration
Demonstration is a qualitative exhibition of functional performance, usually accom-
plished with no or minimal instrumentation or test equipment [10]. Demonstration uses
a set of test activities with system stimuli selected by the supplier to show that system
or system element response to stimuli is suitable or to show that operators can perform
their allocated functions when using the system.
Demonstration is useful to prove functional NGVA requirements requesting to inform
the vehicle crew or to let the crew members physically control a sub-system. Two
example for it are NGVA POW 027, NGVA POW 032 of table 1).
5.2.4 Test
Test quantitatively verifies the operability, supportability, or performance capability
of an item when subjected to controlled conditions that are real or simulated. These
verifications often use special test equipment or instrumentation to obtain very accurate
quantitative data for analysis [10].
Tests are especially needed to ensure that vehicle infrastructure provides the required
latencies and data rates in the interaction of all vehicle systems. With respect to
NGVA INF 004 in table 2, for example, evaluating the network load under operational
conditions with network tools allows to detect possible bottleneck in the vehicle archi-
tecture.
5.3 Review Methods
Throughout the verification process, formal system reviews and audits should be per-
formed at different phases of the verification, e.g. Test Readiness Reviews [27, Section
3.6]. The verification plan should include necessary reviews as well as corresponding
review methods.
E.g., these reviews should ensure that all relevant NGVA requirements for specific
system are captured by the verification plan, appropriate verification methods are used,
12
and verification is conducted properly. Therefore, check-lists or other aids should be
used.
5.4 Analysis Methods
The verification plan should include means to assure traceability and coverage analysis
of requirements. All requirements must be traceable to an implementation/realization in
the system. This allows comprehensive proof that all relevant requirements are fulfilled.
Additionally, provisions to link requirements and verification activities or test cases
should be described in the verification plan. Therefore, a requirements traceability
matrix, also known as requirements coverage matrix, can be used.
5.5 Verification Tools and Techniques
Usually, hardware and software tools are used to assist and automate verification pro-
cesses. For example, software tools supporting test coverage analysis and regression
testing. The verification plan should include guidelines for these tools and any hardware
test equipment. This includes detailed description of the needed tools, explanations of
each tool’s performance, required inputs and outputs generated.
Additionally, the verification plan should address test facilities and integration and
system test laboratories supporting the verification effort, e.g. specific conformance test
systems or interoperability test labs.
5.5.1 Conformance and Interoperability Tests
The main objective of NGVA is the assurance of interoperability between NGVA (Sub-)
Systems. To evaluate system conformity to standards in this vein, typically confor-
mance and interoperability testing are used. Both techniques are complementary; often,
conformance testing addresses protocols and lower-layer communication aspects while
interoperability testing is used for entire systems and applications.
According to [28, Section 4.2], Conformance testing is conducted by a Test System
which stimulates a System under Test. This System under Test often contains an Im-
plementation under Test, which is subject to conformance testing (c.f. Figure 3). Con-
formance testing is a formal process, deterministic and repeatable, and ensures that a
system meets a defined set of requirements, for example, a correctly implemented proto-
col stack. This is especially for data exchange specified in the NGVA Data Infrastructure
and Data Model.
Figure 3: Conformance Testing (adapted from [28, Section 4.2])
13
Interoperability testing [28, Section 4.1] , in contrast, is performed at system interfaces,
which offer only normal user control and observation (c.f. Figure 4). Therefore, it
is based on functionality as experienced by a user and not specified at the protocol
level. The purpose of interoperability testing is to prove that end-to-end functionality
between at least two NGVA (sub-) systems, the Equipment under Test and the Qualified
Equipment, is as defined as the NGVA STANAG.
Figure 4: Interoperability Testing (adapted from [28, Section 4.1])
Interoperability testing as well as Conformance testing should be tackled by the NGVA
verification process. Therefore the verification plan should describe which following tests
should be conducted.
5.5.2 NGVA Data Model Conformance Testing
One verification key aspect should be NGVA Data Model Conformance Testing. This
evaluation may be conducted by independent conformity assessment bodies which pro-
vide appropriate test systems potentially over a Virtual Private Network (VPN) or the
Internet.
These Conformance Test Systems have to verify the NGVA Data Model conformity
of NGVA systems. Therefore, NGVA sub-systems (System under Test) are considered
as black boxes and system specific tests to evaluate the system response for valid, inop-
portune and invalid input are run.
These formalized test suites should support an automatic execution of test cases as
well as an automatic and unbiased assignment of test verdicts. Centrally maintained
NGVA Data Model Conformance Test Systems have the advantage that all (sub-) system
vendors can always access the latest release of the test suite. However restricted by
spatial distribution, they cannot reflect a real vehicle bus and are not suitable for real-
time testing.
5.5.3 Test Laboratories and Test Beds
For overarching conformance and interoperability tests, vendors as well as vendor-
independent authorities could maintain test beds to conduct tests prior to the initial
release of products or upgrades. These test beds allow a collocated testing verifying
that all real-time, safety, and security requirements are met.
In particular, the test beds should provide the infrastructure to which NGVA systems,
the Equipment under Test, have to be interoperable in order to be verified. Therefore,
the test beds may consist of components that can control and request data from the
Equipment under Test or gateway components might be necessary.
14
5.5.4 Demonstrators and Experiments
Especially for the confirmation of functional and operational requirements, demonstra-
tors and experiments should be used. They can be used for verification as well as
validation to prove the intended use of the system. Thereby, the defined concept of use
of the system is validated in predefined operational scenarios.
5.6 Verification Independence
Verification by independent authorities may be necessary for requirements that are
safety-critical or of high-security nature. This is currently not highly relevant for NGVA
since the Safety AEP volume does not contain requirements but this may change in
the next release of the NGVA STANAG. Additionally, a separate Security volume is
under consideration. Therefore, the verification plan should include provisions to take
an appropriate amount of independence into account.
According to [8], Independent Verification and Validation (IV&V) is defined by three
parameters:
1. Technical Independence: Requires effort to use personnel who are not involved in
the development of the system or its elements.
2. Managerial Independence: Requires that the responsibility for the IV&V effort be
vested in an organization separate from the development and program management
organizations.
3. Financial Independence: Requires that control of the IV&V budget be vested in
an independent organization.
Depending on the complexity of the NGVA system to be verified, different forms of
independence have to be adopted for a verification organization. The five most prevalent
are Classical,Modified,Integrated,Internal, and Embedded IV&V [8]. The verification
plan should state the appropriate form for the addressed NGVA system.
5.6.1 Classical IV&V
Classical IV&V embodies all three independence parameters. It is generally required for
Safety Integrity Level (SIL) 4 (i.e., loss of life, loss of mission, significant social loss, or
financial loss) through regulations and standards imposed on the system development.
Further information on Safety will be given in the NGVA Safety AEP volume.
5.6.2 Modified IV&V (No managerial independence)
Modified IV&V is used in many large programs where the system prime integrator is
selected to manage the entire system development including the IV&V. Because the
prime integrator performs all or some of the development, the managerial independence
is compromised by having the IV&V effort report to the prime integrator. Modified
15
IV&V effort would be appropriate for systems with SIL3 (i.e., an important mission and
purpose). Further information on Safety will be given in the NGVA Safety AEP volume.
5.6.3 Integrated IV&V (No technical independence)
This type is focused on providing rapid feedback of V&V results into the development
process and is performed by an organization that is financially and managerially in-
dependent of the development organization to minimize compromises with respect to
independence. The rapid feedback of V&V results into the development process is fa-
cilitated by the integrated IV&V organization but has a potential impact on technical
independence.
5.6.4 Internal IV&V
Internal IV&V exists when the developer conducts the IV&V with personnel from within
its own organization, although preferably not the same personnel involved directly in the
development effort. Technical, managerial, and financial independence are compromised.
This form of IV&V is used when the degree of independence is not explicitly stated and
the benefits of pre-existing staff knowledge outweigh the benefits of objectivity. This
approach is preferable as long as the NGVA (sub-) system under development has no
safety obligations.
5.6.5 Embedded V&V
This type is similar to internal IV&V. The embedded V&V organization however works
side by side with the development organization and attends the same inspections, walk-
throughs, and reviews as the development staff. Embedded V&V is not specifically to
assess independently the original solution or conduct independent tests.
5.7 Re-Verification Guidelines
After modifications of design or implementation, NGVA equipment has to be reverified.
Depending on the level of change, the complete system may need to be reverified. Thus,
the verification plan should describe re-verification guidelines depending on the type
and level of (sub-) system changes. If there are no guidelines given, the complete system
needs to perform the full verification process.
5.8 Legacy Equipment Guidelines
For any previously developed or off-the-shelf equipment, a description of the methods
to satisfy the objectives of the NGVA STANAG should be given. These methods can
incorporate the development of software and hardware adapters as well as descriptions
for dealing with safety and power issues. In addition, a roadmap for a long-term NGVA
adoption should to be outlined. If there are no descriptions given, all legacy and off-the-
shelf equipments are treated as a genuine NGVA system.
16
6 NGVA Compatibility Level
As described in section 3, NGVA requirements vary a lot in the level of abstraction and
in their impact on the (sub-) system under test and the related vehicle infrastructure.
Since the NGVA is a fairly new specification, additionally not all future vehicle sub-
systems may address all NGVA requirements from the beginning. Thus, this section
proposes so-called NGVA Compatibility Levels that allow to verify the different system
requirements in a structured manner by arranging the order of verification. In addition,
the Comparability Levels permit to certify NGVA conformity up to a certain level.
Therefore, an incremental certification is proposed based on three levels: Connectivity
Compatibility, Communication Compatibility, and Functional Compatibility (c.f. Fig-
ure 5). These levels are sequentially-related: Communication Compatibility includes
Connectivity Compatibility and Functional Compatibility includes all others.
Figure 5: NGVA Compatibility Levels
6.1 Connectivity Compatibility
The first level, Connectivity Compatibility ensures that the (sub-) system can be phys-
ically integrated into the NGVA architecture without any negative impacts to existing
NGVA components. Physical power and network interfaces comply with the require-
ments of Power and Data Infrastructures AEP volumes.
Thus, this level applies to requirements that concern the electrical and physical spec-
ifications of the connectors as well as low level means to transfer data between NGVA
(sub-) systems, e.g. OSI Layer 1-4 protocols. Additionally, the first level contains re-
quirements that may compromise other services; for example, requirements that are
related to EMC-safety and power supply. These first level requirements are mainly ver-
ified by physical inspection or testing. In some cases like EMC, even the inspection of
conformity statements from vendors may be sufficient.
6.2 Communication Compatibility
If applicable to a (sub-) system, Communication Compatibility refers to the correct
implementation of the NGVA Data Model and video streaming standards. On the basis
of achieved Connectivity Compatibility, data interfaces (e.g. Data Distribution Service,
Video/Audio Protocols) and associated NGVA Data Model implementation (e.g. Topic
Types, Quality of Service) need to comply with the NGVA Data Model AEP volume.
17
Based on requirements stating the services provided by the system and the NGVA-
related sub-systems are integrated into the system, relevant parts of the NGVA Data
Model covered by the equipment are derived and tested. These tests cover the systems
data exchange specified in the NGVA Data Model, e.g. correct publishing of specification
topics and correct response to mode changes.
6.3 Functional Compatibility
Underpinned by Communication Compatibility, Functional Compatibility evaluation en-
sures that data flows conform to data exchange, performance and specific functional
requirements. Concerning data exchange, NGVA Data Model tests which cover the sys-
tem or component response for valid, inopportune and invalid inputs are conducted.
This includes tests for the publishing of correct information and data format (e.g. for
the current GPS position) and the proper behaviour for commands (for example mount
movements).
Additionally, this level should be evaluated if real-time and bandwidth requirements
are met and specific functional requirements e.g. regarding security are fulfilled. If
further operational requirements are provided, they should be tested here as well.
7 Verification Process
The actual evidence that NGVA (sub-) systems fulfil the requirements specified in the
NGVA STANAG is established in the Verification Process. Thus, this following subsec-
tions proposes a NGVA Verification Process consisting of five steps based on the NASA
Systems Engineering Handbook [12, Section 5.3]. Typically, the Verification Process is
performed by the developer that realized the NGVA end-system with participation of
the end users and independent conformity assessment bodies. The Verification Plan of
section 5 and the System Requirements Document (SRD) of the system under test are
the key inputs for the Verification Process.
7.1 Verification Planning
Planning of the verification process is a first key step. Based on the SRD and the require-
ments of the NGVA AEP volumes, the specific requirements are collected and verification
types (e.g., analysis, demonstration, inspection or test) for them are established. Addi-
tionally, the verification plan should be reviewed for any specific procedures, constraints
or further measures that have to be considered prior to the actual verification.
7.2 Verification Preparation
In preparation for verification, the system requirements are reviewed, confirmed, and
allocated to the different NGVA Compatibility Levels. The NGVA system to be verified
is acquired as well as any enabling products and support resources that are necessary
18
for verification. The verification preparation includes the verification environment. For
Connectivity Compatibility this may cover tools or measuring devices for a particular
pin-out. In case of Communication Compatibility, an account for the NGVA Data Model
Conformance Test System has to be requested or further measures to connect to the
system have to be considered. To test Functional Compatibility simulations may have
to be prepared. The particular measures depend on the specific system requirements.
7.3 Verification Performance
In this step, the verification of NGVA systems is conducted and conformity to each
relevant verification requirement is tested. Therefore, the responsible stakeholder should
ensure that the procedures are followed and performed as specified in the verification
plan and the data is collected and recorded for verification analysis. In this phase,
the tests for the three NGVA Compatibility Levels are conducted in sequential order
from Connectivity over Communication to Functional Compatibility. The different test
procedures and outcomes are linked to the requirements by appropriate means, e.g. a
requirements traceability matrix.
7.4 Verification Outcomes Analysis
Once the verification activities have been completed, the collected results are analysed,
in particular for quality and correctness. Based on this analysis and possible defects, it
could be necessary to re-realize the system or to re-engineer the sub-systems assembled
and integrated into the system verified and to re-perform the NGVA verification process.
Additionally, verification test outcomes can be unsatisfactory for other reasons. This
includes poor conduct of the verification process (e.g., procedures not properly followed,
use of un-calibrated equipment, etc.). This would cause re-performing of the affected
verification steps, as well.
7.5 Capturing of Verification Results
As last step, verification results shall be produced from the verification process activities.
The verification results should:
Identify the verified system including its configuration or version number
State verifier and verification date
Specify the used tools including their configuration and version numbers
Indicate each procedure that passed or failed during the activities
Contain any corrective action taken and the lessons learned (including feedback to
improve the NGVA specification)
Include a traceability analysis
19
Capture the final pass/fail results for each requirement
Document proof that the realized system did (not) satisfy the requirements
Include conclusions and recommendations for further verification activities and
Mention consequences for the validation of the system.
8 Conclusion and Future Work
This paper provides guidance on the verification for systems design according to the
emerging NGVA STANAG. The approach is intentionally kept very generic in order to
deal with any type of NGVA (sub-)system.
Based on a common terminology, a detailed verification plan was introduced. To
facilitate the conformity assessment of NGVA systems, three sequentially-related com-
patibility levels have been developed which allow the evaluation of the requirements in
a structured manner. Additionally, a verification process consisting of five steps from
planning, over preparing, conducting, and analysing to the capturing of the verification
resultshas been proposed. The presented verification framework has been discussed and
agreed in the NGVA community and was accepted as the study draft for the NGVA
Verification and Validation Volume [21].
While concentrating on verification, validation and accreditation is only basically cov-
ered in the current study draft. Thus, additional work is needed to improve the volume
in the next version. Especially for validation, research on how to measure openness,
modularity, and scalability is necessary. In software development, progress in designing
metrics has been made and applicability in the NGVA context needs to be assessed.
Additionally, more emphasis on modular verification could improve the V&V specifi-
cation. Since the NGVA Safety volume will probably contain requirements in the next
release, the effort of re-certifying the whole NGVA system whenever a (sub-) system
changes should be prevented. Modular Safety Cases are an approach to mitigate the
complexity and it should be evaluated if similar contract approaches or service level
agreements are applicable to NGVA as a whole.
The Compatibility Levels need further examination. Introduced to facilitate the ver-
ification, their definition might need to be detailed. Their effective benefit and applica-
bility in an actual verification process needs examination. In addition, all final NGVA
requirements have to be checked and a proposal for a Compatibility Level should be
given.
References
[1] Open Systems Joint Task Force. Modular Open Systems Approach to Acquisition,
Version 2. US Department of Defence, Sept. 2004.
[2] Command and Control Initiative. Open Mission Standard, Version 1.0. Apr. 2014.
20
[3] Thomas Gaska. “Lockheed Martin Common Cockpit Avionics Suite – A Modular
Open Systems Approach (MOSA) Driven Architecture”. In: American Helicopter
Society (AHS) Forum 63 (May 2007).
[4] The Open Group. Technical Standard for Future Airborne Capability Environment
(FACE), Edition 2.1. Reference C145. The Open Group, May 2014.
[5] Mike Southworth. Delivering VICTORY and PNT Hub Services for Tactical Ground
Vehicle Architecture. White Paper. Curtiss Wright Defense Solutions, Oct. 2015.
[6] Ministry of Defence. Defence Standard 23-09 – Generic Vehicle Architecture. Is-
sue 3. Defence Equipment and Support, UK Defence Standardization, Mar. 2013.
[7] NATO. STANAG 4754, NATO Generic Systems Architecture (NGVA) for Land
Systems, Edition 1, Study Draft. NATO Standardization Office (NSO), Aug. 2015.
[8] IEEE. “IEEE Standard for System and Software Verification and Validation”. In:
IEEE Std 1012-2012 (Revision of IEEE Std 1012-2004) (May 2012), pp. 1–223.
doi:10.1109/IEEESTD.2012.6204026.
[9] ISO. ISO/IEC/IEEE 15288:2008, Systems and software engineering – System life
cycle processes. International Organization for Standardization, 2008. url:http:
//www.iso.org/iso/catalogue_detail?csnumber=63711.
[10] ISO. ISO/IEC/IEEE 29148:2011, Systems and software engineering – Life cycle
processes – Requirements engineering. International Organization for Standardiza-
tion, 2011. url:http://www.iso.org/iso/catalogue_detail.htm?csnumber=
45171.
[11] ISO. ISO/IEC 17000:2004, Conformity assessment – Vocabulary and general prin-
ciples. International Organization for Standardization, 2004. url:http://www .
iso.org/iso/home/store/catalogue_tc/catalogue_detail.htm?csnumber=
29316.
[12] National Aeronautics and Space Administration. NASA Systems Engineering Hand-
book. SP-610S. PPMI, June 1995.
[13] The Open Group. Future Airborne Capability Environment (FACE): Conformance
Policy, Version 1.1. Reference X1406. The Open Group, July 2014.
[14] Stephen Walters and Cara Perks. Test, Evaluation & Accpetance Best Practice.
White Paper. BMT Group, 2011.
[15] John Rushby. Modular Certification. NASA Contractor Report CR-2002-212130.
NASA Langley Research Center, 2002.
[16] NATO. AEP-4754 NATO Generic Systems Architecture (NGVA) for Land Sys-
tems, Volume 2: NGVA Power Infrastructure, Edition A, Version 1, Study Draft
1. NATO Standardization Office (NSO), Aug. 2015.
[17] NATO. AEP-4754 NATO Generic Systems Architecture (NGVA) for Land Sys-
tems, Volume 3: NGVA Data Infrastructure, Edition A, Version 1, Study Draft 1.
NATO Standardization Office (NSO), Aug. 2015.
21
[18] NATO. AEP-4754 NATO Generic Systems Architecture (NGVA) for Land Sys-
tems, Volume 4: NGVA Crew Terminal Software Architecture, Edition A, Version
1, Study Draft 1. NATO Standardization Office (NSO), Aug. 2015.
[19] NATO. AEP-4754 NATO Generic Systems Architecture (NGVA) for Land Sys-
tems, Volume 5: NGVA Data Model, Edition A, Version 1, Study Draft 1. NATO
Standardization Office (NSO), Aug. 2015.
[20] NATO. AEP-4754 NATO Generic Systems Architecture (NGVA) for Land Sys-
tems, Volume 6: NGVA Safety, Edition A, Version 1, Study Draft 1. NATO Stan-
dardization Office (NSO), Aug. 2015.
[21] NATO. AEP-4754 NATO Generic Systems Architecture (NGVA) for Land Sys-
tems, Volume 7: NGVA Verification and Validation, Edition A, Version 1, Study
Draft 1. NATO Standardization Office (NSO), Aug. 2015.
[22] OMG. Data Distribution Service (DDS), Version 1.4. Object Management Group,
2015. url:http://www.omg.org/spec/DDS/1.4.
[23] OMG. The Real-time Publish-Subscribe Protocol (RTPS) DDS Interoperability Wire
Protocol Specification, Version 2.2. Object Management Group, 2014.
[24] NATO. STANAG 4697, Platform Extended Video Standard (PLEVID), Edition 1,
Ratification Draft. NATO Standardization Office (NSO), 2013.
[25] ISO. ISO 9000:2005, Quality management systems – Fundamentals and vocabulary.
International Organization for Standardization, 2005. url:http : / / www . iso .
org/iso/home/store/catalogue_ics/catalogue_detail_ics.htm?csnumber=
42180.
[26] Ministry of Defence. Defence Standard 23-09 – Generic Vehicle Architecture. Is-
sue 1. Defence Equipment and Support, UK Defence Standardization, Aug. 2010.
[27] US Department of Defence. MIL-STD-1512B, Technical Reviews and Audits for
Systems, Equipments and Computer Software. US Department of Defence, June
1995.
[28] ETSI Guide. Methods for Testing and Specification (MTS); Internet Protocol Test-
ing (IPT); Generic approach to interoperability testing. ETSI EG 202 237 V1.2.1
(2010-08). European Telecommunications Standards Institute, Aug. 2007. url:
http://www.etsi.org/deliver/etsi_eg/202200_202299/202237/01.02.01_
60/eg_202237v010201p.pdf.
22
Article
Full-text available
This paper outlines the development of an onboard computer prototype for data registration, storage, transformation, and analysis. The system is intended for health and use monitoring systems in military tactical vehicles according to the North Atlantic Treaty Organization Standard Agreement for designing vehicle systems using an open architecture. The processor includes a data processing pipeline with three main modules. The first module captures the data received from sensor sources and vehicle network buses, performs a data fusion, and saves the data in a local database or sends them to a remote system for further analysis and fleet management. The second module provides filtering, translation, and interpretation for fault detection; this module will be completed in the future with a condition analysis module. The third module is a communication module for web serving data and data distribution systems according to the standards for interoperability. This development will allow us to analyze the driving performance for efficiency, which helps us to know the vehicle’s condition; the development will also help us deliver information for better tactical decisions in mission systems. This development has been implemented using open software, allowing us to measure the amount of data registered and filter only the relevant data for mission systems, which avoids communication bottlenecks. The on-board pre-analysis will help to conduct condition-based maintenance approaches and fault forecasting using the on-board uploaded fault models, which are trained off-board using the collected data.
Conference Paper
Full-text available
Nowadays, military land vehicles are equipped with various sensors and effectors. These subsystems are often either not connected or are only connected via manufacturer-specific interfaces. This situation complicates efficient changes and extensions of vehicle equipment since modifications are only possible with the help of the original manufacturer. In order to unify subsystem interfaces and to increase interoperability, various open architecture initiatives have been started by national and multinational organisations. Open architectures offer many advantages such as simplified integration. However, it also requires new approaches to verify that an implementation complies with the architecture specification. This issue has already been identified by some initiatives such as the NATO Generic Vehicle Architecture (NGVA) but considerations are at the conceptual level at the moment. No tools for interface verification are implemented though. This paper describes the realisation of a conformance test system for the data exchange of NGVA-based vehicle subsystems. The test system allows verifying NGVA Data Model compliance of subsystems with respect to their interfaces and behaviour by means of stimulation with valid and inopportune messages. For this purpose, the test system architecture and test categories are derived from the NGVA specification. Furthermore, test cases for an example NGVA system are discussed.
Conference Paper
Lockheed Martin Systems Integration–Owego (LMSI Owego) is the system design agent for a set of modular Flight Avionics System (FAS), Mission Avionics System (MAS), and Stores Avionics System (SAS) to support the Multi-Mission Helicopter (MMH). LMSI’s FAS solution is the proven Common Cockpit™ Avionics Suite. The Common Cockpit™ Avionics Suite architecture optimizes performance, reliability and maintainability. The Common Cockpit™ Avionics Suite scalable Modular Open System Approach (MOSA) accommodates customer preferences for technology and product insertions while providing for system growth throughout the life of the program. A technology insertion program is underway that requires maintaining form, fit, and function for legacy elements while enhancing integrated capabilities for net-centric warfare. The MMH program is the cornerstone of the Navy Master Helo plan and includes mission extended common integrated avionics for the MH-60R, MH-60S Airborne Mine Counter Measures (AMCM), MH-60S Armed Helo (AH), and MH-60S Vertical Replenishment (VERTREP) variants. Contributing to the Navy Master Helo plan cost effectiveness is the use of common modular avionics components in the Mission Computer (MC), Flight Management Computer (FMC), Acoustic Processor (AP), Sensor Data Computer (SDC), and Electronic Support Measures (ESM). The MH-60S aircraft with its Common Cockpit™ Avionics Suite is a proven system having logged over 100,000 incident-free flight hours in unusually harsh environments such as the Gulf War. To date, the partnership between LMSI and Sikorsky on the MH-60 has resulted in the successful delivery of over 100 cockpit Avionics Management Systems, on or ahead of schedule with over 90 MH-60S aircraft delivered to Sikorsky and the USN. Now, with the recent success of the MH-60R OpEval, another Sikorsky helicopter joins the fleet with Common CockpitTM Avionics Suite. The USN is realizing significant operational and support benefits including standardized training for flight and maintenance procedures, common training devices, common spare parts and Performance Based Logistics (PBL) support, which enables significant personnel and cost savings. Another benefit, which will be achieved, is the USN execution of a Multi-Year funded contract, resulting in the most cost effective procurement of over 500 avionics ship sets. This paper will describe the current technology insertion architecture and its capabilities, the provisioning for future growth and open system enhancements, and tradeoffs toward future migration to add more Integrated Modular Avionics System (IMAS) and Modular Open Systems Approach (MOSA) attributes. It will also present the tradeoffs and issues associated with extending the common modular avionics into other subsystems in the platform via staged technology insertion events. Key IMAS insertion platform considerations include legacy software migration, establishing unified network connectivity, managing aircraft wiring impact, and supporting multiple configuration baselines. Future net-centric software application considerations include migration to multi-level secure infrastructure, migration to net-centric defined interoperable data representations, migration to software digital radio clusters, and migration to push/pull data services. Planned product line horizontal and vertical integration is also important as it enables common modular avionics synergies with other platforms to fill technology insertion gaps.
Article
Airplanes are certified as a whole: there is no established basis for separately certifying some components, particularly software-intensive ones, independently of their specific application in a given airplane. The absence of separate certification inhibits the development of modular components that could be largely "precertified" and used in several different contexts within a single airplane, or across many different airplanes. In this report, we examine the issues in modular certification of software components and propose an approach based on assume-guarantee reasoning. We extend the method from verification to certification by considering behavior in the presence of failures. This exposes the need for partitioning, and separation of assumptions and guarantees into normal and abnormal cases. We then identify three classes of property that must be verified within this framework: safe function, true guarantees, and controlled failure. We identify a particular assume-guarantee proof rule (due to McMillan) that is appropriate to the applications considered, and formally verify its soundness in PVS.
AEP-4754 NATO Generic Systems Architecture (NGVA) for Land Systems
  • Nato
NATO. AEP-4754 NATO Generic Systems Architecture (NGVA) for Land Systems, Volume 2: NGVA Power Infrastructure, Edition A, Version 1, Study Draft 1. NATO Standardization Office (NSO), Aug. 2015.
Defence Standard 23-09 -Generic Vehicle Architecture. Issue 3. Defence Equipment and Support, UK Defence Standardization
  • Ministry Of Defence
Ministry of Defence. Defence Standard 23-09 -Generic Vehicle Architecture. Issue 3. Defence Equipment and Support, UK Defence Standardization, Mar. 2013.
Delivering VICTORY and PNT Hub Services for Tactical Ground Vehicle Architecture. White Paper
  • Mike Southworth
Mike Southworth. Delivering VICTORY and PNT Hub Services for Tactical Ground Vehicle Architecture. White Paper. Curtiss Wright Defense Solutions, Oct. 2015.
Systems and software engineering – Life cycle processes – Requirements engineering
  • Iso Iso
  • Iec
ISO. ISO/IEC/IEEE 29148:2011, Systems and software engineering – Life cycle processes – Requirements engineering. International Organization for Standardization, 2011. url: http://www.iso.org/iso/catalogue_detail.htm?csnumber= 45171.
Defence Standard 23-09 -Generic Vehicle Architecture. Issue 1. Defence Equipment and Support, UK Defence Standardization
  • Ministry Of Defence
Ministry of Defence. Defence Standard 23-09 -Generic Vehicle Architecture. Issue 1. Defence Equipment and Support, UK Defence Standardization, Aug. 2010.
Test, Evaluation & Accpetance Best Practice. White Paper
  • Stephen Walters
  • Cara Perks
Stephen Walters and Cara Perks. Test, Evaluation & Accpetance Best Practice. White Paper. BMT Group, 2011.
Systems and software engineering – System life cycle processes. International Organization for Standardization
  • Iso Iso
  • Iec
ISO. ISO/IEC/IEEE 15288:2008, Systems and software engineering – System life cycle processes. International Organization for Standardization, 2008. url: http: //www.iso.org/iso/catalogue_detail?csnumber=63711.
Generic Systems Architecture (NGVA) for Land Systems, Edition 1, Study Draft
  • Nato Stanag
NATO. STANAG 4754, NATO Generic Systems Architecture (NGVA) for Land Systems, Edition 1, Study Draft. NATO Standardization Office (NSO), Aug. 2015.