Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
?
EE Times-Asia > T&M
?
?
T&M??

Adopting aerospace verification standards (Part 1)

Posted: 19 Sep 2013 ?? ?Print Version ?Bookmark and Share

Keywords:DO-178C? avionics? MISRA? verification? high-level languages?

Functional testing (figure) is only as good as the requirements against which the tests have been developed. Studies such as the Chaos Report [6] repeatedly illustrate that a huge number of software projects fail (meaning they had cost or time overruns or didn't fully meet the user's needs) and a major reason for failure is problems with requirements, whether as a result of overwhelming complexity, ambiguous and imprecise definition or scope, creep across the life of the project. The other disadvantage to using only functional testing is the obvious precondition that the system (or sub-system) under test must be coded and functional before testing can begin.

Figure: Typical functional testing harness.

To gain the benefits of early testing, many aerospace companies now employ iterative development processes, which has contributed to an improvement in failure rates in recent years. Iterative development focuses on system subsets and is modular in design. An appropriate technique, typically called unit testing, is a bottom-up process that focuses on system internals, such as classes and individual functions. Not only does unit testing facilitate early stage or prototype development, it can also be used to cover the paths and branches in the software that may be unpredictable or otherwise are impractical to exercise from a functional testing perspective (e.g., error handlers).

By definition, unit testing aims to verify a small portion of the whole system, an incomplete portion that cannot execute independently. Therefore, test drivers and harnesses are required to deliver input values, record outputs, stub missing functionality, and build an executable environment encompassing everything. Immediately, we begin to understand why unit testing is under-used by up to 90% of software engineers:

???There is a huge overhead associated with manually creating test scripts as well as maintaining these elements whenever there are changes to requirements, design or code.
???The test scripts, harnesses and drivers are also software and are thus prone to the same failings of any manually created software.
???The component to be tested has been implemented using language features, such as data hiding, which make it very difficult to provide input values or verify outputs.
???The lack of a unified and structured method means that techniques are applied on a project-by-project basis with little opportunity for reuse via industry-wide standards.

Many of the problems associated with the implementation of traditional manual unit testing processes are concerned with the high skill levels required and the considerable additional overhead that such techniques can impose.

Automation of these processes with the use of tools enables the techniques to be made more standardised yet intuitivehighly desirable goals with potential benefits of increased efficiency and reduced costs. Automation also permits the development of repeatable processes and the standardisation of testing practices. Often tools capture and store complete test information that can be held in a configuration management system with the corresponding source code, then retrieved and imported at a later date for instant regression testing.

What's gained from functional and unit testing is proof that software satisfies its requirements and that errors have been removed. What we don't yet know is how complete the testing effort has been. That's where source and object code verification comes in.

References
1. RTCA Inc. (originally the Radio Technical Commission for Aeronautics) is a private, not-for-profit corporation that develops consensus-based recommendations regarding communications, navigation, surveillance, and air traffic management (CNS/ATM) system issues.
2. EUROCAE, the European Organisation for Civil Aviation Equipment, is a nonprofit organisation which provides a European forum for resolving technical problems with electronic equipment for air transport.
3. The Motor Industry Software Reliability Association (MISRA) is a collaboration between vehicle manufacturers, component suppliers and engineering consultants which seeks to promote best practice in developing safety-related electronic systems in road vehicles.
4. "Guidelines for the use of the C language in critical systems", published first by MISRA Limited in October 2004 and again in March 2013 after comprehensive revision. These standards are complete reworks of the original set published in 1998.
5. "Joint Strike Fighter (JSF) Air Vehicle (AV) C++ Coding Standards for the System Development and Demonstration Program", document number 2RDU00001 Rev D, June 2007. These standards build on relevant portions of the MISRA-C standards with an additional set of rules specific to the appropriate use C++ language features (e.g., inheritance, templates, namespaces) in safety-critical environments.
6. The Chaos Report from the Standish Group has been regularly published since 1994. The 2006 report revealed that 35% of software projects could be categorised as successful, meaning they were completed on time, on budget and met user requirements. This is a marked improvement over 1994 when only 16.2% of projects were labelled as successful.
7. The Orion Crew Exploration Vehicle (CEV) is a spacecraft currently under development by NASA, the contract for its design and construction was awarded to Lockheed Martin in August 2006.

About the authors
Mark Pitchford has over 25 years' experience in software development for engineering applications. He has worked on many significant industrial and commercial projects in development and management, both in the UK and internationally, including extended periods in Canada and Australia. Since 2001, he has specialised in software test, and works throughout Europe and beyond as a Field Applications Engineer with LDRA Ltd.

Bill St. Clair is currently Director, US Operations for LDRA Technology and LDRA Certification Services and has more than 25 years in embedded software development and management. He has worked in the avionics, defence, space, communications, industrial controls, and commercial industries as a developer, verification engineer, manager, and company founder. He holds a U.S. patent for a portable storage system and is inventor of a patent-pending embedded requirements verification system. Bill's leadership was instrumental in adapting requirements traceability into LDRA's verification process.

To download the PDF version of this article, click here.


?First Page?Previous Page 1???2???3



Article Comments - Adopting aerospace verification stan...
Comments:??
*? You can enter [0] more charecters.
*Verify code:
?
?
Webinars

Seminars

Visit Asia Webinars to learn about the latest in technology and get practical design tips.

?
?
Back to Top