Kobus’ musings

The ramblings of an embedded fool.

A More Agile DO-178

When looking at software development methodologies, there is in my mind a spectrum that looks something as follows:

So what’s wrong with the extremes of waterfall or agile development, especially concerning safety critical products? Well I think Dilbert may have some advice to share here. Waterfall development typically suffers from the following:

DILBERT © 2001 Scott Adams. Used By permission of UNIVERSAL UCLICK. All rights reserved.


With waterfall development, a lot of unnecessary requirements gets specified, because the client is scared that if he doesn’t name every possible use case his product could get used for, he won’t have another chance. The result is bloat, increased costs and increases in the product complexity, not great for safety critical products.

Ok, so agile should be perfect then?

DILBERT © 1997 Scott Adams. Used By permission of UNIVERSAL UCLICK. All rights reserved.


Nope, agile struggles to guarantee that the product is safe. Agile tries to shorten the feedback cycle, “fail fast” or “move fast and break things”. Well when you are developing software for an aircraft, you can’t exactly crash an airplane every time you release and then quickly fix the bug, even if you do it fast…

Ok, so up to now most DO-178 development was done with waterfall methodologies, but what might a more agile DO-178 development process look like?

To answer this question, we have to look at the deliverables required for DO-178 certification, and at what stages of the waterfall development model they are typically produced:

Organisational:
(These can be re-used across multiple projects, if the projects are similar enough off course)

  • Software configuration management plan (SCM)
  • Software quality assurance plan (SQA)
  • Software requirements standards (SRS)
  • Software design standards (SDS)
  • Software code standards (SCS)
  • Software verification plan (SVP)

At the start of a project - Specification phase:

  • Plan for software aspects of certification (PSAC)
  • Software development plan (SDP)
  • Software requirements data (SRD)
  • Design description (DD)

During development - Implementation phase:

  • Source code
  • Object code
  • Software verification Cases and Procedures (SVCP)
  • Problem reports
  • Software Quality Assurance Records (SQA)
  • Software Configuration Management Records (SCMR)

At the end of a project - Testing and verification phase:

  • Software Verification Results (SVR)
  • Software Life Cycle Environment Configuration Index (SECI)
  • Software Configuration Index (SCI)
  • Software Accomplishment Summary (SAS)

The problem here is that a lot of the deliverables are generated at the start of a project, before the lessons have been learned. And a lot of the deliverables are generated at the end of a project, not keeping pace with the development of the software, and as such represent a significant source of costs, as these have to be produced and verified manually.

Most of these deliverables also usually takes the form of documentation, with the exception of the source code and object code. DO-178 does not specifically state that the outputs have to be in the form of documentation, and where possible we will try to replace traditionally labour intensive documentation with other artefacts, saving us effort and costs. Off course we must prove that there is no reduction in the reliability of the final product when making these changes.

The organisational deliverables is not my concern here, as once these have been generated they can be re-used across multiple projects. But let’s see if we can get some of the project specific deliverables be generated and verified continuously and automatically during development using the following:

Scrum

During sprint planning and review sessions, we can review and update the Design Description (DD) document detailing the software architecture. At the end of a sprint, the implemented user stories will form the Software Requirements Data (SRD’s). It will look something like this…

During sprint planning we ensure that the user stories i.e. the high level requirements we are planning to implement is consistent with previous implemented requirements. During sprint review we update the requirements with the implemented user stories, and ensure the created functional tests and unit tests i.e. the low level requirements are consistent with the high level requirements (user stories).

This turns the traditional waterfall model on it’s head. How can you write code and generate the requirements only afterwards? Well a lot of times, you only realise what the true requirements are when you are writing the code, so we only set the requirements in stone once we are sure. We still generate user stories as possible requirements before writing code.

Continuous integration (CI) and Continuous deployment (CD)

The CI and CD servers themselves are effectively the Software Life Cycle Environment Configuration Index (SECI), Software Configuration Index (SCI) and parts of the Sofware Configuration Management Records (SCMR) deliverables. For this to be possible, the CI server must have a copy of the version control database when duplicated for certification.

This means an agile setup might look as follows:

If all the tests pass, the CI / CD will autogenerate a snapshot of itself (VM’s or some other duplication means) and the version control database to serve as the SECI and SCI. It will also generate reports of the tests run and their results to serve as the SVP and SVR’s, and generate reports that can serve as SCMR items (This is baselines, commit histories etc.). This is off course highly idealistic and represents no small amount of additional work, but the purpose here is to show that these required deliverables of DO-178 is fairly repetative and thus highly automatable.

Test driven development

Test driven development can be used to generate the large parts of the Software Verification Cases and Procedures (SVCP’s) and Software Verification Results (SVR’s). High level requirements will be developed and tested with feature driven development, and unit tests will be used to develop and test low level requirements. But not all unit tests are really low level requirements, for instance testing if a function can handle null pointer parameters. As such we will mark which unit tests are indeed low level requirements in the unit testing code itself.

The relationship between the Continuous integration (CI) server and the Continuous deployment (CD) server is detailed in the popular test pyramid developed by Mike Cohn, where the CI is responsible for making sure the source code compiles at all times, and passes all unit tests, and the CD is responsible for making sure the automated functional tests pass at all times. One would expect to develop a lot more unit tests than functional tests, thereby limiting (but not eliminating) the need for expensive manual testing.

And what will the workflow differences look like between a waterfall and agile DO-178 project? The following represents a very simplified project workflow, but will hopefully give you an idea.

Every blue item represents a stage gate, that has to be satisfied before the team can continue on to the next set of items

*Sprint review includes a retrospective, code review, functional and unit tests review.

Going back to the DO-178 specification, it lists as an appendix the requirements for the deliverables. The purpose of an agile process would be then to automate the verification of as many of these requirements in order to speed up the certification and re-certification of the product. The difference between how a waterfall or agile work flow satisfies these requirements then looks as follows (DO-178C wording used):

Table A-1: Software planning process

Objective
Output
Agile strategy
1
The activities of the software life cycle processes are defined.
Plan for Software Aspects of Certification

Software Development Plan

Software Verification Plan

Software Configuration Management Plan

Software Quality Assurance Plan
An agile process is defined.
2
The software life cycle(s), including the inter-relationships between the processes, their sequencing, feedback mechanisms, and transition criteria, is defined.
An agile process is defined.
3
Software life cycle environment is selected and defined.
The CI server is defined as the software lifecycle environment as a final deliverable.
4
Additional considerations are addressed
No difference.
5
Software development standards are defined
SW Requirements standards

SW Design Standards

SW Code Standards
No difference.
6
Software plans comply with this document
Software Quality Assurance Records

Software verification results
The SVR’s will now be automatically generated by the unit tests, and the CD server, with a small section still generated manually with manual testing.
7
Development and revision of software plans are coordinated.
Software Quality Assurance Records

Software verification results.
No difference


Table A-2: Software Development processes

Objective
Output
Agile strategy
1
High-level requirements are developed
Software Requirements Data
At the end of each scrum, the implemented user stories will generate the high level requirements section in the SRD.
2
Derived high-level requirements are defined and provided to the system processes, including the system safety assessment process.
Software Requirements Data
At the end of each scrum, the implemented derived user stories will generate the high level requirements section in the SRD. There is a problem here, in that the system safety assesment process requires the high level requirements as input, and determines the DO-178 level required, but in an agile process the high level requirements are not defined at the beginning of a project.
3
Software architecture is developed
Design description
At the beginning of each scrum, the software architecture is reviewed, at the end of each scrum the software architecture document (DD) is updated.
4
Low level requirements are developed
Design description
At the end of each scrum, the implemented unit tests marked as low level requirements will generate the low level requirements section in the DD.
5
Derived low-level requirements are defined and provided to the system processes, including the system safety assessment process.
Design description
At the end of each scrum, the implemented unit tests marked as low level requirements will generate the low level requirements section in the DD.
6
Source code is developed
Source code
No difference.
7
Executable Object Code and Parameter Data Item Files, if any, are produced and loaded in the target computer.
Executable object code
No difference.


Table A-3: Verification of Outputs of Software Requirements Process

Objective
Output
Agile strategy
1
High-level requirements comply with system requirements.
Software Verification Results
At the beginning of each scrum, the suitability of the user stories to be implemented will be evaluated against the system requirements.
2
High level requirements are accurate and consistent
Software Verification Results
User stories to be accurate and consistent.
3
High level requirements are compatible with target computer
Software Verification Results
User stories verified with continuous deployment and functional tests.
4
High level requirements are verifiable
Software Verification Results
User stories verified with continuous deployment and functional tests.
5
High level requirements conform to standards
Software Verification Results
User stories to conform to standards.
6
High level requirements are traceable to system requirements
Software Verification Results
No difference.
7
Algorithms are accurate
Software Verification Results
No difference.


Table A-4: Verification of Outputs of Software Design Process

Objective
Output
Agile strategy
1
Low level requirements comply with high level requirements
Software Verification Results
Newly written unit tests marked as low level requirements will be annotated as to which high level requirement it is traced to and reviewed at every sprint.
2
Low level requirements are accurate and consistent
Software Verification Results
Unit tests to be accurate and consistent.
3
Low level requirements are compatible with target computer
Software Verification Results
Unit tests verified with continuous integration.
4
Low level requirements are verifiable
Software Verification Results
Unit tests verified with continuous integration.
5
Low level requirements conform to standards
Software Verification Results
Unit tests to conform to standards.
6
Low level requirements are traceable to High level requirements
Software Verification Results
Unit tests marked as low level requirements will be annotated as to which high level requirement it is traced to.
7
Algorithms are accurate
Software Verification Results
Accuracy can be verified with unit tests.
8
Software architecture is compatible with high level requirements
Software Verification Results
Software architecture to be reviewed and updated with every sprint.
9
Software architecture is consistent
Software Verification Results
Software architecture to be reviewed and updated with every sprint.
10
Software architecture is compatible with target computer
Software Verification Results
Software architecture verified with continuous deployment and functional tests.
11
Software architecture is verifiable
Software Verification Results
Software architecture verified with continuous deployment and functional tests.
12
Software architecture conforms to standards
Software Verification Results
Software architecture to be reviewed and updated with every sprint.
13
Software partitioning integrity is confirmed
Software Verification Results
Software partitioning integrity verified with continuous deployment and functional tests.


Table A-5: Verification of Outputs of Software Coding and Integration Process

Objective
Output
Agile strategy
1
Source code complies with low level requirements
Software Verification Results
Verified with unit tests.
2
Source code complies with software architecture
Software Verification Results
Can be confirmed with sprint code reviews or peer programming.
3
Source code is verifiable
Software Verification Results
Verified with unit tests and functional tests.
4
Source code conforms to standards
Software Verification Results
Can be confirmed with sprint code reviews or peer programming.
5
Source code is traceable to low level requirements
Software Verification Results
Verified with continuous integration and unit tests.
6
Source code is accurate and consistent
Software Verification Results
Can be confirmed with sprint code reviews or peer programming.
7
Output of software integration process is complete and correct
Software Verification Results
Software integration process verified with continuous deployment and functional tests.
8
Parameter Data Item File is correct and complete
Software Verification Cases and Procedures

Software Verification Result
Parameter Data Item File verified with continuous deployment and functional tests.
9
Verification of Parameter Data Item File is achieved.
Software Verification Results
Parameter Data Item File verified with continuous deployment and functional tests.


Table A-6: Testing of Outputs of Integration Process

Objective
Output
Agile strategy
1
Executable object code complies with high level requirements
Software Verification Cases and Procedures

Software Verification Results
User stories verified with continuous deployment and functional tests.
2
Executable object code is robust with high level requirements
Software Verification Cases and Procedures

Software Verification Results
User stories verified with continuous deployment and functional tests.
3
Executable object code complies with low level requirements
Software Verification Cases and Procedures

Software Verification Results
Verified with continuous integration and unit tests.
4
Executable object code is robust with low level requirements
Software Verification Cases and Procedures

Software Verification Results
Verified with continuous integration and unit tests.
5
Executable object code is compatible with target computer
Software Verification Cases and Procedures

Software Verification Results
User stories verified with continuous deployment and functional tests.


Table A-7: Verification of Verification Process Results

Objective
Output
Agile strategy
1
Test procedures are correct
Software Verification Cases and Procedures
Sprint review of unit tests and functional tests
2
Test results are correct and discrepancies explained
Software Verification Results
Sprint review of unit tests and functional tests
3
Test coverage of high level requirements is achieved
Software Verification Results
Sprint review of unit tests and functional tests
4
Test coverage of low level requirements is achieved
Software Verification Results
Sprint review of unit tests and functional tests
5
Test coverage of software structure (modified condition / decision coverage) is achieved
Software Verification Results
No difference
6
Test coverage of software structure (decision coverage) is achieved
Software Verification Results
No difference
7
Test coverage of software structure (statement coverage) is achieved
Software Verification Results
No difference
8
Test coverage of software structure (data coupling and control coupling) is achieved
Software Verification Results
No difference
9
Verification of additional code, that cannot be traced to Source Code, is achieved.
Software Verification Results
No difference


Table A-8: Software Configuration Management Process

Objective
Output
Agile strategy
1
Configuration items are identified
SCM Records
No difference
2
Baselines and traceability are established
Software Configuration index

SCM Records
Baseline generated by cloning the CI / CD, traceability checked not as normal with documented traceability matrixes, but by verifying traceability between annotated unit tests, functional tests and user stories.
3
Problem reporting, change control, change review, and configuration status accounting are established
Problem reports

SCM Records
No difference
4
Archive, retrieval, and release are established
SCM Records
No difference
5
Software load control is established
SCM Records
No difference
6
Software life cycle environment control is established
Software Life Cycle Environment Configuration Index

SCM Records
No difference


Table A-9: Software Quality Assurance Process

Objective
Output
Agile strategy
1
Assurance is obtained that software plans and standards are developed and reviewed for compliance with this document and for consistency.
Software Quality Assurance Records
No difference
2
Assurance is obtained that software life cycle processes comply with approved software plans.
Software Quality Assurance Records
No difference
3
Assurance is obtained that software life cycle processes comply with approved software standards.
Software Quality Assurance Records
No difference
4
Assurance is obtained that transition criteria for the software life cycle processes are satisfied.
Software Quality Assurance Records
No difference
5
Assurance is obtained that software conformity review is conducted.
Software Quality Assurance Records
No difference


Table A-10: Certification Liaison Process

Objective
Output
Agile strategy
1
Communication and understanding between the applicant and the certification authority is established
Plan for software aspects of certification
No difference
2
The means of compliance is proposed and agreement with the Plan for Software Aspects of Certification is obtained
Plan for software aspects of certification
No difference
3
Compliance substantiation is provided
Software Accomplishment Summary

Software Configuration Index
No difference


So there you have it, a proposal on what a more agile DO-178 development process might look like. I want to make it clear, none of this was developed in a vacuum or my own work, but cherry picked from various sources, which I’ll attribute to as part of the literature study of my thesis.

The question now is, will this pass certification, and can this agile process deliver software of at least the same robustness / quality as a waterfall process delivers. For this question, I will be guiding two three-man student groups to complete the same software project, one group following the waterfall model and another group following the agile model. More on that in the next post (experimental design).

A lot of this post has been quite abstract, not mentioning any specific software solutions to be used during development. The next post will detail the exact solutions the students will use in the form of PSAC and SDP documents, giving more clarity of an agile DO-178 development process.

If I have missed anything or you would like to make a suggestion, kindly do so at the discussion on HN and reddit. Comments and suggestions are very welcome.

If you are currently; or in the past have worked on DO-178 projects, it would be appreciated if you would be so kind as to take part in a quick survey about the state of DO-178 development. I will release the results of this survey shortly. Thank you for everyone who has completed the survey already.