Spending Too Much $$ on UAV DO-178C & DO-254 Compliance? New AFuzion Paper Explains and Simplifies
In the “Good Old Days”, the avionics guideline DO-178B was only informally applied to unmanned systems. But first, let’s clarify: in technology, the “Good Old Days” means 10-20 years ago. In the world of unmanned systems, “Good Old Days” means 5-10 years ago. However, the Good Old Days were rarely as good as they seemed, so the Good Old Days of Unmanned Systems had disadvantages along with those seeming advantages …
AFuzion recently hired four new engineers whose sole focus is UAV certification/compliance. The full 12-page technical paper is available for free download here: Click Here to Download Free 12-Page UAV Certification Paper from AFuzion
DO-178B has been replaced with DO-178C, and is now increasingly applied to (and in a growing number of cases, required for) airborne avionics within Unmanned Systems. DO-178C is never cheap, certainly not on the first project. And in clear cases outlined herein, DO-178C can increase costs above DO-178B, which already increased initial avionics development costs by 20-40% itself. But is DO-178C really “too” expensive? Doesn’t it reduce costs over DO-178B for companies who were doing it “right”? Does DO-178C have favorable benefits versus costs over the lifetime of Unmanned Systems? Does DO-178C reduce long-term unmanned costs at the expense of increased development cost? Will DO-178C improve UAV safety and reliability and if so, to what degree? Exactly what benefits are received from complying with DO-178C? These important questions are answered in the AFuzion paper (available by download). An all-too-brief intro follows.
A popular myth is that DO-178C is expensive when applied to UAV’s. Indeed, DO-178C is not “cheap” as clearly the additional costs can be seen above. Level D certified software still has generally full planning, high and low-level requirements, implementation, reviews, and full functional testing of all high-level requirements with traceability applied. Additionally, configuration management, quality assurance, and DER liaison are applied to Level D. Yet the costs of Level D are just 15% higher than medium-quality consumer/financial software developed per typical CMMI Level 2-3 processes. Why? Because Level D is comprised almost entirely of normal industry-standard software engineering principals. Also, many companies new to DO-178C believe their prior, existing efforts including planning, requirements, designs, test, and reviews must be re-done: not true. In fact, a DO-178 Gap Analysis activity will analyze the existing processes and work to re-use such processes while identifying the “gaps” to fulfill DO-178’s requirements. (The author has performed dozens of successful DO-178 Gap Analysis activities which take 2-4 weeks to perform and form the basis of the data herein.)
DO-178 also requires reviews (generally independent for Level B and mostly independent for Level A) of all software development processes and artifacts. DO-178 reviews must be per approved DO-178 checklists to ensure all required aspects are properly reviewed. To save money and time, many companies use automated project management/review tools.
Another myth is that the most significant software cost escalation occurs when moving from DO-178 Level B criticality to Level A. Untrue, but for an interesting reason. The singular largest difference between a Level A system and the Level B system is the 100x greater reliability required by the Level A system per ARP-4761A. However, that 100x reliability must come primarily from the system/hardware architecture and not the software. How? Added redundancy: the only way to meet DO-178C’s Level A reliability is via increased hardware/system redundancy which of course greatly increases total UAV cost. But the software cost difference, the subject herein, between Level A and Level B software is quite minor as seen in the figures above.
The cost differential within DO-178C is the most significant between Level D and Level C. Why? Level C requires the following key objectives which Level D does not and which results in Level C requiring 35% more effort than Level D:
Testing of low-level software requirements
Ensuring 100% coverage of all source code statements
Assessment of requirements, design, and code to standards
Greater rigor placed upon reviews
In many cases more rigorous configuration management
Level B requires additional structural coverage (decision-condition, e.g. all branches in the source code), additional independence in reviews, and tighter configuration management. At first glance, it would seem that Level B should be significantly more expensive, e.g. 50% – 70%, than Level C. In theory, it seems to make sense, but as in many areas of life, common sense overcomes theory. In Level B (and C) there must be detailed, low-level software requirements and they must be thoroughly tested. Remember, DO-178C requires detailed low-level requirement verification beginning with Level C and those low level requirements will cover the vast majority of software logic decisions. During requirements-based testing, most (80-90%+) of the branches in the source code are already covered and hence require no additional structural coverage testing if test capture and coverage tools are appropriately used during that functional testing. Therefore, the seemingly significant cost increase associated with Level B versus C structural coverage is already mitigated by DO-178C’s greatly enhanced requirements-based testing. Also, quality software engineering organizations already incorporate a semi-automated and streamlined process which includes independent reviews and tight configuration management; ergo the added cost of those aspects for Level B is largely mollified. The reader is well-advised to undergo upfront DO-178 Training and DO-178 Process Improvement to leverage these cost reduction techniques.
Now, “efficient” followers of DO-178B for UAV’s will find even greater cost impact in adhering to DO-178C for the newer generation UAV’s. Their “efficiency” may have been due to taking liberties with DO-178B’s intended, but less enforced, low-level requirement detail. Such shortcuts enabled less detailed functional testing with many fewer logic branches verified. While acceptable for DO-178B Level C, it’s unacceptable for DO-178C Level C which mandates greater detailed of low level requirements. Because of that greater detail, DO-178C inadvertently reduced the difference between Level C and Level B because the decision-condition structural coverage objective of Level B is largely covered already in Level C due to those more detailed low-level requirements. Also, developers making extensive use of Parameter Data Items (objects or logic external to the main application program) are now required to fully document, review, trace, and test all that data under DO-178C; something they “should” have been doing under the intent of DO-178B. Voilà.
To better understand how to apply reuse to UAV’s for DO-178C and DO-254 compliance, you just need to understand your current DO-178C gaps and DO-254 gaps and perform a UAV Gap Analysis per DO-178C/DO-254. AFuzion shows you how in a 1-minute video and also explains UAV and DO-178C DO-254 Gap Analysis – free info and video here: Click here for UAV DO-178C and DO-254 Gap Analysis and 1-minute Video
GET IN TOUCH
jack jones
AFuzion Inc
Release ID: 262567