Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login
F-35 C++ coding standard [pdf] (www.stroustrup.com) similar stories update story
20.0 points by azth | karma 1849 | avg karma 1.61 2014-04-22 16:48:40+00:00 | hide | past | favorite | 72 comments



view as:

Note that military requirements are typically less stringent than those of commercial aerospace.

How are the requirements less stringent?

There are fewer regulations, many of them informal and they don't answer to the FAA but to some branch of the DoD. This makes sense since military aircraft don't usually transport civilians or operate over populated areas.

See the software section of this document which says that military airborne software isn't, in general, acceptable to the FAA, which implies that the military are bound to weaker standards. http://www.faa.gov/documentLibrary/media/Order/ND/8110.101.p...


Previous sumbmission and comments:

https://news.ycombinator.com/item?id=3967316


I feel like if I were a pilot with even a modicum of engineering and language experience, I'd be pretty frustrated that they aren't using something like Haskell that provides a certain level of runtime security. C++ seems like it has enough ways to generate a memory leak to sink a boat, or drop a fighter out of the sky.

While initially this seems like a good idea, and I thought things like this for a while... I'm sure they are using a real time operating system. Imagine having things garbage collect for you when trying to fire a missile. This may seem inconsequential, but reliability is everything.

https://en.wikipedia.org/wiki/Real-time_operating_system

edit: also, time/space leaks in haskell.


C++ may not be the best choice, but a language like Haskell would be worse for an aviation application for a number of reasons:

1) Memory on the hardware is tightly constrained and controlled, and you need some visibility into its usage at any given time. Even if it exposes you to null pointer errors and the like.

2) Along the same lines, throughput must be predictable and is tightly constrained. Anything happening at runtime that can't be predicted at compile time is, for a safety critical aviation application, a huge risk and is usually explicitly forbidden by the requirements.

3) Haskell is relatively new. C++ has stood the test of time and the DoD can be sure plenty of C++ programmers will still be around in 30+ years.

All of these are dealbreakers from the DoD's perspective. The technical risks associated with handling your own memory and not having provably safe code can be addressed with enough time and expense. Since the DoD is not short on money and operates on time scales of years or decades, this is acceptable. In a startup or academic environment, the balance of risks and resources is completely different. But believe it or not, the DoD and commercial aviation entities actually do look at the tradeoffs associated with the tools they choose and don't simply choose them out of inertia.

Source: I develop commercial jet engine software, where many of the same resource/risk tradeoffs exist.


No disagreement, but wanted to point out that groups like Galois have been using Haskell-embedded, compile-to-C DSLs in some interesting ways recently:

http://smaccmpilot.org/images/dronecon-talk.pdf

This has the potential for nice compile-time guarantees without the runtime cost (and potentially at much lower cost than full formal verification)

(also, fwiw, Haskell has been around since 1990, making it about 7 years younger than C++)


Haskell cannot give hard real-time guarantees, to my knowledge. C++ can (if you don't use some features like exception and dynamic_casts [1])

[1] I should point out that dynamic_cast is a little tricky -- http://www.stroustrup.com/fdc_jcse.pdf


You also can't call new or malloc (unless your heap is has deterministic worst-case timing).

In addition to the other replies, Haskell lacks a proper ISO or ANSI standard.

I think that current safety-critical best practices (what real software engineers actually do) do a good job at correcting the warts inherent to the C family while respecting the need for real-time execution. Not clear what Haskell brings to the table here (or on other embedded systems in general).


That's exactly what the DoD created Ada for. Unfortunately, programmers that know Ada seem to be in short supply.

Did the poster have some point in mind, or do we just post random PDF documents on HN now?

I thought the military uses Ada?

Edit: Link in case anyone is interested from the discussion below is interested.

http://www.seas.gwu.edu/~mfeldman/ada-project-summary.html


The DoD has not used, and has in fact strongly dissuaded the use of, Ada for almost 20 years.

Ummm.... tell that to the people I know who work at lockheed.

Edit: did some research and the people I know work on projects started before the change; F-22/C-130. And it wasnt 20 years ago more like 15, although I couldn't find an exact date.


Ada is something of a running gag in the aerospace industry at this point. Only really, really old programs still use it.

Must be really old programmers working on the Boeing 777 flight control system.

So the "backbone" of a 787 is considered a really, really old program?

http://www.adacore.com/customers/787-dreamliner-common-core-...


It was 1997 that DoD started their COTS push.

I actually worked in some of the Ada code in the F-22. The code and the SBC it ran on were designed in 1993. My understanding is the code was rewritten in C++ in about 2009. I believe the Ada code in the F-22 PICC module went the same direction as the module design evolved.


Interesting I always figured they used C or C++ but when I found out about ADA's use I was confused because I had never heard of it.

I actually was able to watch the F-22s maiden flight back in 1997 I guess same year as the C++ change.


Out of curiosity, what was the stated reason for the change, and is it all C/C++ nowadays?

As I mentioned above, it was part of the COTS kick that DoD got in sometime in 1997. They decided having processing architectures (like the MIL-STD-1750A processor) and a programming language that were basically used only by DoD programs was too expensive and made DoD programs too dependent on a shrinking number of suppliers. It also made recruiting and retaining engineers harder. It also promised to make system upgrades easier and reduce the number of systems that needed to be replaced wholesale. Thus, they decided to start mandating that programs use COTS hardware and software as much as feasible.

C and C++ are the languages of choice for hard and firm real-time code. Java is popular for UI code, though C and C++ are still used on some older Solaris-based front-ends.


The Ada mandate got dropped in 1997, but I don't think it's fair to say they're "strongly dissuaded" it's use. Defense contractors are free to keep using it, and many do (Boeing, for example).

It's not used 100%. As a mandate it disappeared a while ago, anyone that's using it is using it because of institutional momentum (rare) or some cost/risk analysis, or projects they've inherited or a rare mandate from the program office. I'd be happy if it were used more often, but it's not. And I'm not sure where vonmoltke gets that use is dissuaded, that's not been my experience. It's often that the language chosen isn't mandated and companies just don't choose it (there was too big a backlash in the 80s about the original mandate).

--------------

EDIT

The OS [1] for the F-35 and the 787, tightly integrated with development environments for Ada and C/C++. I really don't get the Ada knocking going on, it's a solid language and a hell of a lot better than C/C++. The reason it loses is corporate culture and lack of familiarity from developers, and an inherited hate/dislike that's persisted since the 80s. For these applications it really is the better language.

[1] http://www.ghs.com/news/archive/211031l.html


They are going to down vote us because we are not discussing the newest flavor of ABC. And as we all know ABC can do everything and since their startup uses it everything else is obsolete or dead.

No wonder this thing doesn't fly (as expected). In all seriousness, perhaps the choice of the language has affected the software release dates?

I'm surprised folks are down voting my comment. See http://rt.com/usa/f35-jet-software-delay-233/ for example. And as the one who programmed in C++ for 20 years I think I know what I'm talking about.

You are being voted down because your comment was completely unsubstantiated. Maybe if you had thought about it a little longer you'd have figured put there aren't actually that many (any?) viable alternatives for C++ for the kind of systems programming you'll find in a fighter jet. Or, if you disagree, you could have offered some insight as to what these languages would be.

If you can't have non-deterministic behavior such as GC, you can't waste gobs of heap or stack memory, you have to be able to interface with C libraries and embedded components, you need hard real-time guarantees, and your whole development stack needs to be stable and well supported, you end up with C or C++, it's that easy. You don't gamble by picking the hipster language du jour all the cool kids use to write websites or phone apps, but proven technology that (if used correctly) gives you reliable and predictable results, and will still be around and supported in 40 years time when these jets are still in service.

Blaming the failure to deliver the f35 on time and within budget on C++ is really quite a cheap stab.


C/C++ aren't the only languages that lack a GC (see Ada) or in which you can severely restrict the GC (see Rust, though it's not suitable for this sort of project yet). The OS that most of the software runs on (that is, not the software running on LRUs with their own embedded OS) offers the same APIs to Ada and C and C++. It was a choice to go with the non-Ada languages. Ada is hardly a hipster language, it's been around for 35 years and will likely be around the rest of this century (for better or for worse).

Well, that may be true, but what advantages does Ada offer over C++? I understand it may be one of the (very few) workable alternatives for this kind of systems, but I can imagine has many disadvantages over C++. Where do you find enough developers with Ada expertise, for example? How current (up-to-date) are Ada toolchains? How many commercial vendors can provide and support Ada toolchains? Can you expect weird incompatibilities linking between internally developed Ada code, and externally provided C++ code?

Questions like this are probably why the DoD decided to move away from Ada, not problems with the language itself. Almost nobody uses Ada for anything anymore, so sticking with it would have been a risk. I agree that it's not a hipster language, that qualification was supposed to languages like Go, Rust, JavaScript, etc. which are very popular right now, but completely unsuitable for writing the control software of a fighter jet.


  Well, that may be true, but what advantages does Ada offer over C++?
http://extranet.eu.adacore.com/articles/Ada%20Cpp.pdf

I had a longer section here, but then came across this. Check out the section on the type system to get a clearer version of what I wanted to post. I really think that the type system alone is (for a lot of the embedded systems I've seen/worked on) a good enough reason to switch. The rest of the language is simple enough to learn in an afternoon if you're just wanting to replace C. I don't know how many bugs I've seen that were caused by everything being an int. Even using typedefs in C, they don't restrict the range of values or give significant compile/runtime guarantees when one type is stored into another and they both happen to be ints or chars or something.

  I understand it may be one of the (very few) workable alternatives
  for this kind of systems, but I can imagine has many disadvantages
  over C++. Where do you find enough developers with Ada expertise,
  for example?
Perhaps the greatest problem for Ada, I don't know. There are some colleges that use Ada in introductory courses. There are others with strong ties to the defense/aerospace industry that happen to offer Ada courses as electives. The rest is going to be recruiting from curious individuals or companies that happened to recognize its value or that got stuck with a substantial Ada codebase from the 80s/90s.

  How current (up-to-date) are Ada toolchains?

  How many commercial vendors can provide and support Ada toolchains?
Very, it's an actively developed language. Ada 2012 [1] is the most recent version, though given that I'm really encouraging this for embedded use you'd be using a subset of that. AdaCore is the primary developers of GNAT, which is current through Ada 2012. Green Hills and Wind River also support through Ada 2012.

  Can you expect weird incompatibilities linking between internally
  developed Ada code, and externally provided C++ code?
Not sure, FFI is not something that would've been used in any of the embedded systems I'd have recommended using Ada for. The language reference does specify how calling to C/C++, Fortran and Cobol code should function or be exposed within an Ada implementation. See the above PDF, it speaks a bit about FFI with C.

[1] http://www.ada-auth.org/standards/ada12.html

-----------------

A couple years ago I could've probably written up a better response, I've all but given up on Ada in my workplace.

-----------------

EDIT: I should probably also confess a bit of an infatuation with good type systems and formal verification tools/processes. I've seen what results when these are absent, I've quit jobs because these were absent and the culture prevented any improvements. I'm not working on safety critical systems anymore, but if I ever end up back in those projects I will insist on using the right tools/processes.

I don't want to wake up one day, turn on the news and read about a crash or malfunction that was caused or should have been prevented by my systems. Poor processes and tools enable these failures, and we should have zero tolerance for them. We have certifications/licensing for engineers (mechanical, civil, aerospace, electrical, etc.). If they sign off on a system/design that fails catastrophically and the failure should have been detected (via models or industry standards or whatever), they are held liable. Software developers in the safety/critical systems space have too cavalier an attitude, we're too well insulated from the aftermath of our defects. That attitude may never be changed, but a change in tools/processes to ones that contain, minimize or eliminate large classes of failures is certainly worth the effort.


Not my downvote but to me there's no wonder you got downvoted.

Why is it no wonder these things don't fly? Are the rules too strict? too loose? too broad? too stupid? or is it the choice of language? Your post provides no insight into your opinion at all and added no value for anyone else, quite the contrary.


I thought the avionics software are written in Java these days.

Strongly real-time embedded systems and garbage collection don't play well together.

Thanks. I was just joking and it seems like a lot of people did not get it :(. My joke was targeted to the people who think if there is one programming language left in the world it should be Java.

Ah. I'll adjust my sarcasm detector...

I stopped reading after the very first rule: Any one function (or method) will contain no more than 200 logical source lines of code. Seriously?

What of it? It's a lot of code, but it's C. Simple things usually take more lines than in higher-level languages.

No. Actually it's C++. A very different language.

True enough, but in any case, C++ is still fairly noisy.

C++ is also not Java. When you're working with low-level hardware on real-time systems, C++ ends up looking a lot like C.

Besides, most commercial applications of C++ tend to devolve into "C with objects." IMO this is because well-written blocks of C are actually very readable, and you're a lot more likely to mess up memory management when you start passing data around just to break up large code blocks. The whole reason Java has garbage collection and only one way to pass objects is because they help write more readable code.


I do not think there is a viable alternative to C++ as a systems language, thus it's utilization in a complex project combining many systems developed by different parties makes sense. That being said, no humanly comprehensible amount of rules can make average engineer churn out good C++ code and no amount of good engineers can save project with bad management.

Have you checked out Rust yet? It's too unstable to be a viable alternative yet, but I have high hopes.

I don't think Rust is aiming for hard real-time systems. Other than that, I too am quite excited for it.

It is definitely aiming for hard real-time systems. It's not quite there yet, but the Rust developers been hard at work removing obstacles to its use in those situations.

From the little I read about the F-35, the problems started with "bad requirements".

Galois and others have had success using Haskell, or using Haskell tooling to generate formally assured c/c++. I think there's plenty more industry could be doing today, even when tied to c/c++.

You said viable, so I guess you mean production ready, which excludes still experimental or "in gestation" languages like rust or ATS, but they are indeed worth a look (note that ATS in its 1st incarnation targets C).

I'd like to point out that there was a post several weeks ago on /r/haskell of someone having implemented a BSD kernel module. There's also MirageOS, an OCaml implemented framework providing all the services of an OS, thus letting people boot their apps in a VM very easily (that's the aim of the project afaik, given that it comes from the same lab than Xen. While not specifically tuned for OS development, they seem able to cope well with the task. Note that both languages also have a native SSL library development ongoing, which is a part of the MirageOS framework in the case of OCaml.


Don't get too wrapped up with these standards. When asked about them Stroustrup himself said they aren't a general C++ standard, but something very specific to the embedded & critical systems of fighter jets. I can't remember the exact talk, but it was from one his Going Native talks:

http://channel9.msdn.com/Events/GoingNative/GoingNative-2012...

or

http://channel9.msdn.com/Events/GoingNative/2013/Opening-Key...

(both are _well_ worth a watch if you like C++ or program in it)


"4.13.4 Function Invocation AV Rule 119 (MISRA Rule 70) Functions shall not call themselves, either directly or indirectly (i.e. recursion shall not be allowed)."

That's too bad.


That's a good rule for embedded software on a jet though. I can see why they'd do that.

Not if you want provable worst-case stack sizes (function call times).

[Edit: or function call times]


Unexpectedly running out of stack space at Mach 1 while pulling a 5g maneuver is a little different than unexpectedly running out of stack space while playing tetris. Most recursion can be rolled into loops, so it's not completely limiting.

The rationale is sensible though. C/C++ do not have tail call optimization, recursion reduces available stack memory which is a limited resource. By barring recursion, note they also bar malloc after initialization, they allow the program's memory usage to be determined at compile time, rather than during runtime. This allows them to satisfy requirements like "shall not use more than 50% of memory" (NB: this doesn't mean it will never use more than 50%, that extra memory available allows them to modify the program in the future and relax the memory constraint as needed without needing to modify hardware).

I couldn't help but search the pdf for brace rules, and lo:

AV Rule 60

Braces ("{}") which enclose a block will be placed in the same column, on separate lines directly before and after the block.

Example:

if (var_name == true)

{

}

else

{

}


Hey, nothing wrong with that. As long as there's a consistent standard, it does make things more readable.

I agree about consistency but I don't find that style more readable. It wastes vertical space to little benefit.

I agree with you stylistically and don't personally use that brace placement. However, given the number of people writing C++ code for the F-35, it makes sense to settle on a standard that is never ambiguous. When nesting multiple control statements, it can sometimes require good judgment to know where to place braces for maximum readability. Giving each one its own line avoids requiring your developers to exercise good judgment in all the myriad cases.

Yeah, but that's a stylistic choice. IMO this is one of those cases where one choice isn't better than another, and it's just better to have made a decision.

I think he's referring to apples "goto fail" bug to show it wouldn't happen under a project with strict guidelines :)

They require that pointers be declared as

    int32* p;
and not

    int32 *p;
and I know that is the convention in C++, but it still makes my eyes bleed. It's a gross violation of the Law of Least Astonishment, since, of course,

    int32* p,q;
doesn't do what you might think it would, based on the syntax. This is not a problem for the F-35 code, since multiple declarators per declaration are forbidden.

Sorry, I'm an old C guy and I guess there are some new tricks I just can't learn...


It seems that this standard definitely goes for clarity of code over syntactic shortcuts.

Which, you know, is probably a good idea for critical system software on an aircraft. It's better for your intermediate developers to know exactly what your code is going to do than to be able to use some greybeard tricks that can lead to buffer overflows if the person modifying them doesn't know exactly how they work.


"AV Rule 1: Any one function (or method) will contain no more than 200 logical source lines of code"

Are they crazy? 200 SLOC is a huge beast. With our non-mission-critical software, we typically aim to not exceed 20 lines of code per function, and this including comments and whitespace. Typically it is not very hard to get functions of size < 10 sloc.


With your non-mission-critical software, you also probably don't have hard limits on recursion. Writing iterative versions of naturally recursive functions can blow up function sizes fast.

I'm not sure if they're defining "logical" lines of code as all non-comment lines, but if so the requirement to have all curly braces ({}) on a line of their own would add to the vertical line count.

Many of these standards are pretty reasonable.

Legal | privacy