|
On 12/05/2019 18:37, Philippe Verdy wrote:
Interesting article : it finally shows that compilers are made by people that are interested only in compilers and prure performance, and don't even care about making it useful for any language. Now if C99 allows this, then C99 is not a language defined for programmers, jsut for the interests of compiler makers. It just kills the language and makes it completely useless (and malicious) to any one: programmers, and final users (which are not even given any right to know what is in the program they buy or use, these programs are also best to categorize are malwares). In summary, C is a malware to eradicate, unless it is implemented by replacing all undefined behaviors from the so called "standard" by defined behavior.
[snip]I'm not a big fan of C (and C++ either: C++ it has almost the same definition of UB as C), but I had to reconsider my views in the last 5 years, since I had to cope with microcontroller (MCU) software development.
I agree that, in an ideal world, no sane high level application should be developed in C (or C++) because they are intrinsically unsafe, thanks to UB. C++ mitigates that a little by providing high level constructs that a programmer can use to avoid the most common cases where a mistake can cause UB (e.g. dynamic memory management).
There could be, and probably there are, better, and safer, alternatives for high level application development that don't sacrifice performance.
The problem is that at this stage there is no real alternative in the following cases:
(1). You must heavily interact with OS API. (2). You must write system-level software (e.g. device drivers).(3). You must write SW for the bare metal (no OS, like on MCUs or DSP-Digital Signal Processors).
(4). You must absolutely squeeze the last drop of performance from a machine (without resorting to assembly), e.g. lots of heavy math computations.
(5). Economics: the sheer number of /well tested/ code lines written in C and C++ are a huge investment that firms are not going to replace unless they are compelled by reasons that have the same "economic weight".
(1) and (2) stems from the fact that no OS I'm aware of is written in a standardized language different from C/C++. So even if an OS provides an API for other languages, the C/C++ API is going to be /the/ real native API (which would be the only way to do really anything the OS allows you to do, as far as API operations go).
(3) AFAIK C/C++ are the only widely used standardized languages that allows assembly code to interface nicely (through compiler extensions) with embedded assembly code. This is essential for low-level programming to access specific CPU/MCU features that cannot be represented in C/C++ code (eg. switching memory banks, accessing bit-level instructions, etc.).
Moreover, it seems that writing a C compiler is not that hard, so a CPU/MCU manufacturer can easily come up with a decent C compiler fairly quickly once they develop a new architecture (in the high-end CPU world we are accustomed to a handful of architectures, but in embedded world there are dozens of architectures with thousands of different MCU models, from 8bit to 64bits, not counting DSP chips!!!).
If you want to go OSS it's even easier: you write a new back-end for GCC and you have a good C optimizing compiler without much effort (see GCC for AVR architecture, for example)!
In the embedded world there are also examples of "mini C" implementations, i.e. stripped-down, non-standard, C dialects, for which a barebone compiler is even easier to write!
I know of only three "real" attempts to build a system language that has the same range of applications as C/C++ and the same performance and are safer at the same time: Go, Rust and Ada. I said "real" because (AFAIK) they are backed by either big money or real effort and have gained enough momentum to be, at least remotely, a viable alternative.
I know almost nothing about Rust, I've just read their FAQ some time ago, just for curiosity, so I cannot comment further.
I had a stab at Go some years ago, but I think it failed its "mission" to replace C as a system language. It is a nice language, but the biggest drawback is that its runtime support system (AFAIR) is huge (hundreds of MBs, IIRC), so that rules out using it on most embedded systems. I don't know how it is now, since I haven't followed its evolution, but I doubt they ever tried to target the non-PC world.
The third attempt, which was somewhat successful, was Ada. A language sponsored by USA Department of Defense (DoD). At a certain point in time (IIRC in the 90s) DoD mandated that any software that would run on its systems (from database SW to avionics firmware) be written in Ada.
I read that after about 10 years they had to reconsider that position: the cost of military SW development had skyrocketed because of that constraint. They realized that it cost less to write SW in C/C++ and then put money in more testing and safety frameworks. In fact it turned out that an expert Ada programmer was extremely expensive to hire and also to train (AFAIK Ada is a very complex language and it is extremely fussy) compared to a C programmer of the same expertise. Moreover, they were much harder to find (who writes SW in Ada outside USA government?!?). And in addition to that the supporting tools for Ada were worse or non-existent.
The last point, (5), is the most compelling though: economics. Sadly economics trumps engineering almost always in the real world (as the DoD Ada example shows). Companies are pumping billions of dollars into software written in C/C+, so it makes sense for them to continue to support their tools.
I've been said that there are some companies that still support their old SW written in Cobol because the sheer amount of debugging/testing put into that programs (banking SW) is worth millions of dollars and the code base is now virtually bug-free and they are happy to overpay the now rare Cobol programmers just to do some enhancement here and there.
Add to all this that C is an easy language to learn (alas, it's also easy to learn wrong!). It's like a jackhammer: it is fairly easy to understand how it works, so anyone could try to use it. And like a jackhammer, it takes time to learn how to use it correctly without risking of smashing your own feet!
So, in the end, I don't think we will see any real, practical alternative to C any time soon (I'd say, sadly).
I'd love to hear something about Rust from people on this list, though. Has someone used it, especially if it is usable in embedded world, and how it compares to C.
Cheers! -- Lorenzo