Opinion I must be a glutton for punishment. Not only was my first programming language IBM 360 Assembler, my second language was C. Programming anything in them wasn't easy. Programming safely in either is much harder.
So when the US Cybersecurity and Infrastructure Security Agency (CISA) and the Federal Bureau of Investigations (FBI announced they were doubling down on their efforts to persuade software manufacturers to abandon "memory-unsafe" programming languages such as C and C++, it came as no surprise.
The report on Product Security Bad Practices warns software manufacturers about developing "new product lines for use in service of critical infrastructure or [national critical functions] NCFs in a memory-unsafe language (eg, C or C++) where there are readily available alternative memory-safe languages that could be used is dangerous and significantly elevates risk to national security, national economic security, and national public health and safety."
In short, don't use C or C++. Yeah, that's going to happen.
If this sounds familiar, it's because CISA has been preaching on this point for years. Earlier in 2024, CISA, along with partner agencies including the FBI, Australian Signals Directorate's Australian Cyber Security Centre, and the Canadian Centre for Cyber Security, aka the Five Eyes, published a report, Exploring Memory Safety in Critical Open Source Projects, which analyzed 172 critical open source projects. The findings revealed that over half of these projects contain code written in memory-unsafe languages, accounting for 55 percent of the total lines of code across the examined projects.
Specifically, "Memory-unsafe languages require developers to properly manage memory use and allocation. Mistakes, which inevitably occur, can result in memory-safety vulnerabilities such as buffer overflows and use after free. Successful exploitation of these types of vulnerabilities can allow adversaries to take control of software, systems, and data."
Tell us something we didn't know.
CISA continued that memory safety vulnerabilities account for 70 percent of security vulnerabilities. To address this concern, CISA recommends that developers transition to memory-safe programming languages such as Rust, Java, C#, Go, Python, and Swift. These languages incorporate built-in protections against common memory-related errors, making them more secure from the code up.
Sounds good, doesn't it?
If only it were that easy to snap your fingers and magically transform your code base from C to Rust. Spoiler alert: It's not.
Take Rust in Linux, for example. Even with support from Linux's creator, Linus Torvalds, Rust is moving into Linux at a snail's pace.
The problem is, as Torvalds said at Open Source Summit Europe 2024, "The whole Rust versus C discussion has taken almost religious overtones" with harsh arguments that have led to one Rust in Linux maintainer throwing up his hands in disgust and walking away. You see, people who've spent years and sometimes decades mastering C don't want to master the very different Rust. They don't see the point. After all, they can write memory-safe code in C, so why can't you?
Well, because they don't have those years of experience, for one thing.
It's more than just old, grumpy developers. Converting existing large codebases to memory-safe languages can be an enormous undertaking. It's time-consuming, resource-intensive, requires careful planning to maintain functionality, and, frankly, it's a pain in the rump.
Another problem is that memory-safe languages may introduce performance slowdowns compared to C and C++. There's a reason we're still using these decades-old, difficult languages; with them, developers can produce the fastest programs. Given a choice between speed and security, programmers and the companies that employ them go for the fastest code every time.
Besides the sheer migration cost, companies also face the expense of replacing existing development tools, debuggers, and testing frameworks to support the new languages. Then, of course, they're integrating the new programs with the old code and libraries.
The CISA is insisting that this be done. Or, at the least, companies must come up with roadmaps for moving their existing codebases by January 1st, 2026. The CISA argues that the long-term benefits in terms of reduced vulnerabilities and improved security outweigh the initial investment.
I know businesses. They're not going to buy this argument. In the modern corporate world, it's all about maximizing the profits for the next quarter. Spending money today to save money in 2027? It's not going to happen.
Eventually, painfully, slowly, we'll move to memory-safe languages. It really is a good idea. Personally, though, I don't expect it to happen this decade. In the 2030s? Yes, 2020s? No.
Neither businesses nor programmers have sufficient reason to make the jump. Sorry, CISA, that's just the way it is. ®
Opinion Does anyone want to tell Linus Torvalds? No? I didn't think so
Affected business calls situation 'mindbogglingly dangerous' as sysadmins reminded to check backup and restore strategies
Government-appointed commissioners say Birmingham severely lacked Oracle skills during disastrous implementation
UK data regulator says some devs and providers are operating without a 'lawful basis'
An emotionally-manipulable AI in the hands of the Pentagon and CIA? This'll surely end well
Data platform vendors can't meet all your needs, warns Gartner
Rewrite 'please leave my text editor alone'