As I wrote yesterday, we need some way to “persuade” the deep-pocketed companies free-riding on free software like OpenSSL that it is really in their interests to support it properly. I was discussing this thorny issue on Google+ with Alan Cox, early Linux kernel hacker and now one of the free software community’s wisest elders, who saw two fundamental problems that need addressing here:
1. Software dodges most liability law, so there is a quality race to the bottom (which is getting worse). Industry massively underspends on computer security versus the real cost.
2. The software industry is still in denial and thinks that things like code inspection will fix the problem. It won’t. We’ve known that for years.
The question that should be asked is
“Why was the vulnerable routine able to access an object it did not need rights too”.
That goes to the heart of problem – wrong programming languages, wrong tools, gaps in architectural features. Had stronger language and related tools been used in the first place the attack would have been reduced to a slightly annoying DoS with the service blowing up with a bounds exception of some form.
The sad thing is the computing industry cannot easily fix this itself. Because there is no liability and customers cannot properly evaluate security quality there is a lemon market at work.
As you can see, Cox seems to be suggesting that those providing software should accept liability for the problems it causes. This is not a new idea, and has always seemed problematic for free software: after all, how can coders that receive little or nothing for writing software be expected to pay possibly huge liability claims against them? Cox thinks he has the answer:
I don’t think it is a huge problem for open source. The law already deals with liability for analogous things like lending your neighbour your lawnmower, or cooking dinner for friends. In fact 3D printing to an extent already has to deal with this.
It does need to have an impact on the vendors to some extent. If you can “give the software away” and sell a giant support contract for it then the liability needs to attach to that support deal.
It’s part of a general problem that products have proper protection for buyers, yet services and “licenses” don’t. The results of this are bizarre.
Some years ago I pointed out to the EU that this includes gems such as being able to sue a manufacturer for a GPS which went wrong, but not being able to sue them if they provided the same software to install on my PC, even if the software caused the fault, let alone being able to do anything if I used the same software via a web based mapping site.
A lot of it also comes down to correctly setting expectations. If you supply software for free which clearly states its “an amusing hack I wrote while drunk last week” and someone uses it in air traffic control then the blame shouldn’t land on the writer.
You could certainly roll it out in steps – start by applying liability to software in closed boxes and work out in steps. The key will be to make the transition smooth so companies adapt.
Of course the vendors will scream and claim it will be the end of the world. The car industry did the same with road safety in its early days too. The excuses will even be the same “its the operators fault”, “we can't determine what 3rd party changes the user will make” etc
That does seem to be a neat approach: only attach liability where money is involved. The people who write the code – even if they are paid for doing so – don’t need to worry about being held personally responsible. Only those that create a business with the code are affected. Of course, that might be a company like Red Hat that also employs some of the coders, but it’s clearly in their interests to winkle out the bugs, so this additional incentive wouldn’t change much.
Moreover, such a regime might well encourage companies currently producing closed-source codes to open it up: after all, it provides essentially “free” scrutiny – all those extra eye-balls – that is simply not available when the code is closed. The chance that might save them from ruinously expensive litigation is a strong incentive to adopt this approach, which obviously has many other benefits too.
Unlike pretty much any other kind of commercial venture, the deployers of software are able to disclaim all liability for harm caused by their code. Fix that, and the magic of market forces will fix everything else.
This is not to say the authors of the code should be personally liable. Open source projects don’t exist to create code for nonparticipants; they exist as the locus of collaboration for developers. It’s not reasonable to hold a project liable for its code quality (even if we should ask serious questions about code quality and governance). But the entities that deploy the code do have a responsibility. They need an incentive to pay developers, pay auditors, and promote quality and accountability. We’ll only get that when we fix the liability issue.
He also reminds us that a report on network and information security for the European Network and Information Security Agency, produced by a group of researchers including Ross Anderson, explored precisely this area in some depth. It was entitled “Security Economics and The Internal Market”; here’s part of the relevant chapter:
We believe that, as with the motor industry, a patient and staged approach will be necessary. While it might have been feasible to impose stricter rules on software liability as late as the 1970s or even 1980s, by now there is software in too many products and services for a one-size-fits-all approach to be practical. In particular, where software from dozens of vendors is integrated into a single consumer product, such as a car, the sensible approach (taken by current EU law) is to hold the car maker (or primary importer) liable for faults that cause harm; this ensures that the maker has the right incentives to ensure that the software in their product is fit for purpose. Thus, for the time being at least, liability for failures of software in embedded systems should continue to rest with the maker or importer and be dealt with by safety, product-liability and consumer regulation. However, where devices are connected to a network, they can cause harm to others.
The whole report is well-worth reading, even though it dates back to 2008. As that indicates, nothing much happened with its many sensible recommendations. Perhaps the Heartbleed episode and its wide-ranging consequences will be enough to make people think seriously about imposing liability on software companies. Provided an approach along the lines suggested by Alan Cox is adopted, that could also be great news for open source, since it would encourage all those currently free-riding on free software to start supporting it with a generosity commensurate with its value.