Why Open Source is Replacing Open Standards

On Monday, I wrote about how the Linux Foundation was founding the Dronecode project, which it will be helping to run, as it does with a growing number of other collaborative projects. The announcement was made at LinuxCon Europe, which took place in Düsseldorf this year.


On Monday, I wrote about how the Linux Foundation was founding the Dronecode project, which it will be helping to run, as it does with a growing number of other collaborative projects. The announcement was made at LinuxCon Europe, which took place in Düsseldorf this year.

I've attended two previous conferences: the one in 2011, when I interviewed the Linux Foundation's Executive Director, Jim Zemlin, for this column. In 2012, I went again, and talked to Zemlin, and also had a long and interesting chat with Linus Torvalds - key quote: "I don't read code any more." This year, I interviewed Zemlin once again, and he had plenty of interesting things to say about the world of open source (disclosure: the Linux Foundation paid for my travelling and accommodation while I was there.)

Although Linux still lies at the heart of what the Linux Foundation does, its home page proclaims "open collaboration powers everything... ", and that's a reflection of how much it has broadened its base. Today, its collaborative projects include the following:

AllSeen Alliance - Internet of things (IOT)

CodeAurora - mobile wireless

Dronecode - drones

Open Network for Network Functions Virtualisation

Open Virtualisation Alliance

Open BEL - Open Biological Expression Language

Open Daylight - software-defined networking (SDN)

OpenMAMA - Open Middleware Agnostic Messaging API

Tizen - mobile and connected devices

Xen - virtualisation

Yocto - embedded systems

Here's Zemlin's perspective on why the Foundation is becoming involved in so many collaborative industry projects:

"Companies are now as the norm using open source to shed comunity R&D, to do collective innovation, particularly at the infrastructure layer, for almost every aspect of technology, not just Linux - SDN, IOT, network functions virtualisation, cloud computing, etc. What you have seen as a result is this proliferation of organisations who facilitate that development, on a very large professional scale. That's a permanent fixture of how the tech sector operates. We launch a new one of these about every 3 months. Next year we'll have many many more of these type of projects."

As I wrote last year, when it began announcing more of these collaborations, the Linux Foundation is uniquely well-placed to manage these new groups. First, because the key features of the trend that Zemlin describes - industry competitors working together on infrastructural elements that can form a common, non-competitive basis on top of which they innovate - is exactly how the Linux ecosystem has evolved. There, we have seen erstwhile rivals working with each other to support projects that they would then build on in different ways in order to compete in the marketplace. Secondly, because the Linux world essentially invented this approach - albeit largely by accident - the Linux Foundation has more experience than anyone in acting as what I described eighteen months ago as the "honest broker" for industry groups.

So far, so conventional. But Zemlin went on to make an important point that I think is both new and significant:

"The largest form of collaboration in the tech industry for 20 years was at standards development organisations - IEEE, ISO, W3C, these things - where in order for companies to interoperate, which was a requirement in tech, they would create a specification, and everyone would implement that. The tech sector is moving on to a world where, in the Internet of things [for example], do you want to have a 500-page specification that you hand to a light bulb manufacturer, or do you want source code that you can hand to that manufacturer that enables interoperability? I think that's a permanent fixture. People have figured out for a particular non-differentiating infrastucture they want to work on that through open source, rather than creating a spec."

This is a major shift in the way that the technology world operates. Instead of trying to pin down in a specification how a new set of common standards will operate, leaving each company to implement those specifications as they see fit - perhaps with variable compatibility among them - we are moving to a world where the new standard is represented by open source code that both defines that standard, and does 99% of the work of implementing it.

That brings two huge advantages. First, it ensures that competitors really are working from a common foundation, and that compatibility is baked in. Perhaps even more importantly for those working with the open source code, it saves them much time and money, since they do not need to write an implementation from scratch, but can simply tweak the open source that is freely available. That not only saves money, it speeds up development and the pace of innovation. It also widens the market, since it means that even relatively small companies can take that code and use it in their products - something that was impossible in the age of complex standards.

The benefits of moving from open standards to open source are so great, a legitimate question is: why isn't it happening more widely? Zemlin says the explanation is very simple:

"The big problem that we see right now, is there are not enough people in between the engineers who get it, and maybe the business decision-maker who gets it a senior level, who really are experts on what the licences mean, how to share what they want to share with people, the social coding practices, the product development cycle, people who understand how to pull code from an open source project, commercialise it, make fixes, contribute those fixes back to the upstream project, stay in compliance with the open source licences."

Senior managers understand the efficiency of defining industry standards using open source; engineers naturally understand that open source is a better way to write code. It's the middle managers who can find it difficult to grasp such an apparently different way of doing things. And it's not just in small companies that this is an issue:

"The top ten tech companies spend about $64 billion on R&D, yet some of these companies have not a single person in their organisations who manages open source, which represents, in many cases, 80% of the code in their products, and billions of dollars of value."

In other words, many companies depend completely on open source in one way or another and yet are still unaware of this fact, and are simply taking it for granted. This lack of appreciation means that they are not really engaged with open source in the way they should be - a perilous situation for them. Zemlin says that the Linux Foundation is trying to train more middle managers who understand the importance of working with the open source approach in order to maximise the benefit for companies, but there's still a long way to do.

As well as this fascinating perspective on the shift from open standards to open source as a result of these industry-wide collaborative projects, we also discussed the Linux Foundation's Core Infrastructure Initiative (CII):

The Core Infrastructure Initiative is a multi-million dollar project housed at The Linux Foundation to fund open source projects that are in the critical path for core computing functions. Inspired by the Heartbleed OpenSSL crisis, the Initiative’s funds are administered by the Linux Foundation and directed by a steering group comprised of industry backers. 

The steering group works with an advisory board of esteemed open source developers and related experts to identify and fund open source projects in need.

Zemlin explained the background:

"If the claim is that [open source] is more secure and more aligned with privacy due to code transparency, that at a fundamental level is correct, but there's certainly specific steps the open source community can take to make that even more true. That's really where CII, I hope, can provide value.

One guy working on bash for twenty years, two guys named "Steve" maintaining OpenSSL, Harlan Stenn working on NTP, largely on his own, that keeps Internet time, a couple of guys working on OpenSSH, that's about it. We have taken them for granted - all of us; no one was really paying attention."

As well as helping these and other key projects directly with extra resources, the Linux Foundation is also carrying out a wide-ranging and rather novel audit:

"We have a list of projects that we're researching right now. You're not looking for a zero-day, we're looking for who's working on it, under what licence, how do they collaborate, where is this code physically housed, who has access to that data set? - all these different structural things that you look at and can say: hey, is this project in trouble or not?

Once more, this shows how the Linux Foundation is meeting a need that is not otherwise addressed. Just as there are few other organisations that could hope to bring together competitors to collaborate on large-scale software projects, so the Foundation is probably alone in having the will and the means to carry out this kind of stocktaking of open source's core infrastructure.

In the post-Snowden world, such support is even more crucial than ever, and we should be grateful that we have the Linux Foundation to persuade deep-pocketed players in the computing industry that what's good for the rest of us is also good for them. With this programme under way, we can hope that at least a few of any lurking security problems in key open source applications will be winkled out and fixed. That won't make us 100% safe, but it will certainly be better than what we have now - and miles better than anything that closed-source code, with all its hidden backdoors, can ever offer.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+