EU Copyright Dialogue: The Great Sham(e)

In an implicit acknowledgement that the Europe-wide protests against ACTA indicated that there was a problem with copyright in the digital age, the European Commission announced back in December what it called "an orientation debate on content in...

Share

In an implicit acknowledgement that the Europe-wide protests against ACTA indicated that there was a problem with copyright in the digital age, the European Commission announced back in December what it called "an orientation debate on content in the digital economy." This is what that meant, apparently:

A structured stakeholder dialogue will be launched at the start of 2013 to work to address six issues where rapid progress is needed: cross-border portability of content, user-generated content, data- and text-mining, private copy levies, access to audiovisual works and cultural heritage. The discussions will explore the potential and limits of innovative licensing and technological solutions in making EU copyright law and practice fit for the digital age.

This process will be jointly led by Michel Barnier, Neelie Kroes and Androulla Vassiliou. By December 2013 the College will take stock of the outcome of this dialogue which is intended to deliver effective market-led solutions to the issues identified, but does not prejudge the possible need for public policy action, including legislative reform.

Well, that sounded fair enough: having all the stakeholders, including the public, involved, was obviously a good idea, and it was encouraging to see user-generated content explicitly mentioned. Also welcome was that the outcome of this dialogue would not "prejudge the possible need for public policy action, including legislative reform", because that is obviously an important option that needs to be considered.

The next thing we heard was on the home page of this initiative, where the "structured stakeholder dialogue" has mysteriously morphed into "Licences for Europe." Now, licensing is obviously an important part of this dialogue, but by no means the only part. Indeed, arguably, it is precisely because of an obsession with licensing that the copyright regime has become so dysfunctional in the online world, where licensing is simply not workable for all the acts of copying that take place there – think of how content is copied multiple times as it traverses the Internet, all without any kind of licence at all.

And it wasn't just the title that had changed. We also had this statement about the stakeholder dialogue:

Its main purpose is to seek to deliver rapid progress in bringing content online through practical industry-led solutions.

The original discussion was about making copyright law and practice fit for the copyright age, not about imposing "industry-led solutions", for the simple reason that some of the solutions – notably for things like user-generated content and data- and text-mining, don't require the industry to be involved. Indeed, involving them simply recapitulates the problems we have today.

So what began as a stakeholder dialogue that would not "prejudge" a possible need for public policy action, has turned into an effort to shoe-horn everything into the tired old solutions of licensing and more licensing.

It gets worse. When the session on user-generated content took place – the one where the public was obviously most central to the argument – what do we find, but that fully 78% of the participants were "Copyright industries, and collecting societies and lobbies working for them" as La Quadrature du Net rightly describes it. In other words, the idea that this was a serious discussion about the issues affecting end-users was a total sham: this was essentially just the same old bunch of copyright companies and their lobbyists agreeing with each other.

Inevitably, then, the "dialogue" was nothing of the kind, but a regurgitation of the failed solutions that have been tried for the last decade or more. It is beyond ironic, then, that according to one report on the session, when people tried to raise key issues like "fair use", this was batted away with the contemptuous comment "fair use is from the 20th century."

And when the crucially important topic of "exceptions" - situations where users do not require licences to use copyright material – was discussed, the massed ranks of the copyright industries seemed offended something so distasteful had even been mentioned. Luckily, the session's moderator extinguished this errant train of thought immediately so as to protect the delicate sensibilities of the maximalists who hate having to consider the idea that copyright ought to be fair and balanced.

Now, you might be thinking that this session on user-generated content was just an unfortunate exception; that something had gone wrong with all the invitations that were going to be sent out to the representatives of digital rights groups; and that we shouldn't really judge the entire "Licences for Europe" initiative on the basis of one biased session with a packed audience that refused to allow the public's point of view even to be discussed.

So let's look at what happened in another important group, that exploring data- and text-mining. This is a crucially important new area, because it is about finding extra information in data by analysing and comparing on a large scale. That's only become possible thanks to the availability of low-cost computing power (usually running free software), which makes these kinds of analyses feasible for the first time.

But to do this, researchers (and businesses) need to have the ability to carry out such data- and text-mining without needing further licences – if the material is commercial, then they would already have paid a fee in order to access it. But, as usual, the copyright industries are not satisfied with getting paid once, but want to get paid again for this kind of large-scale use of data. In effect, this is creating yet another form of "right", alongside copyright and the database right.

It was naturally important for those most affected by that attitude to be able to discuss it, and the fact that it just adds an extra cost layer to this kind of activity – something that academics can't afford, and which therefore closes off this kind of work that potentially can lead to all kinds of new knowledge. But here's what happened in the European Commission's stakeholder dialogue on this area, as reported by Communia:

Unfortunately the first meeting of this working group which took place on the 4th of February in Brussels did not live up to the expectations raised by the Commission's earlier announcement. It quickly became evident that the stakeholder dialogue is based on a flawed assumption (‘more licensing will bring copyright in line with the requirements of the digital economy') and that the process was designed to prevent a serious discussion about how to unlock the potential of scientific text and data mining.

So, there it was again: "the process was designed to prevent a serious discussion about how to unlock the potential of scientific text and data mining" - just as happened in the user-generated content session. That is, once again, the European Commission had pre-judged what the solution was, and was not prepared to hear from the experts what the problems on the ground were. It just wanted to impose the old solution again: licensing and more licensing.

Communia goes on:

Chief among these concerns is the belief that in order to have an open discussion about the reform, possible solutions cannot be limited to licensing. From our perspective text and data mining cannot be solved by re-licensing texts to libraries, researchers or the public. What Europe needs is clarity that text and data mining works that are lawfully available does not require permission by rights holders. A stakeholder dialogue that simply declares this position off limits can hardly be called a dialogue at all. In the case of Public Domain content, there is a risk that a focus upon licensing will lead to unlawful re-licensing of content that is out of copyright.

As a result of the European Commission's arrogant refusal even to listen to the opinion of anyone other than the industries that would most benefit from imposing even more fees, an impressive list of public organisations representing academia, the researcher community and civil society felt compelled to address an open letter to the European Commissioners responsible for this charade, appended to the Communia post discussed above. Here are some of the key points:

The potential of TDM [text- and data-mining] technology is enormous. If encouraged, we believe TDM will within a small number of years be an everyday tool used for the discovery of knowledge, and will create significant benefits for industry, citizens and governments. McKinsey Global Institute reported in 2011 that effective use of ‘big data' in the US healthcare sector could be worth more than US$300 billion a year, two-thirds of which would be in the form of a reduction in national health care expenditure of about 8%. In Europe, the same report estimated that government expenditure could be reduced by ‚¬100 billion a year. TDM has already enabled new medical discoveries through linking existing drugs with new medical applications, and uncovering previously unsuspected linkages between proteins, genes, pathways and diseases. A JISC study on TDM found it could reduce "human reading time" by 80%, and could increase efficiencies in managing both small and big data by 50%. However at present, European researchers and technology companies are mining the web at legal and financial risk, unlike their competitors based in the US, Japan, Israel, Taiwan and South Korea who enjoy a legal limitation and exception for such activities.

Given the life-changing potential of this technology, it is very important that the EU institutions, member state governments, researchers, citizens, publishers and the technology sector are able to discuss freely how Europe can derive the best and most extensive results from TDM technologies. We believe that all parties must agree on a shared priority, with no other preconditions – namely how to create a research environment in Europe with as few barriers as possible, in order to maximise the ability of European research to improve wealth creation and quality of life. Regrettably, the meeting on TDM on 4th February 2013 had not been designed with such a priority in mind. Instead it was made clear that additional relicensing was the only solution under consideration, with all other options deemed to be out of scope.We are of the opinion that this will only raise barriers to the adoption of this technology and make computer-based research in many instances impossible.

In other words, the selfish obstinacy of the copyright industries, in their refusal to consider any approach other than yet more licensing, is likely to cause serious economic losses to Europe, alongside the cultural ones. Once again, we have a clear demonstration of how copyright maximalists are happy to harm an entire continent rather than concede that their scarcity-based approach is outdated and needs to be re-thought in a world of digital abundance.

That the copyright industries should cling to this retrogressive approach is no surprise – it's what they've done at every turn for the last fifteen years or more. That the European Commission still hasn't learned its lesson from the street demonstrations against ACTA last year, and tries to shore up decrepit intellectual monopolies to the disadvantage of the 500 million citizens of Europe it supposedly serves, is not just extraordinary, but truly shameful.

Follow me @glynmoody on Twitter or identi.ca, and on Google+

Find your next job with computerworld UK jobs