33C3 – Proposed EU Copyright Law – Not Fit for Anything

Another highlight from 33C3 was Julia Reda’s talk about the proposed EU copyright law, Copywrongs 2.0. I say highlight, only because it was an interesting and compelling talk, the law itself is an absolute lowlight. To say that the proposed law is not fit for purpose is an understatement, and there is a question as to whether it is designed for purpose has less to do with protecting creators and more to do with protecting an industry struggling with an outdated business model.

The reform is a final parting shot by the outgoing EU commissioner Günther Oettinger. His proposed reform to EU copyright threatens freedom of expression by making simple things like linking to content (a central tenet of the the internet) a breach of copyright. This is obviously madness.

The proposals seems to be the product of some intensive lobbying by what are often referred to as ‘old media’. Some news publishers, mostly those who are struggling to adapt their business models to the 21st century, want to charge search engines and social networks for the links displayed in searches or embed in users posts. Essentially charging for the traffic sent their way. The other culprit is the music industry, struggling in the world of YouTube. Personally, I particularly don’t want to see the newspaper industrial disappear, especially in the world we live in today, but this isn’t the answer.

So what does the proposed law prohibit? As written sharing small sections of news articles e.g. on a blog or a personal website (such as this one) without a license from the publisher will be an infringement, for as long as 20 years after the article was originally published. This is crazy, the point of doing that is to drive traffic to the original story, the newspaper industry seems to be shooting itself in its foot.

As its stands the EU Commission has not proposed any exceptions based on the size of the snippet, or for individuals, or for non-commercial purposes, and providing a link to the source isn’t enough. This essentially means you have to have a license to reference or attribute a quote. What this means for newspapers quoting each other I don’t know, or for academic work.

Not only can you not link on social media, it would also seem that indexing the web in general would be impossible without licensing, and thus essentially impossible. In fact, any and every site in existence would have to ways of filtering out copyright infringements.

What about collaboration? The affect such a law would have on site that foster collaboration is also not clear, but likely to be bad. For example GitHub would have to put in place the filtering technology to search for source code that someone wants to keep of the site. Even if that code was written under some open source licenses. Also in trouble would be Wikipedia, and anyone using data from the web for training of AI or similar.

So what is Günther Oettinger trying to do? Does he just have no understanding of the internet, and it would seem copyright? He is known to be in favour of big business, and seems to be close to the publishing industry. At best its a misguided attempt at protecting an outmoded business model. What happens now is down to people doing a bit of lobby of our own. Is there any point in Brits getting involved? Yes, for one there is a chance that the UK will mirror some EU laws, at least initially and we don’t want this one. Also we can do our bit to help out our EU neighbours.

Here comes the two-tier internet?

The government want to control what we do on the internet. They want to make us all safe by controlling what we can see and recording everything that we do. Its for our own good, we need to be kept safe. Think of the children!

Ok, so that’s a rather flippant response to two big issues but I think there is a sinister truth within. Leaving aside the apprent fact that much of our online lives are either accessable to the security services, or being recorded, I want to focus more on David Cameron’s war on pornography. Or rather, the idea and practicalities of filtering the internet. Internet pornography is a particularly difficult subject in a number of ways. Child exploitation and abuse is without question horrible, and its right that every effort should be made to remove it from society and the Internet. Consensual pornography is a less clear cut issue, one that there has been a lot of debate about, and will continue to be debated perhaps forever. Is pornography always exploitative? If all parties are consenting what is the problem with its production and viewing? Should pornography be protected as a form of expression? Where is the line drawn between pornography and art? I want to leave these questions for others and move on instead about filtering the internet.

To me the Internet (or perhaps the whole World Wide Web) is a decentralised, uncontrolled, bastion of freedom. One that seems to be under threat in a number of different ways. More and more of the infrastructure of the Internet is control by a small number of large companies. Google, Facebook, Amazon, Yahoo!, Microsoft, Rackspace and others host, route or provide access to increasing amounts of the content on the Internet. The dream of a truly decentralised Internet of millions of servers providing hosting, mail, or search fertilities seems to be over or at least under threat. Filtering seems to be the next big assault on my bastion of freedom.

Filtering the Internet entails blocking users’ access to certain websites, either via host-name blocking, or perhaps IP blocking. This is what David Cameron is proposing to do, by default he wants UK ISPs to block porn sites and only allow access if people actively choose to opt-out of filtering. The problem with filtering (leaving aside the issues around freedom of expression etc) is that is rubbish and doesn’t work very well. Filtering can operate at the level of Domain Name Servers (DNS). DNS servers are essentially an index, when I try to reach google.co.uk I am requesting the IP address of the computer hosting google.co.uk off a DNS server. If filtering is in place, instead of getting the IP address of the computer I would be redirected to a page telling me that I was trying to reach naughty content that I am not allowed to see.

The other possibility is blocking access to particular IP addresses. I suspect that this would be done at the point of Network Address Translation (NAT), or while routing traffic. Essentially an ISP could keep a list of IP that are banned, and refuse to route you to that computer. This is more problematic because the IP address of a website can change independently of the host name. So even if the IP is blocked you could reach the site, or a ‘safe’ site could be blocked accidentally. Also, it is possible to host a number of websites on one computer, therefore all under the same IP address, block one you block them all. So IP filtering is a very blunt instrument.

These methods of filtering can be circumvented in a number of different ways. DNS filtering for example can easily be sidestepped by changing your DNS server. By default your home router will be set up to use your ISPs DNS servers, filtering and all. Don’t like it? Change it to one of the public DNS servers that are free. This can be done on your router, or just for individual computers or even browsers. You could also use a Virtual Private Network (VPN). This is slightly more difficult but not much. Here you establish an encrypted link between your computer and another one elsewhere (perhaps in a different country). All the network traffic out of your computer is then sent to this other computer that then deals with it, by forwarding it on to DNS servers etc. It looks like you are that other computer, and if that computer is outside of the filtering then you are unfiltered. You could also use TOR. TOR was designed to protect against network surveillance and traffic analysis. How it works is an article (at least) on its own, but needless to say if you use it you wouldn’t be filtered.

So filtering is rubbish… so what next? Perhaps governments will give up on filtering, trusting adults to make there own minds up about what they want to access, and instead focus on catching the people that are breaking the law and leave filtering children’s access to the internet to parents? Not likely. My fear is that in a quest for increasing control (to make us all safe) of what the Internet is they will create a two-tier internet.

What is this two-tier Internet? An island analogy will work here. If you leave on an island you can easily drive around on the island and visit all the shops, people etc. However if you want to go off the island you have to drive down the one bridge to the rest of the world. It could be possible to build an Internet island. By default this is what everyone is given access too and you aren’t allowed to cross the bridge. Under this model all the websites accessible on the island would have to be pre-registered and vetted, if you attempt to access any other website (or indeed computer) you would be blocked. Public DNS wouldn’t work, that would require going down the bridge, same for a VPN that is outside the island. You just can’t get there. Unless, that is, you have requested to be able to cross that bridge. It would be easy then to monitor who wanted to get off the island, and in some cases what they brought back with them. Just asking to get off would make you a person of ‘interest’. Why would any normal, law abiding person want to get off our safe internet island?

How likely is this? Well you would not need to alter the existing infrastructure of the internet. It could be implemented with existing technology, some countries have already attempted to unplug their entire population from the rest of the internet for periods of time. The model changes from one of allow with exceptions, to block with exceptions, ISPs could be forced to connect to what would essentially be a subnet of the of the WWW with a router controlling that bridge to the mainland. The only way round the filtering would then be building your own bridge!

International safe zones could be connected, with some sort of international body acting as oversight, deciding what makes the safe list. How is this different from what we have now? Well currently anyone can setup a DNS server and add hostnames to the index. There is no central control, DNS servers are distributed and hosting servers for websites pop up all the time. Anyone can get an IP address for an internet device and get access to other computers. That could easily change, its already changing.

What troubles me is that these changes could creep up on us. Some people will like the idea of a filtered internet. They would be happy to think that their children can’t get to unsavoury content while they are surfing in their bedrooms, or on their phones. What is more, they a could start getting upset when the filters fail, and start putting pressure on the government to change things, force the ISPs to do it better. Could this lead to vetting and a two-tier internet? I think it could, and it might not look like a bad thing for a while. Those who want to get down the bridge are allowed to do so, and what they do isn’t recorded without good reason. The problem is that good reason today might look very different in 10 years, and future governments might not be so benign (how benign are present day governments?).