On February 16, 2010, the Assemblée Nationale, the lower house of the French legislature, approved the draft Loi d’Orientation et de Programmation pour la Sécurité Intérieure (Law on the Orientation and Programming for Internal Security, or “LOPPSI”). After the DADVSI law of 2007, which criminalized Digital Rights Management (DRM) circumvention, and the controversial HADOPI law of 2009, which sought to enact a “three strikes” disconnection policy against online copyright infringers, the latest bill has been described as conferring on the French government “unprecedented control over the Internet” (Der Spiegel; see also The Register, Le Monde (in French)). Le Monde sees in LOPPSI a “true arsenal for cyber security,” which is being pushed as a matter legislative priority by President Nicolas Sarkozy.
Ragbag security legislation
The bill is a ragbag of security-related provisions, spanning a diverse range of issues such as online identity theft, video surveillance, stadium violence, and dangerous driving. The law apparently also authorizes the French authorities to use malware to obtain evidence on criminal suspects, for example by covertly uploading software to their PCs to log their keyboard inputs. While the express purpose of the bill is to set out the framework for the operations of law enforcement agencies for the next five years, it focuses particularly on the technical means that can be employed by the police and judges.
The provision that has proven most controversial is draft article 4, which provides for the filtering, on the authority of ministerial orders, of websites hosting child pornography. The 312 to 214 vote in favor by the Assemblée is unlikely to mark the end of the controversy, as the upper house, the Sénat (Senate) has yet to debate and approve the law. This post considers the text of the provision and the debates surrounding it, before comparing the proposal to similar proposals and existing filtering systems around the world.
Filtering by ministerial order
Draft article 4 is explicitly targeted at, and limited to, the “requirements of the fight against images or representations of minors” prohibited by the Code Pénal (Criminal Code), i.e. child pornography. There is no leeway under the current wording of the article for blocking sites other than those which provide access to child pornography. In terms of procedure, as pointed out by the Ministry of the Interior’s press release on the law, “the rule is simple: the Minister for the Interior communicates to ISPs a blacklist of sites and online content to be blocked, and it is the ISPs who prevent access to those sites and content from computers located in France.”
Article 4 was one of the main points of contention in the legislative debates over the bill. According to Le Monde, the (right wing) majority accused the left, which opposed the bill, of turning a blind eye to the kind of materials easily available online. The left, on the other hand, protested against the “diabolization” of the internet, a hostility which Green députée (Representative) Martine Billard sees as rooted in the government’s frustration with its inability to control the internet. The opposition further attacked the bill on the grounds that it fails to address either the victims of the crimes at issue or those who create the images, but rather focuses only on the means of transmission.
One crucial amendment to the bill was introduced during the debates in the Assemblée Nationale by député Lionel Tardy, a member of the majority UMP party. The amendment requires the approval of a judge before the ministerial order to block a given site can be put into effect. According to Le Monde, the bill sponsors expressed their reservations regarding this amendment (and in particular its potential to slow down the enforcement procedure), but in the end chose not to oppose it. This decision may have been based on a recognition of the validity of the opinion of the Commission des Lois (Law Commission), which was of the view that the absence of this procedural safeguard could lead to the law being struck down as unconstitutional (as happened to the HADOPI law last year).
None of the critics of LOPPSI argue that child pornography ought not to be fiercely cracked down on. Rather, a leading theme of criticism of the bill is a concern that, by enshrining a ministerial power to order the blocking of internet sites, LOPPSI lays the foundations for a system of internet filtering that could easily outgrow its original purpose. French cybercrime expert Guillaume Lovet (quoted here), notes that the legislation gives the French government a “foot in the door,” and observes that it reflects a growing international trend of “legislate first, address accountability later.”
Blogger Jean-Michel Planche notes that, if the law is passed, the internet will become the first infrastructure network (e.g. roads, electricity, gas, postal services) to come under the control of the Ministry of the Interior, and wonders what implications this may have as the internet’s role as a platform for all kinds of social and economic exchanges grows.
A number of critics have also questioned the effectiveness of the bill, remarking that this type of ISP-level filtering would do little to prevent the determined and tech-savvy from accessing offending websites, for example through virtual private networks (VPNs) (see e.g. this online LOPPSI forum).
The explanatory notes to LOPPSI mention the fact that “neighboring democracies” such as Denmark, the Netherlands, Norway, Sweden and the United Kingdom have put in place technical measures enabling the blocking of access to specified sites from within their territories (though these have not been formalized in LOPPSI-like legislation; Le Monde provides a useful map which identifies various countries around the world which have adopted targeted filtering of child pornography sites). The experience of filtering in these countries is not encouraging with regard to the accountability of blacklisting systems.
The blacklists maintained by a number of countries, including Denmark, Norway, Australia and Thailand, have been leaked through Wikileaks over the last few years. The Thai government’s blacklist, aimed at child pornography, allegedly included 1,203 political sites which were thought to criticize the Thai king, in breach of Thailand’s strict lèse majesté laws (see ZeroPaid post here). But even in the case of western democracies, blacklists have been accused of being open to abuse: Forbes reported that the blacklist compiled by the Australian Communications and Media Authority, which is meant to target child pornography and terrorist websites, was found to include the websites of a tour operator and a Queensland dentist’s practice. The U.K. filtering system came under fire in 2008 when it was found that six major British ISPs had blocked access to a Wikipedia page which contained an image reproducing a controversial Scorpions album cover (see report from The Register).
An interesting contrast to LOPPSI is the fate of a recent German filtering proposal, the Gesetz zur Erschwerung des Zugangs zu kinderpornographischen Inhalten in Kommunikationsnetzen (Law on the Restriction of Access to Child Pornography Content in Communication Networks), which was initially approved in the summer of last year by the German lower house, the Bundestag (see Deutsche Welle report). Unlike the French bill, the German law would not have blocked access to the offending sites but would have thrown up a warning page displaying a large red stop sign. The stop sign would notify web users of the nature of the content they were seeking to access, but nevertheless allow the users to proceed if they so choose. The proposal met with considerable public opposition, including an online petition signed by more than 130,000 people (the biggest online petition in Germany to date). Elections in September 2009 resulted in changes to the governing coalition, and the liberal FDP made it clear, during the talks that led to it joining the government, that it would not support the filtering provisions. The filtering strategy was formally dropped on Feburary 8, 2010, in favor of a policy targeted at deleting offending websites rather than blocking them (see Opennet report).
Looking at the wording of article 4 of LOPPSI alone, the concerns of some of the bill’s critics may seem overblown. Few dispute the pressing need to fight the dissemination of child pornography online. Even if ISP-level filtering is unlikely to deter the most resourceful seekers of such content, what limiting effect it does have must surely be welcomed. Regarding the criticism that the bill focuses only on intermediaries, it is clear that other legislation targets the creators of child pornography. Furthermore, in many areas of law enforcement, targeting intermediaries often proves to be the most effective means of achieving effective enforcement. Regarding blacklists, there is a valid argument that releasing the blacklist publicly could compromise the aim of suppressing access to the sites concerned, as it would provide potential offenders with an “address book” of prohibited sites, which the more tech-savvy could then easily access. However, the patchy record even of liberal democracies suggests a strong need for accountability mechanisms in the administration of any kind of blacklist system. In this respect, the amendment introduced by Mr. Tardy is a welcome and necessary procedural safeguard. Nevertheless, there is little doubt that its sufficiency, and indeed the legitimacy of any kind of filtering strategy, will be much debated as LOPPSI makes its way through the French legislative process.
 In fact, the current bill should more accurately be referred to as “LOPPSI 2,” as a law of the same name was adopted in 2002 (see French Wikipedia article here).
 Note that French criminal judges can be much more intimately involved in investigation and evidence gathering than their common law counterparts.