The Fundamental Rights Agency
of the European Union (FRA) has finished its opinion on the proposed directive for an EU-PNR system
for the retention and mass analysis of flight passenger data. It had been asked by the Civil Liberties Committee of the European Parliament in March 2011, on initiative of the Greens/EFA group.
I provide a summary of the most important findings below. A summary in their own words is at page 20.
Further reading: In the meantime, the legal service of the EU Council has also shred the proposed directive into pieces
(German version only, sorry!).
The FRA opinion criticises the proposed PNR directive on the following grounds:1) Data Protection Violations
FRA shares the concerns published by the EUropean Data Protection Supervisor (EDPS) and the Article 29 Working Party. The FRA opinion therefore is seen as complementing it and only touches on issues that are not addressed by the data protection bodies:
"In general, the FRA shares these analysis and opinions and takes them as a point of departure. This FRA opinion complements and adds to the opinions of the EDPS and the Article 29 Working Group by focusing on topics from a broader fundamental rights perspective." (p. 5)2) Ban of Discrimination not sufficiently respecteda) Discriminatory Profiling based on sensitive Data:
The directive would have to exclude many more categories than the ones listed in articles 5 and 11. The Commission did not cover the following categories in its proposal, though they are protected under EU law:
"[I only list the ones not covered by the proposed directive, RB] sex, colour, social origin, genetic features, language, any other opinion (beyond political views), membership of a national minority, property, birth, disability, age” (p. 7)b) Indirect Discrimination based on Profiling for Other Data:
This would also be prohibited and is not by the proposed directive. It includes all data categories that are not covered by a) (p. 9). To me it reads like a cautiously written general ban on profiling, because any data category can be used for discrimination. Surveillance studies scholars have called profiling "digital discrimination" years ago.
An example by anaologue: Discrimination based on language or nationality or religion is banned, but if someone travels from Islamabad to Mekka once a year, you can assume he or she is Muslim. This would be prohibited.3) Clarity of the law is not given:
"Individual passengers may be generally aware that their flight details are being recorded and exchanged but will typically know neither the assessment criteria applied nor whether or not they have been flagged by the system for further scrutiny. Therefore, any measure giving the authorities power to interfere with fundamental rights should contain explicit, detailed provisions" (p. 12)
This clarity is lacking because ofa) Generic clauses
such as “general remarks (...) such as" in the description of the data transmitted, retained and analysed (item 12 in the annex to the proposed directive, see p. 13 of FRA opinion). The types of data are also not limited:
"The explanatory text within the brackets also indicates solely what kind of information is included, but does not limit the data to be collected. This might possibly permit unlimited information gathering and transfer and, therefore, might not be justified by the purpose of the PNR system" (p. 13)b) Purpose Limitation is lacking:
"The definition of serious crime included in Article 2 (h) includes an open formulation: (...) the discretion the proposal grants Member States to decide which crimes are covered and which are not seems unnecessarily broad." (p. 14)c) Data Matching is unspecified:
"Article 4 (2) (b) states that “the Passenger Information Unit may compare PNR data against relevant databases, including international or national databases or national mirrors of Union databases, where they are established on the basis of Union law, on persons or objects sought or under alert, in accordance with Union, international and national rules applicable to such files.” This provision allows for matching PNR data ‘with undetermined databases’. Because the databases are not specified, the use of PNR data might not reach the required level of foreseeability" (p. 14)4) No Proof of Necessity:
"The FRA is aware that further evidence proving the necessity of a PNR system might exist beyond what was disclosed." (p. 15)
In plain English: Do your homework! (Fun fact: The Commission currently has the same problem with regards to the evaluation of the data retention directive 2006/24/EC, where they were not able to prove the necessity based on hard data.)5) False Positives / Repression against Innocent People
"The examples provided by the European Commission relate only to cases in which PNR data were successfully used in the course of investigations. For a more complete picture, it would also be necessary to analyse those cases in which the use of data proved to be misleading and led to the investigation of innocent people. Such a case is included by the European Union Committee of the UK House of Lords in its 2007 report on the EU/US Passenger Name Record (PNR) Agreement: the case of Maher Arar." (p. 16)6) Proportionality of Applying the Measures to all Passengers
: The FRA quotes at length from rulings by the German Constitutional Court etc., and then concludes:
"The FRA suggests for proportionality reasons to include an explicit obligation in the proposal to make every reasonable effort to define assessment criteria in a manner which ensures that as few innocent people as possible are flagged by the system. This aspect could also play an important role for the review envisaged in Article 17 of the proposal which states that special attention should be given in the course of the review to “the quality of the assessments”. (p. 18)7) Effective Oversight unclear:
Any data protection oversight must be fully independent and must have powers of investigation and binding rulings, which apparently is not clear from the proposed directive draft. (p. 19f)