thoughts and observations of a privacy, security and internet researcher, activist, and policy advisor

Wednesday, April 16, 2008

Deep Packet Inspection, or: The end of the net as we’ve known it?

My new research project that just started at the TU Delft and is supervised by Milton Mueller and Harry Bouwman has produced a first short description:

"Like a daydreaming postal worker, the network simply moves the data and leaves interpretation of the data to the applications at either end. This minimalism in design is intentional. It reflects both a political decision about disabling control and a technological decision about the optimal network design."
(Lawrence Lessig: Code and other Laws of Cyberspace,
New York: Basic Books 1999, p. 32)

Technological advances in routers and network monitoring equipment now allow Internet Service Providers (ISPs) to monitor the content of TCP/IP packets in real-time and make decisions accordingly about how to handle them. If rolled out widely, this technology known as deep packet inspection (DPI) would turn the internet into something completely new. Lawrence Lessig almost ten years ago reminded us that its design is not a natural given, but the outcome of political and technological decisions and trends. DPI therefore has the potential to affect the fundamental properties of the internet as a global public infrastructure and therefore also to alter the capacity of global internet governance.

DPI is reportedly motivated by three considerations on the ISPs’ side:
  1. They are under regulatory or public pressure by intellectual property owners and government agencies to control and filter the flow of illegal content.
  2. They pursue a strategy of vertical integration with specific content providers by slowing down their competitors’ content or by inserting ads into content served by third parties;
  3. They try to allocate bandwidth more efficiently and fairly among users, especially in the more bandwidth-constrained last mile and in the mobile internet.
The research project will examine the deployment of DPI by internet service providers and its actual and prospective impact on Internet users and internet governance. It will proceed in four steps:
  1. Empirical phase: It will examine the technological and design trends and true scope of implementation of DPI capabilities by ISPs, and the economic and regulatory drivers and barriers promoting as well as constraining its use. The data for this phase will be gathered in several case studies (different countries and ISPs) through desk research and interviews. Relevant indicators will include: design and deployment of DPI technologies; design, availability and deployment of DPI circumvention technologies such as encryption; bandwidth supply and demand for backbone and mobile internet; regulatory and other legal obligations for ISPs; economic indicators like ISPs’ market development and revenue trends.
  2. Explanatory phase: It then will attempt to assess how these empirical developments can be explained. Drawing on political, economic, and socio-technological theories, it will derive more specific hypotheses and models and test them with the data.
  3. Normative phase: The project will then assess the implications of DPI on human rights, such as the privacy and freedom of expression of internet users; on market failures and competition policies; and on norms of good infrastructure governance such as the “common carrier” concept or “network neutrality”.
  4. Praxeological phase: Based on the explanatory models developed before, it will derive recommendations on how to most efficiently rectify the normative problems identified.
Any comment and feedback is welcome, especially on the theory/explanatory part, and where to get the relevant data.

Wednesday, April 02, 2008

Privacy, Forgetting and Information Ecology

I am at the re:publica conference in Berlin this week, just listening to Viktor Mayer-Schönberger's keynote on forgetting and remembering. His speech is about "information ecology", and he reminds us that in human history, forgetting has always been the norm, while remembering was the exception that took an effort and was costly. This is changing with computers and hard drives, creating new problems in terms of privacy and out-of-context judgements based on outdated information. He is suggesting an expiry date for personal information. Read his full argument here.

A similar idea was developed by the Identity Futures Working Group last year. If forgetting is so difficult nowadays, we should at least display which information is older and may therefore be less relevant:
The Older Posts By And About People Appear More ‘Aged’ When Viewed. 2010±. It is now the norm for ‘digital aging’ to be visually displayed on documents as they age. Usenet posts from 20 years ago although still viewable have a grey age spots and cracks by default when first viewing them. Myspace posts from 2 years ago are yellow tinged.
Ed Felten at Freedom to Tinker has a new idea on how to create incentives for forgetting, based on the idea of a market for carbon dioxide emissions:
We all want more and bigger hard drives, but what is going to be stored on those drives? Information, probably relating to other people. The equation is simple: more storage equals more privacy invasion. That’s why I have pledged to maintain a storage-neutral lifestyle. From now on, whenever I buy a new hard drive, I’ll either delete the same amount of old information, or I’ll purchase a storage offset from someone else who has extra data to delete. By bidding up the cost of storage offsets, I’ll help create a market for storage conservation, without the inconvenience of changing my storage-intensive lifestyle.