Deep Packet Inspection puts its stamp on an evolving Internet

There is a “perfect storm” of Internet usage, disruptive technologies, and political sentiment that promises to shape the Internet’s future. In this month’s column we’ll explore the elements of this perfect storm and consider its implications.

Internet usage

It’s been widely publicized that over the next five years, Internet bandwidth usage will more than quadruple. This tremendous bandwidth boom represents both challenge and opportunity to network service providers. Increased demand represents additional revenues and profits to be had. However, it also represents the need for capital expenditures to deploy a network that can support this increased demand.

On one end, service providers can spend money on infrastructure that can carry multimedia data faster and at higher capacity. At the same time, the latest in available network infrastructure products doesn’t appear as though it will keep up with the projected demand. This causes service providers to consider options involving pricing based on bandwidth usage. Traffic management techniques and tiered service level pricing are options being considered for maintaining services acceptable to all users and driving revenue growth. This kind of “access-and usage-based pricing” model has been talked about for years but has never really taken hold. The main reason is that in order for Service Level Agreements (SLAs) to be in effect, the network infrastructure must take on the additional burden of enforcing these policies. This requirement further bogs down an already oversubscribed infrastructure.

However, the ever-increasing demand for bandwidth coupled with the deployment of disruptive technologies such as multicore processing and Deep Packet Inspection (DPI) will enable network infrastructure to become more intelligent while keeping up with wire-speed. This results in the ability to roll out new bandwidth-based pricing models as soon as network service providers are no longer able to satisfactorily accommodate the demands of their most important subscribers.

Disruptive technology

Two key disruptive technologies are at work: One is the move toward multicore processors and the other is DPI. Multicore processors are supplanting single core processors within network infrastructure and end equipment as a way to continue ongoing significant performance improvements related to delivering and displaying next generation multimedia services.

On the software side, DPI is a disruptive technology, which the proliferation of multicore processors is aiding. DPI has been a much-talked-about technology with regard to network neutrality and privacy issues, and it promises to change the Internet philosophy and landscape forever. The original Internet design philosophy cast the Internet as a “dumb pipe.” This pipe focused on reliable delivery of end-to-end data between endpoints, without regard to the content or the endpoints.

Now, however, the coupling of multicore processors and deep packet inspection software is enabling network infrastructure to become more intelligent and more aware of the content it carries. Ralf Bendrath describes a large number of applications for DPI and their implications on network neutrality and Internet governance, in his March 2009 paper titled “Global Technology Trends and National Regulation, Explaining Variation in the Governance of Deep Packet Inspection.[1]”

Deep packet inspection coupled with multicore processor technology has the potential to change the very nature of the Internet from a “dumb pipe” to an “intelligent network,” which gives Internet service providers the ability to assume a new role as potential gatekeepers of all their users’ traffic. These kinds of gate-keeping roles include content filtering, network monitoring and surveillance, and identification and classification of traffic-enabling content subscription models. This kind of capability represents a significant paradigm shift from the current Internet usage model of political freedom, technical simplicity, and economic openness as described in Ralf Bendrath’s paper.

Political sentiment

Recent events in Egypt as well as other areas in the Middle East drive home the kind of political and social reform of which the Internet social media is capable. A section in Ralf Bendrath’s paper on deep packet inspection and its ramifications is referenced below, and I recommend reading it in its entirety if you’re interested in the political ramifications of disruptive technology. One section of the paper outlines the political implications:

“To summarize, the internet has so far been a loose network of interconnected data networks that share few central characteristics (see also Carpenter 1996):

  1. Technical Simplicity: Because of the layered approach, they are only connected through the TCP/IP protocol suite and a shared address space. Therefore, they are highly open to new transportation methods (WiMax, UMTS etc.) as well as new applications (e.g. Twitter, Bittorrent, or XMPP/Jabber).
  2. Political Freedom: Because the higher-layer payloads are encapsulated for the lower layers, the users have end-to-end communication channels at the application layer, which are normally not interfered with in transport.
  3. Economic Openness: Because of the openness for new applications, they do not discriminate traffic according to its source, therefore treating all innovations at the application layer equally and giving them a fair chance to succeed at the market.”

Note that adding intelligence within the infrastructure begins to break down all three of the characteristics above. DPI applications range from relatively innocuous to quite intrusive. Ad injection, for example, is a relatively new concept where ads are injected into websites that match the specific interests of each user visiting the site based on detailed analysis of the user’s Internet traffic. While innocuous on the surface, it still requires monitoring the user’s activities, sites visited by the user, and potentially the user’s buying habits while on the Internet.

Another application involves network security. With this application, network operators have the ability to filter malware and other dangerous traffic before it reaches their subscribers. This kind of service is valuable and serves to protect subscribers against malicious threats that they may not be experienced enough to avoid.

The lawful surveillance area is simply an extension to the lawful wire-tapping laws for analog telephone systems. The Communications Assistance for Law Enforcement Agencies (CALEA) laws that went into effect in 2006 govern similar kinds of lawful surveillance on the Internet and the requirement that Internet service providers be able to provide this information when presented with the proper warrant. This kind of lawful surveillance also arms the network with the ability to intrusively monitor and intercept any kind of Internet interaction – emails, VoIP calls, websites visited, you name it. For those interested, here’s a link to FBI testimony on the need for new surveillance laws: http://judiciary.house.gov/hearings/hear_02172011.html

A related area in law enforcement and the Internet is copyright enforcement. The Recording Industry Association of America (RIAA) references an analysis by the Institute of Policy Innovation that concludes global music piracy costs $12.5 billion of economic losses yearly. There is a significant push by those in the content industry to require service providers to deploy filtering equipment that can detect and block copyrighted material from being passed between subscribers in an attempt to rein in peer-to-peer downloading of pirated content.

Then of course, there is the network management side of DPI for dealing with bandwidth usage. These techniques allow service providers to throttle or block various applications based on their assigned priority within the network. This aspect of DPI may be the most controversial because it leaves the decision of what things to throttle up to the service provider. Throttling bandwidth for specific applications could cause lower consumer demand for that service, so this practice opens up potential issues involving preferential bandwidth access and delivery for certain applications based on the highest bid.

Conclusion

The conclusion is that there is really no conclusion. The point is not to comment on whether any or all of these developments influencing the evolution of the Internet are “good” or “bad.” Rather, my point is to raise awareness that it is happening. Organizations are looking at DPI implications and at exactly what kinds of regulations should be put in place to ensure proper and healthy evolution of the Internet, which is now interwoven into the fabric of our society. Its evolution is tied to our evolution as a society and how we live, learn, and work. Whatever your philosophy on these kinds of things, I encourage you to let your thoughts be known on the subject. However the Internet evolves, it should be shaped by popular consensus, not special interests.

References

Ralf Bendrath, “Global technology trends and national regulation: Explaining Variation in the Governance of Deep Packet Inspection”. Paper prepared for the International Studies Annual Convention New York City, 15-18 February 2009 and updated 3 March 2009

For more information, contact Curt at cschwaderer@opensystemsmedia.com.