Internet Freedom Part II: Regulating the Panopticon

Internet Freedom Part II: Regulating the Panopticon

This is part II of a two part series, “The Declaration of the Separation of Internet and State.” Part I is here.

A summary of Part I: Our current internet implementation is a product of the “rule of convention,” not the “rule of law.” The application of “the rule of law,” either from the typical conservative or progressive perspective, will erode the internet–the small network–from serving any means other than surveillance and control.

Secondly, our “rule of convention” is not a pure “spontaneous order.” It’s more of a hybrid “constructivist order” along the lines of Hayek’s “Planning for Competition.” A question then is whether the degree of efficiency gained from this constructivist contribution turns out to be a blessing or a curse. In other words, there is evolutionary artificiality in our current degree of network efficiency. With this level of efficiency, what we do want is rules begetting rules(or conventions). But what we want to avoid is rules begetting laws.

No Rule of Law for Network Neutrality

By design, the tcp/ip protocol does not rely on the transport layer for routing. Routers can simply process packets at either the link layer(the physical medium layer of the link, known as “the wire”) or the IP layer. The transport layer is only processed by the end hosts(or terminal nodes). In principle, then, the network doesn’t need know the details upper layer protocols being transported. A routing node need only strip off the link layer encapsulation of packet, examine the destination IP Address, lookup its routing tables to figure out how to route the packet–that is, what link or interface it should re-place the packet on–and re-encapsulate in a frame for the given physical link. Routing nodes do not need to know the type of payload being delivered-whether web or mail, etc) or tcp socket details that govern the implementation of reliable network transmission.

But this is a convention that does not hold up in the real world. The ideal prototype of “network neutrality” was shattered by the 1988 “Morris Worm” that exploited a buffer overflow in the unix fingerd daemon and a default option security flaw in the sendmail daemon to clog up the then nascent internet backbone. The threat of a rogue email worm was prescient because today it is still largely email that makes “transport layer neutrality” unworkable. To me, the very term “network neutrality” is a misnomer. The more accurate term is indeed “transport layer neutrality.” And any ISP or Telco that practices “transport layer neutrality” as a network policy will soon be out of business. Just for the simple problem of SPAM. Imagine your are inundated with spam(you may already be, but imagine the problem orders of magnitude worse) and you call up your ISP to complain but you are rebutted with the response: “Sorry customer, but we practice a policy of Network Neutrality.” That answer would be unacceptable. And it would be stupid.

The very usability, reliability and security of the public network depends on a robust implementation of “transport layer heuristic discrimination.” But our heuristic rules are not the product of any rule of law. Indeed it is silly to think of adjudicating the problem of SPAM, rogue computer programs, “phishing,” and other “violations” of unwanted network traffic by calling up your local sheriff, your local cops, the FBI, etc. What the fuck are they going to do other than perhaps subject your own system to a “forensic analysis” which potentially can land your ass in jail(remember, everyone is guilty of committing 3-5 federal crimes a day). Simply, the traditional model of the State as a formal compliance mechanism to ensure the “regularity” of a system does not apply here(does it ever apply?). The assurance of the “regularity” and usability of the System derives from the heuristic network rules adopted and implemented by the Telcos, network providers, ISPs, Datacenters, etc. The source of the rules is convention and practice, not legislation.1

Given the reality and necessity of “transport layer discrimination” to ensure the regular functioning of the network, it seems a bit farcical to talk of the need of “legislation” to enforce “network neutrality.” What really is at stake here is a political definition of “allowable traffic” and the erection of centralized compliance mechanisms to enforce a policy of political economy. Why in the world would you want to transplant a working, heuristic rules regimes that operates perfectly fine by convention over to one governed by a legal regime spawned from a rent-seeking game? There is no “necessary” reason for this. The argument of the supposed the threat of the integration of access and platform is a nonexistent problem. I discussed this in some detail in part I. Recall that the early dominant players in the commercial phase of the internet played a strategy of leveraging access for controlling platform. It was a losing strategy. Those early players are mostly gone now. The mega deal of the 1990s, the merger of Time Warner and AOL, which was supposed to be the ultimate execution of this strategy, ranks as perhaps the worst business transaction in corporate history.

But I can say, without hesitation, that if a heuristic network policy of traffic discrimination is replaced by a formal policy emanating from a rent-seeking legal regime, then the only real human freedom likely to remain will simply be the freedom to obey. Liberalism would have engineered a final panopticon social order for the human race.

Data Analytics Regime Change

In a recent debate between Peter Thiel and Google CEO Eric Schmidt, Thiel argued that Google had more or less become a public utility company. Thiel’s argument was that Google was out of ideas and was now using its monopoly position of dominance attained from a network effect in “Search” to invest its cash reserves in US Government paper. Thiel then wove this observation into broader argument that technology has failed to dramatically improve the general standard of living for the average person the past generation because the State had outlawed innovation in everything but information and bits–meaning computers and finance. In particular, Thiel bemoaned the outlawing of innovation in energy production.

Schmidt, of course, disagreed. Schmidt stressed the need for more education to maintain “competitiveness,” the need of the State to manage the demographics of entitlement spending, and played up the wonderful information technological opportunities presented by “data-analytics.”

The over-arching theme of the debate perhaps was the attempt to explain the current “stagflation.” I generally have no problem explaining the stagflation because I’m not aware of any political economic theory that seriously can argue a police state as a model of economic prosperity. So I would have to come down on the side of Thiel. Indeed, Thiel’s argument reminded me of an older post, Modeling Capitalist Regime Change, that I wrote a couple of years ago. In that post I used “complexity theory” as a model of political economy, noting: (1) technological singularities for human progress are tied to infrequent(countably infrequent, namely “two” so far–fire and fossil fuels) energy advancements (2) the information age is not a singularity because binary state machines are still tied to fossil fuels as an energy source (3) capitalism itself can be modeled as a complex system that periodically has to undergo “phase transitions, or “regime transitions,” because political economic rent seeking(creating moral hazard) results in a dampening of negative feedback loops. A predominance of positive feedback loops creates the need for increasing intervention measures to maintain resiliency. The system generally has to transition to new regime eventually to avoid consuming every last resource to protect the old brittle, non-resilient regime from catastrophic consequences of even minor disturbances(the “price of tea” in China).

In this complexity model we find that the political economy of capitalism isn’t really the engine of human progress it is often given credit for. It is really just a top-layer progression of regime transitions driven by political economic rent-seeking. Each successive transition is not a measure of human progress. Rather each is a necessity to avoid the catastrophic regime consequences of enforcing the previous regime.

So our “stagflation” can perhaps be better explained by the fact that we have no new regime to transition to. We are still propping up the old one. One of Thiel’s points was that now that “finance” had been outlawed, we are only left with “computer technology” as the basis of our new regime and this isn’t the foundation for anything other than serving a narrow economic interest. Of course, he was pointing at Eric Schmidt. And, of course, the narrow economic interest is “data-analytics.”

There are two essential directions of computing. One would be the cloud. And we are not merely talking about data. We are also talking about the application service layer. The second would be the move from IPv4 to IPv6. The former is a 32-bit addressing scheme that gives us roughly 4×109 nodes on the network. The latter is a 128-bit addressing scheme that would allow 4×1038 addressable nodes on the network. The exponential order of magnitude increase in addressable network nodes can hardly be comprehended or visually grasped by the human brain. The conceptual model(the thing that can be intuitively grasped) simply reduces to a statement that everything produced will be network addressable(with a permanent address). A number that large exceeds the total number molecules on the surface of the earth so certainly we are talking about the a future of network addressable nanotechnology.

A reason for great pessimism is that a political economic regime built over data analytics results in a prison. Innovation in this area within the context of a complexity capitalist model that is only “regulated” by positive feedback mechanisms is probably the type of innovation you want to avoid. The prospect of a IPv6 networked world should rightly scare the shit out of you. The consequences of this world really depends on its implementation. Obviously, the prospect of everything being permanently network addressable and subject to relentless data-analytics is orders of magnitude even beyond Orwell. However, this reality could be avoided or highly mitigated against by implementing the privacy features of IPv6, including data anonymization policies. The trick is that these anonymization algorithms have to be implemented by your network provider. And this algorithmic masking occurs at the network layer. So, as we see, IPv6, in terms of any privacy considerations, requires the network provider to operate algorithmically at the network layer. The notion of “network neutrality” is further seen to be a completely straw-man concept.

Why the US Government is Militarizing the Internet

By now it should be clear that the actual threat to the internet is not the integration of access and platform but rather the subjugation of convention and practice to the “rule of law.” That is to say, the greatest threat to the internet is the de-civilization of network administration under a formal legal compliance regime. IPv6 gives us the panopticon outcome if network policy is placed under such a formal regime. Your networked existence boils down to what type of competition determines the algorithmic masking at the network layer. The political competition solution paves the way for a “data-analytics” regime change.

If we take our theory seriously, it is not surprising that the US Military is actively engaged in militarizing the internet. It is actively and brazenly involved in disseminating rogue computer programs and remote system penetration for the expressed purpose of heightening the threat of the public network as an instrument of war. This in turn will result in some legal overhaul that will put the civil administration of the public network under a formal legal compliance regime under the guise of National Security.

Conclusion

We proffer A Declaration of the Separation of Internet and State because a political economy applied to the public network results in a panopticon social order. The original implementation may have been a hybrid, constructivist example of Hayek’s “Planning for Competition,” but the subsequent order demonstrates the threat of “planning against competition.” A glimpse of this future world can be had observing this recent action by Cisco. That action by Cisco was beat down and reversed by the outcry of its customer base. But when it is enforced by a formal “rule of law,” there isn’t much recourse. As of now, consider that just to be a dry run…

1 Unfortunately, this is not entirely true. Most will at least give lip service and pretense to enforcing copyright and pornography statute violations with respect to network. But it is mostly pretense. After all, tech support call centers aren’t there to process customer complaints of copyright violations over the network.

One thought on “Internet Freedom Part II: Regulating the Panopticon

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s