Another side of net neutrality: The case in favor of Title II

Original publication ©2014 Fierce Networks

In 2014 for FierceEnterpriseCommunications — perhaps the longest-titled newsletter ever to circulate in the hundreds of thousands — I chose to focus the publication on many of the aspects of communications that would fall (at that time) outside of the FierceNetworks newsletter’s bailiwick. One of them was the net neutrality debate in congress and throughout various agencies of the Obama administration. The Supreme Court had ruled that the Federal Communications Commission lacked the authority to enforce net neutrality principles for the Internet, since the Internet wasn’t really (in its view) a telecommunications service. That could change, however, if Congress chose to declare the Internet a Title II “common carrier.” In an interview with me, the man who wrote the words of the law the Supreme Court ruled upon, told me the high court got it wrong.

When Federal Communications Commission Chairman Tom Wheeler first raised the specter of Title II in late April, it was with reference to the part of the Telecommunications Act of 1996 that was intended to provide a framework for regulating the Internet infrastructure.  And it was introduced the way a substitute teacher threatens to assign seating in a classroom.  It’s not what anyone wants, evidently, but if he has to resort to it in order to restore some order to net neutrality, he’s prepared to swallow his pride.

Lost amid that drama was the question of what Title II really is.  There’s a common perception (and I’ve certainly done my part to advance it) that the principle of a “common carrier” applies best to the regulation of telephone service.  Now, a former staffer in the office of one of the lawmakers who drafted the ’96 Act — the late Sen. Ted Stevens (R – Alaska), who literally wrote the words on the bill’s draft pages — tells FierceEnterpriseCommunications that lawmakers knew full well at the time that the phrase “common carrier” applied to the transport mechanism at the infrastructure of the Internet.

Earl Comstock went on to serve as CEO of the COMPTEL trade organization for seven years, including testifying before Congress in hearings on net neutrality and other legislation.  His term having recently ended, Comstock now serves as a practicing attorney with the law firm Eckert Seamans Cherin & Mellott, LLC, focusing on federal legislative affairs.  In an extended interview with us, Comstock said he agrees with advocates of a system where the applications and functions that the Internet provides are unregulated, by the FCC or anyone else.  But the underlying network of networks deserves oversight, and he believes Title II is better suited to that task than anything the FCC has attempted since.

“What the basic tenet of ‘common carriage’ deals with is, essentially, unreasonable discrimination,” says Comstock.  “Everybody seems to think that, if I say it’s common carriage, I can never discriminate or do anything.  No, you can.  But you can’t do it on the basis that you don’t like people who are wearing black shoes, and you do like people wearing white shoes.  That kind of arbitrary distinction doesn’t fly.  On the other hand, if I happen to be somebody who’s using a million minutes and therefore you’re going to give me a discount of a penny a minute, because I’m buying a lot, that’s a normal sort of reasonable distinction that could be made.”

The key difference, Comstock continues, is that the carrier cannot then deny the same discount to some other customer.  Buying products in bulk or high-quantity at a discount has almost never been prohibited by law, unless those products are deemed hazardous substances or dangerous chemicals.

In Verizon v. FCC, the Appeals Court ruled that the FCC lacked the authority to regulate Internet service as in terms of a common carrier, in the same context that is used to describe it as an information service.  But as Comstock points out, the court provided a roadmap with which it could enforce common carrier: by essentially striking the context which that enforcement would inherently contradict.

Back in 1980, the FCC implemented a concept of structural separation between telecommunications facilities and telecom functions or applications.  This was necessary to artificially create a system whereby the “Baby Bells” broken off from the original AT&T Corp. could compete in what was then called the “enhanced services” market.  That FCC regulatory regime, today known as Computer II, gave birth to the notion of infrastructure separate from applications in the law (it had been part of telecom architecture for some time, of course).

What the basic tenet of ‘common carriage’ deals with is, essentially, unreasonable discrimination.  Everybody seems to think that, if I say it’s common carriage, I can never discriminate or do anything.  No, you can.  But you can’t do it on the basis... of arbitrary distinction.
— Earl Comstock, Former CEO, COMPTEL

That same regime was carried forth in the ’96 Act, says Comstock, which as he interprets it, states that as long as all service providers can obtain access to the underlying transport layer on equal terms with one another, the terms themselves would not need to be regulated.  As time passed, he says, successive Congresses and successive Commissions lost track of how new technologies (broadband, Wi-Fi, cloud) were from a legal perspective scaled-up versions of the same basic constructs.

Earl Comstock presently serves as Senior Policy Counsel in the International Trade Group at White & Case LLP. Previously, he served as the Policy and Strategic Planning Director for the US Commerce Dept.

“Essentially the FCC made a decision, for whatever reason, that they did not like the framework that Congress had adopted.  So they started defining people as ‘information service providers’ — a term not found in the statute — in order to create a situation for which Congress had supposedly provided no rules...  Using their status as the ‘expert agency,’ they got away with it...  The FCC decided to overturn their prior decisions, but they also, frankly, presented a lot of false information to the courts.  They made blatantly wrong statements about the nature of the services being offered.”

Those statements were formal interpretations of Congressional mandates of how the FCC should carry out its regulatory mandate, to which the law says, when in doubt, even the courts must defer (“Chevron deference”).  In Part 2 of our conversation with Earl Comstock next week, we’ll talk about how he believes the original legal building blocks for sensible Internet regulation — legal precedent dating back to 1954 — were obscured by a more populist perspective of the Internet as both an inviolable structure and a human right.


Part 2:

Earl Comstock: Legal Precedent for the Cloud Dates Back to 1956

Cloud computing is typically presented in the context of pages such as these, as a completely new and often foreign phenomenon.  It alters the complexion of data centers and converts software into services.  One very seldom heard argument is that cloud computing — effectively the merger between information and telecommunications systems on a colossal scale — could not have come into existence without an effective legal precedent.

The date of that precedent is January 24, 1956, says former congressional staffer and former Capitol Hill telco industry advocate Earl Comstock, now a practicing attorney with Eckert Seamans Cherin & Mellott, LLC, in Washington, D.C.  It was on this date that the original AT&T Corp. signed its first consent decree with the U.S. Dept. of Justice (the second would come in 1982) enabling it to keep its monopoly over the nation’s telephone system so long as it enjoined itself from entering into any other industry.

In Part 1 of FierceEnterpriseCommunications’ interview with Comstock last week, he made a case for the FCC ending its Open Internet debate now, and resuming its prior policy of regulating the Internet somewhat like a telecommunications service — as it had been doing prior to Chairman Michael Powell’s first articulation of net neutrality principles.  Now, Comstock tells us, nearly six decades ago, AT&T floated the notion that large-scale computers would require telecommunications systems — and therefore long lines — to share processing power and data storage.  It was purchasing computers for this purpose, and wanted to recoup some of its costs.  Even back then, AT&T foresaw the possibility of selling excess computing capacity from its switching stations, using a utility billing model.

“They didn’t call it ‘cloud computing,’” says Comstock, “but essentially having your computer processing ride in the cloud, that’s an idea that goes back to the original Computer I decision.

“People don’t realize this, but in the 1960s, when they started moving to digital switching equipment and using computers to switch telephone calls,” he continues, “the telephone companies said, ‘Wait a minute.  I’ve got this excess processing capacity in these big switches that businesses could use to do other things.  I’m using it to switch telephone calls, but they could use it to do something else when I don’t need it.’  That’s what started this whole process, in the late 1960s, on through the ‘70s, to the 1980 [Computer II] decision.”

Two of AT&T’s competitors in this ancient prototype of the cloud market were IBM and EDS.  They relied on the telephone network to perform maintenance on their mainframes remotely — remote administration.  These two competitors became AT&T’s first customers for raw compute capacity.  One clear reason this business was created in the first place was to enable AT&T to gain a foothold in the emerging computer market, while adhering on the surface to the terms of its 1956 DOJ consent decree.

“This has been part of the problem all along, and both the Internet community and, frankly, the FCC has fallen victim to it:  People have simply made up new names for old services,” remarks Comstock.  “Nobody talks about mainframes anymore; they talk about cloud computing.  If you look at what cloud computing is doing, it’s just a reincarnation of the old mainframe system.”

But lawmakers today clearly perceive cloud computing service as residing on an entirely different infrastructure, both structurally and conceptually.  Is this simply because the names were changed?  “If people don’t realize they’re talking about essentially the same thing,” he responds, “then obviously it becomes easier for them to be given the impression that they need new legislation.  And that obviously works to certain people’s advantage.  But for [FCC Chairman] Wheeler, who is certainly a student of history himself, it’s not hard for him to go back and find these documents.  I pull them up on the Web all the time.  They’ll show you, yes, Congress was thinking about, talking about this, and the public was talking about this.

“What I find comical — and frankly, a tragedy,” Earl Comstock continues, “is that the ‘expert agency’ is an active participant in this misinformation to the public.  Take the current net neutrality debate:  They talk about a ‘broadband Internet service provider.’  That’s not found in the statute anywhere.  They don’t use the terms that are already there.”  But transmission speeds on the order of 45 Mbps — which is still faster than many broadband customers experience today — were under open discussion as far back as the 1960s.

Previous
Previous

Great Depression

Next
Next

Securing Your Business in the Cloud: Asset Management