Archive for the ‘Net Neutrality’ Category

The FCC Tries Yet Again

Wednesday, February 19th, 2014

FCC Chairman Tom Wheeler’s official response to the DC Appeals Court decision on the Commission’s “net neutrality” rules promises to keep the issue on the table for the foreseeable future.  That is unfortunate, because there are better ways for the Commission and its staff to spend their time.

The Appeals Court took away from the Commission with one hand, while giving back with the other:  It struck down the more onerous provisions of the net neutrality rules—the “anti-discrimination” and “anti-blocking” provisions—because they imposed common carrier obligations and broadband is not classified as a Title II common carrier service.  However, the Court affirmed the Commission’s argument that it has general authority (under section 706 of the Communications Act) to regulate in order to encourage broadband deployment.

Since the Appeals Court decision came down, the FCC has been under considerable pressure from net neutrality proponents to reclassify broadband as a Title II common carrier.  In today’s announcement, the Commission declined to do that. However, the Commission also declined to close the Title II docket, keeping the threat of reclassification and the regulatory burdens and oversight that go with it, alive.

In addition, the Commission announced its intention to start yet another net neutrality rulemaking, under its section 706 authority, in order to fulfill the Commission’s no blocking and non-discrimination goals as well as to enhance the transparency rule (the one major provision that the court upheld).

With all the activity aimed towards asserting legal justification for its net neutrality rules, it sometimes gets lost that the FCC had no convincing economic or consumer welfare justification for the rules in the first place.

While there is widespread agreement that the Internet should be open and provide consumers with access to content, applications and services of their choice, the rule was always a solution in search of a problem, a sentiment echoed today by FCC Commissioner Pai.  The Commission never provided the necessary data and analysis to show that the rules would address a significant market failure, did not identify harms to users that the rules would remedy, and did not demonstrate that the benefits of the rules would exceed their costs.  In other words, the Commission neglected to explain why the broadband market, which has generally thrived under minimal regulation, should now be subject to an enhanced regulatory regime.   Indeed, a good argument can be made that, by making the adoption of innovative business models more difficult, the rules would have hindered rather than encouraged the deployment of broadband infrastructure, notwithstanding the Commission’s assertions to the contrary.

There is now substantial concern that the Appeals Court has expanded the Commission’s authority to include the entire Internet ecosystem—including potentially content, applications, and service providers—as long as it can make some plausible argument that its actions encourage broadband deployment.  Expanding the Commission’s domain in this way would be a serious mistake and would compound the harm.

A major goal of the Commission in promulgating its net neutrality rules initially was to “provide greater predictability.”  It clearly has not achieved that goal.  Starting yet another proceeding, and keeping the Title II docket open, will create even more uncertainty for the entire Internet ecosystem.

Internet Hysteria – Are We Losing Our Edge?

Thursday, December 15th, 2011

Scott Wallsten and Amy Smorodin

From Anthony Wiener’s wiener to the FCC’s brave stand on Americans’ shameful inability to turn down the damn volume by themselves, 2011 has been a big year for tech and communications policy. But how has one of the Washington tech crowd’s most important products—Internet hype—fared this year?  In this post, we seek to answer this crucial question.

The Internet Hysteria Index

The Internet is without doubt the most powerful inspiration for hyperbole in the history of mankind. Some extol the Internet’s greatness, like Howard Dean, who called the Internet “the most important tool for re-democratizing the world since Gutenberg invented the printing press.”[1] Others fret about the future, like Canada’s Office of Privacy Commissioner, who claimed, “Nothing in society poses as grave a threat to privacy as the Internet Service Provider.”[2]

Sometimes the hyperbole is justified. For example, thanks to Twitter, attendees at this past summer’s TPI Aspen Summit were privy to a steady stream of misinformation even before the DC-area earthquake stopped.[3]

In the same spirit, we present the Internet Hysteria Index (IHI). The IHI, which the DOJ and FCC should take care not to confuse with the HHI, is the most rigorous and flexible tool ever conceived for gauging the Internet’s “worry zeitgeist”. It’s rigorous[4] because it uses numbers and flexible[5] because you can interpret it in so many different ways that it won’t threaten your preconceived ideas no matter what you believe.

The IHI has two components. The first tracks fears of an unrecognizable, but certainly Terminator-esque, future Internet. We count the number of times the exact phrases “the end of the internet as we know it” and “break the internet” appear in Nexis news searches each year since 2000.

Figure 1: The End of the Internet as we Know It!


Figure 1 shows that 2011 produced a bumper crop of “break the internet” stories, mostly related to the Stop Online Piracy Act and the Protect IP Act. The spike in 2006 reflects a wave of Net Neutrality stories after AT&T’s then-CEO proclaimed that “what they [content providers] would like to do is use my pipes free, and I ain’t going to let them do that because we have spent this capital and we have to have a return on it.”

As our research illustrates, the “End of the Internet” hyperbole shows a healthy, generally upward trend, reflecting the effectiveness of our collective fretting and hand-wringing. Our data do not allow us to identify[6] whether the trend is due to clever Washington PR, lazy hacks retreading old lines, real concerns, or collusion among interest groups simply ensuring they can all stay in business by responding to each other.

The second component of our index measures the incidence of hand-wringing regarding the state of broadband in the U.S. In particular, this measure counts the number of times phrases suggesting lagging U.S. broadband performance show up in Nexis since 2000.[7] Figure 2 shows the results of our analysis.

Figure 2: The Grass is So Much Greener on the Other Side of the Pond: U.S. Broadband Sucks


The big spike in 2010 is related to release of the National Broadband Plan. The prior high, in 2007, saw stories focusing on the OECD rankings, broadband mapping, and the beginnings of broadband plan discussions.

Unfortunately, 2011 was not a good year for misinterpreting shoddily-gathered statistics. Figure 2 shows a dramatic drop-off in bemoaning the dire state of U.S. broadband, possibly after everyone just got really, really tired of talking about the National Broadband Plan. We’re extremely concerned that as a result, the U.S. may have fallen dramatically in the OECD worry rankings. In fact, in a warning shot across our bow, on December 14 the BBC reported that “the UK remains in danger of falling behind when it comes to next-generation mobile services” and superfast broadband.[8] We’re hopeful American fretting will pick up once analysts actually read the FCC’s USF order that was promulgated under the cover of 23 days between approval and publication. On the other hand, there is a risk that the sheer volume of the Order—the equivalent of more than 4 million tweets—might dissuade people from talking about it ever again.

For generations, Americans have taken a back seat to nobody on the important issue of Internet hyperbole. Let’s hope the inside-the-beltway crowd pulls itself together and breathes some life back into the speech economy. Happy New Year.


[1] http://motherjones.com/politics/2007/06/interview-howard-dean-chairman-democratic-national-committee

[2] http://dpi.priv.gc.ca/index.php/essays/the-greatest-threat-to-privacy/

[3] Picture from Funny Potato, http://www.funny-potato.com/blog/august-23rd-2011-east-coast-quake.

[4] It’s not.

[5] In other words, “probably pretty meaningless.”

[6] Actually, they do, but we don’t want to do the work.

[7] Specifically, the search is ((“U.S. falling behind “OR “U.S. lagging”) AND broadband) OR ((“United States falling behind” OR “United States lagging”) AND broadband).

[8] http://www.bbc.co.uk/news/technology-16174745

Net Neutrality Regulation’s First Target: Small Wireless Competitors?

Friday, January 14th, 2011

Telecommunications regulations have a long history of protecting incumbents, often because incumbents are able to use the regulatory process to insulate themselves from competition.  Unfortunately, we already see the seeds of that outcome in the response to a restrictive data plan offered by MetroPCS, but in this case due not to the actions of incumbents, but rather to the actions of some public interest groups.

MetroPCS, a regional mobile provider, offers a number of service plans with different voice and data combinations.  Its cheapest plan is $40 per month and offers unlimited voice, messaging, and web access.  The unlimited web access, however, does not allow access to certain sites like Netflix and Skype, but does allow access to YouTube.  Access to the full Internet requires a more expensive plan.

Net neutrality advocates argue that the restricted plans violate at least the spirit, if not the letter, of the new regulations.  The advocates may very well be correct, and that’s the problem.

MetroPCS is a small player in the mobile market, as the table from the FCC below demonstrates. It has no market power. Subscribers are not “locked in” when they sign up because they don’t have to sign contracts.

Wireless Subs Year-End 2009

Source: FCC 14th Annual Report and Analysis of Competitive Market Conditions With Respect to Mobile Wireless, Including Commercial Mobile Services. 2010. P.9. Note that these are voice subscribers.

MetroPCS must believe that this combination of unlimited voice and unlimited use of a restricted set of web services will appeal to some people, and that walling off certain parts of the Internet will reduce its costs.

As an entrant in a high fixed-cost market, MetroPCS must find ways to differentiate itself from the larger carriers and reduce costs if it is to succeed. While it sounds appealing on its face to make the entire web accessible to MetroPCS subscribers, requiring MetroPCS to offer precisely the same services as larger carriers could leave it with no sustainable business model.

Allowing MetroPCS to experiment with business plans does not, however, mean that it should mislead consumers.  Our perusal of its website and calls to customer service left us confused about which services, exactly, it excludes from the plan.  Presumably MetroPCS uses a well-defined algorithm for deciding which sites it excludes. It should be able to explain that algorithm to potential subscribers, though any harm is limited due to the absence of contracts, meaning that consumers can switch plans or cancel if they find the restrictions too onerous.

Despite this (hopefully soon-to-be-rectified) transparency issue, this plan is a business model that one of the smallest players in the mobile industry hopes will help it to compete successfully against its much bigger rivals.

Prohibiting MetroPCS from offering its new plan would benefit the large, incumbent carriers, not consumers. Let MetroPCS experiment.  It would be a shame if the Commission’s first enforcement action under the new regulation reduces wireless competition.

Peering or End of the Internet as we know It?

Tuesday, December 7th, 2010

One of the top tech stories in the headlines of late is the dispute between Comcast and Level 3.  For those of you who were ignoring mass media last week: the dispute is over traffic handling agreements, apparently spurred by the announcement of a deal between Level 3 and Netflix to carry the latter’s streaming video.

Depending on who you listen to, the Comcast / Level 3 dispute is either a simple peering disagreement blown out of proportion or a gross violation of network neutrality principles and the beginning of the “tiered internet.”  So, where does the truth lie?  According to TPI’s Scott Wallsten, probably somewhere in the middle.   

Scott shared his views on the Comcast / Level 3 dispute on the EEEI Spectrum podcast, “This Week in Technology,” hosted by Steven Cherry.  In short, he identified the conflict as a dispute over who will pay for the increased cost resulting from the video traffic.  He also explained that there is a possibility of incentives to erect barriers to competition in such a deal but that the DOJ can adequately deal with such concerns through antitrust enforcement.  Scott also discusses his predictions on the outcome of the dispute and the future of the video delivery business.  The podcast can be found here.

Antitrust and Vertical Integration in ‘New Economy’ Industries

Monday, November 8th, 2010

Where does a firm end and a market begin? 

This existential query is discussed by Bruce Owen in “Antitrust and Vertical Integration in ‘New Economy’ Industries” released today by TPI.

Referring to Adam Smith’s analogy of division of labor in 18th century pin-making, Bruce illustrates how vertical integration is inherent in all firms, and therefore cannot be viewed on its own as a predictor of market problems.

So what is the take-away for regulators regarding tech and media policy?

As Owen so eloquently explains: “Toadying to uninformed populist fears of vertical integration between network providers and content creators by imposing investment-dampening ex ante regulatory constraints is likely to be far less useful to the public than steps to ensure effective competition among network providers.”

My release is here and the entire paper can be found here.

The FCC Tries to Find Its Way

Monday, June 21st, 2010

Three months after the Comcast decision the FCC issued a Notice of Inquiry (NOI) asking, basically, “what should we do now?”  Not being a lawyer, I have a difficult time understanding, let alone caring, whether the FCC’s regulatory authority derives from Title I or Title II.  As an economist, however, I do care about the content of proposed regulations.

So what problem does this NOI seek to solve?  It does not propose directly any new rules industry must follow.  Instead, it seeks a framework in which the FCC can regulate broadband in the future.

In other words, this NOI does not address how industry should behave, but rather how the FCC itself should behave.  Somewhat ironically, therefore, this NOI asks how to regulate the FCC, and reveals an existential problem for the Commission: what is it supposed to do, and does it have the authority to do it?

The Commission is to be commended for laying out the legal issues, but the road forward it proposes is inherently flawed.  Consider that scholars of new institutional economics generally agree that in order to be effective, regulatory institutions must meet several criteria: they must be independent from short-term political influence, transparent, accountable, and have clear limits on the extent of their jurisdiction.[1]

The NOI highlights the problem the FCC now faces as an institution—it does not know the extent or limits of its jurisdiction.  Ultimately, the FCC cannot set those boundaries itself.  Instead, it is up to Congress to define the FCC’s mission and the Courts to define the extent of its authority within its legislative mandate.

The current confusion is not the FCC’s fault.  Our telecommunications laws are antiquated and no longer appropriate for the fast-changing world of broadband and information technologies.  No amount of reclassification, forbearance, or other fancy footwork can change that basic fact.

It is time for Congress to rewrite our telecommunications laws in ways that do not rely on arbitrary definitions of services, and instead create an analytical framework flexible enough to accommodate these rapidly changing industries.  A new telecommunications law could recognize, for example, the inherent antitrust issues in many current debates, such as the question of vertical relationships underlying net neutrality.

Such a rewrite involves risks, to be sure.  Every interest group will fight to influence the process for good and ill, nobody will end up entirely happy, and we could end up with laws worse than those we have today.  Regardless, we could at least rest assured that a new law would better reflect the will of the people, as expressed through their elected representatives, than would the FCC’s current attempt to fit a square peg in a round hole.

Congress already appears to be taking the beginning steps in rewriting the 1996 Telecommunications Act.  It should view the NOI as a cry for help and further evidence that it should take action.  A regulatory agency simply cannot function properly when it has to ask in a public notice what it is allowed to do and how.  The courts have attempted to define boundaries in recent decisions, but the Commission believes it must act to meet Congress’s objectives.  But only Congress can define its objectives, and the time has arrived for it to do so.


[1] See Noll (2000) or Wallsten, et al (2004) for discussions (Noll, Roger. 2000. Telecommunications Reform in Developing Countries. SIEPR Policy Paper. Wallsten, Scott, George Clarke, Luke Haggarty, Rosario Kaneshiro, Roger G. Noll, Mary Shirley, and Lixin Colin Xu. 2004. New Tools for Studying Network Industry Reforms in Developing Countries: The Telecommunications and Electricity Regulation Database. Review of Network Economics 3, no. 3: 248-282.)  See also Weiser (2009) for a discussion of institutional features of Internet regulation (Weiser, Philip J. 2009. The Future of Internet Regulation. Legal Studies Research Paper Series 09, no. 02).

Regulating the Internet

Thursday, May 6th, 2010

So much for data-driven policy:

The Third Way