Archive for the ‘Broadband’ Category

Comcast and Netflix—What’s the Big Deal?

Wednesday, February 26th, 2014

Netflix and Comcast recently announced an agreement whereby Netflix will pay Comcast for direct access to its network.  This agreement addresses congestion that is slowing delivery of Netflix videos to Comcast’s broadband subscribers and resolves a dispute between the two companies concerning how to pay for the needed network upgrades.  Netflix and Verizon are currently working through a similar dispute.  While some commentators think deals such as the one between Netflix and Comcast are problematic, the reality is that the agreement reflects a common market transaction that yields an outcome more efficient and more quickly than any regulatory intervention could have.

The following series of stylized figures illustrate how the growth of Netflix and other streaming video services have affected the volume and flow of internet traffic and corresponding payments in recent years.  Traditionally (Figure 1), Internet backbone providers and ISPs entered into “peering” agreements, which did not call for payments on either side, reflecting a relatively balanced flow of traffic.  Content distributors paid backbone providers for “transit,” reflecting the unbalanced flow of traffic along that route.

Slide1

With the growth of online video and with Netflix accounting for 30 percent of traffic at some times of the day, this system was bound to become strained, as we are now seeing and as shown in Figure 2.  The flow of traffic between the backbone provider and the ISP is unbalanced and has grown enormously, requiring investments in additional capacity.

Slide2

One way to address this problem is for the backbone provider to pay the ISP, reflecting the greater amount of traffic (and greater capacity needed) going in that direction (see Figure 3).  In fact, that is what happened following a dispute between Level 3 and Comcast in late 2010.

Slide3

Another solution is the just-announced Comcast-Netflix deal, reflected in Figure 4.  In this case, Netflix/Comcast is bypassing the intermediate backbone provider (either partially or completely), presumably because it is more efficient to do so.  One or both of them is investing in the needed capacity.  Regulatory interference with such a deal runs the risk of blocking an advance that would lower costs and/or raise quality to consumers.

Slide4

The Wall Street Journal has described the debate as being “over who should bear the cost of upgrading the Internet’s pipes to carry the nation’s growing volume of online video:  broadband providers like cable and phone companies, or content companies like Netflix, which make money by sending news or entertainment through those pipes.”  Ultimately, of course, consumers pay one way or the other.  When Netflix pays Comcast, the cost is passed through to Netflix subscribers.  This is both efficient and fair, because the consumer of Netflix services is paying for the cost of that service.

In the absence of such an agreement, quality would suffer or the ISP would bear the cost.  The ISP might recover these costs by increasing prices to subscribers generally.  This would involve a cross-subsidy of Netflix subscribers by non-subscribers, which would be neither efficient nor fair.  Alternatively, Comcast could increase prices for those subscribers who consume a lot of bandwidth, which might have similar effects to the just-announced deal, but would probably lose some efficiencies.  In any event, it is difficult to see how such an arrangement would be better for consumers than the announced agreement.

 

 

The FCC Tries Yet Again

Wednesday, February 19th, 2014

FCC Chairman Tom Wheeler’s official response to the DC Appeals Court decision on the Commission’s “net neutrality” rules promises to keep the issue on the table for the foreseeable future.  That is unfortunate, because there are better ways for the Commission and its staff to spend their time.

The Appeals Court took away from the Commission with one hand, while giving back with the other:  It struck down the more onerous provisions of the net neutrality rules—the “anti-discrimination” and “anti-blocking” provisions—because they imposed common carrier obligations and broadband is not classified as a Title II common carrier service.  However, the Court affirmed the Commission’s argument that it has general authority (under section 706 of the Communications Act) to regulate in order to encourage broadband deployment.

Since the Appeals Court decision came down, the FCC has been under considerable pressure from net neutrality proponents to reclassify broadband as a Title II common carrier.  In today’s announcement, the Commission declined to do that. However, the Commission also declined to close the Title II docket, keeping the threat of reclassification and the regulatory burdens and oversight that go with it, alive.

In addition, the Commission announced its intention to start yet another net neutrality rulemaking, under its section 706 authority, in order to fulfill the Commission’s no blocking and non-discrimination goals as well as to enhance the transparency rule (the one major provision that the court upheld).

With all the activity aimed towards asserting legal justification for its net neutrality rules, it sometimes gets lost that the FCC had no convincing economic or consumer welfare justification for the rules in the first place.

While there is widespread agreement that the Internet should be open and provide consumers with access to content, applications and services of their choice, the rule was always a solution in search of a problem, a sentiment echoed today by FCC Commissioner Pai.  The Commission never provided the necessary data and analysis to show that the rules would address a significant market failure, did not identify harms to users that the rules would remedy, and did not demonstrate that the benefits of the rules would exceed their costs.  In other words, the Commission neglected to explain why the broadband market, which has generally thrived under minimal regulation, should now be subject to an enhanced regulatory regime.   Indeed, a good argument can be made that, by making the adoption of innovative business models more difficult, the rules would have hindered rather than encouraged the deployment of broadband infrastructure, notwithstanding the Commission’s assertions to the contrary.

There is now substantial concern that the Appeals Court has expanded the Commission’s authority to include the entire Internet ecosystem—including potentially content, applications, and service providers—as long as it can make some plausible argument that its actions encourage broadband deployment.  Expanding the Commission’s domain in this way would be a serious mistake and would compound the harm.

A major goal of the Commission in promulgating its net neutrality rules initially was to “provide greater predictability.”  It clearly has not achieved that goal.  Starting yet another proceeding, and keeping the Title II docket open, will create even more uncertainty for the entire Internet ecosystem.

Where do vendors to cable think the industry is heading? Evidence from 2013 Cable Show data

Tuesday, June 11th, 2013

For the past four years (2010 – 2013) I have been collecting data about exhibitors at the Cable Show. Key observations based on the most recent data:

  • The number of exhibitors continues to decline, down to 251 in 2013 from 345 in 2010 (Figure 1).
  • Programming is the most popular exhibitor category, and has been steadily increasing in popularity since 2010. In 2013 nearly one-third of exhibitors classify themselves under programming. Multi-screen content, HDTV, video on demand, and IPTV are the second, third, fourth, and fifth most popular categories (Figure 2).
  • The categories with the biggest increases in representation since 2010 are multi-screen content, programming, HDTV, new technology, and cloud services (Figure 3).
  • The categories with the biggest decreases in representation since 2010 include telecommunications equipment, services, and VOIP (Figure 4).

Exhibitor attendance

This year, the website listed 251 exhibitors, continuing a steady decline from 2010 (Figure 1). The number is biased upwards because an exhibitor can be counted multiple times if it appears in multiple booths.

Figure 1: Number of Cable Show Exhibitors, 2010-2013

Number of Exhibitors

 

Hot or Not?

The website shows the categories of products, services, or technologies each exhibitor selects to describe itself. An exhibitor can select several categories. To evaluate the prevalence of each category I total the number of times each category is selected, and then divide that by the number of exhibitors to make it comparable across years.

The table below shows the top 20 categories for 2010 – 2013. Programming has remained the top category for all four years. However, multi-screen content jumped to second place, followed by HDTV, pushing video on demand and IPTV to numbers four and five.

topcategories

 

Hot

Figure 2 shows how the top 5 exhibitor categories for 2013 have evolved over the past four years. Fully one-third of all exhibitors classify themselves as programming, nearly twice as many as in 2010. Multi-screen content did not exist as a category in 2010 while 16 percent of all exhibitors included themselves in this category in 2013.

Figure 2: Share of Exhibitors with Products in Top 5 2013 Categories Over Time

topcategorychart

 

Consistent with the above figure, from 2010 – 2013 cable programming increased in representation more than any other category. Multi-screen content saw the second-largest increase, followed by mobile apps, new technology, and cloud services.

Figure 3: Categories with Biggest Increase in Representation Since 2010biggestincreases

Not

Telecommunications services and equipment has seen the biggest decrease in representation since 2010, followed by VOIP, program guides, and optical networking. However, because “program guides” was not included as a category in 2013 it is not clear if the category truly became less popular or is now simply called something else.

Figure 4: Categories with Biggest Decreases in Representation Since 2010

biggestlosers

 

What does this mean?

The data themselves have certain problems that make drawing strong conclusions difficult. For example, counting exhibitors and categories implicitly assumes that each exhibitor is identical in size and importance, which clearly is not true (Figure 5). Additionally, the categories are self-reported by the exhibitors and do not appear to have strict definitions. Exhibitors have no incentive to select grossly inaccurate categories, since that would attract people unlikely to purchase their products, but exhibitors probably tend towards being overly-inclusive so as not to miss potential clients. This tendency might bias towards especially popular technologies. For example, perhaps exhibitors take liberties in claiming they offer “cloud services” because those contain popular buzzwords rather than because their products truly offer much in the way of those services.

Despite these shortcomings in the data, they provide one source of information on where economic actors with money at stake think the industry is headed over the next year. And, according to them, this year the industry is trending more towards its traditional role as video provider, focusing on programming and multi-screen content.

Figure 5: Exhibitor Map, 2013 Cable Show

map

 

Unleashing the Potential of Mobile Broadband: What Julius Missed

Thursday, March 7th, 2013

In yesterday’s Wall Street Journal op-ed, FCC Chairman Genachowski correctly focuses on the innovation potential of mobile broadband.  For that potential to be realized, he points out, the U.S. needs to make more spectrum available.  A spectrum price index developed by my colleague, Scott Wallsten, demonstrates what most observers believe – that spectrum has become increasingly scarce over the last few years.

The Chairman’s op-ed highlights three new policy initiatives the FCC and the Obama Administration are taking in an attempt to address the spectrum scarcity:  (1) the incentive auctions designed to reclaim as much as 120 MHz of high-quality broadcast spectrum for flexibly licensed – presumably, mobile broadband – uses;   (2) freeing up the TV white spaces for unlicensed uses; and (3) facilitating sharing of government spectrum by private users.

There are two notable omissions from the Chairman’s list.  First, he does not mention the 150 MHz of mobile satellite service (MSS) spectrum, which has been virtually unused for over twenty years due to gross government mismanagement.  A major portion of this spectrum, now licensed to three firms – LightSquared, Globalstar, and Dish – could quickly be made available for mobile broadband uses. The FCC is now considering a proposal from LightSquared that would enable at least some of its spectrum to be productively used.  That proposal should be approved ASAP.  The MSS spectrum truly represents the low-hanging fruit and making it available should be given the same priority as the other items on the Chairman’s list.

Second, if the FCC and NTIA truly want to be innovative with respect to government spectrum, they should focus on the elusive task of developing a system that requires government users to face the opportunity cost of the spectrum they use.  This is currently not the case, which is a major reason why it is so difficult to get government users to relinquish virtually any of the spectrum they control.  To introduce opportunity cost into government decision making, Larry White and I have proposed the establishment of a Government Spectrum Ownership Corporation (GSOC). A GSOC would operate similarly to the General Services Administration (GSA).  Government agencies would pay a market-based “rent” for spectrum to the GSOC, just as they do now to the GSA for the office space and other real estate they use.  Importantly, the GSOC could then sell surplus spectrum to the private sector (as the GSA does with real estate). The GSOC would hopefully give government agencies appropriate incentives to use spectrum efficiently, just as they now have that incentive with real estate.  This would be a true innovation.

In the short run, administrative mechanisms are probably a more feasible way to make more government spectrum available.  For example, White and I also proposed cash prizes for government employees who devise ways their agency can economize on its use of spectrum.  This would be consistent with other government bonuses that reward outstanding performance.

Sharing of government spectrum is a second-best solution.  It would be far better if government used its spectrum more efficiently and more of it was then made exclusively available to private sector users.  This is, admittedly, a difficult task, but worth the Administration’s efforts.

Unintended—But Not Necessarily Bad—Consequences of the 700 MHz Open Access Provisions

Tuesday, November 6th, 2012

Wireless data pricing has been evolving almost as rapidly as new wireless devices are entering the marketplace. The FCC has mostly sat on the sidelines, watching developments but not intervening.

Mostly.

Last summer, the FCC decided that Verizon was violating the open access rules of the 700 MHz spectrum licenses it purchased in 2008 by charging customers an additional $20 per month to tether their smartphones to other devices. Verizon paid the fine and allowed tethering on all new data plans.[1]

Much digital ink has been spilled regarding how to choose a shared data plan best-tailored for families with a myriad of wireless devices and demand for data. Very little, however, appears to have been said about individual plans and, more specifically, about those targeted to light users.

One change that has gone largely unnoticed is that Verizon effectively abandoned the post-paid market for light users after the FCC decision.

Verizon no longer offers individual plans. Even consumers with only a single smartphone must purchase a shared data plan. That’s sensible from Verizon’s perspective since mandatory tethering means that Verizon effectively cannot enforce a single-user contract. The result is that Verizon no longer directly competes for light users.

The figure below shows the least amount of money a consumer can pay each month on a contract at the major wireless providers. As the table below the figure highlights, the figure does not present an apples-to-apples comparison, but that’s not the point—the point is to show the choices facing a user who wants voice and data, but the smallest possible amount of each.

Note: Assumes no data overages.

The figure shows that this thrifty consumer could spend $90/month at Verizon, $60/month at AT&T, $70/month at T-Mobile, and $65/month at Sprint if the consumer is willing to purchase voice/text and data plans separately. Even Verizon’s prepaid plan, at $80/month, costs more than the others’ cheapest postpaid plans.

Moreover, prior to the shift to “share everything” plans, this consumer could have purchased an individual plan from Verizon for $70/month—$20/month less than he could today. At AT&T the price was $55/month but increased by only $5/month. Again, the point is not to show that one plan is better than another. Verizon’s cheapest plan offers 2 GB of data, unlimited voice and texts, and tethering while AT&T’s cheapest plan offers 300 MB of data, 450 voice minutes, and no texts or tethering. Which plan is “better” depends on the consumer’s preferences. Instead, the point is to show the smallest amount of money a light user could spend on a postpaid plan at different carriers, and that comparison reveals that Verizon’s cheapest option is significantly more expensive than other post-paid options and, moreover, increased significantly with the introduction of the shared plan.

Is the FCC’s Verizon Tethering Decision Responsible for this Industry Price Structure?

There’s no way to know for sure. The rapidly increasing ubiquity of households with multiple wireless device means that shared data plans were probably inevitable. And carriers compete on a range of criteria other than just price, including network size, network quality, and handset availability, to name a few.

Nevertheless, Verizon introduced its “share everything” plans about a month before the FCC’s decision. If we make the not-so-controversial assumption that Verizon knew it would be required to allow “free” tethering before the decision was made public and that individual plans would no longer be realistic for it, then the timing supports the assertion that “share everything” was, at least in part, a response to the rule.

How Many Customers Use These “Light” Plans?

Cisco estimated that in 2011 the average North American mobile connection “generated” 324 megabytes. The average for 2012 will almost surely be higher and even higher among those with higher-end phones. Regardless, even average use close to 1 Gb would imply a large number of consumers who could benefit from buying light-use plans, regardless of whether they do.

Did the FCC’s Tethering Decision Benefit or Harm Consumers?

It probably did both.

The consumer benefits: First, Verizon customers who want to tether their devices can do so without an extra charge. Second, AT&T and Sprint followed Verizon in offering shared data plans, with AT&T’s shared plans also including tethering. Craig Moffett of Alliance Bernstein noted recently that “Family Share plans are not, as has often been characterized, price increases. They are price cuts…”[2] because the plans allow consumers to allocate their data more efficiently. As a result, he notes, these plans should cause investors to worry that the plans will reduce revenues. In other words, the shared plans on balance probably represent a shift from producer to consumer surplus.

The consumer costs: Verizon is no longer priced competitively for light users.

The balance: Given that other carriers still offer postpaid plans to light users and that a plethora of prepaid and other non-contract options exist for light users, the harm to consumers from Verizon’s exit is probably small, while the benefits to consumers may be nontrivial. In other words, the net effect was most likely a net benefit to consumers.

What Does This Experience Tell Us?

The FCC’s decision and industry reaction should serve as a gentle reminder to those who tend to favor regulatory intervention: even the smallest interventions can have unintended ripple effects. Rare indeed is the rule that affects only the firm and activity targeted and nothing else. More specifically, rules that especially help the technorati—those at the high end of the digital food chain—may hurt those at the other end of the spectrum.

But those who tend to oppose regulatory intervention should also take note: not all unintended consequences are disastrous, and some might even be beneficial.

Is That a Unique Observation?

Not really.

Could I Have Done Something Better With My Time Instead of Reading This?

Maybe. Read this paper to find out.


[1] The FCC allowed Verizon to continue charging customers with grandfathered “unlimited” data plans an additional fee for tethering.

[2] Moffett, Craig. The Perfect Storm. Weekend Media Blast. AllianceBernstein, November 2, 2012.

Is a broadband tax a good idea?

Thursday, August 30th, 2012

The FCC recently asked for comments on a proposal to raise money for universal service obligations by taxing broadband connections. Let’s set aside, for the moment, the question of whether the universal service program has worked (it hasn’t), whether it is efficient (it isn’t), and whether the reforms will actually improve it (they won’t). Instead, let’s focus on the specific question of whether taxing broadband is the best way to raise money for any given program telecommunications policymakers want to fund.

The answer, in typical economist fashion, is that it depends.

A tax is generally evaluated on two criteria: efficiency and equity. The more “deadweight loss” the tax causes, the more inefficient it is. Deadweight loss results from people changing their behavior in response to the tax and, in principle, can be calculated as a welfare loss.

Closely related to efficiency is how the tax affects policy goals. This question is particularly relevant here because the service being taxed is precisely the service the tax is also suppose to support, making it possible that the tax itself could undo any benefits of the spending it funds.

Equity—in general, how much people of different income levels pay—is simple in concept but difficult in practice since it is not possible to say what the “right” share of the tax any given group should pay.

Perhaps surprisingly to some, a broadband tax may actually be efficient relative to some other options, including income taxes (i.e., coming from general revenues). Historically, universal service funds were raised by taxes on long distance service, which is highly price sensitive, making the tax quite inefficient.

By contrast, for the typical household, fixed (and, increasingly, mobile) broadband has likely become quite inelastic. In 2010, one study estimated that the typical household was willing to pay about $80 per month for “fast” broadband service, while the median monthly price for that service was about $40. Since then, the number of applications and available online services has increased, meaning that consumer willingness to pay has presumably also increased, while according to the Bureau of Labor Statistics broadband prices have remained about the same.

Consumer Price Index, Internet Services and Electronic Information Providers

Source: Bureau of Labor Statistics, Series ID CUUR0000SEEE03, adjusted so January 2007=100

While no recent study has specifically evaluated price elasticity, the large gap between prices and willingness to pay suggests that a tax of any size likely to be considered might not be hugely inefficient overall.

The problem is that even if the tax does not affect subscription decisions by most people, it can still affect precisely the population policymakers want to help. Even though only 10 percent of people who do not have broadband cite price as the barrier, there is some lower price at which people will subscribe. A tax effectively increases the price consumers pay, meaning that it puts people at that margin—people who may be on the verge of subscribing—that much further away from deciding broadband is worthwhile. Similarly, people on the other side of that margin—those who believe broadband is worthwhile, but just barely—will either cancel their subscriptions or subscribe to less robust offerings.

To be sure, people who would be eligible for low-income support would probably receive more in subsidies than they pay in taxes, but this is an absurdly inefficient way to connect more people. As one astute observer noted, it is not merely like trying to fill a leaky bucket, but perhaps more like trying to fill that bucket upside-down through the holes.[1]

Higher prices for everyone highlights the equity problem. A connection tax is the same for everyone regardless of income, making the tax regressive. The tax becomes even more regressive because much of the payments go to rural residents regardless of their income while everyone pays regardless of their income, meaning the tax includes a transfer from the urban poor to the rural rich.

Even without an income test, methods exist to mitigate the equity problem. Unfortunately, the methods the FCC proposes are likely to undermine other policy goals. In particular, the FCC asks about the effects of taxing by tier of service, presumably with higher tiers of service paying more (paragraph 249). The FCC does not specifically mention equity in its discussion, but if higher income people are more likely to have faster connections then it would help mitigate equity issues.

This tiered tax approach is commonly used for other services, including electricity and water, where a low tier of use is taxed at a low rate, and higher usage rates are taxed incrementally more. Therefore, in the case of water, for example, a family that uses water only for cooking and cleaning will pay a lower tax rate than a family that also waters its lawn and fills a swimming pool. And while it is not a perfect measure of income, in general wealthier people are more likely to have big lawns and pools.

The problem with this approach in broadband is that while willingness to pay for “fast” broadband is relatively high, most people are not yet willing to pay much more at all for “very fast” broadband. Thus, taxing higher tiers of service at a higher price, while more equitable, may lead to other efficiency problems if it reduces demand for higher tiers of service.

So what’s the solution?

The FCC should decide which objectives are the most important: efficiency, equity, or other policy objectives such as inducing more people to subscribe or upgrade their speeds, and then design the tax system that best achieves that goal. Then it should compare this “best” tax to the outcome if the system were simply funded from general revenues and compare which of those would lead to a better outcome.

But no tax is worthwhile if the program it supports is itself inefficient and inequitable. The real solution is to dramatically reduce spending on ineffective universal service programs in order to minimize the amount of money needed to fund them. Unfortunately, the reforms appear to do just the opposite. In 2011, the high cost fund spent $4.03 billion and had been projected to decrease even further. The reforms, however, specified that spending should not fall below $4.5 billion (see paragraph 560 of the order), meaning that the first real effect of the reforms was to increase spending by a half billion dollars. And, as the GAO noted, the FCC “has not addressed its inability to determine the effect of the fund and lacks a specific data-analysis plan for carrier data it will collect” and “lacks a mechanism to link carrier rates and revenues with support payments.”

The right reforms include integrating true, third-party, evaluation mechanisms into the program and, given the vast evidence of inefficiency and ineffectiveness, a future path of steady and significant budget cuts. Those changes combined with an efficient tax-collection method, might yield a program that efficiently targets those truly in need.


[1] This excellent analogy comes from Greg Rosston via personal communications.

Hey, FCC: Stop Counting!

Friday, June 1st, 2012

By June 2011, nearly one-third of American households relied solely on wireless voice service, with  lower income households more likely to be wireless-only. This information doesn’t come from the FCC, as you might expect. Instead, it comes from the twice-yearly National Health Interview Survey, conducted by the U.S. Census for the Centers for Disease Control and Prevention (CDC).[1] The example highlights three points policymakers should take to heart for data collection relevant to telecommunications:

  • FCC does not always produce the most relevant telecommunications data.
  • Careful, representative surveys—not population counts, which the FCC uses for measuring voice and broadband markets—are usually the most effective and efficient way to gather data.
  • Policymaking agencies like the FCC can obtain relevant data from other agencies like the U.S. Census that specialize in data collection but have no vested interest in any particular policy outcome.

Counting telephones began at the turn of the 20th Century

The U.S. Census began to collect data on telephones as they became an increasingly important part of American life. In 1922 the Bureau noted, “The census of telephones has been taken quinquennially since 1902, and statistics of telephones were compiled and published in the decennial censuses of 1880 and 1890.”[2]

The FCC has largely continued this tradition, attempting to count each line or connection for communications technologies. (Some—not me, of course—might say delays in producing some reports indicate a desire to revert to the quinquennial release schedule).

Maintaining a consistent approach to data-gathering has certain advantages, such as facilitating comparisons over time. However, that advantage diminishes as it becomes less clear what, exactly, we should measure and as market changes make any particular count less relevant.

Counting is inefficient and misses the most important data

Most economic and social policy is based on surveys conducted by agencies such as the Census Bureau and the Bureau of Labor Statistics. We rely on surveys because gathering information on an entire population is typically not feasible. For the constitutionally-mandated decennial census, for example, the U.S. Census spends about $11 billion and hires about one million temporary workers.[3] By contrast, in a non-census year, the Census bureau spends about $1 billion on all its data collection efforts.[4] Additionally, surveys make it possible to gather data about particular groups and estimate the likelihood that different measures truly reflect the actual population.

The FCC attempts to count all lines, connections, and other factors related to telecommunications by requiring companies to provide certain data. Large firms spend significant resources providing these data. Small firms often do not have the resources to provide this information, and the FCC’s skilled data staff then must spend enormous time and effort trying to gather this information from firms who either will not or cannot respond.

The result is that the FCC has the least reliable count data in precisely the topical and geographic areas that it needs data for sound decisionmaking. For example, counts of broadband connections provide some measure of the intersection of supply (availability) and demand, but not good information on either separately. The counts provide no information on how those connections are used nor on how they break down demographically.[5]

This telecommunications counting fetish has spread to other parts of the government, as well. The National Broadband Map is based on the same flawed premise: the belief that the best dataset comes from observing every detail of every broadband connection. The effort cost about $350 million and still apparently yields inaccurate results in rural areas where policymakers want to direct resources.[6]

The FCC Should Stop Counting and Start Contracting the Census Bureau to do Surveys

Nearly all other areas of economic policy are informed by surveys, many of which are conducted monthly to provide real-time information to markets and policymakers. Nothing in particular about telecommunications requires a total population count rather than survey data.

Additionally, there is no reason why the FCC itself should be responsible for data collection. The U.S. Census Bureau is much better equipped to design and implement surveys. It is not uncommon for Census to do survey work for (and funded by) other agencies. In addition to the CDC survey mentioned above, Census also does surveys for the Department of Justice,[7] the National Center for Education Statistics,[8] and State Library Agencies[9] to name a few.

Embracing surveys conducted by other agencies would have several advantages:

  • Surveys are almost certain to be cheaper than counts both to the government and to the private sector.
  • Surveys of users, rather than counts submitted by providers, are more likely to yield data not influenced by providers’ incentives to game the data collection process to their own benefit.
  • Data collection by outside agencies would reduce any inherent conflict of interest the FCC might face when gathering data related to its agenda.

Surveys by other agencies, of course, are not a silver bullet for obtaining better and more timely data. They can be done poorly. And the FCC should remain involved. As the expert agency it should largely determine the questions it needs answered and the type of information necessary for policymaking and provide the resources necessary to do it. Additionally, the FCC needs the ability to compel data from regulated companies for specific decisions when necessary.

Today, unfortunately, surveys are being subject to attacks by Congressional Republicans, who want to reduce the ability of the U.S. Census to collect data.[10] These attacks have been roundly and correctly criticized by conservative and liberal commentators, who note that these data are crucial to good policymaking.[11]

Despite the Congressional statistical ignorance de jour, surveys by agencies expert in data collection will yield far better data at lower cost than today’s methods. Hopefully the FCC will take note and begin to move our ability to study telecommunications out of the 19th Century.


[1] http://www.cdc.gov/nchs/nhis.htm

[2] http://www2.census.gov/prod2/decennial/documents/13473055ch1.pdf

[3] http://usgovinfo.about.com/od/censusandstatistics/a/aboutcensus.htm

[4] http://www.osec.doc.gov/bmi/budget/12CJ/Census_Bureau_FY_2012_Congressional_Submission.pdf

[5] It is possible to merge geographic counts with demographic data from the Census, but even this approach would be more effective if done in a way that explicitly incorporates connections to the Current Population Survey.

[6] http://www.govtech.com/wireless/Study-National-Broadband-Map-Inaccurate.html

[7] http://www.census.gov/econ/overview/go2300.html, http://www.census.gov/econ/overview/go2500.html

[8] http://www.census.gov/econ/overview/go1600.html,  http://www.census.gov/econ/overview/go2000.html

[9] http://www.census.gov/econ/overview/go1900.html

[10] http://www.nytimes.com/2012/05/20/sunday-review/the-debate-over-the-american-community-survey.html

[11] See, for example, Matthew Yglesias’s discussion: http://www.slate.com/articles/business/moneybox/2012/05/american_community_survey_why_republican_hate_it_.html

What Cable Monopoly?

Thursday, May 3rd, 2012

“The future is in fiber optic high-speed Internet access, as compared to DSL and cable modem service.”

“Many new business models are made possible by high-speed access, and fiber access in particular. By contrast, DSL and cable modem access are subject to sharp capacity limitations which are rapidly rendering them obsolete for the types of activities Americans want to engage in online.”

-      Crawford, Susan P. “Transporting Communications.” Boston University Law Review 89, no. 3 (2009): 871–937, pp. 928 & 930.

“…the broad consensus seems to be that the long-term fixed platform will likely be fiber, and cable plant too will likely become increasingly fiber-based over time, as the theoretical and long-term practical capacity of fiber to the home systems will be orders of magnitude larger than for cable systems.”

-       Benkler, Yochai, Rob Faris, Urs Gasser, Laura Miyakawa, and Stephen Schultze. Next Generation Connectivity: A Review of Broadband Internet Transitions and Policy from Around the World. The Berkman Center for Internet & Society, 2010, p.63

What a difference a few years makes! As late as 2009 Susan Crawford was arguing that cable broadband was becoming obsolete and Harvard’s Berkman Center believed the only long-term answer to increasing broadband demand was fiber.

Today, Crawford is warning of a looming cable monopoly. To be sure, DOCSIS 3.0 technology has given cable a relatively low-cost upgrade path while traditional telcos generally have to invest far more in fiber to achieve similar performance.[1]

So, what is really happening in the market? As the chart below shows, data on fixed broadband subscriptions contradict the claims of monopoly. The most recent FCC data only goes through December 2010, so we extend the figure to June 2011 using data from the OECD.[2] The data show that cable has always held the majority of connections, peaking around 2003 when it held close to 60 percent of the fixed broadband market.

Sources: FCC reports on local telephone competition and broadband deployment, and OECD http://www.oecd.org/document/23/0,3746,en_2649_34225_33987543_1_1_1_1,00.html

The share of cable connections is trending upwards, but, at least as of last year, did not appear to be significantly different from the past.

More recent data comes from companies’ financial reports. The following chart shows the quarter-to-quarter percentage change in the number of high-speed Internet subscribers for Comcast, Time Warner Cable, Verizon, and AT&T. Cable companies have been doing well in terms of net additions for several quarters, but not significantly better than Verizon, and even AT&T is reporting net gains from its U-Verse platform.

Sources: Company quarterly and trending reports.

Note: Time Warner Cable reported 10.716 million HIS subscribers in Q1 2012, which represented close to a 7 percent increase over Q4 2011. However, 550,000 of that increase came from TWC’s acquisition of Insight Communication and 42,000 of it from the acquisition of NewWave Communications. The percentage shown in the figure deducts increases due to acquisitions.[3]

None of this evidence means that Crawford’s warnings are necessarily wrong, of course. Whether cable’s cost advantage will ultimately translate into a monopoly or any increased market power, however, will depend not just on technological differences but also on changes in demand.

When an HD video stream comes from Netflix at less than 5 Mbps there is little advantage to cable DOCSIS 3.0 relative to a DSL connection with at least 5 mbps. But demand will surely change over time, and cable’s cost advantage will be an important point in its favor. That’s one reason why AllianceBernstein’s analyst Craig Moffett is so bullish on cable stocks.

Even as critics are pivoting from demanding that we focus only on fiber to warnings of a cable monopoly, the market is shifting under their feet again. Today, consumers are adopting smartphones and tablets in droves. The trend towards wireless is already affecting the development of Internet innovation (think mobile apps). Cable still has some advantages in that area—wireless providers need to offload their data somewhere, after all—but it may yet not end up as the dominant technology.

More generally, this market changes quickly. A few years ago policymakers were being urged to focus on fiber. Now they are being warned about a cable monopoly even as wireless broadband is taking center stage, as the FCC data shown in the figure below demonstrate. And surely in a few years technology and demand will have moved us in directions we can’t yet predict.

Source: FCC Internet Access Services Report, October 2011, Table 7 http://transition.fcc.gov/wcb/iatd/comp.html

Policymakers should, without a doubt, keep a close eye on market conditions and work to ensure an environment conducive to competition. But if this fast-changing market teaches us anything, it’s that we should think twice before we conclude we know the endgame.


[1] Christopher Yoo has pointed out the fiber-cable worry flipflop, so I can’t claim credit for noticing it.

[2] I have been very critical of the OECD rankings. However, data for a single country over time should be reliable if the within-country definitions remain constant. Judging from how closely the OECD data track the FCC data it is likely they come from similar sources.

[3] http://www.fiercecable.com/story/insight-leadership-team-departs-following-completion-time-warner-cable-acqu/2012-03-01 and http://www.businesswire.com/news/home/20110613005676/en/Time-Warner-Cable-Acquire-Cable-Systems-NewWave

Internet Hysteria – Are We Losing Our Edge?

Thursday, December 15th, 2011

Scott Wallsten and Amy Smorodin

From Anthony Wiener’s wiener to the FCC’s brave stand on Americans’ shameful inability to turn down the damn volume by themselves, 2011 has been a big year for tech and communications policy. But how has one of the Washington tech crowd’s most important products—Internet hype—fared this year?  In this post, we seek to answer this crucial question.

The Internet Hysteria Index

The Internet is without doubt the most powerful inspiration for hyperbole in the history of mankind. Some extol the Internet’s greatness, like Howard Dean, who called the Internet “the most important tool for re-democratizing the world since Gutenberg invented the printing press.”[1] Others fret about the future, like Canada’s Office of Privacy Commissioner, who claimed, “Nothing in society poses as grave a threat to privacy as the Internet Service Provider.”[2]

Sometimes the hyperbole is justified. For example, thanks to Twitter, attendees at this past summer’s TPI Aspen Summit were privy to a steady stream of misinformation even before the DC-area earthquake stopped.[3]

In the same spirit, we present the Internet Hysteria Index (IHI). The IHI, which the DOJ and FCC should take care not to confuse with the HHI, is the most rigorous and flexible tool ever conceived for gauging the Internet’s “worry zeitgeist”. It’s rigorous[4] because it uses numbers and flexible[5] because you can interpret it in so many different ways that it won’t threaten your preconceived ideas no matter what you believe.

The IHI has two components. The first tracks fears of an unrecognizable, but certainly Terminator-esque, future Internet. We count the number of times the exact phrases “the end of the internet as we know it” and “break the internet” appear in Nexis news searches each year since 2000.

Figure 1: The End of the Internet as we Know It!


Figure 1 shows that 2011 produced a bumper crop of “break the internet” stories, mostly related to the Stop Online Piracy Act and the Protect IP Act. The spike in 2006 reflects a wave of Net Neutrality stories after AT&T’s then-CEO proclaimed that “what they [content providers] would like to do is use my pipes free, and I ain’t going to let them do that because we have spent this capital and we have to have a return on it.”

As our research illustrates, the “End of the Internet” hyperbole shows a healthy, generally upward trend, reflecting the effectiveness of our collective fretting and hand-wringing. Our data do not allow us to identify[6] whether the trend is due to clever Washington PR, lazy hacks retreading old lines, real concerns, or collusion among interest groups simply ensuring they can all stay in business by responding to each other.

The second component of our index measures the incidence of hand-wringing regarding the state of broadband in the U.S. In particular, this measure counts the number of times phrases suggesting lagging U.S. broadband performance show up in Nexis since 2000.[7] Figure 2 shows the results of our analysis.

Figure 2: The Grass is So Much Greener on the Other Side of the Pond: U.S. Broadband Sucks


The big spike in 2010 is related to release of the National Broadband Plan. The prior high, in 2007, saw stories focusing on the OECD rankings, broadband mapping, and the beginnings of broadband plan discussions.

Unfortunately, 2011 was not a good year for misinterpreting shoddily-gathered statistics. Figure 2 shows a dramatic drop-off in bemoaning the dire state of U.S. broadband, possibly after everyone just got really, really tired of talking about the National Broadband Plan. We’re extremely concerned that as a result, the U.S. may have fallen dramatically in the OECD worry rankings. In fact, in a warning shot across our bow, on December 14 the BBC reported that “the UK remains in danger of falling behind when it comes to next-generation mobile services” and superfast broadband.[8] We’re hopeful American fretting will pick up once analysts actually read the FCC’s USF order that was promulgated under the cover of 23 days between approval and publication. On the other hand, there is a risk that the sheer volume of the Order—the equivalent of more than 4 million tweets—might dissuade people from talking about it ever again.

For generations, Americans have taken a back seat to nobody on the important issue of Internet hyperbole. Let’s hope the inside-the-beltway crowd pulls itself together and breathes some life back into the speech economy. Happy New Year.


[1] http://motherjones.com/politics/2007/06/interview-howard-dean-chairman-democratic-national-committee

[2] http://dpi.priv.gc.ca/index.php/essays/the-greatest-threat-to-privacy/

[3] Picture from Funny Potato, http://www.funny-potato.com/blog/august-23rd-2011-east-coast-quake.

[4] It’s not.

[5] In other words, “probably pretty meaningless.”

[6] Actually, they do, but we don’t want to do the work.

[7] Specifically, the search is ((“U.S. falling behind “OR “U.S. lagging”) AND broadband) OR ((“United States falling behind” OR “United States lagging”) AND broadband).

[8] http://www.bbc.co.uk/news/technology-16174745

The AT&T/T-Mobile Merger Conundrum: Increase Efficiency AND Create Jobs?

Friday, December 2nd, 2011

How did the proposed AT&T and T-Mobile merger, which many viewed as so certain when announced, end up on life support? Is it because of the decision by the Department of Justice (DOJ) to challenge the merger in court? Or maybe because of skeptics’ claims regarding the likelihood of the merger “creating jobs?”

Those factors certainly played a role, but another reason the merger reached the brink of collapse is arguably because the current jobs crisis made it impossible for AT&T to justify the merger to antitrust authorities while also making it palatable to politicians and the FCC with its broader “public interest” standard.

For antitrust purposes, AT&T had to demonstrate that it would not substantially reduce competition and that if it did, the increased efficiency of a merged company would greatly outweigh those costs. For political purposes, in an era of persistent unemployment AT&T decided it had to demonstrate that the merger would create jobs.

Horizontal mergers between large competitors, such as the proposed one between AT&T and T-Mobile, are generally subject to tough antitrust scrutiny. Antitrust policy is indifferent to the effect of a merger on jobs, instead focusing on the effects of the merger on competition and consumers while weighing those effects against the potential economic benefits of a more efficient merged firm.

As the DOJ-FTC Horizontal Merger Guidelines note, “Competition usually spurs firms to achieve efficiencies internally. Nevertheless, a primary benefit of mergers to the economy is their potential to generate significant efficiencies and thus enhance the merged firm’s ability and incentive to compete, which may result in lower prices, improved quality, enhanced service, or new products” (p.29).

The efficiency argument is always a high bar in a merger case since “the antitrust laws give competition, not internal operational efficiency, primacy in protecting customers” (p.31). One way the merged company might increase efficiency would be to lay off large numbers of workers if it believed it could maintain service quality while doing so. By appearing to take that option off the table and arguing that the merger was, in fact, good for jobs, AT&T raised the efficiency bar even higher than it normally is.

It is, of course, possible to increase employment and efficiency if the firm increases output by more than it increases costs. AT&T made an argument consistent with that outcome in its filings by contending that spectrum constraints are distorting investment decisions at both AT&T and T-Mobile.

AT&T’s biggest claim regarding jobs was that the merger would lead to more jobs through better mobile broadband. However, the empirical link demonstrating that broadband increases employment—rather than simply being correlated with higher employment—has not been rigorously established, as Georgetown Professor John Mayo and I demonstrate in a paper published earlier this year.

As a result, even if DOJ were willing to consider effects external to the firms, industry, and direct consumers, the speculative nature of the claims would probably cause the DOJ to disregard them. As the Merger Guidelines note,

Efficiency claims will not be considered if they are vague, speculative, or otherwise cannot be verified by reasonable means. Projections of efficiencies may be viewed with skepticism, particularly when generated outside of the usual business planning process. (p.30)

The FCC is more sympathetic to the effect on jobs than DOJ, but the staff report made it clear that it expected the merger to result in a net loss of direct employment and was highly skeptical of the claims regarding the indirect effects on employment (see Section V(G), beginning at paragraph 259 for the jobs discussion).

In short, even setting aside the substantive questions of the net effects on competition, consumers, and broadband availability, the merger was always going to be an especially tough sell in the current economic and political climate.

To win the day, AT&T had to convince antitrust authorities that improved efficiencies by the merged firm would outweigh any resulting reduction in competition while simultaneously convincing politicians that the merger was good for jobs. But convincing DOJ that the company would increase employment risked signaling to DOJ that the merger was not about efficiency, and convincing the FCC that the merger was good for efficiency risked signaling to the FCC that the merger would not produce jobs.

Unable to thread that needle, AT&T’s strategy collapsed. Whether it will succeed with a new strategy remains to be seen.