Archive for the ‘Broadband’ Category

Dispatch from the TPI Aspen Forum

Monday, August 18th, 2014

Sunday, August 17

Last night, we kicked off our 2014 Aspen Forum in lovely Aspen, Colorado.

Congressman Scott Tipton welcomed attendees to his home state (and his home district).  In his remarks, Tipton discussed the importance of tech in growing small business and the economic impact of regulations, which he estimated to cost $1.8 billion a year.  Rep. Tipton also discussed the importance of broadband penetration in rural areas.

Video of his speech, and short remarks from TPI President Thomas Lenard and TPI Board Member Ray Gifford, can be found here.

Monday, August 18

The first full day of the TPI Aspen Forum began with a discussion on “The Political Economy of Telecom Reform,” moderated by TPI’s Scott Wallsten.

Former Congressman Rick Boucher, now a Partner at Sidley Austin, explained that during the 1996 telecom act, the issues were not partisan in nature.  However, he identified a sticking point that seems to be drawn along party lines: network neutrality.  He would like to see net neutrality dealt with separately prior to the start of any real push for telecom reform in Congress, in hopes that lawmakers will have an easier time finding common ground.

Peter Davidson from Verizon stated that there does not seem to be as much consensus among players in the communications industry as there was during the last push for telecom reform.  However, he did express that the threat of Title II regulation may drive many to band together.

Roger Noll from Stanford University declared the big winners in the ‘96 Act “were people who make a living manipulating regulatory processes.”  He also said such a thing was less likely to happen with any new telecom reform act because there are many more players – not just traditional wired communications companies – who know how to mobilize politically.

Philip Weiser, Dean, University of Colorado Law School stated the communications sector is going to have a lot of innovation in the next few years despite the static telecom reform act. In any new reform act, Congress should stick to high-level principles to enable ongoing innovation.  In other words, Congress needs to show restraint.

Video of the entire discussion can be viewed here.

More summaries of today’s panels and tonight’s keynote dinner speech by Comcast’s David Cohen, will be posted soon. Videos of everything will also be posted on the TPI YouTube page just as soon as we can get them up.

Stay tuned!

Where Do Vendors To Cable Think The Industry Is Heading? Evidence From Cable Show Data In 2014

Friday, April 25th, 2014

Scott Wallsten and Corwin Rhyan

For the past five years we have collected data about the exhibitors at the annual NCTA Cable Show from its website.  Each year we analyze trends in the industry through the categories used to classify the exhibitors.  Key observations this year include:

       »      The number of exhibitors continues to fall as it has in each of the past 4 years, from 345 in 2010 to 241 in 2014 (Figure 1).

       »      Cable Programming, Video on Demand, IPTV, and Multi-Screen Content are the first, second, third, and fourth most popular in 2014. The top three increased in popularity since last year and Multi-Screen Content decreased slightly (Figure 2).

       »      New popular categories this year included RDK (Reference Design Kit)[1] and Content Search/Navigation Systems, with each having over 10 exhibitors in their first year. Fiber was absent in 2013, but has a few exhibitors in 2014 (Figure 4).

       »      Games, Consultants, and Research & Development show some of the largest year over year increases from 2013-2014 after Video on Demand, and IPTV (Figure 5).

       »      Four previously popular categories are absent from the 2014 show—HDTV, New Networks, tru2way, and VOIP. Other notable decliners include 3D TV and Mobile Apps. These highlight some difficulties in interpreting the results without other information. HDTV likely disappeared because it is so ubiquitous while 3D TV disappeared because it has generally been a market disappointment (Figure 7).

Number of Participants

The number of exhibitors in 2014 is down over 30% from 2010. After a large drop in 2011, the last 3 years have decreased at a more moderate 3% annual rate. The number of exhibitors is biased slightly upwards due to the fact that an exhibitor with multiple booth locations gets classified as two separate exhibitors in our data. However, the number of duplicates over the years is relatively small and consistent.

Figure 1: Number of Exhibitors 2010-2014

1

Hot Tech This Year

The Cable Show allows its exhibitors to define their companies by categorical labels which signal to potential customers the types of products and services offered.  An exhibitor can select multiple categories for their products.  In 2014, the average number of categories per exhibitor was 3.87, down slightly from 4.33 in 2013.  In general, we expect exhibitors to classify their products as generally and widely as possible, with the hope of attracting interested attendees to their booths.  In order to normalize the data for year over year comparisons, we divide the number of exhibitors in each category by the total number of exhibitors, yielding a percentage of exhibitors that select each category.  The top 20 categories are listed below for the last 3 years, with Cable Programming defining over a third of all exhibitors this year.

Figure 2: Most Popular Categories 2012 – 2014

2

In graphical form below, we plot the trends of this year’s top 5 most popular categories over the past 5 years.  While many of these categories have traditionally been near the top, most have grown over the past 5 years. 

Figure 3: Top 5 Most Popular Categories

3

While we cannot rule out or in any particular hypothesis based on these data, it is worth noting that the large increase in programming-related exhibitors coincides with unprecedented increases in retransmission fees cable companies pay to programmers. It would be consistent with economic theory to see entry into this market as price increases.

What’s In and Out in 2014

The categories used to classify products and services change regularly.  The new categories used in 2014 are listed in Figure 4. Some are similar to previous categories, such as Content Search/Navigation Services, which likely evolved from the separate category of Content Navigation, while others come with little previous background like RDK (Reference Design Kit).

Figure 4: New 2014 Categories

4

Many of the most popular 2013 categories continued to gain ground in 2014, with Video on Demand, IPTV, and Cable Programming showing strong gains.  Games and Consultants showed a strong increase in representation as well.  A complete list of the top gainers in 2014 is shown in Figure 5.  Some gainers declined in 2013 but returned with stronger showings in 2014.  This list includes categories such as Video on Demand, Billing, Internet TV Providers, IPTV, and Telecommunications Providers.  A chart of these categories is shown in Figure 6.

Figure 5: Biggest 2013-2014 Gainers

5

Figure 6: Categories that switched to growth in 2014

6

At the same time, some categories disappear between years.  In 2014 some notable categories are no longer present.

The categories that declined in 2014 included many that disappeared from the list completely such as HDTV, New Networks, tru2way, and VOIP. Several possible theories could explain these disappearances: perhaps some categories became so ubiquitous as to be meaningless in the context of a trade show (e.g., HDTV or VOIP), show organizers decided to no longer include a category because it overlapped too heavily with other categories (e.g., was “New Network” the same as “Program Networks?”), or because it is no longer relevant?

Other notable decliners include once “up and coming” technologies such as 3D TV, Mobile Apps, and Social TV. A decrease in a category is probably easier to interpret than an outright disappearance. 3D television, for example, has been a notable market disappointment and it is no surprise to see it disappearing from the show.

Figure 7: Biggest 2013-2014 Losers

7

Figure 8: Categories that switched to decline in 2014

8

Conclusion

Using data from the Cable Show’s exhibitors is advantageous because it is representative of the actors in the industry who have real money on the line.  In a tech world that loves to exaggerate the next “big thing”, using data directly from the industry members might help provide a better understanding of where the industry is headed.  However, this data must be used with caution.  First, the categories are self-reported by exhibitors, and while they have a clear incentive to accurately categorize their products and services, some might also see advantages in identifying with certain hyped industry technologies to attract customers. Secondly, the analysis weighs each exhibitor identically, which clearly isn’t accurate as some booths are massive and staffed by dozens of people while others are little more than a table and the company owner (Figure 9).

Despite these data shortcomings the data show a continued trend towards a cable industry more focused on its traditional roles as a television service provider, with programming, television, video, and networks topping our list in 2014, while the hyped technologies that were set to revolutionize the cable industry in 2012 and 2013 fell in 2014.

Figure 9: 2014 Cable Show Floor Plan

9


[1] According to www.rdkcentral.com, RDK is “a pre-integrated software bundle that provides a common framework for powering customer-premises equipment (CPE) from TV service providers, including set-top boxes, gateways, and converged devices.”

Comcast and Netflix—What’s the Big Deal?

Wednesday, February 26th, 2014

Netflix and Comcast recently announced an agreement whereby Netflix will pay Comcast for direct access to its network.  This agreement addresses congestion that is slowing delivery of Netflix videos to Comcast’s broadband subscribers and resolves a dispute between the two companies concerning how to pay for the needed network upgrades.  Netflix and Verizon are currently working through a similar dispute.  While some commentators think deals such as the one between Netflix and Comcast are problematic, the reality is that the agreement reflects a common market transaction that yields an outcome more efficient and more quickly than any regulatory intervention could have.

The following series of stylized figures illustrate how the growth of Netflix and other streaming video services have affected the volume and flow of internet traffic and corresponding payments in recent years.  Traditionally (Figure 1), Internet backbone providers and ISPs entered into “peering” agreements, which did not call for payments on either side, reflecting a relatively balanced flow of traffic.  Content distributors paid backbone providers for “transit,” reflecting the unbalanced flow of traffic along that route.

Slide1

With the growth of online video and with Netflix accounting for 30 percent of traffic at some times of the day, this system was bound to become strained, as we are now seeing and as shown in Figure 2.  The flow of traffic between the backbone provider and the ISP is unbalanced and has grown enormously, requiring investments in additional capacity.

Slide2

One way to address this problem is for the backbone provider to pay the ISP, reflecting the greater amount of traffic (and greater capacity needed) going in that direction (see Figure 3).  In fact, that is what happened following a dispute between Level 3 and Comcast in late 2010.

Slide3

Another solution is the just-announced Comcast-Netflix deal, reflected in Figure 4.  In this case, Netflix/Comcast is bypassing the intermediate backbone provider (either partially or completely), presumably because it is more efficient to do so.  One or both of them is investing in the needed capacity.  Regulatory interference with such a deal runs the risk of blocking an advance that would lower costs and/or raise quality to consumers.

Slide4

The Wall Street Journal has described the debate as being “over who should bear the cost of upgrading the Internet’s pipes to carry the nation’s growing volume of online video:  broadband providers like cable and phone companies, or content companies like Netflix, which make money by sending news or entertainment through those pipes.”  Ultimately, of course, consumers pay one way or the other.  When Netflix pays Comcast, the cost is passed through to Netflix subscribers.  This is both efficient and fair, because the consumer of Netflix services is paying for the cost of that service.

In the absence of such an agreement, quality would suffer or the ISP would bear the cost.  The ISP might recover these costs by increasing prices to subscribers generally.  This would involve a cross-subsidy of Netflix subscribers by non-subscribers, which would be neither efficient nor fair.  Alternatively, Comcast could increase prices for those subscribers who consume a lot of bandwidth, which might have similar effects to the just-announced deal, but would probably lose some efficiencies.  In any event, it is difficult to see how such an arrangement would be better for consumers than the announced agreement.

 

 

The FCC Tries Yet Again

Wednesday, February 19th, 2014

FCC Chairman Tom Wheeler’s official response to the DC Appeals Court decision on the Commission’s “net neutrality” rules promises to keep the issue on the table for the foreseeable future.  That is unfortunate, because there are better ways for the Commission and its staff to spend their time.

The Appeals Court took away from the Commission with one hand, while giving back with the other:  It struck down the more onerous provisions of the net neutrality rules—the “anti-discrimination” and “anti-blocking” provisions—because they imposed common carrier obligations and broadband is not classified as a Title II common carrier service.  However, the Court affirmed the Commission’s argument that it has general authority (under section 706 of the Communications Act) to regulate in order to encourage broadband deployment.

Since the Appeals Court decision came down, the FCC has been under considerable pressure from net neutrality proponents to reclassify broadband as a Title II common carrier.  In today’s announcement, the Commission declined to do that. However, the Commission also declined to close the Title II docket, keeping the threat of reclassification and the regulatory burdens and oversight that go with it, alive.

In addition, the Commission announced its intention to start yet another net neutrality rulemaking, under its section 706 authority, in order to fulfill the Commission’s no blocking and non-discrimination goals as well as to enhance the transparency rule (the one major provision that the court upheld).

With all the activity aimed towards asserting legal justification for its net neutrality rules, it sometimes gets lost that the FCC had no convincing economic or consumer welfare justification for the rules in the first place.

While there is widespread agreement that the Internet should be open and provide consumers with access to content, applications and services of their choice, the rule was always a solution in search of a problem, a sentiment echoed today by FCC Commissioner Pai.  The Commission never provided the necessary data and analysis to show that the rules would address a significant market failure, did not identify harms to users that the rules would remedy, and did not demonstrate that the benefits of the rules would exceed their costs.  In other words, the Commission neglected to explain why the broadband market, which has generally thrived under minimal regulation, should now be subject to an enhanced regulatory regime.   Indeed, a good argument can be made that, by making the adoption of innovative business models more difficult, the rules would have hindered rather than encouraged the deployment of broadband infrastructure, notwithstanding the Commission’s assertions to the contrary.

There is now substantial concern that the Appeals Court has expanded the Commission’s authority to include the entire Internet ecosystem—including potentially content, applications, and service providers—as long as it can make some plausible argument that its actions encourage broadband deployment.  Expanding the Commission’s domain in this way would be a serious mistake and would compound the harm.

A major goal of the Commission in promulgating its net neutrality rules initially was to “provide greater predictability.”  It clearly has not achieved that goal.  Starting yet another proceeding, and keeping the Title II docket open, will create even more uncertainty for the entire Internet ecosystem.

Where do vendors to cable think the industry is heading? Evidence from 2013 Cable Show data

Tuesday, June 11th, 2013

For the past four years (2010 – 2013) I have been collecting data about exhibitors at the Cable Show. Key observations based on the most recent data:

  • The number of exhibitors continues to decline, down to 251 in 2013 from 345 in 2010 (Figure 1).
  • Programming is the most popular exhibitor category, and has been steadily increasing in popularity since 2010. In 2013 nearly one-third of exhibitors classify themselves under programming. Multi-screen content, HDTV, video on demand, and IPTV are the second, third, fourth, and fifth most popular categories (Figure 2).
  • The categories with the biggest increases in representation since 2010 are multi-screen content, programming, HDTV, new technology, and cloud services (Figure 3).
  • The categories with the biggest decreases in representation since 2010 include telecommunications equipment, services, and VOIP (Figure 4).

Exhibitor attendance

This year, the website listed 251 exhibitors, continuing a steady decline from 2010 (Figure 1). The number is biased upwards because an exhibitor can be counted multiple times if it appears in multiple booths.

Figure 1: Number of Cable Show Exhibitors, 2010-2013

Number of Exhibitors

 

Hot or Not?

The website shows the categories of products, services, or technologies each exhibitor selects to describe itself. An exhibitor can select several categories. To evaluate the prevalence of each category I total the number of times each category is selected, and then divide that by the number of exhibitors to make it comparable across years.

The table below shows the top 20 categories for 2010 – 2013. Programming has remained the top category for all four years. However, multi-screen content jumped to second place, followed by HDTV, pushing video on demand and IPTV to numbers four and five.

topcategories

 

Hot

Figure 2 shows how the top 5 exhibitor categories for 2013 have evolved over the past four years. Fully one-third of all exhibitors classify themselves as programming, nearly twice as many as in 2010. Multi-screen content did not exist as a category in 2010 while 16 percent of all exhibitors included themselves in this category in 2013.

Figure 2: Share of Exhibitors with Products in Top 5 2013 Categories Over Time

topcategorychart

 

Consistent with the above figure, from 2010 – 2013 cable programming increased in representation more than any other category. Multi-screen content saw the second-largest increase, followed by mobile apps, new technology, and cloud services.

Figure 3: Categories with Biggest Increase in Representation Since 2010biggestincreases

Not

Telecommunications services and equipment has seen the biggest decrease in representation since 2010, followed by VOIP, program guides, and optical networking. However, because “program guides” was not included as a category in 2013 it is not clear if the category truly became less popular or is now simply called something else.

Figure 4: Categories with Biggest Decreases in Representation Since 2010

biggestlosers

 

What does this mean?

The data themselves have certain problems that make drawing strong conclusions difficult. For example, counting exhibitors and categories implicitly assumes that each exhibitor is identical in size and importance, which clearly is not true (Figure 5). Additionally, the categories are self-reported by the exhibitors and do not appear to have strict definitions. Exhibitors have no incentive to select grossly inaccurate categories, since that would attract people unlikely to purchase their products, but exhibitors probably tend towards being overly-inclusive so as not to miss potential clients. This tendency might bias towards especially popular technologies. For example, perhaps exhibitors take liberties in claiming they offer “cloud services” because those contain popular buzzwords rather than because their products truly offer much in the way of those services.

Despite these shortcomings in the data, they provide one source of information on where economic actors with money at stake think the industry is headed over the next year. And, according to them, this year the industry is trending more towards its traditional role as video provider, focusing on programming and multi-screen content.

Figure 5: Exhibitor Map, 2013 Cable Show

map

 

Unleashing the Potential of Mobile Broadband: What Julius Missed

Thursday, March 7th, 2013

In yesterday’s Wall Street Journal op-ed, FCC Chairman Genachowski correctly focuses on the innovation potential of mobile broadband.  For that potential to be realized, he points out, the U.S. needs to make more spectrum available.  A spectrum price index developed by my colleague, Scott Wallsten, demonstrates what most observers believe – that spectrum has become increasingly scarce over the last few years.

The Chairman’s op-ed highlights three new policy initiatives the FCC and the Obama Administration are taking in an attempt to address the spectrum scarcity:  (1) the incentive auctions designed to reclaim as much as 120 MHz of high-quality broadcast spectrum for flexibly licensed – presumably, mobile broadband – uses;   (2) freeing up the TV white spaces for unlicensed uses; and (3) facilitating sharing of government spectrum by private users.

There are two notable omissions from the Chairman’s list.  First, he does not mention the 150 MHz of mobile satellite service (MSS) spectrum, which has been virtually unused for over twenty years due to gross government mismanagement.  A major portion of this spectrum, now licensed to three firms – LightSquared, Globalstar, and Dish – could quickly be made available for mobile broadband uses. The FCC is now considering a proposal from LightSquared that would enable at least some of its spectrum to be productively used.  That proposal should be approved ASAP.  The MSS spectrum truly represents the low-hanging fruit and making it available should be given the same priority as the other items on the Chairman’s list.

Second, if the FCC and NTIA truly want to be innovative with respect to government spectrum, they should focus on the elusive task of developing a system that requires government users to face the opportunity cost of the spectrum they use.  This is currently not the case, which is a major reason why it is so difficult to get government users to relinquish virtually any of the spectrum they control.  To introduce opportunity cost into government decision making, Larry White and I have proposed the establishment of a Government Spectrum Ownership Corporation (GSOC). A GSOC would operate similarly to the General Services Administration (GSA).  Government agencies would pay a market-based “rent” for spectrum to the GSOC, just as they do now to the GSA for the office space and other real estate they use.  Importantly, the GSOC could then sell surplus spectrum to the private sector (as the GSA does with real estate). The GSOC would hopefully give government agencies appropriate incentives to use spectrum efficiently, just as they now have that incentive with real estate.  This would be a true innovation.

In the short run, administrative mechanisms are probably a more feasible way to make more government spectrum available.  For example, White and I also proposed cash prizes for government employees who devise ways their agency can economize on its use of spectrum.  This would be consistent with other government bonuses that reward outstanding performance.

Sharing of government spectrum is a second-best solution.  It would be far better if government used its spectrum more efficiently and more of it was then made exclusively available to private sector users.  This is, admittedly, a difficult task, but worth the Administration’s efforts.

Unintended—But Not Necessarily Bad—Consequences of the 700 MHz Open Access Provisions

Tuesday, November 6th, 2012

Wireless data pricing has been evolving almost as rapidly as new wireless devices are entering the marketplace. The FCC has mostly sat on the sidelines, watching developments but not intervening.

Mostly.

Last summer, the FCC decided that Verizon was violating the open access rules of the 700 MHz spectrum licenses it purchased in 2008 by charging customers an additional $20 per month to tether their smartphones to other devices. Verizon paid the fine and allowed tethering on all new data plans.[1]

Much digital ink has been spilled regarding how to choose a shared data plan best-tailored for families with a myriad of wireless devices and demand for data. Very little, however, appears to have been said about individual plans and, more specifically, about those targeted to light users.

One change that has gone largely unnoticed is that Verizon effectively abandoned the post-paid market for light users after the FCC decision.

Verizon no longer offers individual plans. Even consumers with only a single smartphone must purchase a shared data plan. That’s sensible from Verizon’s perspective since mandatory tethering means that Verizon effectively cannot enforce a single-user contract. The result is that Verizon no longer directly competes for light users.

The figure below shows the least amount of money a consumer can pay each month on a contract at the major wireless providers. As the table below the figure highlights, the figure does not present an apples-to-apples comparison, but that’s not the point—the point is to show the choices facing a user who wants voice and data, but the smallest possible amount of each.

Note: Assumes no data overages.

The figure shows that this thrifty consumer could spend $90/month at Verizon, $60/month at AT&T, $70/month at T-Mobile, and $65/month at Sprint if the consumer is willing to purchase voice/text and data plans separately. Even Verizon’s prepaid plan, at $80/month, costs more than the others’ cheapest postpaid plans.

Moreover, prior to the shift to “share everything” plans, this consumer could have purchased an individual plan from Verizon for $70/month—$20/month less than he could today. At AT&T the price was $55/month but increased by only $5/month. Again, the point is not to show that one plan is better than another. Verizon’s cheapest plan offers 2 GB of data, unlimited voice and texts, and tethering while AT&T’s cheapest plan offers 300 MB of data, 450 voice minutes, and no texts or tethering. Which plan is “better” depends on the consumer’s preferences. Instead, the point is to show the smallest amount of money a light user could spend on a postpaid plan at different carriers, and that comparison reveals that Verizon’s cheapest option is significantly more expensive than other post-paid options and, moreover, increased significantly with the introduction of the shared plan.

Is the FCC’s Verizon Tethering Decision Responsible for this Industry Price Structure?

There’s no way to know for sure. The rapidly increasing ubiquity of households with multiple wireless device means that shared data plans were probably inevitable. And carriers compete on a range of criteria other than just price, including network size, network quality, and handset availability, to name a few.

Nevertheless, Verizon introduced its “share everything” plans about a month before the FCC’s decision. If we make the not-so-controversial assumption that Verizon knew it would be required to allow “free” tethering before the decision was made public and that individual plans would no longer be realistic for it, then the timing supports the assertion that “share everything” was, at least in part, a response to the rule.

How Many Customers Use These “Light” Plans?

Cisco estimated that in 2011 the average North American mobile connection “generated” 324 megabytes. The average for 2012 will almost surely be higher and even higher among those with higher-end phones. Regardless, even average use close to 1 Gb would imply a large number of consumers who could benefit from buying light-use plans, regardless of whether they do.

Did the FCC’s Tethering Decision Benefit or Harm Consumers?

It probably did both.

The consumer benefits: First, Verizon customers who want to tether their devices can do so without an extra charge. Second, AT&T and Sprint followed Verizon in offering shared data plans, with AT&T’s shared plans also including tethering. Craig Moffett of Alliance Bernstein noted recently that “Family Share plans are not, as has often been characterized, price increases. They are price cuts…”[2] because the plans allow consumers to allocate their data more efficiently. As a result, he notes, these plans should cause investors to worry that the plans will reduce revenues. In other words, the shared plans on balance probably represent a shift from producer to consumer surplus.

The consumer costs: Verizon is no longer priced competitively for light users.

The balance: Given that other carriers still offer postpaid plans to light users and that a plethora of prepaid and other non-contract options exist for light users, the harm to consumers from Verizon’s exit is probably small, while the benefits to consumers may be nontrivial. In other words, the net effect was most likely a net benefit to consumers.

What Does This Experience Tell Us?

The FCC’s decision and industry reaction should serve as a gentle reminder to those who tend to favor regulatory intervention: even the smallest interventions can have unintended ripple effects. Rare indeed is the rule that affects only the firm and activity targeted and nothing else. More specifically, rules that especially help the technorati—those at the high end of the digital food chain—may hurt those at the other end of the spectrum.

But those who tend to oppose regulatory intervention should also take note: not all unintended consequences are disastrous, and some might even be beneficial.

Is That a Unique Observation?

Not really.

Could I Have Done Something Better With My Time Instead of Reading This?

Maybe. Read this paper to find out.


[1] The FCC allowed Verizon to continue charging customers with grandfathered “unlimited” data plans an additional fee for tethering.

[2] Moffett, Craig. The Perfect Storm. Weekend Media Blast. AllianceBernstein, November 2, 2012.

Is a broadband tax a good idea?

Thursday, August 30th, 2012

The FCC recently asked for comments on a proposal to raise money for universal service obligations by taxing broadband connections. Let’s set aside, for the moment, the question of whether the universal service program has worked (it hasn’t), whether it is efficient (it isn’t), and whether the reforms will actually improve it (they won’t). Instead, let’s focus on the specific question of whether taxing broadband is the best way to raise money for any given program telecommunications policymakers want to fund.

The answer, in typical economist fashion, is that it depends.

A tax is generally evaluated on two criteria: efficiency and equity. The more “deadweight loss” the tax causes, the more inefficient it is. Deadweight loss results from people changing their behavior in response to the tax and, in principle, can be calculated as a welfare loss.

Closely related to efficiency is how the tax affects policy goals. This question is particularly relevant here because the service being taxed is precisely the service the tax is also suppose to support, making it possible that the tax itself could undo any benefits of the spending it funds.

Equity—in general, how much people of different income levels pay—is simple in concept but difficult in practice since it is not possible to say what the “right” share of the tax any given group should pay.

Perhaps surprisingly to some, a broadband tax may actually be efficient relative to some other options, including income taxes (i.e., coming from general revenues). Historically, universal service funds were raised by taxes on long distance service, which is highly price sensitive, making the tax quite inefficient.

By contrast, for the typical household, fixed (and, increasingly, mobile) broadband has likely become quite inelastic. In 2010, one study estimated that the typical household was willing to pay about $80 per month for “fast” broadband service, while the median monthly price for that service was about $40. Since then, the number of applications and available online services has increased, meaning that consumer willingness to pay has presumably also increased, while according to the Bureau of Labor Statistics broadband prices have remained about the same.

Consumer Price Index, Internet Services and Electronic Information Providers

Source: Bureau of Labor Statistics, Series ID CUUR0000SEEE03, adjusted so January 2007=100

While no recent study has specifically evaluated price elasticity, the large gap between prices and willingness to pay suggests that a tax of any size likely to be considered might not be hugely inefficient overall.

The problem is that even if the tax does not affect subscription decisions by most people, it can still affect precisely the population policymakers want to help. Even though only 10 percent of people who do not have broadband cite price as the barrier, there is some lower price at which people will subscribe. A tax effectively increases the price consumers pay, meaning that it puts people at that margin—people who may be on the verge of subscribing—that much further away from deciding broadband is worthwhile. Similarly, people on the other side of that margin—those who believe broadband is worthwhile, but just barely—will either cancel their subscriptions or subscribe to less robust offerings.

To be sure, people who would be eligible for low-income support would probably receive more in subsidies than they pay in taxes, but this is an absurdly inefficient way to connect more people. As one astute observer noted, it is not merely like trying to fill a leaky bucket, but perhaps more like trying to fill that bucket upside-down through the holes.[1]

Higher prices for everyone highlights the equity problem. A connection tax is the same for everyone regardless of income, making the tax regressive. The tax becomes even more regressive because much of the payments go to rural residents regardless of their income while everyone pays regardless of their income, meaning the tax includes a transfer from the urban poor to the rural rich.

Even without an income test, methods exist to mitigate the equity problem. Unfortunately, the methods the FCC proposes are likely to undermine other policy goals. In particular, the FCC asks about the effects of taxing by tier of service, presumably with higher tiers of service paying more (paragraph 249). The FCC does not specifically mention equity in its discussion, but if higher income people are more likely to have faster connections then it would help mitigate equity issues.

This tiered tax approach is commonly used for other services, including electricity and water, where a low tier of use is taxed at a low rate, and higher usage rates are taxed incrementally more. Therefore, in the case of water, for example, a family that uses water only for cooking and cleaning will pay a lower tax rate than a family that also waters its lawn and fills a swimming pool. And while it is not a perfect measure of income, in general wealthier people are more likely to have big lawns and pools.

The problem with this approach in broadband is that while willingness to pay for “fast” broadband is relatively high, most people are not yet willing to pay much more at all for “very fast” broadband. Thus, taxing higher tiers of service at a higher price, while more equitable, may lead to other efficiency problems if it reduces demand for higher tiers of service.

So what’s the solution?

The FCC should decide which objectives are the most important: efficiency, equity, or other policy objectives such as inducing more people to subscribe or upgrade their speeds, and then design the tax system that best achieves that goal. Then it should compare this “best” tax to the outcome if the system were simply funded from general revenues and compare which of those would lead to a better outcome.

But no tax is worthwhile if the program it supports is itself inefficient and inequitable. The real solution is to dramatically reduce spending on ineffective universal service programs in order to minimize the amount of money needed to fund them. Unfortunately, the reforms appear to do just the opposite. In 2011, the high cost fund spent $4.03 billion and had been projected to decrease even further. The reforms, however, specified that spending should not fall below $4.5 billion (see paragraph 560 of the order), meaning that the first real effect of the reforms was to increase spending by a half billion dollars. And, as the GAO noted, the FCC “has not addressed its inability to determine the effect of the fund and lacks a specific data-analysis plan for carrier data it will collect” and “lacks a mechanism to link carrier rates and revenues with support payments.”

The right reforms include integrating true, third-party, evaluation mechanisms into the program and, given the vast evidence of inefficiency and ineffectiveness, a future path of steady and significant budget cuts. Those changes combined with an efficient tax-collection method, might yield a program that efficiently targets those truly in need.


[1] This excellent analogy comes from Greg Rosston via personal communications.

Hey, FCC: Stop Counting!

Friday, June 1st, 2012

By June 2011, nearly one-third of American households relied solely on wireless voice service, with  lower income households more likely to be wireless-only. This information doesn’t come from the FCC, as you might expect. Instead, it comes from the twice-yearly National Health Interview Survey, conducted by the U.S. Census for the Centers for Disease Control and Prevention (CDC).[1] The example highlights three points policymakers should take to heart for data collection relevant to telecommunications:

  • FCC does not always produce the most relevant telecommunications data.
  • Careful, representative surveys—not population counts, which the FCC uses for measuring voice and broadband markets—are usually the most effective and efficient way to gather data.
  • Policymaking agencies like the FCC can obtain relevant data from other agencies like the U.S. Census that specialize in data collection but have no vested interest in any particular policy outcome.

Counting telephones began at the turn of the 20th Century

The U.S. Census began to collect data on telephones as they became an increasingly important part of American life. In 1922 the Bureau noted, “The census of telephones has been taken quinquennially since 1902, and statistics of telephones were compiled and published in the decennial censuses of 1880 and 1890.”[2]

The FCC has largely continued this tradition, attempting to count each line or connection for communications technologies. (Some—not me, of course—might say delays in producing some reports indicate a desire to revert to the quinquennial release schedule).

Maintaining a consistent approach to data-gathering has certain advantages, such as facilitating comparisons over time. However, that advantage diminishes as it becomes less clear what, exactly, we should measure and as market changes make any particular count less relevant.

Counting is inefficient and misses the most important data

Most economic and social policy is based on surveys conducted by agencies such as the Census Bureau and the Bureau of Labor Statistics. We rely on surveys because gathering information on an entire population is typically not feasible. For the constitutionally-mandated decennial census, for example, the U.S. Census spends about $11 billion and hires about one million temporary workers.[3] By contrast, in a non-census year, the Census bureau spends about $1 billion on all its data collection efforts.[4] Additionally, surveys make it possible to gather data about particular groups and estimate the likelihood that different measures truly reflect the actual population.

The FCC attempts to count all lines, connections, and other factors related to telecommunications by requiring companies to provide certain data. Large firms spend significant resources providing these data. Small firms often do not have the resources to provide this information, and the FCC’s skilled data staff then must spend enormous time and effort trying to gather this information from firms who either will not or cannot respond.

The result is that the FCC has the least reliable count data in precisely the topical and geographic areas that it needs data for sound decisionmaking. For example, counts of broadband connections provide some measure of the intersection of supply (availability) and demand, but not good information on either separately. The counts provide no information on how those connections are used nor on how they break down demographically.[5]

This telecommunications counting fetish has spread to other parts of the government, as well. The National Broadband Map is based on the same flawed premise: the belief that the best dataset comes from observing every detail of every broadband connection. The effort cost about $350 million and still apparently yields inaccurate results in rural areas where policymakers want to direct resources.[6]

The FCC Should Stop Counting and Start Contracting the Census Bureau to do Surveys

Nearly all other areas of economic policy are informed by surveys, many of which are conducted monthly to provide real-time information to markets and policymakers. Nothing in particular about telecommunications requires a total population count rather than survey data.

Additionally, there is no reason why the FCC itself should be responsible for data collection. The U.S. Census Bureau is much better equipped to design and implement surveys. It is not uncommon for Census to do survey work for (and funded by) other agencies. In addition to the CDC survey mentioned above, Census also does surveys for the Department of Justice,[7] the National Center for Education Statistics,[8] and State Library Agencies[9] to name a few.

Embracing surveys conducted by other agencies would have several advantages:

  • Surveys are almost certain to be cheaper than counts both to the government and to the private sector.
  • Surveys of users, rather than counts submitted by providers, are more likely to yield data not influenced by providers’ incentives to game the data collection process to their own benefit.
  • Data collection by outside agencies would reduce any inherent conflict of interest the FCC might face when gathering data related to its agenda.

Surveys by other agencies, of course, are not a silver bullet for obtaining better and more timely data. They can be done poorly. And the FCC should remain involved. As the expert agency it should largely determine the questions it needs answered and the type of information necessary for policymaking and provide the resources necessary to do it. Additionally, the FCC needs the ability to compel data from regulated companies for specific decisions when necessary.

Today, unfortunately, surveys are being subject to attacks by Congressional Republicans, who want to reduce the ability of the U.S. Census to collect data.[10] These attacks have been roundly and correctly criticized by conservative and liberal commentators, who note that these data are crucial to good policymaking.[11]

Despite the Congressional statistical ignorance de jour, surveys by agencies expert in data collection will yield far better data at lower cost than today’s methods. Hopefully the FCC will take note and begin to move our ability to study telecommunications out of the 19th Century.


[1] http://www.cdc.gov/nchs/nhis.htm

[2] http://www2.census.gov/prod2/decennial/documents/13473055ch1.pdf

[3] http://usgovinfo.about.com/od/censusandstatistics/a/aboutcensus.htm

[4] http://www.osec.doc.gov/bmi/budget/12CJ/Census_Bureau_FY_2012_Congressional_Submission.pdf

[5] It is possible to merge geographic counts with demographic data from the Census, but even this approach would be more effective if done in a way that explicitly incorporates connections to the Current Population Survey.

[6] http://www.govtech.com/wireless/Study-National-Broadband-Map-Inaccurate.html

[7] http://www.census.gov/econ/overview/go2300.html, http://www.census.gov/econ/overview/go2500.html

[8] http://www.census.gov/econ/overview/go1600.html,  http://www.census.gov/econ/overview/go2000.html

[9] http://www.census.gov/econ/overview/go1900.html

[10] http://www.nytimes.com/2012/05/20/sunday-review/the-debate-over-the-american-community-survey.html

[11] See, for example, Matthew Yglesias’s discussion: http://www.slate.com/articles/business/moneybox/2012/05/american_community_survey_why_republican_hate_it_.html

What Cable Monopoly?

Thursday, May 3rd, 2012

“The future is in fiber optic high-speed Internet access, as compared to DSL and cable modem service.”

“Many new business models are made possible by high-speed access, and fiber access in particular. By contrast, DSL and cable modem access are subject to sharp capacity limitations which are rapidly rendering them obsolete for the types of activities Americans want to engage in online.”

-      Crawford, Susan P. “Transporting Communications.” Boston University Law Review 89, no. 3 (2009): 871–937, pp. 928 & 930.

“…the broad consensus seems to be that the long-term fixed platform will likely be fiber, and cable plant too will likely become increasingly fiber-based over time, as the theoretical and long-term practical capacity of fiber to the home systems will be orders of magnitude larger than for cable systems.”

-       Benkler, Yochai, Rob Faris, Urs Gasser, Laura Miyakawa, and Stephen Schultze. Next Generation Connectivity: A Review of Broadband Internet Transitions and Policy from Around the World. The Berkman Center for Internet & Society, 2010, p.63

What a difference a few years makes! As late as 2009 Susan Crawford was arguing that cable broadband was becoming obsolete and Harvard’s Berkman Center believed the only long-term answer to increasing broadband demand was fiber.

Today, Crawford is warning of a looming cable monopoly. To be sure, DOCSIS 3.0 technology has given cable a relatively low-cost upgrade path while traditional telcos generally have to invest far more in fiber to achieve similar performance.[1]

So, what is really happening in the market? As the chart below shows, data on fixed broadband subscriptions contradict the claims of monopoly. The most recent FCC data only goes through December 2010, so we extend the figure to June 2011 using data from the OECD.[2] The data show that cable has always held the majority of connections, peaking around 2003 when it held close to 60 percent of the fixed broadband market.

Sources: FCC reports on local telephone competition and broadband deployment, and OECD http://www.oecd.org/document/23/0,3746,en_2649_34225_33987543_1_1_1_1,00.html

The share of cable connections is trending upwards, but, at least as of last year, did not appear to be significantly different from the past.

More recent data comes from companies’ financial reports. The following chart shows the quarter-to-quarter percentage change in the number of high-speed Internet subscribers for Comcast, Time Warner Cable, Verizon, and AT&T. Cable companies have been doing well in terms of net additions for several quarters, but not significantly better than Verizon, and even AT&T is reporting net gains from its U-Verse platform.

Sources: Company quarterly and trending reports.

Note: Time Warner Cable reported 10.716 million HIS subscribers in Q1 2012, which represented close to a 7 percent increase over Q4 2011. However, 550,000 of that increase came from TWC’s acquisition of Insight Communication and 42,000 of it from the acquisition of NewWave Communications. The percentage shown in the figure deducts increases due to acquisitions.[3]

None of this evidence means that Crawford’s warnings are necessarily wrong, of course. Whether cable’s cost advantage will ultimately translate into a monopoly or any increased market power, however, will depend not just on technological differences but also on changes in demand.

When an HD video stream comes from Netflix at less than 5 Mbps there is little advantage to cable DOCSIS 3.0 relative to a DSL connection with at least 5 mbps. But demand will surely change over time, and cable’s cost advantage will be an important point in its favor. That’s one reason why AllianceBernstein’s analyst Craig Moffett is so bullish on cable stocks.

Even as critics are pivoting from demanding that we focus only on fiber to warnings of a cable monopoly, the market is shifting under their feet again. Today, consumers are adopting smartphones and tablets in droves. The trend towards wireless is already affecting the development of Internet innovation (think mobile apps). Cable still has some advantages in that area—wireless providers need to offload their data somewhere, after all—but it may yet not end up as the dominant technology.

More generally, this market changes quickly. A few years ago policymakers were being urged to focus on fiber. Now they are being warned about a cable monopoly even as wireless broadband is taking center stage, as the FCC data shown in the figure below demonstrate. And surely in a few years technology and demand will have moved us in directions we can’t yet predict.

Source: FCC Internet Access Services Report, October 2011, Table 7 http://transition.fcc.gov/wcb/iatd/comp.html

Policymakers should, without a doubt, keep a close eye on market conditions and work to ensure an environment conducive to competition. But if this fast-changing market teaches us anything, it’s that we should think twice before we conclude we know the endgame.


[1] Christopher Yoo has pointed out the fiber-cable worry flipflop, so I can’t claim credit for noticing it.

[2] I have been very critical of the OECD rankings. However, data for a single country over time should be reliable if the within-country definitions remain constant. Judging from how closely the OECD data track the FCC data it is likely they come from similar sources.

[3] http://www.fiercecable.com/story/insight-leadership-team-departs-following-completion-time-warner-cable-acqu/2012-03-01 and http://www.businesswire.com/news/home/20110613005676/en/Time-Warner-Cable-Acquire-Cable-Systems-NewWave