Comcast and Netflix—What’s the Big Deal?

By Tom Lenard
February 26th, 2014

Netflix and Comcast recently announced an agreement whereby Netflix will pay Comcast for direct access to its network.  This agreement addresses congestion that is slowing delivery of Netflix videos to Comcast’s broadband subscribers and resolves a dispute between the two companies concerning how to pay for the needed network upgrades.  Netflix and Verizon are currently working through a similar dispute.  While some commentators think deals such as the one between Netflix and Comcast are problematic, the reality is that the agreement reflects a common market transaction that yields an outcome more efficient and more quickly than any regulatory intervention could have.

The following series of stylized figures illustrate how the growth of Netflix and other streaming video services have affected the volume and flow of internet traffic and corresponding payments in recent years.  Traditionally (Figure 1), Internet backbone providers and ISPs entered into “peering” agreements, which did not call for payments on either side, reflecting a relatively balanced flow of traffic.  Content distributors paid backbone providers for “transit,” reflecting the unbalanced flow of traffic along that route.

Slide1

With the growth of online video and with Netflix accounting for 30 percent of traffic at some times of the day, this system was bound to become strained, as we are now seeing and as shown in Figure 2.  The flow of traffic between the backbone provider and the ISP is unbalanced and has grown enormously, requiring investments in additional capacity.

Slide2

One way to address this problem is for the backbone provider to pay the ISP, reflecting the greater amount of traffic (and greater capacity needed) going in that direction (see Figure 3).  In fact, that is what happened following a dispute between Level 3 and Comcast in late 2010.

Slide3

Another solution is the just-announced Comcast-Netflix deal, reflected in Figure 4.  In this case, Netflix/Comcast is bypassing the intermediate backbone provider (either partially or completely), presumably because it is more efficient to do so.  One or both of them is investing in the needed capacity.  Regulatory interference with such a deal runs the risk of blocking an advance that would lower costs and/or raise quality to consumers.

Slide4

The Wall Street Journal has described the debate as being “over who should bear the cost of upgrading the Internet’s pipes to carry the nation’s growing volume of online video:  broadband providers like cable and phone companies, or content companies like Netflix, which make money by sending news or entertainment through those pipes.”  Ultimately, of course, consumers pay one way or the other.  When Netflix pays Comcast, the cost is passed through to Netflix subscribers.  This is both efficient and fair, because the consumer of Netflix services is paying for the cost of that service.

In the absence of such an agreement, quality would suffer or the ISP would bear the cost.  The ISP might recover these costs by increasing prices to subscribers generally.  This would involve a cross-subsidy of Netflix subscribers by non-subscribers, which would be neither efficient nor fair.  Alternatively, Comcast could increase prices for those subscribers who consume a lot of bandwidth, which might have similar effects to the just-announced deal, but would probably lose some efficiencies.  In any event, it is difficult to see how such an arrangement would be better for consumers than the announced agreement.

 

 

The FCC Tries Yet Again

By Tom Lenard
February 19th, 2014

FCC Chairman Tom Wheeler’s official response to the DC Appeals Court decision on the Commission’s “net neutrality” rules promises to keep the issue on the table for the foreseeable future.  That is unfortunate, because there are better ways for the Commission and its staff to spend their time.

The Appeals Court took away from the Commission with one hand, while giving back with the other:  It struck down the more onerous provisions of the net neutrality rules—the “anti-discrimination” and “anti-blocking” provisions—because they imposed common carrier obligations and broadband is not classified as a Title II common carrier service.  However, the Court affirmed the Commission’s argument that it has general authority (under section 706 of the Communications Act) to regulate in order to encourage broadband deployment.

Since the Appeals Court decision came down, the FCC has been under considerable pressure from net neutrality proponents to reclassify broadband as a Title II common carrier.  In today’s announcement, the Commission declined to do that. However, the Commission also declined to close the Title II docket, keeping the threat of reclassification and the regulatory burdens and oversight that go with it, alive.

In addition, the Commission announced its intention to start yet another net neutrality rulemaking, under its section 706 authority, in order to fulfill the Commission’s no blocking and non-discrimination goals as well as to enhance the transparency rule (the one major provision that the court upheld).

With all the activity aimed towards asserting legal justification for its net neutrality rules, it sometimes gets lost that the FCC had no convincing economic or consumer welfare justification for the rules in the first place.

While there is widespread agreement that the Internet should be open and provide consumers with access to content, applications and services of their choice, the rule was always a solution in search of a problem, a sentiment echoed today by FCC Commissioner Pai.  The Commission never provided the necessary data and analysis to show that the rules would address a significant market failure, did not identify harms to users that the rules would remedy, and did not demonstrate that the benefits of the rules would exceed their costs.  In other words, the Commission neglected to explain why the broadband market, which has generally thrived under minimal regulation, should now be subject to an enhanced regulatory regime.   Indeed, a good argument can be made that, by making the adoption of innovative business models more difficult, the rules would have hindered rather than encouraged the deployment of broadband infrastructure, notwithstanding the Commission’s assertions to the contrary.

There is now substantial concern that the Appeals Court has expanded the Commission’s authority to include the entire Internet ecosystem—including potentially content, applications, and service providers—as long as it can make some plausible argument that its actions encourage broadband deployment.  Expanding the Commission’s domain in this way would be a serious mistake and would compound the harm.

A major goal of the Commission in promulgating its net neutrality rules initially was to “provide greater predictability.”  It clearly has not achieved that goal.  Starting yet another proceeding, and keeping the Title II docket open, will create even more uncertainty for the entire Internet ecosystem.

Chairman Rockefeller and Data Brokers

By Amy Smorodin
September 26th, 2013

Chairman Rockefeller recently sent letters to a dozen different companies seeking information on how they share information with third parties.  The letters are an extension of previous requests sent to “data brokers” asking for clarification of the companies’ “data collection, use and sharing practices.”  In the letters, the Chairman opines that the privacy policies on many websites “appear to leave room for sharing a consumer’s information with data brokers or other third parties who in turn may share with data brokers.”  He also stresses the importance of transparent privacy practices for consumers.

While a call for more information and data is certainly commendable, one should ask, “Where is this all going?”    Is the Chairman suddenly seeing the need for some data to inform policy making in this area?

While we would hope so, the Chairman’s letter infers the assumption that there is something inherently harmful about data collection and sharing, although this harm is not explicitly described.  He also posits that consumers may not be aware that their information is being collected or how it’s being used.  Again, there is no information offered on how this conclusion is reached.

Overall, more data to inform privacy policy-making would be a good thing.  As Tom Lenard has pointed out in filings, Congressional testimony, and a recent book chapter submission, the last comprehensive survey of privacy policies was back in 2001, a lifetime ago in the technology industry.  Ideally, any privacy proposals from Congress or the FTC should be based upon a survey of the actual current events on the ground, as opposed to opinions and assumptions.  Only with relevant data can policies be drafted that are targeting towards specific harms.  Additionally, data-driven policymaking can be evaluated to ensure that specific policy is performing as intended, and that benefits derived outweigh the costs of the regulation.

Data collection is burdensome and time consuming for companies involved. Any other government entity (besides Congress) would be required under the Paperwork Reduction Act to have its proposal be assessed, as they are required to “reduce information collection burdens on the public.” Since it doesn’t appear that Rockefeller’s recent requests for information are part of any systematic study or plan, it is understandable why some companies would bristle at the thought of spending time and resources on answering a list of questions.

The FTC recently conducted its own query in preparation for a study on “big data” and the privacy practices of data brokers.  One hopes the study, expected to be out by the end of the year, is well-designed and an objective look at the industry without a predetermination of results. Such a study would be useful going forward.

Dispatch from the TPI Aspen Forum – Monday Keynotes, Panels and Beyond

By Amy Smorodin
August 20th, 2013

(With help from Corey Rhyan)

The first full day of the TPI Aspen Forum began with a keynote speech by Bob Crandall, TPI Adjunct Senior Fellow and Nonresident Senior Fellow, Economic Studies Program at Brookings Institution.  Crandall’s remarks covered how broadband policy should be informed by an accurate assessment of current market conditions.  Despite what Crandall described as a pessimistic tone in recent reports on US broadband, a relaxed regulatory environment has led to a penetration rate over 98% for broadband in the US (including wireless options), and U.S. broadband speeds have been steadily increasing.  The US also leads in deployment of 4G wireless services around the globe.  As a result of robust competition between cable and copper, the US cable companies have deployed super-fast DOCSIS 3.0 technology to 85% of households and incumbent telecom providers have exceeded the cable companies’ capital investment to match their services in recent years.  While super-fast 100 Mbps speeds are often the topic of policy discussions Crandall pointed to evidence that households do not want to pay for extremely high-speed service even when it’s available. Crandall’s remarks can be viewed here.

Next up was the panel “Communications and IT – What Can We Expect From Congress?,” which featured ex-members of Congress Rick Boucher, Cliff Stearns and Tom Tauke, and was skillfully moderated by Brendan Sasso from The Hill.  The free-wheeling discussion began with the open question: what is the most important tech issue Congress is likely to address? While the answers varied from privacy, to spectrum and the upcoming incentive auctions, to cybersecurity, to NSA surveillance, each opined that the current Congress has a “productivity problem” when it comes to passing legislation.  One item some found particularly encouraging was the recently-created working groups on spectrum and privacy.  When asked about the current nominees for the FCC, all agreed there should be an easy confirmation, particularly because of the pair of Republican and Democratic nominees, but concern was voiced over the current nomination process.   Watch the entire (very entertaining) panel here.

The next panel, “Deconstructing Creative Destruction,” was a nod to the overall theme of the conference and featured: Danny Boice (Speek), Chris Ciabarra (Revel Systems), Joshua Gans (University of Toronto), Laura Martin (Needham & Company LLC), Hal Varian (Google) and was moderated by TPI’s Scott Wallsten. The two startup representatives or “real-world doers” according to Wallsten, discussed how their companies have become disruptive forces in their industries.  The key each entrepreneur proclaimed was solving a problem, one in particular that affects consumers and end-users.  The panel also discussed hurdles such as H1B visa use in start-ups and obstacles in hiring, and financing issues in innovative technologies.  Martin discussed today’s tax and investment environment, especially for the media and communications industries. The video can be viewed online here.

The third and final panel of the day, “Competition, Regulation, and the Evolution of Internet Business Models” focused on potential innovations in the pricing of broadband services and featured Kevin Leddy (Time Warner Cable), Robert Quinn (AT&T), Joshua Wright (FTC), Christopher Yoo (University of Penn Law School) and was moderated by TPI’s Tom Lenard.  Much of the discussion of these panelists focused on the potentially new pricing that could make broadband networks more efficient and create value for consumers.  However, a common theme pitted these innovations against the open internet laws current under review by the DC Circuit Court.  In fact, attempts so far to implement usage pricing has been called “discrimination” and resulted in quick backlash. FTC Commissioner Wright stated he believes the FTC is more than capable of protecting consumers in this space and that many of the proposed innovations and vertical agreements are precompetitive and the FTC can prevent those that may harm consumers.  The video can be viewed online here.

Monday lunch featured a speech from the Chairwoman from the FTC, Edith Ramirez, who focused her talk on the future of Big Data and the FTC’s role as a lifeguard for consumers.  Media coverage of the speech can be found here and here and video of Chairwoman Ramirez’s remarks can be viewed here.

Last night’s dinner keynote was the Hon. Mitch Daniels, President of Purdue University and Former Governor of the State of Indiana.  Daniels opined on “creative destruction” in higher education.  Video is here.

Tuesday’s panels and keynotes will be posted throughout today on the TPI YouTube channel.  They include: a keynote by Randal Milch, Executive Vice President of Public Policy and General Counsel at Verizon, and the discussion panels “Who Pays for the Internet – A Global Perspective, “Privacy, Data Security and Trade – Policy Choices,” and “The FCC’s Incentive Auctions – How Can They Succeed?”

The conference concludes this afternoon with “A Conversation with the Commissioners,” moderated by Politico’s Tony Romm.  Video of the talk will be up later this afternoon.

Thanks to all attendees and speakers who came out to the TPI Aspen Forum this year!  All of us at TPI are now taking a little break.  Hope to see you next year!

Dispatch from the TPI Aspen Forum – Sunday Opening Reception

By Amy Smorodin
August 19th, 2013

(With help from Corey Rhyan)

The 2013 Technology Policy Institute Aspen Forum started out this year with a little rain but plenty of good conversation.  Welcoming remarks were given by TPI President Tom Lenard and TPI Board Member Ray Gifford, who emphasized that the Forum was a great way to end the summer.

Every year, TPI secures a Colorado-based speaker to welcome attendees to the Forum.  This year’s speaker was R. Stanton Dodge, Executive Vice President, General Counsel and Secretary, Dish Network.  In keeping with the forum theme, Dodge opined on the creative destruction both past and present in the video delivery industry.  From the usage of smaller satellite dishes, to the rise of the DVR, the changing expectations of consumers have dictated change in the industry, which must transition to provide content on-demand.

Dodge also urged attendees to take time to watch the US Pro Cycling Challenge, which happens to be going through Aspen this year – after attending the afternoon breakout sessions, of course.

Video of last night’s remarks will be posted shortly on the TPI YouTube page, and you can follow along with the pithy and insightful tweets from attendees at #TPIAspen.

Highlights of today’s panels and keynotes will be coming soon.

Where do vendors to cable think the industry is heading? Evidence from 2013 Cable Show data

By Scott Wallsten
June 11th, 2013

For the past four years (2010 – 2013) I have been collecting data about exhibitors at the Cable Show. Key observations based on the most recent data:

  • The number of exhibitors continues to decline, down to 251 in 2013 from 345 in 2010 (Figure 1).
  • Programming is the most popular exhibitor category, and has been steadily increasing in popularity since 2010. In 2013 nearly one-third of exhibitors classify themselves under programming. Multi-screen content, HDTV, video on demand, and IPTV are the second, third, fourth, and fifth most popular categories (Figure 2).
  • The categories with the biggest increases in representation since 2010 are multi-screen content, programming, HDTV, new technology, and cloud services (Figure 3).
  • The categories with the biggest decreases in representation since 2010 include telecommunications equipment, services, and VOIP (Figure 4).

Exhibitor attendance

This year, the website listed 251 exhibitors, continuing a steady decline from 2010 (Figure 1). The number is biased upwards because an exhibitor can be counted multiple times if it appears in multiple booths.

Figure 1: Number of Cable Show Exhibitors, 2010-2013

Number of Exhibitors

 

Hot or Not?

The website shows the categories of products, services, or technologies each exhibitor selects to describe itself. An exhibitor can select several categories. To evaluate the prevalence of each category I total the number of times each category is selected, and then divide that by the number of exhibitors to make it comparable across years.

The table below shows the top 20 categories for 2010 – 2013. Programming has remained the top category for all four years. However, multi-screen content jumped to second place, followed by HDTV, pushing video on demand and IPTV to numbers four and five.

topcategories

 

Hot

Figure 2 shows how the top 5 exhibitor categories for 2013 have evolved over the past four years. Fully one-third of all exhibitors classify themselves as programming, nearly twice as many as in 2010. Multi-screen content did not exist as a category in 2010 while 16 percent of all exhibitors included themselves in this category in 2013.

Figure 2: Share of Exhibitors with Products in Top 5 2013 Categories Over Time

topcategorychart

 

Consistent with the above figure, from 2010 – 2013 cable programming increased in representation more than any other category. Multi-screen content saw the second-largest increase, followed by mobile apps, new technology, and cloud services.

Figure 3: Categories with Biggest Increase in Representation Since 2010biggestincreases

Not

Telecommunications services and equipment has seen the biggest decrease in representation since 2010, followed by VOIP, program guides, and optical networking. However, because “program guides” was not included as a category in 2013 it is not clear if the category truly became less popular or is now simply called something else.

Figure 4: Categories with Biggest Decreases in Representation Since 2010

biggestlosers

 

What does this mean?

The data themselves have certain problems that make drawing strong conclusions difficult. For example, counting exhibitors and categories implicitly assumes that each exhibitor is identical in size and importance, which clearly is not true (Figure 5). Additionally, the categories are self-reported by the exhibitors and do not appear to have strict definitions. Exhibitors have no incentive to select grossly inaccurate categories, since that would attract people unlikely to purchase their products, but exhibitors probably tend towards being overly-inclusive so as not to miss potential clients. This tendency might bias towards especially popular technologies. For example, perhaps exhibitors take liberties in claiming they offer “cloud services” because those contain popular buzzwords rather than because their products truly offer much in the way of those services.

Despite these shortcomings in the data, they provide one source of information on where economic actors with money at stake think the industry is headed over the next year. And, according to them, this year the industry is trending more towards its traditional role as video provider, focusing on programming and multi-screen content.

Figure 5: Exhibitor Map, 2013 Cable Show

map

 

Unleashing the Potential of Mobile Broadband: What Julius Missed

By Tom Lenard
March 7th, 2013

In yesterday’s Wall Street Journal op-ed, FCC Chairman Genachowski correctly focuses on the innovation potential of mobile broadband.  For that potential to be realized, he points out, the U.S. needs to make more spectrum available.  A spectrum price index developed by my colleague, Scott Wallsten, demonstrates what most observers believe – that spectrum has become increasingly scarce over the last few years.

The Chairman’s op-ed highlights three new policy initiatives the FCC and the Obama Administration are taking in an attempt to address the spectrum scarcity:  (1) the incentive auctions designed to reclaim as much as 120 MHz of high-quality broadcast spectrum for flexibly licensed – presumably, mobile broadband – uses;   (2) freeing up the TV white spaces for unlicensed uses; and (3) facilitating sharing of government spectrum by private users.

There are two notable omissions from the Chairman’s list.  First, he does not mention the 150 MHz of mobile satellite service (MSS) spectrum, which has been virtually unused for over twenty years due to gross government mismanagement.  A major portion of this spectrum, now licensed to three firms – LightSquared, Globalstar, and Dish – could quickly be made available for mobile broadband uses. The FCC is now considering a proposal from LightSquared that would enable at least some of its spectrum to be productively used.  That proposal should be approved ASAP.  The MSS spectrum truly represents the low-hanging fruit and making it available should be given the same priority as the other items on the Chairman’s list.

Second, if the FCC and NTIA truly want to be innovative with respect to government spectrum, they should focus on the elusive task of developing a system that requires government users to face the opportunity cost of the spectrum they use.  This is currently not the case, which is a major reason why it is so difficult to get government users to relinquish virtually any of the spectrum they control.  To introduce opportunity cost into government decision making, Larry White and I have proposed the establishment of a Government Spectrum Ownership Corporation (GSOC). A GSOC would operate similarly to the General Services Administration (GSA).  Government agencies would pay a market-based “rent” for spectrum to the GSOC, just as they do now to the GSA for the office space and other real estate they use.  Importantly, the GSOC could then sell surplus spectrum to the private sector (as the GSA does with real estate). The GSOC would hopefully give government agencies appropriate incentives to use spectrum efficiently, just as they now have that incentive with real estate.  This would be a true innovation.

In the short run, administrative mechanisms are probably a more feasible way to make more government spectrum available.  For example, White and I also proposed cash prizes for government employees who devise ways their agency can economize on its use of spectrum.  This would be consistent with other government bonuses that reward outstanding performance.

Sharing of government spectrum is a second-best solution.  It would be far better if government used its spectrum more efficiently and more of it was then made exclusively available to private sector users.  This is, admittedly, a difficult task, but worth the Administration’s efforts.

Life on the Dark Side of Network Effects: Why I Ditched My Windows Phone

By Scott Wallsten
January 2nd, 2013

For consumers, 2012 was a great year in wireless. Carriers rolled out 4G networks in earnest and smartphone competition heated up. Apple’s iPhone 5 release was no surprise. But no longer was Android relegated primarily to low-end phones. Ice Cream Sandwich received strong reviews and Samsung launched high end Android devices like the Galaxy S3 that rivaled the iPhone. Microsoft kept plugging away at the margins and introduced Windows Phone 8 with a new partner in Nokia, which had seen better days. For its part, RIM provided investors with numerous opportunities to short its stock.

I love gadgets. Especially new gadgets. So I eagerly awaited the day my wireless contract expired so I could participate in the ritual biennial changing of the phone. (I wish I could change it more frequently, but I wait to qualify for a subsidized upgrade because we also have to do things like occasionally buy food for the kids). But what phone to choose?

The iPhone 5 was mostly well-received, and even early skeptics like Farhad Manjoo wrote that once you held it you realized how awesome it was. Still, even though it had become a cliche critique, to me it just looked like a taller iPhone, not a newer iPhone, and I wanted to get something that felt really new, not really tall. Am I a little shallow for rejecting an upgrade for that reason? Yes, yes I am.

So after reading rave reviews and talking to friends who had already upgraded, I got the Samsung Galaxy S3.

I hated it.

The Android lock screen customizations and widgets should have made me happy, but they didn’t. I couldn’t find a setup I liked. The Samsung’s hardware didn’t work for me. Buttons on both the right and on the left sides of the phone meant that every time I tried to press the button on the right I would also press the button on the left, screwing up whatever important task I was doing (OK, maybe that “important task” was Angry Birds, but still). Those aren’t inherent criticisms of Android or the Galaxy S3. They’re just my own quirks. (It isn’t you, it’s me).

Finally I got so frustrated with my phone that one day I hopped off the Metro on my way to work, went to the nearest AT&T store, returned it, and re-activated my old iPhone 4.

My first reaction to reanimating my 4 was relief that I could once again operate the phone properly. My second reaction was, “holy **** this screen is tiny!” I was sure my iPhone 4 had turned itself into a Nano out of spite while languishing unused.

After that, the Nokia Lumia 920 with Windows Phone 8 caught my eye. Great reviews (including this thoughtful and thorough review by a self-professed “iPhone-loving Apple fangirl” at Mashable who switched to the Lumia for two weeks), beautiful phone. And those “Live Tiles” on the home screen! No more old-fashioned grid-style icons. This, finally, was something new.

I wanted to love it. I tried to love it. I brought it home to meet my family. Some features are wonderful. The People Hub, in particular, combines Facebook, Twitter, and LinkedIn feeds in a nicely readable format. Nokia helped by developing a suite of apps for it and making great hardware. The phone is a nice size, has an excellent camera and a two-LED flash (which makes it the most versatile, if not the most powerful, $450 flashlight on the market). And while some reviews have complained about its heft, I appreciate a phone that can be used for self-defense.

But at the end of the day — and after the return period, natch — I just couldn’t handle being on the wrong side of the network effects.

Network Effects

Network effects come in two flavors: direct and indirect. With direct network effects, every user benefits as other users adopt the technology. Old-fashioned voice telephones are the classic example. If you own the only phone it is worthless because you can’t call anybody. But when the next person buys a phone you immediately benefit because now you can call him or her. (Unless you can’t stand that person, in which case his phone reduces the value of your phone to you, especially since with only two phones in the world it’s not like you can just change your number).

Direct network effects aren’t a big issue with smartphones for most people. You can call any number from any device. (Though for curmudgeons like me that’s increasingly a cost rather than a benefit. Why am I expected to drop everything I’m doing and answer the phone just because someone else decided it was time to chat?) Popular apps like Facebook and Twitter, whose value derives from the size of their networks, are platform-agnostic, at least with respect to hardware and operating systems, so each user gets the benefit of additional users regardless of the (hardware and OS) platform.[1]

But, like The Force, indirect network effects are all-powerful among mobile operating systems. To paraphrase Obi-Wan Kenobi, “…indirect network effects are what give a smartphone its power….They surround us and penetrate us; they bind the users and app developers together.”

In other words, because the vast majority of all potential customers are on iOS or Android devices, it makes sense for developers to build apps for those platforms. If apps are successful there, then maybe it’s worth building apps for a small platform like Windows Phone. Those general incentives are true whether you are the proverbial kid in the garage or Google.

These incentives apparently even affect developers at Microsoft. While Microsoft seems to be putting significant resources into the Windows Phone operating system, it’s not clear that other Microsoft developers share the love. For example, although Microsoft owns Skype, the Windows Phone 8 Skype app was not available when the first phones went on sale. Skype is still only a “preview” app in the Windows Phone store.

As a result, Windows Phone users get the short end of the app stick.

To be sure, the Windows Phone store is far from empty, and some people will find everything they need. Certain apps I rely on, like Evernote and Expensify, are there and work well.

But, overall, the Windows Phone store feels like a dollar store in Chinatown. It has a lot of stuff–75,000 new apps added in 2012, according to Microsoft–but when you look closely you realize they’re selling Sherple pens rather than Sharpie pens. Sometimes the Sherple pen works fine. For example, Microsoft promised to deliver a Pandora app sometime in 2013, but in the meantime users can rely on the “MetroRadio” app, which somehow manages to play Pandora stations. God bless those third-party developers for stepping in and making popular services available to those of us who who love them so much we’re willing to pay any price, as long as the price is zero. But third-party apps can stop working anytime the original source changes something, and it feels like being a second class citizen in the app world.

Small platforms also have problems at the high and low end of the app ecosystem. Windows Phone is missing certain hugely popular apps like Instagram.  At the same time, because of the small customer base the odds of this month’s hot new app being readily available on (or much less, originating on) Windows Phone are tiny.

http://www.youtube.com/watch?v=Nn-dD-QKYN4

Relying on a competitor can be OK if you have some power

Even worse, not only does Microsoft need to overcome its network effects disadvantage in order to succeed, it must also have good access to products developed by its arch-nemesis, Google.

Relying on a competitor isn’t inherently disastrous. Apple clearly benefits from the excellent products Google makes for iOS. Recent stories have even suggested that some of Google’s iOS products are better than its companion Android products. There is no love lost between Google and Apple, but Google apparently needs Apple’s huge customer base as much as Apple needs Google.

That’s not to say such cooperation is easy or without risk. Apple buys chips for its mobile devices from archrival Samsung, but has become wary of relying on a competitor for such a crucial part of its golden goose. Similarly, Netflix relies on Amazon’s AWS data facilities for its video streaming, even though they compete in the video delivery market. That relationship, too, makes some uneasy, for example, when Netflix service went down over Christmas and Amazon’s did not. Nevertheless, Amazon and Netflix apparently believe each has enough to gain by working with the other that the relationship continues despite such hiccups.

But with only about two percent of the market, Microsoft is but a fart in a mobile windstorm. Even if Windows Phone were not a potential competitor to Android, it’s hard to make a business case for Google to care one whit about Windows Phone today. That is, Google faces the same lack of incentive to develop apps for Windows Phone that all developers face. And, given that Windows Phone is trying to compete with Android, it’s hard to come up with a good reason why Google should invest in the Windows Phone platform. In other words, Microsoft needs Google but Google doesn’t need Microsoft.

And Google’s lack of need for Windows Phone shows. YouTube doesn’t work well on Windows Phone, gMail on the Windows Phone web browser looks like it was designed for an old feature phone, and Google itself offers only one, lonely, app–a basic search app–in the Windows app store.[2]

This isn’t anti-competitive behavior by Google by a long shot. The small number of Windows Phone users means that Google is unlikely to earn much of a return on investments in Windows Phone. And given that those returns are likely to be even lower if the investments help the Windows platform succeed, it becomes difficult, indeed, to see a reason for Google to invest much. If Windows Phone acquires enough users to generate sufficient ad revenues, however, you can bet Google will develop apps for it.

A New Hope

A third mobile platform could still succeed, despite these obstacles. Overcoming them will require enormous resources, and Microsoft, with an estimated $66 billion in cash, clearly has them. Whether it will deploy those resources effectively remains to be seen. IMO, more resources developing apps and fewer on embarrassingly bad ads might be an effective approach.

Like I said, I wanted to love my Lumia 920. And I want this new platform to succeed–more competition is good. I just don’t want to see it enough to suffer on the wrong side of the network effects in the meantime.

My iPhone 5 comes tomorrow. Don’t tell my wife I used her upgrade.

____________

[1]There are exceptions, of course. For example, Apple’s FaceTime and Find Friends app work only on Apple devices, but–much to Apple’s dismay, I’m sure–these do not appear to have had much effect on aggregate sales, at least in part because of close cross-platform substitutes like Skype and Google Latitude.

[2]Again, some third-party developers come partly to the rescue. Gmaps Pro, for example, provides a wonderful Google Maps experience on Windows Phones.

Unintended—But Not Necessarily Bad—Consequences of the 700 MHz Open Access Provisions

By Scott Wallsten
November 6th, 2012

Wireless data pricing has been evolving almost as rapidly as new wireless devices are entering the marketplace. The FCC has mostly sat on the sidelines, watching developments but not intervening.

Mostly.

Last summer, the FCC decided that Verizon was violating the open access rules of the 700 MHz spectrum licenses it purchased in 2008 by charging customers an additional $20 per month to tether their smartphones to other devices. Verizon paid the fine and allowed tethering on all new data plans.[1]

Much digital ink has been spilled regarding how to choose a shared data plan best-tailored for families with a myriad of wireless devices and demand for data. Very little, however, appears to have been said about individual plans and, more specifically, about those targeted to light users.

One change that has gone largely unnoticed is that Verizon effectively abandoned the post-paid market for light users after the FCC decision.

Verizon no longer offers individual plans. Even consumers with only a single smartphone must purchase a shared data plan. That’s sensible from Verizon’s perspective since mandatory tethering means that Verizon effectively cannot enforce a single-user contract. The result is that Verizon no longer directly competes for light users.

The figure below shows the least amount of money a consumer can pay each month on a contract at the major wireless providers. As the table below the figure highlights, the figure does not present an apples-to-apples comparison, but that’s not the point—the point is to show the choices facing a user who wants voice and data, but the smallest possible amount of each.

Note: Assumes no data overages.

The figure shows that this thrifty consumer could spend $90/month at Verizon, $60/month at AT&T, $70/month at T-Mobile, and $65/month at Sprint if the consumer is willing to purchase voice/text and data plans separately. Even Verizon’s prepaid plan, at $80/month, costs more than the others’ cheapest postpaid plans.

Moreover, prior to the shift to “share everything” plans, this consumer could have purchased an individual plan from Verizon for $70/month—$20/month less than he could today. At AT&T the price was $55/month but increased by only $5/month. Again, the point is not to show that one plan is better than another. Verizon’s cheapest plan offers 2 GB of data, unlimited voice and texts, and tethering while AT&T’s cheapest plan offers 300 MB of data, 450 voice minutes, and no texts or tethering. Which plan is “better” depends on the consumer’s preferences. Instead, the point is to show the smallest amount of money a light user could spend on a postpaid plan at different carriers, and that comparison reveals that Verizon’s cheapest option is significantly more expensive than other post-paid options and, moreover, increased significantly with the introduction of the shared plan.

Is the FCC’s Verizon Tethering Decision Responsible for this Industry Price Structure?

There’s no way to know for sure. The rapidly increasing ubiquity of households with multiple wireless device means that shared data plans were probably inevitable. And carriers compete on a range of criteria other than just price, including network size, network quality, and handset availability, to name a few.

Nevertheless, Verizon introduced its “share everything” plans about a month before the FCC’s decision. If we make the not-so-controversial assumption that Verizon knew it would be required to allow “free” tethering before the decision was made public and that individual plans would no longer be realistic for it, then the timing supports the assertion that “share everything” was, at least in part, a response to the rule.

How Many Customers Use These “Light” Plans?

Cisco estimated that in 2011 the average North American mobile connection “generated” 324 megabytes. The average for 2012 will almost surely be higher and even higher among those with higher-end phones. Regardless, even average use close to 1 Gb would imply a large number of consumers who could benefit from buying light-use plans, regardless of whether they do.

Did the FCC’s Tethering Decision Benefit or Harm Consumers?

It probably did both.

The consumer benefits: First, Verizon customers who want to tether their devices can do so without an extra charge. Second, AT&T and Sprint followed Verizon in offering shared data plans, with AT&T’s shared plans also including tethering. Craig Moffett of Alliance Bernstein noted recently that “Family Share plans are not, as has often been characterized, price increases. They are price cuts…”[2] because the plans allow consumers to allocate their data more efficiently. As a result, he notes, these plans should cause investors to worry that the plans will reduce revenues. In other words, the shared plans on balance probably represent a shift from producer to consumer surplus.

The consumer costs: Verizon is no longer priced competitively for light users.

The balance: Given that other carriers still offer postpaid plans to light users and that a plethora of prepaid and other non-contract options exist for light users, the harm to consumers from Verizon’s exit is probably small, while the benefits to consumers may be nontrivial. In other words, the net effect was most likely a net benefit to consumers.

What Does This Experience Tell Us?

The FCC’s decision and industry reaction should serve as a gentle reminder to those who tend to favor regulatory intervention: even the smallest interventions can have unintended ripple effects. Rare indeed is the rule that affects only the firm and activity targeted and nothing else. More specifically, rules that especially help the technorati—those at the high end of the digital food chain—may hurt those at the other end of the spectrum.

But those who tend to oppose regulatory intervention should also take note: not all unintended consequences are disastrous, and some might even be beneficial.

Is That a Unique Observation?

Not really.

Could I Have Done Something Better With My Time Instead of Reading This?

Maybe. Read this paper to find out.


[1] The FCC allowed Verizon to continue charging customers with grandfathered “unlimited” data plans an additional fee for tethering.

[2] Moffett, Craig. The Perfect Storm. Weekend Media Blast. AllianceBernstein, November 2, 2012.

Is a broadband tax a good idea?

By Scott Wallsten
August 30th, 2012

The FCC recently asked for comments on a proposal to raise money for universal service obligations by taxing broadband connections. Let’s set aside, for the moment, the question of whether the universal service program has worked (it hasn’t), whether it is efficient (it isn’t), and whether the reforms will actually improve it (they won’t). Instead, let’s focus on the specific question of whether taxing broadband is the best way to raise money for any given program telecommunications policymakers want to fund.

The answer, in typical economist fashion, is that it depends.

A tax is generally evaluated on two criteria: efficiency and equity. The more “deadweight loss” the tax causes, the more inefficient it is. Deadweight loss results from people changing their behavior in response to the tax and, in principle, can be calculated as a welfare loss.

Closely related to efficiency is how the tax affects policy goals. This question is particularly relevant here because the service being taxed is precisely the service the tax is also suppose to support, making it possible that the tax itself could undo any benefits of the spending it funds.

Equity—in general, how much people of different income levels pay—is simple in concept but difficult in practice since it is not possible to say what the “right” share of the tax any given group should pay.

Perhaps surprisingly to some, a broadband tax may actually be efficient relative to some other options, including income taxes (i.e., coming from general revenues). Historically, universal service funds were raised by taxes on long distance service, which is highly price sensitive, making the tax quite inefficient.

By contrast, for the typical household, fixed (and, increasingly, mobile) broadband has likely become quite inelastic. In 2010, one study estimated that the typical household was willing to pay about $80 per month for “fast” broadband service, while the median monthly price for that service was about $40. Since then, the number of applications and available online services has increased, meaning that consumer willingness to pay has presumably also increased, while according to the Bureau of Labor Statistics broadband prices have remained about the same.

Consumer Price Index, Internet Services and Electronic Information Providers

Source: Bureau of Labor Statistics, Series ID CUUR0000SEEE03, adjusted so January 2007=100

While no recent study has specifically evaluated price elasticity, the large gap between prices and willingness to pay suggests that a tax of any size likely to be considered might not be hugely inefficient overall.

The problem is that even if the tax does not affect subscription decisions by most people, it can still affect precisely the population policymakers want to help. Even though only 10 percent of people who do not have broadband cite price as the barrier, there is some lower price at which people will subscribe. A tax effectively increases the price consumers pay, meaning that it puts people at that margin—people who may be on the verge of subscribing—that much further away from deciding broadband is worthwhile. Similarly, people on the other side of that margin—those who believe broadband is worthwhile, but just barely—will either cancel their subscriptions or subscribe to less robust offerings.

To be sure, people who would be eligible for low-income support would probably receive more in subsidies than they pay in taxes, but this is an absurdly inefficient way to connect more people. As one astute observer noted, it is not merely like trying to fill a leaky bucket, but perhaps more like trying to fill that bucket upside-down through the holes.[1]

Higher prices for everyone highlights the equity problem. A connection tax is the same for everyone regardless of income, making the tax regressive. The tax becomes even more regressive because much of the payments go to rural residents regardless of their income while everyone pays regardless of their income, meaning the tax includes a transfer from the urban poor to the rural rich.

Even without an income test, methods exist to mitigate the equity problem. Unfortunately, the methods the FCC proposes are likely to undermine other policy goals. In particular, the FCC asks about the effects of taxing by tier of service, presumably with higher tiers of service paying more (paragraph 249). The FCC does not specifically mention equity in its discussion, but if higher income people are more likely to have faster connections then it would help mitigate equity issues.

This tiered tax approach is commonly used for other services, including electricity and water, where a low tier of use is taxed at a low rate, and higher usage rates are taxed incrementally more. Therefore, in the case of water, for example, a family that uses water only for cooking and cleaning will pay a lower tax rate than a family that also waters its lawn and fills a swimming pool. And while it is not a perfect measure of income, in general wealthier people are more likely to have big lawns and pools.

The problem with this approach in broadband is that while willingness to pay for “fast” broadband is relatively high, most people are not yet willing to pay much more at all for “very fast” broadband. Thus, taxing higher tiers of service at a higher price, while more equitable, may lead to other efficiency problems if it reduces demand for higher tiers of service.

So what’s the solution?

The FCC should decide which objectives are the most important: efficiency, equity, or other policy objectives such as inducing more people to subscribe or upgrade their speeds, and then design the tax system that best achieves that goal. Then it should compare this “best” tax to the outcome if the system were simply funded from general revenues and compare which of those would lead to a better outcome.

But no tax is worthwhile if the program it supports is itself inefficient and inequitable. The real solution is to dramatically reduce spending on ineffective universal service programs in order to minimize the amount of money needed to fund them. Unfortunately, the reforms appear to do just the opposite. In 2011, the high cost fund spent $4.03 billion and had been projected to decrease even further. The reforms, however, specified that spending should not fall below $4.5 billion (see paragraph 560 of the order), meaning that the first real effect of the reforms was to increase spending by a half billion dollars. And, as the GAO noted, the FCC “has not addressed its inability to determine the effect of the fund and lacks a specific data-analysis plan for carrier data it will collect” and “lacks a mechanism to link carrier rates and revenues with support payments.”

The right reforms include integrating true, third-party, evaluation mechanisms into the program and, given the vast evidence of inefficiency and ineffectiveness, a future path of steady and significant budget cuts. Those changes combined with an efficient tax-collection method, might yield a program that efficiently targets those truly in need.


[1] This excellent analogy comes from Greg Rosston via personal communications.