Using Search Results to Fight Piracy

By Michael Smith
September 15th, 2014

With the growing consensus in the empirical literature that piracy harms sales, and emerging evidence that increased piracy can affect both the quantity and quality of content produced (here and here for example), governments and industry partners are exploring a variety of ways to reduce the harm caused by intellectual property theft. In addition to graduated response efforts and site shutdowns, Internet intermediaries such as Internet Service Providers, hosting companies, and web search engines are increasingly being asked play a role in limiting the availability of pirated content to consumers.

However, for this to be a viable strategy, it must first be the case that these sorts of efforts influence consumers’ decisions to consume legally. Surprisingly, there is very little empirical evidence one way or the other on this question.

In a recent paper, my colleagues Liron Sivan, Rahul Telang and I used a field experiment to address one aspect of this question: Does the prominence of pirate and legal sites in search results impact consumers’ choices for infringing versus legal content? Our results suggest that reducing the prominence of pirate links in search results can reduce copyright infringement.

To conduct our study, we first developed a custom search engine that allows us to experimentally manipulate what results are shown in response to user search queries. We then studied how changing what sites are listed in search results impacted the consumption behavior of a panel of users drawn from a general population, and a separate panel of only college aged participants.

In our experiments, we first randomly assigned users to one of three groups: a control group of users who are shown the same search results they would receive from a major search engine, and two treatment groups where pirate sites are artificially promoted and artificially demoted in the displayed search results. We then asked users to obtain a movie they are interested in watching, and to use our search engine instead of the search engine they would normally use. We observe what queries each set of users issued to search for their chosen movie, and surveyed them regarding what site they used to obtain the movie.

Our results suggest that changing the prominence of pirate and legal links has a strong impact on user choices: Relative to the control condition, users are more likely to consume legally (and less likely to infringe copyright) when legal content is more prominent in search results, and user are more likely to consume pirate content when pirate content is more prominent in search results.

By analyzing users’ initial search terms we find that these results hold even among users with an apparent predisposition to pirate: users whose initial search terms indicate an intention to consume pirated content are more likely to use legal channels when pirated content is harder to find in search results.

Our results suggest that reducing the prominence of pirate links in search results can reduce copyright infringement. We also note that there is both precedent and available data for this sort of response. In terms of precedent, search engines are already required to block a variety of information, including content from non-FDA approved pharmacies in the U.S. and content that violates an individual’s “right to be forgotten” in a variety of EU countries. Likewise, the websites listed in DMCA notices give search engines some of the raw data necessary to determine which sites are most likely to host infringing content.

Thus, while more research and analysis is needed to craft effective policy, we believe that our experimental results provide important initial evidence that users’ choices for legal versus infringing content can be influenced by what information they are shown, and thus that search engines can play a role in the ongoing fight against intellectual property theft.

 

Does Piracy Undermine Product Creation?

By Michael Smith
September 5th, 2014

(Below is a guest post by my colleague, Rahul Telang from Carnegie Mellon University)

That Piracy undermines demand for products in copyright industries is intuitive and well supported by data. Music, movies, books, software have seen demand degradation due to various forms of piracy. What is not so well supported by data is whether piracy undermines product creation. For example, does piracy reduce the number of movies made, or quality of movies made, or investments in movies? Common sense suggests that this must be true. After all, this is the core principle of copyright. Large scale copyright infringement should affect revenues which in turn should affect producers’ incentives to create.

Despite this compelling argument the data does not support this claim readily. The reasons are many. For one, while the change in demand due to infringement happens more quickly, the production adjustments take time. So unless the infringement is persistent for a period of time, the contraction in production is not readily visible. The technology that leads to widespread infringement (say P2P networks and broadband infra-structure that facilitates online piracy) might also be accompanied by a period where cost of production and distribution declines or new markets open up. The net effect of these two opposing factors is all we can see in the data. And, the net effect could very well be that the production actually has increased!!!. This is not an evidence that piracy does not matter. Finally, there may be distributional bottlenecks (say number of theatres) that may prevent growth in production but might lead to larger investments in movies or in some cases higher input costs (actors and directors become more expensive).

In short, to see the effects of piracy in data, we need a setting where other factors are largely unchanged. With my co-author Joel Waldfogel, we explore Indian movie industry around the diffusion of Cable television and VCR. This phenomenon took place during 1985-2000. The paper is here. The story of our paper from the abstract is essentially that:

The diffusion of the VCR and cable television in India between 1985 and 2000 created substantial opportunities for unpaid movie consumption. We first document, from narrative sources, conditions conducive to piracy as these technologies diffused. We then provide strong circumstantial evidence of piracy in diminished appropriability: movies’ revenues fell by a third to a half, conditional on their ratings by movie-goers and their ranks in their annual revenue distributions. Weaker effective demand undermined creative incentives. While the number of new movies released had grown steadily from 1960 to 1985, it fell markedly between 1985 and 2000, suggesting a supply elasticity in the range of 0.2-0.7.

Even the quality as measured by IMDb ratings declined substantially. Thus, our study provides affirmative evidence on a central tenet of copyright policy, that stronger effective copyright protection effects more creation. For empirical research, sometimes you have to look at the historical context to see the evidence of the effect of a policy. Doing a similar study in post 2000 era for any other country might be tricky because the other competing factors have altered. There will be a need to be more creative in defining and measuring product creation in this new context. And, I am sure we will see such efforts in near future. Needless to say, a lot more research is needed to settle this issue.  However, our paper does provide an evidence that in an appropriate setting, effects of copyright infringement on product creation can be measured.

2014 TPI Aspen Forum has Ended, but the Videos Live On…

By Amy Smorodin
August 22nd, 2014

Did you miss the Aspen Forum this year?  Or, do you just want to watch some of the panels again?  Videos of the panels and keynotes from the 2014 event are now up on the TPI website.

Some highlights from Monday night and Tuesday:

Comcast’s David Cohen was the Monday night dinner speaker.  In front of a packed room, Cohen spoke about the benefits of the Comcast/TWC deal, vertical and horizontal integration in the industry in general, and even revealed what keeps him up at night (hint: it’s not the communications industry).  His speech can be viewed here.

First up on Monday morning was a panel on copyright moderated by Mike Smith, TPI Senior Adjunct Fellow and Professor at Carnegie Mellon.  “Copyright Protection: Government vs. Voluntary Arrangements” featured Robert Brauneis from GW Law School, the Center for Copyright Information’s Jill Lesser, Jeff Lowenstein from the Office of Congressman Schiff, Shira Perlmutter from USPTO and NYU’s Chris Sprigman. Panelists discussed the copyright alert system, the state of the creative market in general, and the perennial question of what can be done to reduce piracy.  Video of the spirited panel can be viewed here.

Next up was the panel, “Internet Governance in Transition:  What’s the Destination?” moderated by Amb. David Gross.  The pretty impressive group of speakers discussed issues surrounding the transition of ICANN away from the loose oversight provided by the U.S. Dept. of Commerce.  Participants were ICANN Chair Steve Crocker, Reinhard Wieck from Deutsche Telekom, Shane Tews from AEI, Amb. Daniel Sepulveda, the U.S. Coordinator for International Communications and Information Policy, and NYU’s Lawrence White.  Video is here.

Finally, the Forum concluded with a panel on “Data and Trade,” moderated by TPI’s Scott Wallsten.  The panelists discussed how cybersecurity, local privacy laws, and national security issues are barriers to digital trade.  Speakers were USITC Chairman Meredith Broadbent, Anupam Chander from University of CA, Davis, PPI’s Michael Mandel, Joshua Meltzer from Brookings, and Facebook’s Matthew Perault.  Video of the discussion is here.

We hope all attendees and participants at the TPI Aspen Forum found it interesting, educational, and enjoyable.  We hope to see you next year!

Dispatch from the TPI Aspen Forum

By Amy Smorodin
August 18th, 2014

Sunday, August 17

Last night, we kicked off our 2014 Aspen Forum in lovely Aspen, Colorado.

Congressman Scott Tipton welcomed attendees to his home state (and his home district).  In his remarks, Tipton discussed the importance of tech in growing small business and the economic impact of regulations, which he estimated to cost $1.8 billion a year.  Rep. Tipton also discussed the importance of broadband penetration in rural areas.

Video of his speech, and short remarks from TPI President Thomas Lenard and TPI Board Member Ray Gifford, can be found here.

Monday, August 18

The first full day of the TPI Aspen Forum began with a discussion on “The Political Economy of Telecom Reform,” moderated by TPI’s Scott Wallsten.

Former Congressman Rick Boucher, now a Partner at Sidley Austin, explained that during the 1996 telecom act, the issues were not partisan in nature.  However, he identified a sticking point that seems to be drawn along party lines: network neutrality.  He would like to see net neutrality dealt with separately prior to the start of any real push for telecom reform in Congress, in hopes that lawmakers will have an easier time finding common ground.

Peter Davidson from Verizon stated that there does not seem to be as much consensus among players in the communications industry as there was during the last push for telecom reform.  However, he did express that the threat of Title II regulation may drive many to band together.

Roger Noll from Stanford University declared the big winners in the ‘96 Act “were people who make a living manipulating regulatory processes.”  He also said such a thing was less likely to happen with any new telecom reform act because there are many more players – not just traditional wired communications companies – who know how to mobilize politically.

Philip Weiser, Dean, University of Colorado Law School stated the communications sector is going to have a lot of innovation in the next few years despite the static telecom reform act. In any new reform act, Congress should stick to high-level principles to enable ongoing innovation.  In other words, Congress needs to show restraint.

Video of the entire discussion can be viewed here.

More summaries of today’s panels and tonight’s keynote dinner speech by Comcast’s David Cohen, will be posted soon. Videos of everything will also be posted on the TPI YouTube page just as soon as we can get them up.

Stay tuned!

The Expendables 3 Leak and the Financial Impact of Pre-Release Piracy

By Michael Smith
July 25th, 2014

This past week a DVD-quality copy of the movie The Expendables 3 leaked online three weeks before its planned U.S. theatrical release. According to Variety, the film was downloaded 189,000 times within 24 hours. As researchers, an immediate question comes to mind: how much of a financial impact could movie-makers face from such pre-release piracy?

The effect of piracy on the sales of movies and other copyrighted works has long been scrutinized, with the vast majority of peer-reviewed academic papers concluding that piracy negatively impacts sales. Indeed, in a recent National Bureau of Economic Research book chapter, my co-authors and I reviewed the academic literature, and showed that 16 of the 19 papers published in peer-reviewed academic journals find that piracy harms media sales.

But less well understood is the impact of pre-release movie piracy, which could be particularly harmful to box office revenue because it appears at a time when there are no legal channels for anxious fans to consume the movie. Because of this, seeing a movie appear online before it appears in theaters sends chills down the spines of studio executives given the investment in human and financial capital necessary to produce the typical studio film.

To better understand the impact of this particular form of piracy, my colleagues and I conducted a study to measure the impact of pre-release piracy on box office revenue. Our study was accepted for publication last month in the peer-reviewed journal Information Systems Research, making it the first peer-reviewed journal article we are aware of to analyze the impact of pre-release movie piracy.

In our study we applied standard statistical models for predicting box office revenue, but added a variable for whether a movie leaked onto pirated networks prior to its release using data obtained from the site VCDQ.com. Our analysis concluded that, on average, pre-release movie piracy results in a 19% reduction in box office revenue relative to what would have occurred if piracy were only available after the movie’s release. As we discuss in the paper, this result is robust to a variety of different empirical approaches and sensitivity tests.

The growing consensus in the academic literature regarding financial harm from digital piracy provides an important backdrop to active policy debates about the best options for addressing this threat. We have seen governments and industry adopt various anti-piracy measures in recent years, from government sponsored graduated response laws, site blocking and site shutdowns; to market-based responses by rights holders and industry-level partnerships such as the Copyright Alert System in the United States.

At next month’s TPI Aspen Forum I am pleased to be chairing a panel of industry, legal, and policy experts to discuss the effectiveness and appropriateness of these initiatives to better serve the interests of the creative sector, the technology industries, and society as a whole. However, what seems to require no discussion is that digital piracy of this type can dramatically reduce sales.

Takeaways from the White House Big Data Reports

By Tom Lenard
May 5th, 2014

On May 1, the White House released its two eagerly-awaited reports on “big data” resulting from the 90-day study President Obama announced on January 17—one by a team led by Presidential Counselor John Podesta, and a complementary study by the President’s Council of Advisors on Science and Technology (PCAST).  The reports contain valuable detail about the uses of big data in both the public and private sector.  At the risk of oversimplifying, I see three major takeaways from the reports.

First, the reports recognize big data’s enormous benefits and potential.  Indeed, the Podesta report starts out by observing that “properly implemented, big data will become an historic driver of progress.”  It adds, “Unprecedented computational power and sophistication make possible unexpected discoveries, innovations, and advancements in our quality of life.”  The report is filled with examples of the value of big data in medical research and health care delivery, education, homeland security, fraud detection, improving efficiency and reducing costs across the economy, as well as in providing targeted information to consumers and the raw material for the advertising-supported internet ecosystem.  The report states that the “Administration remains committed to supporting the digital economy and the free flow of data that drives its innovation.”

Second, neither report provides any actual evidence of harms from big data.  While the reports provide concrete examples of beneficial uses of big data, the harmful uses are hypothetical.  Perhaps the most publicized conclusion of the Podesta report concerns the possibility of discrimination—that “big data analytics have the potential to [italics added] eclipse longstanding civil rights protections in how personal information is used in housing, credit, employment, health, education, and the marketplace.”  However, the two examples of discrimination cited turn out to be almost non-examples.

The first example involves StreetBump, a mobile application developed to collect information about potholes and other road conditions in Boston.  Even before its launch the city recognized that this app, by itself, would be biased toward identifying problems in wealthier neighborhoods, because wealthier individuals would be more likely to own smartphones and make use of the app.  As a result, the city adjusted accordingly to ensure reporting of road conditions was accurate and consistent throughout the city.

The second example involves the E-verify program used by employers to check the eligibility of employees to work legally in the United States.  The report cites a study that “found the rate at which U.S. citizen have their authorization to work be initially erroneously unconfirmed by the system was 0.3 percent, compared to 2.1 percent for non-citizens.  However, after a few days many of these workers’ status was confirmed.”  It seems almost inevitable that the error rate for citizens would be lower since citizens automatically are eligible to work, whereas additional information is needed to confirm eligibility for non-citizens (i.e., evidence of some sort of work permit).  Hence, it is not clear this is an example of discrimination.

It is notable that both these examples are of government activities.  The reports do not present examples of commercial uses of big data that discriminate against particular groups.  To the contrary, the PCAST report notes the private-sector use of big data to help underserved individuals with loan and credit-building alternatives.

Finally, and perhaps most importantly, both reports indicate that the Fair Information Practice Principles (FIPPs) that focus on limiting data collection are increasingly irrelevant and, indeed, harmful in a big data world.  The Podesta report observes that “these trends may require us to look closely at the notice and consent framework that has been a central pillar of how privacy practices have been organized for more than four decades.”  The PCAST report notes, “The beneficial uses of near-ubiquitous data collection are large, and they fuel an increasingly important set of economic activities.  Taken together, these considerations suggest that a policy focus on limiting data collection will not be a broadly applicable or scalable strategy—nor one likely to achieve the right balance between beneficial results and unintended negative consequences (such as inhibiting economic growth).”  The Podesta report suggests examining “whether a greater focus on how data is used and reused would be a more productive basis for managing privacy rights in a big data environment.”  The PCAST report is even clearer:

Policy attention should focus more on the actual uses of big data and less on its collection and analysis.  By actual uses, we mean the specific events where something happens that can cause an adverse consequence or harm to an individual or class of individuals….By contrast, PCAST judges that policies focused on the regulation of data collection, storage, retention, a priori limitations on applications, and analysis…are unlikely to yield effective strategies for improving privacy.  Such policies would be unlikely to be scalable over time, or to be enforceable by other than severe and economically damaging measures.

In sum, the two reports have much to like:  their acknowledgement of the importance and widespread use of big data and their attempt, particularly in the PCAST report, to refocus the policy discussion in a more productive direction.  The reports also, however, suffer from a lack of evidence to substantiate their claim of harms.

Where Do Vendors To Cable Think The Industry Is Heading? Evidence From Cable Show Data In 2014

By Scott Wallsten
April 25th, 2014

Scott Wallsten and Corwin Rhyan

For the past five years we have collected data about the exhibitors at the annual NCTA Cable Show from its website.  Each year we analyze trends in the industry through the categories used to classify the exhibitors.  Key observations this year include:

       »      The number of exhibitors continues to fall as it has in each of the past 4 years, from 345 in 2010 to 241 in 2014 (Figure 1).

       »      Cable Programming, Video on Demand, IPTV, and Multi-Screen Content are the first, second, third, and fourth most popular in 2014. The top three increased in popularity since last year and Multi-Screen Content decreased slightly (Figure 2).

       »      New popular categories this year included RDK (Reference Design Kit)[1] and Content Search/Navigation Systems, with each having over 10 exhibitors in their first year. Fiber was absent in 2013, but has a few exhibitors in 2014 (Figure 4).

       »      Games, Consultants, and Research & Development show some of the largest year over year increases from 2013-2014 after Video on Demand, and IPTV (Figure 5).

       »      Four previously popular categories are absent from the 2014 show—HDTV, New Networks, tru2way, and VOIP. Other notable decliners include 3D TV and Mobile Apps. These highlight some difficulties in interpreting the results without other information. HDTV likely disappeared because it is so ubiquitous while 3D TV disappeared because it has generally been a market disappointment (Figure 7).

Number of Participants

The number of exhibitors in 2014 is down over 30% from 2010. After a large drop in 2011, the last 3 years have decreased at a more moderate 3% annual rate. The number of exhibitors is biased slightly upwards due to the fact that an exhibitor with multiple booth locations gets classified as two separate exhibitors in our data. However, the number of duplicates over the years is relatively small and consistent.

Figure 1: Number of Exhibitors 2010-2014

1

Hot Tech This Year

The Cable Show allows its exhibitors to define their companies by categorical labels which signal to potential customers the types of products and services offered.  An exhibitor can select multiple categories for their products.  In 2014, the average number of categories per exhibitor was 3.87, down slightly from 4.33 in 2013.  In general, we expect exhibitors to classify their products as generally and widely as possible, with the hope of attracting interested attendees to their booths.  In order to normalize the data for year over year comparisons, we divide the number of exhibitors in each category by the total number of exhibitors, yielding a percentage of exhibitors that select each category.  The top 20 categories are listed below for the last 3 years, with Cable Programming defining over a third of all exhibitors this year.

Figure 2: Most Popular Categories 2012 – 2014

2

In graphical form below, we plot the trends of this year’s top 5 most popular categories over the past 5 years.  While many of these categories have traditionally been near the top, most have grown over the past 5 years. 

Figure 3: Top 5 Most Popular Categories

3

While we cannot rule out or in any particular hypothesis based on these data, it is worth noting that the large increase in programming-related exhibitors coincides with unprecedented increases in retransmission fees cable companies pay to programmers. It would be consistent with economic theory to see entry into this market as price increases.

What’s In and Out in 2014

The categories used to classify products and services change regularly.  The new categories used in 2014 are listed in Figure 4. Some are similar to previous categories, such as Content Search/Navigation Services, which likely evolved from the separate category of Content Navigation, while others come with little previous background like RDK (Reference Design Kit).

Figure 4: New 2014 Categories

4

Many of the most popular 2013 categories continued to gain ground in 2014, with Video on Demand, IPTV, and Cable Programming showing strong gains.  Games and Consultants showed a strong increase in representation as well.  A complete list of the top gainers in 2014 is shown in Figure 5.  Some gainers declined in 2013 but returned with stronger showings in 2014.  This list includes categories such as Video on Demand, Billing, Internet TV Providers, IPTV, and Telecommunications Providers.  A chart of these categories is shown in Figure 6.

Figure 5: Biggest 2013-2014 Gainers

5

Figure 6: Categories that switched to growth in 2014

6

At the same time, some categories disappear between years.  In 2014 some notable categories are no longer present.

The categories that declined in 2014 included many that disappeared from the list completely such as HDTV, New Networks, tru2way, and VOIP. Several possible theories could explain these disappearances: perhaps some categories became so ubiquitous as to be meaningless in the context of a trade show (e.g., HDTV or VOIP), show organizers decided to no longer include a category because it overlapped too heavily with other categories (e.g., was “New Network” the same as “Program Networks?”), or because it is no longer relevant?

Other notable decliners include once “up and coming” technologies such as 3D TV, Mobile Apps, and Social TV. A decrease in a category is probably easier to interpret than an outright disappearance. 3D television, for example, has been a notable market disappointment and it is no surprise to see it disappearing from the show.

Figure 7: Biggest 2013-2014 Losers

7

Figure 8: Categories that switched to decline in 2014

8

Conclusion

Using data from the Cable Show’s exhibitors is advantageous because it is representative of the actors in the industry who have real money on the line.  In a tech world that loves to exaggerate the next “big thing”, using data directly from the industry members might help provide a better understanding of where the industry is headed.  However, this data must be used with caution.  First, the categories are self-reported by exhibitors, and while they have a clear incentive to accurately categorize their products and services, some might also see advantages in identifying with certain hyped industry technologies to attract customers. Secondly, the analysis weighs each exhibitor identically, which clearly isn’t accurate as some booths are massive and staffed by dozens of people while others are little more than a table and the company owner (Figure 9).

Despite these data shortcomings the data show a continued trend towards a cable industry more focused on its traditional roles as a television service provider, with programming, television, video, and networks topping our list in 2014, while the hyped technologies that were set to revolutionize the cable industry in 2012 and 2013 fell in 2014.

Figure 9: 2014 Cable Show Floor Plan

9


[1] According to www.rdkcentral.com, RDK is “a pre-integrated software bundle that provides a common framework for powering customer-premises equipment (CPE) from TV service providers, including set-top boxes, gateways, and converged devices.”

Comcast and Netflix—What’s the Big Deal?

By Tom Lenard
February 26th, 2014

Netflix and Comcast recently announced an agreement whereby Netflix will pay Comcast for direct access to its network.  This agreement addresses congestion that is slowing delivery of Netflix videos to Comcast’s broadband subscribers and resolves a dispute between the two companies concerning how to pay for the needed network upgrades.  Netflix and Verizon are currently working through a similar dispute.  While some commentators think deals such as the one between Netflix and Comcast are problematic, the reality is that the agreement reflects a common market transaction that yields an outcome more efficient and more quickly than any regulatory intervention could have.

The following series of stylized figures illustrate how the growth of Netflix and other streaming video services have affected the volume and flow of internet traffic and corresponding payments in recent years.  Traditionally (Figure 1), Internet backbone providers and ISPs entered into “peering” agreements, which did not call for payments on either side, reflecting a relatively balanced flow of traffic.  Content distributors paid backbone providers for “transit,” reflecting the unbalanced flow of traffic along that route.

Slide1

With the growth of online video and with Netflix accounting for 30 percent of traffic at some times of the day, this system was bound to become strained, as we are now seeing and as shown in Figure 2.  The flow of traffic between the backbone provider and the ISP is unbalanced and has grown enormously, requiring investments in additional capacity.

Slide2

One way to address this problem is for the backbone provider to pay the ISP, reflecting the greater amount of traffic (and greater capacity needed) going in that direction (see Figure 3).  In fact, that is what happened following a dispute between Level 3 and Comcast in late 2010.

Slide3

Another solution is the just-announced Comcast-Netflix deal, reflected in Figure 4.  In this case, Netflix/Comcast is bypassing the intermediate backbone provider (either partially or completely), presumably because it is more efficient to do so.  One or both of them is investing in the needed capacity.  Regulatory interference with such a deal runs the risk of blocking an advance that would lower costs and/or raise quality to consumers.

Slide4

The Wall Street Journal has described the debate as being “over who should bear the cost of upgrading the Internet’s pipes to carry the nation’s growing volume of online video:  broadband providers like cable and phone companies, or content companies like Netflix, which make money by sending news or entertainment through those pipes.”  Ultimately, of course, consumers pay one way or the other.  When Netflix pays Comcast, the cost is passed through to Netflix subscribers.  This is both efficient and fair, because the consumer of Netflix services is paying for the cost of that service.

In the absence of such an agreement, quality would suffer or the ISP would bear the cost.  The ISP might recover these costs by increasing prices to subscribers generally.  This would involve a cross-subsidy of Netflix subscribers by non-subscribers, which would be neither efficient nor fair.  Alternatively, Comcast could increase prices for those subscribers who consume a lot of bandwidth, which might have similar effects to the just-announced deal, but would probably lose some efficiencies.  In any event, it is difficult to see how such an arrangement would be better for consumers than the announced agreement.

 

 

The FCC Tries Yet Again

By Tom Lenard
February 19th, 2014

FCC Chairman Tom Wheeler’s official response to the DC Appeals Court decision on the Commission’s “net neutrality” rules promises to keep the issue on the table for the foreseeable future.  That is unfortunate, because there are better ways for the Commission and its staff to spend their time.

The Appeals Court took away from the Commission with one hand, while giving back with the other:  It struck down the more onerous provisions of the net neutrality rules—the “anti-discrimination” and “anti-blocking” provisions—because they imposed common carrier obligations and broadband is not classified as a Title II common carrier service.  However, the Court affirmed the Commission’s argument that it has general authority (under section 706 of the Communications Act) to regulate in order to encourage broadband deployment.

Since the Appeals Court decision came down, the FCC has been under considerable pressure from net neutrality proponents to reclassify broadband as a Title II common carrier.  In today’s announcement, the Commission declined to do that. However, the Commission also declined to close the Title II docket, keeping the threat of reclassification and the regulatory burdens and oversight that go with it, alive.

In addition, the Commission announced its intention to start yet another net neutrality rulemaking, under its section 706 authority, in order to fulfill the Commission’s no blocking and non-discrimination goals as well as to enhance the transparency rule (the one major provision that the court upheld).

With all the activity aimed towards asserting legal justification for its net neutrality rules, it sometimes gets lost that the FCC had no convincing economic or consumer welfare justification for the rules in the first place.

While there is widespread agreement that the Internet should be open and provide consumers with access to content, applications and services of their choice, the rule was always a solution in search of a problem, a sentiment echoed today by FCC Commissioner Pai.  The Commission never provided the necessary data and analysis to show that the rules would address a significant market failure, did not identify harms to users that the rules would remedy, and did not demonstrate that the benefits of the rules would exceed their costs.  In other words, the Commission neglected to explain why the broadband market, which has generally thrived under minimal regulation, should now be subject to an enhanced regulatory regime.   Indeed, a good argument can be made that, by making the adoption of innovative business models more difficult, the rules would have hindered rather than encouraged the deployment of broadband infrastructure, notwithstanding the Commission’s assertions to the contrary.

There is now substantial concern that the Appeals Court has expanded the Commission’s authority to include the entire Internet ecosystem—including potentially content, applications, and service providers—as long as it can make some plausible argument that its actions encourage broadband deployment.  Expanding the Commission’s domain in this way would be a serious mistake and would compound the harm.

A major goal of the Commission in promulgating its net neutrality rules initially was to “provide greater predictability.”  It clearly has not achieved that goal.  Starting yet another proceeding, and keeping the Title II docket open, will create even more uncertainty for the entire Internet ecosystem.

Chairman Rockefeller and Data Brokers

By Amy Smorodin
September 26th, 2013

Chairman Rockefeller recently sent letters to a dozen different companies seeking information on how they share information with third parties.  The letters are an extension of previous requests sent to “data brokers” asking for clarification of the companies’ “data collection, use and sharing practices.”  In the letters, the Chairman opines that the privacy policies on many websites “appear to leave room for sharing a consumer’s information with data brokers or other third parties who in turn may share with data brokers.”  He also stresses the importance of transparent privacy practices for consumers.

While a call for more information and data is certainly commendable, one should ask, “Where is this all going?”    Is the Chairman suddenly seeing the need for some data to inform policy making in this area?

While we would hope so, the Chairman’s letter infers the assumption that there is something inherently harmful about data collection and sharing, although this harm is not explicitly described.  He also posits that consumers may not be aware that their information is being collected or how it’s being used.  Again, there is no information offered on how this conclusion is reached.

Overall, more data to inform privacy policy-making would be a good thing.  As Tom Lenard has pointed out in filings, Congressional testimony, and a recent book chapter submission, the last comprehensive survey of privacy policies was back in 2001, a lifetime ago in the technology industry.  Ideally, any privacy proposals from Congress or the FTC should be based upon a survey of the actual current events on the ground, as opposed to opinions and assumptions.  Only with relevant data can policies be drafted that are targeting towards specific harms.  Additionally, data-driven policymaking can be evaluated to ensure that specific policy is performing as intended, and that benefits derived outweigh the costs of the regulation.

Data collection is burdensome and time consuming for companies involved. Any other government entity (besides Congress) would be required under the Paperwork Reduction Act to have its proposal be assessed, as they are required to “reduce information collection burdens on the public.” Since it doesn’t appear that Rockefeller’s recent requests for information are part of any systematic study or plan, it is understandable why some companies would bristle at the thought of spending time and resources on answering a list of questions.

The FTC recently conducted its own query in preparation for a study on “big data” and the privacy practices of data brokers.  One hopes the study, expected to be out by the end of the year, is well-designed and an objective look at the industry without a predetermination of results. Such a study would be useful going forward.