2014 TPI Aspen Forum has Ended, but the Videos Live On…

By Amy Smorodin
August 22nd, 2014

Did you miss the Aspen Forum this year?  Or, do you just want to watch some of the panels again?  Videos of the panels and keynotes from the 2014 event are now up on the TPI website.

Some highlights from Monday night and Tuesday:

Comcast’s David Cohen was the Monday night dinner speaker.  In front of a packed room, Cohen spoke about the benefits of the Comcast/TWC deal, vertical and horizontal integration in the industry in general, and even revealed what keeps him up at night (hint: it’s not the communications industry).  His speech can be viewed here.

First up on Monday morning was a panel on copyright moderated by Mike Smith, TPI Senior Adjunct Fellow and Professor at Carnegie Mellon.  “Copyright Protection: Government vs. Voluntary Arrangements” featured Robert Brauneis from GW Law School, the Center for Copyright Information’s Jill Lesser, Jeff Lowenstein from the Office of Congressman Schiff, Shira Perlmutter from USPTO and NYU’s Chris Sprigman. Panelists discussed the copyright alert system, the state of the creative market in general, and the perennial question of what can be done to reduce piracy.  Video of the spirited panel can be viewed here.

Next up was the panel, “Internet Governance in Transition:  What’s the Destination?” moderated by Amb. David Gross.  The pretty impressive group of speakers discussed issues surrounding the transition of ICANN away from the loose oversight provided by the U.S. Dept. of Commerce.  Participants were ICANN Chair Steve Crocker, Reinhard Wieck from Deutsche Telekom, Shane Tews from AEI, Amb. Daniel Sepulveda, the U.S. Coordinator for International Communications and Information Policy, and NYU’s Lawrence White.  Video is here.

Finally, the Forum concluded with a panel on “Data and Trade,” moderated by TPI’s Scott Wallsten.  The panelists discussed how cybersecurity, local privacy laws, and national security issues are barriers to digital trade.  Speakers were USITC Chairman Meredith Broadbent, Anupam Chander from University of CA, Davis, PPI’s Michael Mandel, Joshua Meltzer from Brookings, and Facebook’s Matthew Perault.  Video of the discussion is here.

We hope all attendees and participants at the TPI Aspen Forum found it interesting, educational, and enjoyable.  We hope to see you next year!

Dispatch from the TPI Aspen Forum

By Amy Smorodin
August 18th, 2014

Sunday, August 17

Last night, we kicked off our 2014 Aspen Forum in lovely Aspen, Colorado.

Congressman Scott Tipton welcomed attendees to his home state (and his home district).  In his remarks, Tipton discussed the importance of tech in growing small business and the economic impact of regulations, which he estimated to cost $1.8 billion a year.  Rep. Tipton also discussed the importance of broadband penetration in rural areas.

Video of his speech, and short remarks from TPI President Thomas Lenard and TPI Board Member Ray Gifford, can be found here.

Monday, August 18

The first full day of the TPI Aspen Forum began with a discussion on “The Political Economy of Telecom Reform,” moderated by TPI’s Scott Wallsten.

Former Congressman Rick Boucher, now a Partner at Sidley Austin, explained that during the 1996 telecom act, the issues were not partisan in nature.  However, he identified a sticking point that seems to be drawn along party lines: network neutrality.  He would like to see net neutrality dealt with separately prior to the start of any real push for telecom reform in Congress, in hopes that lawmakers will have an easier time finding common ground.

Peter Davidson from Verizon stated that there does not seem to be as much consensus among players in the communications industry as there was during the last push for telecom reform.  However, he did express that the threat of Title II regulation may drive many to band together.

Roger Noll from Stanford University declared the big winners in the ‘96 Act “were people who make a living manipulating regulatory processes.”  He also said such a thing was less likely to happen with any new telecom reform act because there are many more players – not just traditional wired communications companies – who know how to mobilize politically.

Philip Weiser, Dean, University of Colorado Law School stated the communications sector is going to have a lot of innovation in the next few years despite the static telecom reform act. In any new reform act, Congress should stick to high-level principles to enable ongoing innovation.  In other words, Congress needs to show restraint.

Video of the entire discussion can be viewed here.

More summaries of today’s panels and tonight’s keynote dinner speech by Comcast’s David Cohen, will be posted soon. Videos of everything will also be posted on the TPI YouTube page just as soon as we can get them up.

Stay tuned!

The Expendables 3 Leak and the Financial Impact of Pre-Release Piracy

By Michael Smith
July 25th, 2014

This past week a DVD-quality copy of the movie The Expendables 3 leaked online three weeks before its planned U.S. theatrical release. According to Variety, the film was downloaded 189,000 times within 24 hours. As researchers, an immediate question comes to mind: how much of a financial impact could movie-makers face from such pre-release piracy?

The effect of piracy on the sales of movies and other copyrighted works has long been scrutinized, with the vast majority of peer-reviewed academic papers concluding that piracy negatively impacts sales. Indeed, in a recent National Bureau of Economic Research book chapter, my co-authors and I reviewed the academic literature, and showed that 16 of the 19 papers published in peer-reviewed academic journals find that piracy harms media sales.

But less well understood is the impact of pre-release movie piracy, which could be particularly harmful to box office revenue because it appears at a time when there are no legal channels for anxious fans to consume the movie. Because of this, seeing a movie appear online before it appears in theaters sends chills down the spines of studio executives given the investment in human and financial capital necessary to produce the typical studio film.

To better understand the impact of this particular form of piracy, my colleagues and I conducted a study to measure the impact of pre-release piracy on box office revenue. Our study was accepted for publication last month in the peer-reviewed journal Information Systems Research, making it the first peer-reviewed journal article we are aware of to analyze the impact of pre-release movie piracy.

In our study we applied standard statistical models for predicting box office revenue, but added a variable for whether a movie leaked onto pirated networks prior to its release using data obtained from the site VCDQ.com. Our analysis concluded that, on average, pre-release movie piracy results in a 19% reduction in box office revenue relative to what would have occurred if piracy were only available after the movie’s release. As we discuss in the paper, this result is robust to a variety of different empirical approaches and sensitivity tests.

The growing consensus in the academic literature regarding financial harm from digital piracy provides an important backdrop to active policy debates about the best options for addressing this threat. We have seen governments and industry adopt various anti-piracy measures in recent years, from government sponsored graduated response laws, site blocking and site shutdowns; to market-based responses by rights holders and industry-level partnerships such as the Copyright Alert System in the United States.

At next month’s TPI Aspen Forum I am pleased to be chairing a panel of industry, legal, and policy experts to discuss the effectiveness and appropriateness of these initiatives to better serve the interests of the creative sector, the technology industries, and society as a whole. However, what seems to require no discussion is that digital piracy of this type can dramatically reduce sales.

Takeaways from the White House Big Data Reports

By Tom Lenard
May 5th, 2014

On May 1, the White House released its two eagerly-awaited reports on “big data” resulting from the 90-day study President Obama announced on January 17—one by a team led by Presidential Counselor John Podesta, and a complementary study by the President’s Council of Advisors on Science and Technology (PCAST).  The reports contain valuable detail about the uses of big data in both the public and private sector.  At the risk of oversimplifying, I see three major takeaways from the reports.

First, the reports recognize big data’s enormous benefits and potential.  Indeed, the Podesta report starts out by observing that “properly implemented, big data will become an historic driver of progress.”  It adds, “Unprecedented computational power and sophistication make possible unexpected discoveries, innovations, and advancements in our quality of life.”  The report is filled with examples of the value of big data in medical research and health care delivery, education, homeland security, fraud detection, improving efficiency and reducing costs across the economy, as well as in providing targeted information to consumers and the raw material for the advertising-supported internet ecosystem.  The report states that the “Administration remains committed to supporting the digital economy and the free flow of data that drives its innovation.”

Second, neither report provides any actual evidence of harms from big data.  While the reports provide concrete examples of beneficial uses of big data, the harmful uses are hypothetical.  Perhaps the most publicized conclusion of the Podesta report concerns the possibility of discrimination—that “big data analytics have the potential to [italics added] eclipse longstanding civil rights protections in how personal information is used in housing, credit, employment, health, education, and the marketplace.”  However, the two examples of discrimination cited turn out to be almost non-examples.

The first example involves StreetBump, a mobile application developed to collect information about potholes and other road conditions in Boston.  Even before its launch the city recognized that this app, by itself, would be biased toward identifying problems in wealthier neighborhoods, because wealthier individuals would be more likely to own smartphones and make use of the app.  As a result, the city adjusted accordingly to ensure reporting of road conditions was accurate and consistent throughout the city.

The second example involves the E-verify program used by employers to check the eligibility of employees to work legally in the United States.  The report cites a study that “found the rate at which U.S. citizen have their authorization to work be initially erroneously unconfirmed by the system was 0.3 percent, compared to 2.1 percent for non-citizens.  However, after a few days many of these workers’ status was confirmed.”  It seems almost inevitable that the error rate for citizens would be lower since citizens automatically are eligible to work, whereas additional information is needed to confirm eligibility for non-citizens (i.e., evidence of some sort of work permit).  Hence, it is not clear this is an example of discrimination.

It is notable that both these examples are of government activities.  The reports do not present examples of commercial uses of big data that discriminate against particular groups.  To the contrary, the PCAST report notes the private-sector use of big data to help underserved individuals with loan and credit-building alternatives.

Finally, and perhaps most importantly, both reports indicate that the Fair Information Practice Principles (FIPPs) that focus on limiting data collection are increasingly irrelevant and, indeed, harmful in a big data world.  The Podesta report observes that “these trends may require us to look closely at the notice and consent framework that has been a central pillar of how privacy practices have been organized for more than four decades.”  The PCAST report notes, “The beneficial uses of near-ubiquitous data collection are large, and they fuel an increasingly important set of economic activities.  Taken together, these considerations suggest that a policy focus on limiting data collection will not be a broadly applicable or scalable strategy—nor one likely to achieve the right balance between beneficial results and unintended negative consequences (such as inhibiting economic growth).”  The Podesta report suggests examining “whether a greater focus on how data is used and reused would be a more productive basis for managing privacy rights in a big data environment.”  The PCAST report is even clearer:

Policy attention should focus more on the actual uses of big data and less on its collection and analysis.  By actual uses, we mean the specific events where something happens that can cause an adverse consequence or harm to an individual or class of individuals….By contrast, PCAST judges that policies focused on the regulation of data collection, storage, retention, a priori limitations on applications, and analysis…are unlikely to yield effective strategies for improving privacy.  Such policies would be unlikely to be scalable over time, or to be enforceable by other than severe and economically damaging measures.

In sum, the two reports have much to like:  their acknowledgement of the importance and widespread use of big data and their attempt, particularly in the PCAST report, to refocus the policy discussion in a more productive direction.  The reports also, however, suffer from a lack of evidence to substantiate their claim of harms.

Where Do Vendors To Cable Think The Industry Is Heading? Evidence From Cable Show Data In 2014

By Scott Wallsten
April 25th, 2014

Scott Wallsten and Corwin Rhyan

For the past five years we have collected data about the exhibitors at the annual NCTA Cable Show from its website.  Each year we analyze trends in the industry through the categories used to classify the exhibitors.  Key observations this year include:

       »      The number of exhibitors continues to fall as it has in each of the past 4 years, from 345 in 2010 to 241 in 2014 (Figure 1).

       »      Cable Programming, Video on Demand, IPTV, and Multi-Screen Content are the first, second, third, and fourth most popular in 2014. The top three increased in popularity since last year and Multi-Screen Content decreased slightly (Figure 2).

       »      New popular categories this year included RDK (Reference Design Kit)[1] and Content Search/Navigation Systems, with each having over 10 exhibitors in their first year. Fiber was absent in 2013, but has a few exhibitors in 2014 (Figure 4).

       »      Games, Consultants, and Research & Development show some of the largest year over year increases from 2013-2014 after Video on Demand, and IPTV (Figure 5).

       »      Four previously popular categories are absent from the 2014 show—HDTV, New Networks, tru2way, and VOIP. Other notable decliners include 3D TV and Mobile Apps. These highlight some difficulties in interpreting the results without other information. HDTV likely disappeared because it is so ubiquitous while 3D TV disappeared because it has generally been a market disappointment (Figure 7).

Number of Participants

The number of exhibitors in 2014 is down over 30% from 2010. After a large drop in 2011, the last 3 years have decreased at a more moderate 3% annual rate. The number of exhibitors is biased slightly upwards due to the fact that an exhibitor with multiple booth locations gets classified as two separate exhibitors in our data. However, the number of duplicates over the years is relatively small and consistent.

Figure 1: Number of Exhibitors 2010-2014


Hot Tech This Year

The Cable Show allows its exhibitors to define their companies by categorical labels which signal to potential customers the types of products and services offered.  An exhibitor can select multiple categories for their products.  In 2014, the average number of categories per exhibitor was 3.87, down slightly from 4.33 in 2013.  In general, we expect exhibitors to classify their products as generally and widely as possible, with the hope of attracting interested attendees to their booths.  In order to normalize the data for year over year comparisons, we divide the number of exhibitors in each category by the total number of exhibitors, yielding a percentage of exhibitors that select each category.  The top 20 categories are listed below for the last 3 years, with Cable Programming defining over a third of all exhibitors this year.

Figure 2: Most Popular Categories 2012 – 2014


In graphical form below, we plot the trends of this year’s top 5 most popular categories over the past 5 years.  While many of these categories have traditionally been near the top, most have grown over the past 5 years. 

Figure 3: Top 5 Most Popular Categories


While we cannot rule out or in any particular hypothesis based on these data, it is worth noting that the large increase in programming-related exhibitors coincides with unprecedented increases in retransmission fees cable companies pay to programmers. It would be consistent with economic theory to see entry into this market as price increases.

What’s In and Out in 2014

The categories used to classify products and services change regularly.  The new categories used in 2014 are listed in Figure 4. Some are similar to previous categories, such as Content Search/Navigation Services, which likely evolved from the separate category of Content Navigation, while others come with little previous background like RDK (Reference Design Kit).

Figure 4: New 2014 Categories


Many of the most popular 2013 categories continued to gain ground in 2014, with Video on Demand, IPTV, and Cable Programming showing strong gains.  Games and Consultants showed a strong increase in representation as well.  A complete list of the top gainers in 2014 is shown in Figure 5.  Some gainers declined in 2013 but returned with stronger showings in 2014.  This list includes categories such as Video on Demand, Billing, Internet TV Providers, IPTV, and Telecommunications Providers.  A chart of these categories is shown in Figure 6.

Figure 5: Biggest 2013-2014 Gainers


Figure 6: Categories that switched to growth in 2014


At the same time, some categories disappear between years.  In 2014 some notable categories are no longer present.

The categories that declined in 2014 included many that disappeared from the list completely such as HDTV, New Networks, tru2way, and VOIP. Several possible theories could explain these disappearances: perhaps some categories became so ubiquitous as to be meaningless in the context of a trade show (e.g., HDTV or VOIP), show organizers decided to no longer include a category because it overlapped too heavily with other categories (e.g., was “New Network” the same as “Program Networks?”), or because it is no longer relevant?

Other notable decliners include once “up and coming” technologies such as 3D TV, Mobile Apps, and Social TV. A decrease in a category is probably easier to interpret than an outright disappearance. 3D television, for example, has been a notable market disappointment and it is no surprise to see it disappearing from the show.

Figure 7: Biggest 2013-2014 Losers


Figure 8: Categories that switched to decline in 2014



Using data from the Cable Show’s exhibitors is advantageous because it is representative of the actors in the industry who have real money on the line.  In a tech world that loves to exaggerate the next “big thing”, using data directly from the industry members might help provide a better understanding of where the industry is headed.  However, this data must be used with caution.  First, the categories are self-reported by exhibitors, and while they have a clear incentive to accurately categorize their products and services, some might also see advantages in identifying with certain hyped industry technologies to attract customers. Secondly, the analysis weighs each exhibitor identically, which clearly isn’t accurate as some booths are massive and staffed by dozens of people while others are little more than a table and the company owner (Figure 9).

Despite these data shortcomings the data show a continued trend towards a cable industry more focused on its traditional roles as a television service provider, with programming, television, video, and networks topping our list in 2014, while the hyped technologies that were set to revolutionize the cable industry in 2012 and 2013 fell in 2014.

Figure 9: 2014 Cable Show Floor Plan


[1] According to www.rdkcentral.com, RDK is “a pre-integrated software bundle that provides a common framework for powering customer-premises equipment (CPE) from TV service providers, including set-top boxes, gateways, and converged devices.”

Comcast and Netflix—What’s the Big Deal?

By Tom Lenard
February 26th, 2014

Netflix and Comcast recently announced an agreement whereby Netflix will pay Comcast for direct access to its network.  This agreement addresses congestion that is slowing delivery of Netflix videos to Comcast’s broadband subscribers and resolves a dispute between the two companies concerning how to pay for the needed network upgrades.  Netflix and Verizon are currently working through a similar dispute.  While some commentators think deals such as the one between Netflix and Comcast are problematic, the reality is that the agreement reflects a common market transaction that yields an outcome more efficient and more quickly than any regulatory intervention could have.

The following series of stylized figures illustrate how the growth of Netflix and other streaming video services have affected the volume and flow of internet traffic and corresponding payments in recent years.  Traditionally (Figure 1), Internet backbone providers and ISPs entered into “peering” agreements, which did not call for payments on either side, reflecting a relatively balanced flow of traffic.  Content distributors paid backbone providers for “transit,” reflecting the unbalanced flow of traffic along that route.


With the growth of online video and with Netflix accounting for 30 percent of traffic at some times of the day, this system was bound to become strained, as we are now seeing and as shown in Figure 2.  The flow of traffic between the backbone provider and the ISP is unbalanced and has grown enormously, requiring investments in additional capacity.


One way to address this problem is for the backbone provider to pay the ISP, reflecting the greater amount of traffic (and greater capacity needed) going in that direction (see Figure 3).  In fact, that is what happened following a dispute between Level 3 and Comcast in late 2010.


Another solution is the just-announced Comcast-Netflix deal, reflected in Figure 4.  In this case, Netflix/Comcast is bypassing the intermediate backbone provider (either partially or completely), presumably because it is more efficient to do so.  One or both of them is investing in the needed capacity.  Regulatory interference with such a deal runs the risk of blocking an advance that would lower costs and/or raise quality to consumers.


The Wall Street Journal has described the debate as being “over who should bear the cost of upgrading the Internet’s pipes to carry the nation’s growing volume of online video:  broadband providers like cable and phone companies, or content companies like Netflix, which make money by sending news or entertainment through those pipes.”  Ultimately, of course, consumers pay one way or the other.  When Netflix pays Comcast, the cost is passed through to Netflix subscribers.  This is both efficient and fair, because the consumer of Netflix services is paying for the cost of that service.

In the absence of such an agreement, quality would suffer or the ISP would bear the cost.  The ISP might recover these costs by increasing prices to subscribers generally.  This would involve a cross-subsidy of Netflix subscribers by non-subscribers, which would be neither efficient nor fair.  Alternatively, Comcast could increase prices for those subscribers who consume a lot of bandwidth, which might have similar effects to the just-announced deal, but would probably lose some efficiencies.  In any event, it is difficult to see how such an arrangement would be better for consumers than the announced agreement.



The FCC Tries Yet Again

By Tom Lenard
February 19th, 2014

FCC Chairman Tom Wheeler’s official response to the DC Appeals Court decision on the Commission’s “net neutrality” rules promises to keep the issue on the table for the foreseeable future.  That is unfortunate, because there are better ways for the Commission and its staff to spend their time.

The Appeals Court took away from the Commission with one hand, while giving back with the other:  It struck down the more onerous provisions of the net neutrality rules—the “anti-discrimination” and “anti-blocking” provisions—because they imposed common carrier obligations and broadband is not classified as a Title II common carrier service.  However, the Court affirmed the Commission’s argument that it has general authority (under section 706 of the Communications Act) to regulate in order to encourage broadband deployment.

Since the Appeals Court decision came down, the FCC has been under considerable pressure from net neutrality proponents to reclassify broadband as a Title II common carrier.  In today’s announcement, the Commission declined to do that. However, the Commission also declined to close the Title II docket, keeping the threat of reclassification and the regulatory burdens and oversight that go with it, alive.

In addition, the Commission announced its intention to start yet another net neutrality rulemaking, under its section 706 authority, in order to fulfill the Commission’s no blocking and non-discrimination goals as well as to enhance the transparency rule (the one major provision that the court upheld).

With all the activity aimed towards asserting legal justification for its net neutrality rules, it sometimes gets lost that the FCC had no convincing economic or consumer welfare justification for the rules in the first place.

While there is widespread agreement that the Internet should be open and provide consumers with access to content, applications and services of their choice, the rule was always a solution in search of a problem, a sentiment echoed today by FCC Commissioner Pai.  The Commission never provided the necessary data and analysis to show that the rules would address a significant market failure, did not identify harms to users that the rules would remedy, and did not demonstrate that the benefits of the rules would exceed their costs.  In other words, the Commission neglected to explain why the broadband market, which has generally thrived under minimal regulation, should now be subject to an enhanced regulatory regime.   Indeed, a good argument can be made that, by making the adoption of innovative business models more difficult, the rules would have hindered rather than encouraged the deployment of broadband infrastructure, notwithstanding the Commission’s assertions to the contrary.

There is now substantial concern that the Appeals Court has expanded the Commission’s authority to include the entire Internet ecosystem—including potentially content, applications, and service providers—as long as it can make some plausible argument that its actions encourage broadband deployment.  Expanding the Commission’s domain in this way would be a serious mistake and would compound the harm.

A major goal of the Commission in promulgating its net neutrality rules initially was to “provide greater predictability.”  It clearly has not achieved that goal.  Starting yet another proceeding, and keeping the Title II docket open, will create even more uncertainty for the entire Internet ecosystem.

Chairman Rockefeller and Data Brokers

By Amy Smorodin
September 26th, 2013

Chairman Rockefeller recently sent letters to a dozen different companies seeking information on how they share information with third parties.  The letters are an extension of previous requests sent to “data brokers” asking for clarification of the companies’ “data collection, use and sharing practices.”  In the letters, the Chairman opines that the privacy policies on many websites “appear to leave room for sharing a consumer’s information with data brokers or other third parties who in turn may share with data brokers.”  He also stresses the importance of transparent privacy practices for consumers.

While a call for more information and data is certainly commendable, one should ask, “Where is this all going?”    Is the Chairman suddenly seeing the need for some data to inform policy making in this area?

While we would hope so, the Chairman’s letter infers the assumption that there is something inherently harmful about data collection and sharing, although this harm is not explicitly described.  He also posits that consumers may not be aware that their information is being collected or how it’s being used.  Again, there is no information offered on how this conclusion is reached.

Overall, more data to inform privacy policy-making would be a good thing.  As Tom Lenard has pointed out in filings, Congressional testimony, and a recent book chapter submission, the last comprehensive survey of privacy policies was back in 2001, a lifetime ago in the technology industry.  Ideally, any privacy proposals from Congress or the FTC should be based upon a survey of the actual current events on the ground, as opposed to opinions and assumptions.  Only with relevant data can policies be drafted that are targeting towards specific harms.  Additionally, data-driven policymaking can be evaluated to ensure that specific policy is performing as intended, and that benefits derived outweigh the costs of the regulation.

Data collection is burdensome and time consuming for companies involved. Any other government entity (besides Congress) would be required under the Paperwork Reduction Act to have its proposal be assessed, as they are required to “reduce information collection burdens on the public.” Since it doesn’t appear that Rockefeller’s recent requests for information are part of any systematic study or plan, it is understandable why some companies would bristle at the thought of spending time and resources on answering a list of questions.

The FTC recently conducted its own query in preparation for a study on “big data” and the privacy practices of data brokers.  One hopes the study, expected to be out by the end of the year, is well-designed and an objective look at the industry without a predetermination of results. Such a study would be useful going forward.

Dispatch from the TPI Aspen Forum – Monday Keynotes, Panels and Beyond

By Amy Smorodin
August 20th, 2013

(With help from Corey Rhyan)

The first full day of the TPI Aspen Forum began with a keynote speech by Bob Crandall, TPI Adjunct Senior Fellow and Nonresident Senior Fellow, Economic Studies Program at Brookings Institution.  Crandall’s remarks covered how broadband policy should be informed by an accurate assessment of current market conditions.  Despite what Crandall described as a pessimistic tone in recent reports on US broadband, a relaxed regulatory environment has led to a penetration rate over 98% for broadband in the US (including wireless options), and U.S. broadband speeds have been steadily increasing.  The US also leads in deployment of 4G wireless services around the globe.  As a result of robust competition between cable and copper, the US cable companies have deployed super-fast DOCSIS 3.0 technology to 85% of households and incumbent telecom providers have exceeded the cable companies’ capital investment to match their services in recent years.  While super-fast 100 Mbps speeds are often the topic of policy discussions Crandall pointed to evidence that households do not want to pay for extremely high-speed service even when it’s available. Crandall’s remarks can be viewed here.

Next up was the panel “Communications and IT – What Can We Expect From Congress?,” which featured ex-members of Congress Rick Boucher, Cliff Stearns and Tom Tauke, and was skillfully moderated by Brendan Sasso from The Hill.  The free-wheeling discussion began with the open question: what is the most important tech issue Congress is likely to address? While the answers varied from privacy, to spectrum and the upcoming incentive auctions, to cybersecurity, to NSA surveillance, each opined that the current Congress has a “productivity problem” when it comes to passing legislation.  One item some found particularly encouraging was the recently-created working groups on spectrum and privacy.  When asked about the current nominees for the FCC, all agreed there should be an easy confirmation, particularly because of the pair of Republican and Democratic nominees, but concern was voiced over the current nomination process.   Watch the entire (very entertaining) panel here.

The next panel, “Deconstructing Creative Destruction,” was a nod to the overall theme of the conference and featured: Danny Boice (Speek), Chris Ciabarra (Revel Systems), Joshua Gans (University of Toronto), Laura Martin (Needham & Company LLC), Hal Varian (Google) and was moderated by TPI’s Scott Wallsten. The two startup representatives or “real-world doers” according to Wallsten, discussed how their companies have become disruptive forces in their industries.  The key each entrepreneur proclaimed was solving a problem, one in particular that affects consumers and end-users.  The panel also discussed hurdles such as H1B visa use in start-ups and obstacles in hiring, and financing issues in innovative technologies.  Martin discussed today’s tax and investment environment, especially for the media and communications industries. The video can be viewed online here.

The third and final panel of the day, “Competition, Regulation, and the Evolution of Internet Business Models” focused on potential innovations in the pricing of broadband services and featured Kevin Leddy (Time Warner Cable), Robert Quinn (AT&T), Joshua Wright (FTC), Christopher Yoo (University of Penn Law School) and was moderated by TPI’s Tom Lenard.  Much of the discussion of these panelists focused on the potentially new pricing that could make broadband networks more efficient and create value for consumers.  However, a common theme pitted these innovations against the open internet laws current under review by the DC Circuit Court.  In fact, attempts so far to implement usage pricing has been called “discrimination” and resulted in quick backlash. FTC Commissioner Wright stated he believes the FTC is more than capable of protecting consumers in this space and that many of the proposed innovations and vertical agreements are precompetitive and the FTC can prevent those that may harm consumers.  The video can be viewed online here.

Monday lunch featured a speech from the Chairwoman from the FTC, Edith Ramirez, who focused her talk on the future of Big Data and the FTC’s role as a lifeguard for consumers.  Media coverage of the speech can be found here and here and video of Chairwoman Ramirez’s remarks can be viewed here.

Last night’s dinner keynote was the Hon. Mitch Daniels, President of Purdue University and Former Governor of the State of Indiana.  Daniels opined on “creative destruction” in higher education.  Video is here.

Tuesday’s panels and keynotes will be posted throughout today on the TPI YouTube channel.  They include: a keynote by Randal Milch, Executive Vice President of Public Policy and General Counsel at Verizon, and the discussion panels “Who Pays for the Internet – A Global Perspective, “Privacy, Data Security and Trade – Policy Choices,” and “The FCC’s Incentive Auctions – How Can They Succeed?”

The conference concludes this afternoon with “A Conversation with the Commissioners,” moderated by Politico’s Tony Romm.  Video of the talk will be up later this afternoon.

Thanks to all attendees and speakers who came out to the TPI Aspen Forum this year!  All of us at TPI are now taking a little break.  Hope to see you next year!

Dispatch from the TPI Aspen Forum – Sunday Opening Reception

By Amy Smorodin
August 19th, 2013

(With help from Corey Rhyan)

The 2013 Technology Policy Institute Aspen Forum started out this year with a little rain but plenty of good conversation.  Welcoming remarks were given by TPI President Tom Lenard and TPI Board Member Ray Gifford, who emphasized that the Forum was a great way to end the summer.

Every year, TPI secures a Colorado-based speaker to welcome attendees to the Forum.  This year’s speaker was R. Stanton Dodge, Executive Vice President, General Counsel and Secretary, Dish Network.  In keeping with the forum theme, Dodge opined on the creative destruction both past and present in the video delivery industry.  From the usage of smaller satellite dishes, to the rise of the DVR, the changing expectations of consumers have dictated change in the industry, which must transition to provide content on-demand.

Dodge also urged attendees to take time to watch the US Pro Cycling Challenge, which happens to be going through Aspen this year – after attending the afternoon breakout sessions, of course.

Video of last night’s remarks will be posted shortly on the TPI YouTube page, and you can follow along with the pithy and insightful tweets from attendees at #TPIAspen.

Highlights of today’s panels and keynotes will be coming soon.