Author Archive

Chairman Rockefeller and Data Brokers

Thursday, September 26th, 2013

Chairman Rockefeller recently sent letters to a dozen different companies seeking information on how they share information with third parties.  The letters are an extension of previous requests sent to “data brokers” asking for clarification of the companies’ “data collection, use and sharing practices.”  In the letters, the Chairman opines that the privacy policies on many websites “appear to leave room for sharing a consumer’s information with data brokers or other third parties who in turn may share with data brokers.”  He also stresses the importance of transparent privacy practices for consumers.

While a call for more information and data is certainly commendable, one should ask, “Where is this all going?”    Is the Chairman suddenly seeing the need for some data to inform policy making in this area?

While we would hope so, the Chairman’s letter infers the assumption that there is something inherently harmful about data collection and sharing, although this harm is not explicitly described.  He also posits that consumers may not be aware that their information is being collected or how it’s being used.  Again, there is no information offered on how this conclusion is reached.

Overall, more data to inform privacy policy-making would be a good thing.  As Tom Lenard has pointed out in filings, Congressional testimony, and a recent book chapter submission, the last comprehensive survey of privacy policies was back in 2001, a lifetime ago in the technology industry.  Ideally, any privacy proposals from Congress or the FTC should be based upon a survey of the actual current events on the ground, as opposed to opinions and assumptions.  Only with relevant data can policies be drafted that are targeting towards specific harms.  Additionally, data-driven policymaking can be evaluated to ensure that specific policy is performing as intended, and that benefits derived outweigh the costs of the regulation.

Data collection is burdensome and time consuming for companies involved. Any other government entity (besides Congress) would be required under the Paperwork Reduction Act to have its proposal be assessed, as they are required to “reduce information collection burdens on the public.” Since it doesn’t appear that Rockefeller’s recent requests for information are part of any systematic study or plan, it is understandable why some companies would bristle at the thought of spending time and resources on answering a list of questions.

The FTC recently conducted its own query in preparation for a study on “big data” and the privacy practices of data brokers.  One hopes the study, expected to be out by the end of the year, is well-designed and an objective look at the industry without a predetermination of results. Such a study would be useful going forward.

Dispatch from the TPI Aspen Forum – Monday Keynotes, Panels and Beyond

Tuesday, August 20th, 2013

(With help from Corey Rhyan)

The first full day of the TPI Aspen Forum began with a keynote speech by Bob Crandall, TPI Adjunct Senior Fellow and Nonresident Senior Fellow, Economic Studies Program at Brookings Institution.  Crandall’s remarks covered how broadband policy should be informed by an accurate assessment of current market conditions.  Despite what Crandall described as a pessimistic tone in recent reports on US broadband, a relaxed regulatory environment has led to a penetration rate over 98% for broadband in the US (including wireless options), and U.S. broadband speeds have been steadily increasing.  The US also leads in deployment of 4G wireless services around the globe.  As a result of robust competition between cable and copper, the US cable companies have deployed super-fast DOCSIS 3.0 technology to 85% of households and incumbent telecom providers have exceeded the cable companies’ capital investment to match their services in recent years.  While super-fast 100 Mbps speeds are often the topic of policy discussions Crandall pointed to evidence that households do not want to pay for extremely high-speed service even when it’s available. Crandall’s remarks can be viewed here.

Next up was the panel “Communications and IT – What Can We Expect From Congress?,” which featured ex-members of Congress Rick Boucher, Cliff Stearns and Tom Tauke, and was skillfully moderated by Brendan Sasso from The Hill.  The free-wheeling discussion began with the open question: what is the most important tech issue Congress is likely to address? While the answers varied from privacy, to spectrum and the upcoming incentive auctions, to cybersecurity, to NSA surveillance, each opined that the current Congress has a “productivity problem” when it comes to passing legislation.  One item some found particularly encouraging was the recently-created working groups on spectrum and privacy.  When asked about the current nominees for the FCC, all agreed there should be an easy confirmation, particularly because of the pair of Republican and Democratic nominees, but concern was voiced over the current nomination process.   Watch the entire (very entertaining) panel here.

The next panel, “Deconstructing Creative Destruction,” was a nod to the overall theme of the conference and featured: Danny Boice (Speek), Chris Ciabarra (Revel Systems), Joshua Gans (University of Toronto), Laura Martin (Needham & Company LLC), Hal Varian (Google) and was moderated by TPI’s Scott Wallsten. The two startup representatives or “real-world doers” according to Wallsten, discussed how their companies have become disruptive forces in their industries.  The key each entrepreneur proclaimed was solving a problem, one in particular that affects consumers and end-users.  The panel also discussed hurdles such as H1B visa use in start-ups and obstacles in hiring, and financing issues in innovative technologies.  Martin discussed today’s tax and investment environment, especially for the media and communications industries. The video can be viewed online here.

The third and final panel of the day, “Competition, Regulation, and the Evolution of Internet Business Models” focused on potential innovations in the pricing of broadband services and featured Kevin Leddy (Time Warner Cable), Robert Quinn (AT&T), Joshua Wright (FTC), Christopher Yoo (University of Penn Law School) and was moderated by TPI’s Tom Lenard.  Much of the discussion of these panelists focused on the potentially new pricing that could make broadband networks more efficient and create value for consumers.  However, a common theme pitted these innovations against the open internet laws current under review by the DC Circuit Court.  In fact, attempts so far to implement usage pricing has been called “discrimination” and resulted in quick backlash. FTC Commissioner Wright stated he believes the FTC is more than capable of protecting consumers in this space and that many of the proposed innovations and vertical agreements are precompetitive and the FTC can prevent those that may harm consumers.  The video can be viewed online here.

Monday lunch featured a speech from the Chairwoman from the FTC, Edith Ramirez, who focused her talk on the future of Big Data and the FTC’s role as a lifeguard for consumers.  Media coverage of the speech can be found here and here and video of Chairwoman Ramirez’s remarks can be viewed here.

Last night’s dinner keynote was the Hon. Mitch Daniels, President of Purdue University and Former Governor of the State of Indiana.  Daniels opined on “creative destruction” in higher education.  Video is here.

Tuesday’s panels and keynotes will be posted throughout today on the TPI YouTube channel.  They include: a keynote by Randal Milch, Executive Vice President of Public Policy and General Counsel at Verizon, and the discussion panels “Who Pays for the Internet – A Global Perspective, “Privacy, Data Security and Trade – Policy Choices,” and “The FCC’s Incentive Auctions – How Can They Succeed?”

The conference concludes this afternoon with “A Conversation with the Commissioners,” moderated by Politico’s Tony Romm.  Video of the talk will be up later this afternoon.

Thanks to all attendees and speakers who came out to the TPI Aspen Forum this year!  All of us at TPI are now taking a little break.  Hope to see you next year!

Dispatch from the TPI Aspen Forum – Sunday Opening Reception

Monday, August 19th, 2013

(With help from Corey Rhyan)

The 2013 Technology Policy Institute Aspen Forum started out this year with a little rain but plenty of good conversation.  Welcoming remarks were given by TPI President Tom Lenard and TPI Board Member Ray Gifford, who emphasized that the Forum was a great way to end the summer.

Every year, TPI secures a Colorado-based speaker to welcome attendees to the Forum.  This year’s speaker was R. Stanton Dodge, Executive Vice President, General Counsel and Secretary, Dish Network.  In keeping with the forum theme, Dodge opined on the creative destruction both past and present in the video delivery industry.  From the usage of smaller satellite dishes, to the rise of the DVR, the changing expectations of consumers have dictated change in the industry, which must transition to provide content on-demand.

Dodge also urged attendees to take time to watch the US Pro Cycling Challenge, which happens to be going through Aspen this year – after attending the afternoon breakout sessions, of course.

Video of last night’s remarks will be posted shortly on the TPI YouTube page, and you can follow along with the pithy and insightful tweets from attendees at #TPIAspen.

Highlights of today’s panels and keynotes will be coming soon.

Lenard to NTIA: Cost-Benefit Analysis can Ensure all Internet Users are Represented in Privacy Code of Conduct

Wednesday, April 4th, 2012

On Monday, Tom Lenard filed comments with the National Telecommunications and Information Administration (NTIA) regarding the proposed multistakeholder (MSH) process for developing a code of conduct.

Among the 80 comments filed with NTIA, many referenced the need to ensure both firms and Internet users were represented in the process.  In his comments, Tom identified one way to ensure the needs of all involved parties are taken into account: requiring a cost-benefit analysis of any proposed code of conduct.

Since the code will apply to many more consumers and firms than can be directly involved in the process, code provisions should be analyzed in much the same way as a regulation in order to assure that they produce benefits in excess of costs.  Tom also described the proposed code of conduct as similar to agency guidance, which is subjected to the regulatory review requirements of Executive Order 12866, including “a reasoned determination that its benefits justify its costs.”

In addition to urging a cost-benefit analysis of any proposed codes, Tom also warns of the need to try to protect against anticompetitive behavior.  NTIA and the MSH process should ensure any privacy code is neutral with respect to technology, business models and organizational structures.  In addition, procedures should guard against the process and resulting code being dominated by incumbents, which could raise the costs of entry and inhibit innovation in the Internet space.

Read more of Tom’s comments here.

FCC Reform Bills

Friday, November 4th, 2011

Politico’s Morning Tech reported Thursday that the release of the text of the already-approved USF order would be delayed probably until next week.  The delay of yet another adopted FCC order in being released to the public makes legislation recently introduced all the more appropriate. 

Wednesday, Rep. Walden and Sen. Heller released legislation aimed at improving agency transparency and process at the FCC.  Although  some interest groups have voiced concern that the proposed reforms on transaction reviews would benefit telecom companies, or overall would curtail the agency’s ability to protect the public interest, the proposals concerning  a cost benefit analysis of regulations are sensible – and desperately needed. 

The reforms, as described in Sen. Heller’s press release, would:

Require the Commission to survey the state of the marketplace through a Notice of Inquiry before initiating new rulemakings to ensure the Commission has an up-to-date understanding of the rapidly evolving and job-creating telecommunications marketplace.

Require the Commission to identify a market failure, consumer harm, or regulatory barrier to investment before adopting economically significant rules. After identifying such an issue, the Commission must demonstrate that the benefits of regulation outweigh the costs while taking into account the need for regulation to impose the least burden on society.

Require the Commission to establish performance measures for all program activities so that when the Commission spends hundreds of millions of federal or consumer dollars, Congress and the public have a straightforward means of seeing what bang we’re getting for our buck.

Apply to the Commission, an independent agency, the regulatory reform principles that President Obama endorsed in his January 2011 Executive Order.

Prevent regulatory overreach by requiring any conditions imposed on transactions to be within the Commission’s existing authority and be tailored to transaction-specific harms.

Identifying an actual market failure a regulation is attempting to address should be a given for policymakers but, unfortunately, the FCC rarely takes that approach. Even if attempts at pre-emptive regulation are well-intended, it is virtually impossible to analyze the effects of a regulation without some measurable outcome.   TPI President Tom Lenard echoed both the need for an identified market problem and a cost-benefit analysis before enacting regulation in comments to the FCC in response to the Open Internet Order NPRM and in comments to the FTC regarding their proposed privacy framework, illustrating that such principles can, and should, apply across regulatory agencies. Recently, Scott Wallsten showed how the FCC could incorporate cost-effectiveness analysis into its decision-making process in the context of universal service reforms.

I’m crossing my fingers that some iteration of Rep. Walden and Sen. Heller’s legislation actually passes.  It’s a great start at sensible, meaningful reform to the agency.

Use the Market to Allocate Spectrum

Wednesday, November 2nd, 2011

TPI President Tom Lenard has a post on The Hill’s Congress Blog discussing the benefits of allocating spectrum via voluntary incentive auctions.  Authorizing the FCC to hold auctions would not only make more spectrum available for the development of wireless broadband, but will also be a big step in creating a more efficient, market oriented spectrum regime.

Purchasers of spectrum through an FCC auction receive an “exclusive license” allowing them to use the spectrum for whatever purpose they want, so long as they don’t interfere with other licensees.  Those uses can change as new technologies emerge—e.g., as subscription TV overtakes over-the-air TV.  This is why this market-based system is flexible and can be expected to achieve an efficient allocation over time.  Moreover, these quasi-property rights are necessary for providers to invest the tens of billions of dollars necessary for advanced wireless services.

Lenard also addresses calls to allocate a significant portion of spectrum freed-up by incentive auctions to unlicensed uses.

Under the unlicensed model, the FCC establishes rules—such as power limits for approved devices—under which any device and any user can operate.  While this approach has yielded benefits—WiFi most notably—as with the legacy command-and-control model, there is no market mechanism in an unlicensed regime to move spectrum to its highest-valued uses.  It is also extremely difficult to determine the opportunity cost of allocating spectrum to unlicensed uses, and no way—other than relative lobbying clout—to determine how much, if any, should be so allocated.

Lenard warns that the amount of spectrum obtained from incentive auctions that is set aside for unlicensed uses would have a direct impact on the amount of funds available for reducing the federal deficit.

The Congressional Budget Office estimates that incentive auctions would yield about $16 billion assuming proposals on the table to allocate spectrum and money to a public safety network are adopted.  The net contribution of incentive auctions to deficit reduction would be reduced substantially if any significant part of the spectrum is not auctioned and instead is set aside for unlicensed uses.

Read the entire post on The Hill’s Congress Blog.

Privacy Bill of Rights Act – Not Terrifying but Still Cause for Concern

Wednesday, April 13th, 2011

Senators Kerry and McCain released their long-awaited privacy bill yesterday afternoon – the Commercial Privacy Bill of Rights Act of 2011. After scanning through the bill summary and text, I thought I would add my initial thoughts to the mountain of reaction the bill is sure to produce.

First of all, it is clear that the Senators made an attempt at addressing calls for privacy regulation while acknowledging the importance of the free-flow of information in the marketplace.  Specifically, the Senators cite the importance of online advertising in funding the “free” online content and services we all enjoy.  Also encouraging is the bill’s call for opt-out consent requirement for behavior advertising or marketing, which would have much less impact on commerce than mandatory opt-in consent.

Still, the bill does raise some concerns, which I touch upon below.

The bill requires that firms provide individuals access to information collected and mechanisms to correct the information.  This has the potential to create a host of issues.  First of all, anytime an individual accesses such information, it is an opportunity for a security breach or even fraud, which runs counter to the bill’s intention to improve information security.  Second, it is unclear if an individual is allowed to change any and all information a firm may have collected.  What about in instances where an individual may want to remove something they deem as negative, but still accurate?  Because of the vagueness of the language, these concerns are not addressed in the bill but they should be considered.  CORRECTION: It was just pointed out to me that the bill does allow a firm to deny access and correction, as long as they allow an individual to request that the firm stop using or distributing that information. 

Also of concern is the bill’s requirement to only collect information that is needed to deliver a specific service, but allow the use of this information to research and development for a “reasonable amount of time.”  There are real trade-offs when the flow of information is restricted.  In this case, restriction of information, including the length of time information can be held, will result in hindering innovation, especially in online services.  It is unclear if consumers value this restriction of information more than innovation in services, but their actual behavior in the marketplace suggests a willingness to give up information in return for services and content. 

Finally, the bill raises an overarching concern that has been reiterated many times by TPI’s esteemed leader, Tom Lenard: “Where’s the data?”  Indeed, the influx of privacy bills and reports of late seem to be based much on feeling and opinions – with not a real analysis of costs and benefits among them.  Without a cost-benefit analysis of these proposed regulations and identification of the actual harms the regulation is trying to address, it’s impossible to tell if any of these proposals will actually make consumers better off.   Since the commercial use of information has been a vital component of the wide array of services offered on the internet, it is imperative that any policy regulating the use of this information is supported by real data and analysis going forward.

Where is the USF Really Going?

Friday, March 18th, 2011

Our own Scott Wallsten participated in a Heartland InfoTech and Telecom News podcast to discuss his recent paper, “The Universal Service Fund: What Do High-Cost Subsidies Subsidize?“”  Just how much of of the funds are going to expenses not directly related to providing telecommunications service?  According to Scott’s research, 59 cents of every dollar is used for administrative and overhead.

The podcast covers a brief description of the policy goals of the Universal Service Fund, issues with the way the program is funded and distributed, and the incentives resulting from the subsidies for firms to increase costs.  In addition to discussing reforms underway to shift the program to subsidize broadband services, Scott also proposed specific reforms, including focusing on low-income assistance and distributing funds directly to consumers.

The podcast, hosted by Bruce Edward Walker, managing editor of Infotech & Telecom News, can be found on the Heartland Institute website.

We’re Number 13!

Thursday, February 17th, 2011

The Global Think Tanks Index, compiled by the Think Tanks and Civil Societies Program at the University of Pennsylvania, has ranked TPI as one of the top science and technology think tank in the world.  Of the top 25 (table 18), we are very pleased to be number 13!

The rankings were determined by a large group of experts and peer institutions, including journalists, donors, intergovernmental organizations, academic institutions, and think tank leaders.  The full report in available at www.gotothinktank.com.  

And, congrats to ITIF for making the list, too!

Commerce Department Green Paper – a lot of Opinion, not a lot of Data

Friday, January 28th, 2011

TPI President Tom Lenard filed comments with the Department of Commerce today regarding its proposed privacy framework.  His take: the Green Paper contains little data or analysis to show whether its framework will improve or reduce consumer welfare.  Moreover, the proposal “violates the spirit, if not the letter, of President Obama’s recent executive order on regulation, which stresses the need to evaluate both benefits and costs.”

Lenard strongly urges the agency to:

  • Collect current data on the privacy and data management practices of major web sites.  It is impossible to make an informed policy decision without an accurate understanding of current privacy practices.  The most recent available data appear to be from 2001.
  • Produce evidence showing that current practices are harming consumers. The agency’s privacy framework will only produce benefits to the extent it alleviates identified harms. 
  • Review what we know about how consumers value privacy. In addition to referring to current studies, the agency should also perform additional studies as a basis for estimating the benefits of a new privacy framework.
  • Estimate the costs of its privacy framework and alternative proposals. These estimates should include direct pecuniary costs to firms from devoting more resources to privacy and the indirect costs of having less information available.
  • Produce sufficient evidence of a reasonable expectation that the benefits of its proposal are greater than the costs.  Otherwise the proposal should not be adopted.

Tom’s brief comments can be found here.