REQUIRED IMAGE

REQUIRED IMAGE

press release

Regulating the Global Information Society

published by
Carnegie
 on October 5, 2000

Source: Carnegie

About the speakers (L to R): Christopher T. Marsden is a Research associate at the Globalization Center at the University of Warwick, and a 1999-2000 Research Fellow of the Information Infrastructure Project at the Kennedy School of Government, Harvard University. Monroe E. Price is co-director of the Program in Comparative Media Law and Policy at Oxford University and Professor of Law at the Benjamin N. Cardozo School of Law, Yeshiva University. Stefaan Verhulst is Director of the Program in Comparative Media Law and Policy at Oxford University and Resident Scholar at the Markle Foundation in New York City. Eli Noam is Professor of Economics and Finance at the Columbia Business School and Director of the Columbia Institute for Tele-Information.

Rapporteur's Report

On September 22, 2000, a panel of experts gathered at the Carnegie Endowment for International Peace for a discussion of the international politics of Internet regulation. The event also marked the release of a new book edited by Christopher T. Marsden, Regulating the Global Information Society (New York: Routledge, 2000), to which four of the five panelists contributed chapters.

The panel?s moderator, William Drake of the Carnegie Endowment, began by noting that there are significant transatlantic differences on the role of regulation in the Internet environment. The issue of government regulation of Internet infrastructure and transactions is highly controversial in the United States, and policymakers thus far have agreed with industry that non-regulation or self-regulation are the best baselines from which to proceed. Given this consensus, pragmatic discussion of the matter can be difficult because many American observers are quick to brand any regulatory proposals as the out-of-date, "second wave" thinking of people who simply do not "get it."

However, the situation is different across the Atlantic. While European policymakers have moved from a tradition of strict regulation to an embrace of more open markets, they still believe that active government oversight of electronic networks is warranted in some cases. In the European view, government regulation or "co-regulation" with industry can be legitimate tools for ensuring fair competition and promoting social objectives. This underlying difference of perspective is also reflected in Europe?s preference for the concept of a "Global Information Society" (GIS) instead of the narrower American formulation of a "Global Information Infrastructure" (GII). Having noted these differences, Drake also pointed out that the United States and Europe are in agreement (or approaching broad consensus) on a number of Internet-related policy issues. The essays in the book explore the political dynamics of transatlantic divergence and convergence, as well as some of the larger policy issues raised by the Internet revolution. Against this backdrop the panel would consider selected aspects of the same broad terrain.

Members of the Washington policy community consider the issues.

The first speaker, Christopher T. Marsden, noted that the growth of a global information economy has necessitated increasingly close relations between Europe and the United States. As a consequence, tensions have emerged over how to regulate the Internet, as evinced by disputes over such issues as privacy, domain name management, etc. At the same time, the two regions exhibit certain commonalities and points of convergence, as a parallel trend toward deregulation attests. Behind both of these dynamics, however, lies a frequently overlooked factor: the growing influence of multinational firms on their home and host country governments. Marsden argued that in disputes over Internet governance, the United States and European Union have been largely reduced to proxies representing such corporations as Microsoft, Bertelsmann, and Nokia. Negotiations between governments that supposedly seek to defend the national interest are shaped significantly by the competing interests of transnational firms. This abdication of governmental responsibility reduces democratic accountability and complicates the handling of important Internet policy issues.

The second speaker, Stefaan Verhulst, used the example of negative content regulation (protection against pornography, hate speech, etc.) to illustrate how regulatory paradigms have shifted with the introduction of new technologies. The old model of traditional mass media was characterized by few broadcasters, one-to-many transmission of information, easily-drawn lines between sectors, and national boundaries?all of which lent themselves to government regulation. Consequently, authorities sought to control negative content through such measures as licensing, scheduling offensive television programs late at night, and banning specific content from the airwaves. But the introduction of new technologies has shifted the context for content regulation, and few characteristics of the old model apply to today?s world of multiple broadcasters, many-to-many communication, convergence of media, and transnational information flow. In such an environment, Verhulst argued, old models of content regulation have become obsolete. "There should be a shift of the regulatory paradigm because there is a shift of the regulatory subject."




In consequence, industry self-regulation has emerged as the new paradigm of choice, with such mechanisms as codes of conduct, content rating, and technical measures allowing parents to filter the information that reaches their children. Self-regulation has a number of advantages over governmental regulation?it is more efficient and flexible, offers greater incentives for compliance, and is transnational in scope. Contrary to those who equate self-regulation with deregulation or non-regulation, Verhulst argued that the codes of conduct adopted by industry may actually be more stringent than rules previously imposed by government. But several caveats are also in order. First, self-regulation is almost always a misnomer. It normally involves a relationship with the government, which should shadow industry?s attempts to self-regulate and step in where it fails. This is why Europeans frequently refer to the concept as co-regulation. Second, self-regulation requires strong support from all parties involved to avoid the problem of free-riders. Finally, self-regulation works best when there is a balance between carrot and stick. In sum, Verhulst argued, properly implemented self-regulation can be a real and effective way of responding to the regulatory challenges of the new media environment.

The next speaker was Monroe Price, who took up the issue of information intervention in the Internet era. States have long claimed the right to regulate information that threatens their national security, but such regulation normally took place on a national level. International norms afforded states control over their own information space and generally supported their efforts to protect against foreign intervention. But the balance has shifted with the advent of global information flows and transnational security issues, and there is now increasing international support and legal justification for intervening in another country?s media, to prevent genocide, end an ongoing conflict, or speed reconstruction after a civil war, for example. Borrowing a term from Jamie Metzl, Price referred to this activity as "information intervention." Information intervention shows that the issue of regulation and the role of the state are not dead?they may simply be rearticulated.



With traditional broadcast media, information intervention has taken a variety of forms, including monitoring of foreign broadcasts, jamming incendiary signals, and "peace broadcasting" from transmitters outside of the target country. If states wish to extend such information intervention to the Internet, however, they face a number of challenges. With traditional media, international broadcasters were largely state-controlled, e.g. Voice of America, the BBC, and Deutsche Welle. In the age of the Internet, barriers to entry are much lower and a number of substate or nonstate actors have a substantial international presence. Additionally, radio and television broadcasts reach millions throughout a target country; at present, the Internet is only available to a small elite in the developing world. For this reason, the Internet may be more effective as a tool of organization than a tool of mass persuasion. Another difference is that the Internet allows for destructive intervention and information warfare?disabling web servers or shutting down computer-controlled electricity, telephone, and water systems. Perhaps for this reason, the United States sees itself as a potential target of Internet intervention, whereas with traditional media it normally initiated the process. Its position on information intervention in the Internet age is likely to be informed by this vulnerability.

Similarly, other states are likely to resist foreign intervention in their Internet sphere, just as they have jammed radio and television broadcasts in the past. Price speculated that some may even delay the diffusion of new technologies, by maintaining high costs for Internet and other telecommunication services, for example. Others will seek to filter Internet content through their control of the underlying network architecture, as China has tried to do. Some, like the United States, will invest in a sophisticated anti-terrorism surveillance system and attempt to monitor international data flows to detect damaging information. Still others will seek to influence international norms regarding new technologies, promoting privacy and control of national data space over openness and transparency.

Eli Noam discusses nanoregulation.

The final speaker, Eli Noam, presented a forward-looking vision of Internet regulation. Rejecting the popular cliches that the Internet cannot be controlled by government, Noam argued that electronic commerce is tending toward a new type of transaction, the nanotransaction, whose technological characteristics would indeed facilitate Internet regulation. The environment in which information exists and operates is becoming increasingly complex, with a large volume of automated transactions taking place between separate machines. Distributed storage networks (e.g. Napster and Gnutella), collaborative processing networks (such as those that analyze data from outer space), and interaction between processors and sensors (automobiles communicating with highways, etc.) all feature multiple information transactions but lack convenient payment mechanisms. To treat these transactions in the most economically efficient manner, Noam argued that

it is necessary?to create a method to conduct numerous transactions in information goods and services quickly, under rapidly changing circumstances, and involving numerous parties that do not know each other? The logical way to do this is to push [payment] down from the human level, and even from the centralized machine level, to the actual level of the information itself?and to integrate information with money."

Noam described a system in which packets of information on the Internet would be equipped with electronic wallets, allowing them to pay for the nanotransactions in which they participate, all free from human interaction. Just as automobile drivers carry money to travel on toll roads, buy gas, and pay for parking, so packets of information will be able to enter toll gateways, pay for services, or even receive payment for services they perform. Users could instruct their computers to charge for advertisement, so they only see ads that have paid a certain price for the inconvenience they impose; the packets that make up the advertisement could themselves purchase the right to be displayed. Music publishers could send out their latest releases to circulate on the Internet; listeners? computers could then automatically purchase the digital music if the artist were of interest and the price were right. A system of nanotransactions, argued Noam, "creates?money that is linked to processing intelligence?smart money. But it also creates information that is linked to means of payment?in other words, rich information."




Not only does a system of nanotransactions facilitate complex economic interaction, but it also opens up new possibilities for government regulation. "Information will become identifiable, and therefore targetable, and therefore regulable, and?the notion that you cannot regulate the Internet will turn out to be nonsense," argued Noam. In the future, for instance, governments might introduce new redistribution policies, sanctioning regulated price discrimination to charge less for educational data and more for pornography. Through a labeling system that specifies the national origin of packets, governments could privilege domestic over foreign information. Whatever the specific measures introduced to regulate nanotransactions, the prospect calls into question the assumption that the Internet cannot be controlled because "information wants to be free." Rejecting that maxim, Noam offered a new one instead: "What information really wants is to have its own credit card and to travel around the world."


"What information really wants is to have its own credit card and to travel around the world."

Following Noam?s presentation, Drake asked each of the participants to speak to the international political implications of the topics that they had addressed. Price mentioned the possibility that governments would agree on a common way to regulate the Internet across borders, and he suggested that such "regulation fixing" might be akin to corporate price fixing. Verhulst pointed to the growing power of international standard-setting bodies and argued that the regulation of meta-information, or information about information (ratings, labels, etc.) would be increasingly important in the future. Noam reiterated his prediction that nanoregulation would be the future method of choice, and he noted that governments (national and local) do not have to collaborate to impose regulation under such a schema. Consequently, a future of nanoregulation may bring with it the problem of too many contradictory authorities trying to control their slice of the Net. Finally, Marsden pointed to several of the arguments presented in his edited book. In particular, he noted, one chapter examines China?s attempts to reap economic benefit from the Internet while avoiding social impact, as well as its policy of tolerating some prohibited behavior while occasionally cracking down on the worst offenders.

In the questions that followed, several audience members picked up on the issue of Internet control in authoritarian regimes. In particular, one compared the current hype over the Internet and democratization to the now-discredited technological determinism argument, which held that countries adopting the same technologies would follow similar political paths. Noam acknowledged similarities in the issues, but he insisted that we have to revisit the debate because the Internet community often forgets lessons of the past and develops naïve ideas about freedom and democracy. The Internet may challenge authoritarian regimes, he argued, but it also offers them new mechanisms of control. Price and Marsden noted that democratic governments attempt to control the Internet as well. Both Britain and the United States, for instance, have sought to develop sophisticated technology to eavesdrop on electronic communications within their borders.

Report prepared by Taylor Boas.

Carnegie does not take institutional positions on public policy issues; the views represented herein are those of the author(s) and do not necessarily reflect the views of Carnegie, its staff, or its trustees.