Controlling the New Media: Hybrid Responses to New Forms of Power

AuthorColin Scott,Andrew Murray
Published date01 July 2002
DOIhttp://doi.org/10.1111/1468-2230.00392
Date01 July 2002
THE
MODERN LAW REVIEW
Volume 65 No 4July 2002
Controlling the New Media: Hybrid Responses to
New Forms of Power
Andrew Murray* and Colin Scott**
The development of new media industries, stimulated by the technology of
digitalisation, has thrown up an important literature on mechanisms for regulation
and control. In this article we elaborate on and develop Lawrence Lessig’s
`modalities of regulation'analysis. As we reconceive them the four basic control
forms are premised upon hierarchy, competition, community and design and can
be deployed in fifteen pure and hybrid forms. This analysis is enriched through
elaborating on the essential elements of control systems (standard-setting,
monitoring and behaviour modification) to demonstrate the importance and
variety of hybrid forms that real-world control systems take in the new media
domains. Although the article does not provide any universal prescriptions as to
which control forms are likely to be most appropriate in particular domains, it
does provide a richer analytical base both for understanding existing control
mechanisms and the potential for using greater variety. The development of
regulatory regimes which are both legitimate and effective in any given domain is
likely to require sensitivity to the particular context and culture of both the domain
and the jurisdiction within which it is located.
Introduction
The emergence and identification of the new media, premised upon the
development and application of digital technologies, has created new sources
and locations of power, many not fully documented or understood. Those new
configurations of power which have been identified have stimulated distinctive
literatures about the most appropriate mechanisms of control. With much of the
literature classical or ‘command and control’ regulation is held either to be
undesirable or unfeasible in the face of the new policy challenges. For one school
of thought the changing market structures associated with the new media indicate a
reduced role for classical regulation and its virtually total displacement by
ßThe Modern Law Review Limited 2002 (MLR 65:4, July). Published by Blackwell Publishers,
108 Cowley Road, Oxford OX4 1JF and 350 Main Street, Malden, MA 02148, USA. 491
*Law Department, London School of Economics.
** Research School of Social Sciences, Australian National University and Centre for Analysis of Risk and
Regulation, London School of Economics.
Though we accept full responsibility for errors and infelicities we are grateful to the following for
comments on an earlier draft: Julia Black, John Braithwaite, Neil Duxbury, Peter Grabosky, Mathias
Klang, Robin Mansell, David Post, and participants in the Competition Law and the New Economy
Workshop, University of Leicester, July 2001.
competition law.1For another school the emergence of the Internet presents
insuperable problems for classical regulation and alternative mechanisms of
control based on self-regulation and architecture are more likely to be effective.
In this article we draw together some of the regulatory problems presented by the
new media and apply a developed and modified version of Lawrence Lessig’s
‘modalities of regulation’2analysis to thinking about the range of mechanisms
which have been developed to address these problems. Accordingly we first provide
a description of some of the key problems identified with controlling the new
media. Our modified version of Lessig’s analysis claims that there are four bases of
regulation – hierarchy, competition, community and design. We set the analysis to
work demonstrating that these four bases of regulation are observable as means of
addressing the range of regulatory problems of the new media. The tendency to
privilege one basis for regulation over others appears to us to be consistent neither
with empirical observation nor with the normative considerations of institutional
design for good regulation. What we observe is the prevalence of hybrid forms of
control which, when better understood, could provide the basis for a better informed
policy debate about the control of the new media.
Differences in approach may partly be explained by reference to the cultures and
preoccupations within different jurisdictions. The UK, and many European Union
states, have a strong tradition of self-regulation in the media generally and the
legitimacy of this form of governance is widely accepted.3Private governance
forms are generally less well recognised and accepted in the United States and
have been the subject matter of fierce debate over their legitimacy.4A related bias
in the US literature is the very high value placed on the constitutional ideal of
freedom of speech which feeds into a strong libertarian underpinning to much
discussion of regulation of new media.5Though freedom of speech may have some
constitutional protection in EU states, the extent to which such a right is qualified
by other collective considerations is quite pronounced. Thirdly American
scholarship on new media issues is dominated by the ‘legal centralist’ perspective
of law and economics which accords less recognition to the potential for pluralism
in the generation of norms than is true of some European scholarship.6
1 The Chicago School of Law and Economics supports market control where markets are competitive.
If the market is uncompetitive, competition law provides an adequate remedy. This premise has been
attacked in relation to layered communications networks. See L. Lessig, The Future of Ideas: The
Fate of the Commons in a Connected Worlds (New York: Random House, 2001) 110; C. Salop &
R.C. Romaine, ‘Preserving Monopoly: Economic Analysis, Legal Standards and Microsoft’ (1999) 7
George Mason Law Review 617.
2 L. Lessig, Code and Other Laws of Cyberspace (New York: Basic Books, 1999) 88.
3 J. Black, ‘Constitutionalising Self-Regulation’ (1996) Modern Law Review 24.
4 The suspicion of private governance institutions in American legal scholarship is forcefully
represented by Michael Froomkin’s critique of the Internet Corporation for Assigned Names and
Numbers (ICANN): M. Froomkin, ‘Wrong Turn in Cyberspace: Using ICANN to Route Around the
APA and the Constitution’ (2000) 50 Duke LJ 17; Compare the (European) views of W.
Kleinwa¨chter, ‘The Silent Subversive: ICANN and the New Global Governance’ (2001) 3 Info 259
which are largely approving of the innovation in governance created by ICANN.
5 M. Castells, The Internet Galaxy (Oxford: Oxford University Press, 2001) 33. S. Venturelli,
‘Inventing E-Regulation in the US, EU and East Asia: Conflicting Social Visions of the Internet and
the Information Society’ paper presented to 29th Research Conference on Communication,
Information and Internet Policy, October 2001, Alexandria, Virginia available at
www.arxiv.org/ftp/cs/papers/0110/0110002.pdf > (visited 19 December 2001).
6 The analysis is succinctly made by R. Ellickson, ‘The Aim of Order Without Law’ (1994) 150
Journal of Institutional and Theoretical Economics 97, which provides a summary of the fuller
treatment in R. Ellickson, Order Without Law (Cambridge, MA: Harvard University Press, 1991). See
also R. Cooter, ‘Against Legal Centrism’ (1993) 81 California Law Review 417; L. Lessig, ‘The
Regulation of Social Meaning’ (1995) University of Chicago Law Review 943.
The Modern Law Review [Vol. 65
492 ßThe Modern Law Review Limited 2002
Whatever the effects of intellectual biases we suggest that research and thinking
on control of new media sectors has generated novel insights on regulation which
are of wider application. In particular the current debate on how forces of control
may be used to shape the future development of networks is of wider interest to
researchers in the fields of law, economics and social policy. The debate is centred
upon the role of the commons in the fledgling third generation Internet. Sunstein’s
claim that ‘there is no avoiding ‘‘regulation’’ of the communications market’7has
been met by an equally forceful counterclaim by Lessig that ‘[t]he issue for us will
not be which system of exclusive control – the government or the market – should
govern a given resource. The question for us comes before: not whether the market
or the state but, for any given resource, whether that resource should be controlled
or free’.8Lessig’s call for a debate on this issue provides a powerful rallying call to
those lobbying for the deregulation of, in the sense of making free, all layers of the
Internet infrastructure. The debate called for by Lessig is not new. The open source
movement led by Richard Stallman and the Free Software Foundation has lobbied
for deregulation of the code level since the mid 1980s.9Deregulation at the content
level was built into the original Internet infrastructure by network designers such as
Paul Baran, Jerome Saltzer, David Clark and David Reed.10 This has since been
substantially eroded by the development of intelligent networks such as Resource
Reservation Protocol (RSVP).11 There is no doubt many in the new media sector
will respond to Lessig’s analysis and over the next five years the wider dialogue of
the role of regulation within political science, media and economics will be
strongly influenced by this currently narrow legal debate. Thus, while we make
extensive use of examples drawn from new media, we suggest that the developed
models of control which we discuss are of interest to policy makers and researchers
with interests in governance and regulation generally.
New media and the problems of effective control
Processes of digitalisation associated with the development of new media have
brought about important reconfigurations of power. The Internet, for example,
provides widespread access to technology based on a network of networks and
addressing systems which connect computers globally.12 It is said to create a space
where users can engage in a variety of activities with a substantial autonomy from
state power which does not exist in non-digital media.13 Digitalisation of
broadcasting and mobile telecommunications create niches for new forms of
service provider, shifting power away both from those who own the physical
infrastructure of networks and from those who own content. We identify in this
section three general problems of new media (that is problems which apply
7 C. Sunstein, Republic.Com (Princeton NJ: Princeton University Press, 2001) 128.
8 Lessig, n 1 above, 12.
9ibid 52–61.
10 ibid 34–44; J. Saltzer, D. Reed and D. Clark, ‘End-to-End Arguments in System Design’ (1984) 2
ACM Transactions in Computer Systems 277. Online version at < http://web.mit.edu/Saltzer/www/
publications/endtoend/endtoend.pdf > (visited 4 January 2002).
11 Discussed below, 499.
12 B. Leiner et al, ‘A Brief History of the Internet’ Internet Society available at < http://www.isoc.org/
internet/history/brief.shtml > (visited 7 January 2002); Castells, n 5 above, ch 1.
13 S. Sassen, ‘Digital Networks and the State: Some Governance Questions’ (2000) 17 Theory, Culture
and Society 19, 20. According to Lawrence Lessig much of this autonomy is hard-wired into the
network by its end-to-end architecture, Lessig, n 1 above, 26–41.
July 2002] Controlling the New Media
ßThe Modern Law Review Limited 2002 493
generally or to more than one medium) which arise from shifts in power. None of
these problems is exclusive of the new media, though each emerges with
interesting new features in this context. They are the problems of regulatory
arbitrage, anonymity and scarcity of resources. In each case once prevalent
governance forms based on public ownership are no longer fashionable (and for
some no longer feasible) enhancing the urgency of investigating other forms of
control. We should be clear that these are not the only problems associated with the
new media. Among the other pressing policy problems are the issues relating to
accessibility of digital broadcasting and communications services to less
advantaged consumers (which can be defined both in economic and social
terms)14 and the extent to which content of digital broadcasting should be
controlled (in the manner that both negative and positive content controls apply to
analogue broadcasting).15 Discussion of these issues is precluded for reasons of
space and in the belief that the theoretical frame developed is sufficiently
addressed by the policy problems which we do discuss.
The regulatory arbitrage problem
The problem of regulatory arbitrage emerges wherever subjects of regulation have
sufficient mobility in their operations or activities that they can choose to be
regulated by one regime rather than another. The effect is to create a form of
market for regulation within which dissatisfied subjects can ‘exit’ one regime in
favour of another. Regulatory arbitrage, seen as a problem for authorities
attempting to capture activities within their web, can also be seen as a solution to
problems of excessive or inappropriate regulation as it limits the capacities of
authorities.16 The problem has an interesting double-edged character in the new
media, since options to relocate to avoid particular regulatory regimes may be
available both to service providers and consumers. Thus broadcasters can relocate
their operations to different jurisdictions to evade national regulation (and this
predates digitalisation) while listeners and viewers can relocate from the more
controllable forms of delivery to satellite and Internet. One of the problems raised
by regulatory arbitrage is the risk that competing standards for the new digital
broadcasting transmission services might develop. This is squarely addressed with
harmonised rules requiring all member states to legislate for common standards in
the EU, notably in respect of consumer equipment for conditional access to
services.17 Under the terms of European Union legislation the EU rules on
broadcasting regulation apply only to broadcasters established in a state to which
the applicable directive applies.18 The directive’s requirements that member states
apply their domestic broadcasting rules to all broadcasters established within the
14 M. Lemley and D. McGowan, ‘Legal Implications of Network Economic Effects’ (1998) 86
California Law Review 479; P. David, ‘The Evolving Accidental Information Super-Highway’ (2001)
17 Oxford Review of Economic Policy 159; M. Cave and R. Mason, ‘The Economics of the Internet:
Infrastructure and Regulation’ (2001) 17 Oxford Review of Economic Policy 188.
15 See. C. Sunstein, ‘Television and the Public Interest’ (2000) 88 California Law Review 499; D.
Goldberg, T. Prosser and S. Verhulst, Regulating the Changing Media: A Comparative Study (Oxford:
Oxford University Press, 1998).
16 See W. Bratton, J. McCahery, S. Picciotto and C. Scott (eds), International Regulatory Competition
and Coordination (Oxford: Oxford University Press, 1996).
17 Directive 95/47/EC OJ 1995 L281, 23.11.95 p 51, Art 4; Broadcasting Act 1996.
18 Council Directive 89/552/EEC OJ 1989, L298 p 23 as amended by Directive 97/36/EC, OJ 1997 L202
p 60, Art 2(1); B. Drijber, ‘The Revised Television Without Frontiers Directive: Is it Fit for the Next
Century’ (1999) 36 Common Market Law Review 87, 92.
The Modern Law Review [Vol. 65
494 ßThe Modern Law Review Limited 2002
state has been interpreted so as to require member states to apply their rules as
intensely to broadcasters directing their programming at other member states.19
This interpretation is intended to preclude countries like the UK establishing
themselves as attractive locations for establishment of overseas broadcasters
through the application of a more liberal regime than would apply to domestic
broadcasters.20 This is a particular issue with broadcasters seeking to evade what
they regard as overly restrictive domestic rules, for example on advertising to
children or transmission of pornography.
Regulatory arbitrage in Cyberspace (that is applying to the Internet) is a focal
point for two opposing schools of thought, the Cyberlibertarians and the Cyber-
paternalists. The primary argument of the Cyberlibertarians is that Cyberspace is
unregulable due to its design. Cyberspace is a unique jurisdiction as it has no
physicality or real-world existence. It is possible to conceive of Internet users
simultaneously in Cyberspace and in a grounded, real-world jurisdiction.21 It is this
duality and the non-physicality of Cyberspace which allows for regulatory
arbitrage. In the physical world sovereignty is exercised by governments over
defined physical territories. A user who wishes to be regulated by a different
regulatory structure may take steps to relocate either themselves or their activities.
In Cyberspace users may transcend physical borders with ease and may choose to
take on any guise or form desired (see below ‘The Anonymity Problem’). Users
who prefer a regulated environment where there are structured discussions on
carefully selected topics, and where content is closely monitored and censored may
choose to join a regulated and monitored cybercommunity such as America Online
(AOL). Users seeking uncensored discussion and complete freedom of speech may
make use of a virtual chat room on the USENET system or may use an Internet
Service Provider (ISP) to enter unmonitored discussion boards on the Web. These
freedoms allow users to choose freely the regulatory structure they wish to follow
while in Cyberspace. Thus a citizen of Germany can enter a USENET discussion
group on the Holocaust and post denial messages, something he or she would be
unable to do freely in their home state. Similarly a UK citizen may post
information which is in breach of the Official Secrets Acts. Although strictly
speaking these citizens are still committing offences within their physical
jurisdiction, they can do so without fear of prosecution as in Cyberspace they
have taken on a different personality and thus are unlikely to be traced and
prosecuted.22 These citizens have effectively removed themselves from the
regulatory control of their sovereign government and have chosen to be regulated
by another set of regulatory values and norms. This is because, as dramatically put
by David Post, ‘Cyberspace . . . does not merely weaken the significance of
physical location it destroys it . .. they do not cross geographical boundaries (in the
way that say environmental pollution crosses geographical boundaries), they
ignore the existence of boundaries altogether’.23
19 Commission vUnited Kingdom Case C-222/94, [1996] ECR I-4025.
20 Broadcasting Act 1990, s 43.
21 n 2 above, 190.
22 With a degree of computer literacy they can ensure that it would be almost impossible for law
enforcement agencies in the physical world to track them down and prosecute. This is discussed
further below at 496–497.
23 D. Post, ‘Governing Cyberspace’ (1997) 43 Wayne Law Review 155. Online version at < http://
www.temple.edu/lawschool/dpost/Governing.html > (visited 4 January 2002).
July 2002] Controlling the New Media
ßThe Modern Law Review Limited 2002 495
The anonymity problem
The non-physicality of Cyberspace allows Internet users to choose to adopt a
different persona from their real-world personality (pseudonymity) or to hide all
details of their personality (anonymity). Pseudonymity and anonymity provide a
further set of problems for regulators. As well as facilitating regulatory arbitrage
by allowing citizens to conceal their identity, thereby inhibiting the application of
civil, administrative and criminal regimes while in Cyberspace, pseudonymity and
anonymity also allow Netizens to carry out transactions in an unregulated
manner.24 Two examples which may be given are the distribution of hate or
defamatory speech, and access to regulated content.
To begin with the latter, there are certain areas in our physical societies where
we regulate access to certain persons. Children are not permitted access to public
bars or licensed sex shops. In addition there are activities that are restricted to
certain persons. Only those with driving licenses may legally drive and only those
who are members of the appropriate professional society may practice as a lawyer.
A lack of physical persona makes the regulation of such simple activities much
more complex in Cyberspace. A child may take on an adult personality and gain
access to pornographic content.25 In the physical world a child entering a licensed
sex shop would be removed by the manager, whereas in Cyberspace the elements
of physicality are lost and the ability to regulate is impaired. This is not to say the
anonymity problem renders regulation of access impossible. Community-based
control structures, supported by design-based elements have met with a high
degree of success.26 More worryingly, the access control problem allows for the
potentially more harmful conduct of adults passing themselves off as children. In
the same way that children are prevented from accessing certain adult areas of the
physical world, there are areas where unauthorised adults are kept out to protect
children.27 Children nowadays are educated to keep away from strangers and to be
wary of any unusual adult contact. Again the lack of physicality in Cyberspace
raises problems. Users cannot discern the age of others in the chatroom intended
for children. As it is at the user’s discretion how much information he wishes to
reveal about himself there is no practical method of ensuring that adults do not
pose as minors for as long as Cyberspace supports an anonymous culture. And
given that any attempt to remove the currently available culture of pseudonymity/
anonymity would probably lead to a high level of regulatory arbitrage there is no
apparent means to deal with such problems.
Further, the easy availability of anonymous messaging allows individuals to take
part in activities without being required to meet usual societal norms. Individuals
may make antisocial comments without fear of being ostracised by society at large.
The technology of anonymous remailers when coupled with encryption technology
can ensure an untraceable message source.28 This may be used to distribute
comments about an individual or organisation without fear of prosecution or social
24 Netizen is the universally accepted term for a ‘citizen of the Internet’.
25 n 2 above, 174.
26 See below ‘Other Forms of Control’.
27 Examples would be schools, children’s playgrounds, nurseries and other controlled environments.
28 The technology is described in some detail by Michael Froomkin in ‘The Internet as a Source of
Regulatory Arbitrage’, in B. Kahin and C. Nesson (eds), Borders in Cyberspace (Cambridge, MA:
MIT Press, 1997). Online version available at < http://www.law.miami.edu/~froomkin/articles/
arbitr.htm > (visited 4 January 2002).
The Modern Law Review [Vol. 65
496 ßThe Modern Law Review Limited 2002
exclusion.29 Anonymity in Cyberspace creates a unique culture where expression
free from the normal constraints of legal and social control is common. Even the
United States with its particular emphasis on the right to free speech cannot allow
completely unfettered or unrestricted freedom of expression.30 Cyberspace
uniquely offers a forum for unfettered free expression.31 Although it may be
argued that ISPs or other moderators of discussion groups may remove offending
messages, they may be reposted somewhere else in Cyberspace almost
immediately. Also Netizens may directly address others via e-mail. Again
although this practice, known as spamming is regulated in Europe by the Distance
Selling and E-commerce Directives32 and by other enactments worldwide, the
availability of anonymous communications renders such enactments impotent
within Cyberspace. He who cannot be caught cannot be punished. Anonymity
therefore allows for perfect freedom of expression, which in the physical world has
been tempered by even the most liberal of regimes.
The scarce resources problem
Regulators in the new media are called upon to oversee systems of allocation of
scarce resources. All new media sectors draw heavily on limited resources,
whether these be natural resources such as spectrum for the telecommunications
or broadcasting sectors or man-made resources such as domain names in relation
to Cyberspace. Digital developments do, in some respects reduce existing
scarcity problems. Thus digital broadcasting uses spectrum more efficiently and
thus enhances capacity.
33
This may in turn create a problem for regulators
seeking to maintain controls designed to ensure pluralism in the broadcasting
sector.
34
The spectrum scarcity problem is exemplified by the emergent market for third
generation (3G) mobile communications.35 3G mobile will make multimedia
29 Following the enactment of the Communications Decency Act 1996 a US-based ISP has no third
party liability for any libellous messages carried on their system (s 230). In the UK and the European
Union ISPs may have third party liability if they fail to act once the nature of a libellous message is
drawn to their attention. See Godfrey vDemon Internet [1999] 4 All ER 342 and the E-Commerce
Directive (Directive2000/31/EC OJL 178 , 17/07/2000, 1–16) Art 12.
30 n 7 above, 151–153. For a fuller account of the philosophical foundations upon which restrictions on
the first amendment are justified see F. Schauer, ‘The Aim and Target in Free Speech Methodology’
(1989) 83 Northwestern University Law Review 562; R.K. Greenawalt, ‘Free Speech Justification’
(1989) 89 Col. LR 119.
31 Several commentators cited the success of the complainers in UEJF (L’Association Union des
Etudiats Juifs de France) et Licra (La Ligue Contre le Racisme et l’Antise
´mitisme) vYahoo! Inc,
L’ordonnance du Tribunal de Grande Instance, 20 November 2000 as evidence of the ability of courts
to regulate expression in Cyberspace. This confidence has been substantially eroded following the
finding of Judge Fogel in Yahoo! Inc vLicra ND Cal Filed 7 November 2001, that the French order is
not enforceable in the United States as ‘[it] chills Yahoo’s First Amendment Rights . .. and that the
threat to its constitutional rights is real and immediate.’ (at 23) Decision available at < http://
www.cand.uscourts.gov/cand/tentrule.nsf/4f9d4c4a03b0cf70882567980073b2e4/daaf80f58b9f-
b3e188256b060081288f/$FILE/yahoo%20sj%20%5Bconst%5D.PDF > (visited 20 December 2001).
32 Distance Selling Directive, Directive 97/7/EC OJ L 144, 04/06/1997, pp 19–27; E-Commerce
Directive, Directive 2000/31/EC, OJ L 178, 17/07/2000 pp. 1–16. The Distance Selling Directive was
implemented in the UK through the Consumer Protection (Distance Selling) Regulations 2000 SI
2334. At the date of writing the E-Commerce Directive awaits implementation.
33 R. Collins, ‘Back to the Future: Digital Television and Convergence in the United Kingdom’ (1998)
22 Telecommunications Policy, 383, 384–385.
34 M. Cave, ‘Regulating Digital Television in a Convergent World’ (1997) 21 Telecommunications
Policy 575, 590.
35 Sometimes known as Universal Mobile Telecommunications System (UMTS).
July 2002] Controlling the New Media
ßThe Modern Law Review Limited 2002 497
services available to mobile phone users anywhere in the world, combining
satellite and terrestrial digital capacities. This development has the potential both
substantially to displace a number of current communications technologies,
notably second generation mobile and fixed link telephony, and to grow new
markets in mobile communications. Most EU Member States have concluded that
spectrum scarcity permits them to licence between 4 and 6 network operators for
3G mobile.36 The objectives of the licence allocation processes have been to
promote the development of competitive markets, to allocate the spectrum to those
best placed to use it, and in many cases to secure windfall fee-income to the
finance ministry. Further policy making will be necessary to determine the terms
on which service providers who do not have network operators licences can have
access to the networks for the provision of services.
Scarce resources are also a problem in Cyberspace. The Internet is often seen as
a network without resource constraints. If more resources are needed more
computers can be added to the network. This though only increases the available
processing power of the net, there are other key areas where resources remain
scarce. One area is bandwidth.37 Modern telecommunications networks rely on the
ability to transmit data from one source to another and in this respect the Internet is
no different from mobile telecommunications networks. Network content is
increasingly sophisticated. Consumers are demanding faster and more stable access
to the network, to allow them to listen to real time audio transmissions and to view
streaming video transmissions. These additional network demands are putting the
current network protocols under strain and commercial providers of such services
are calling for the current protocols to be substantially overhauled to provide for
the flow of such services free from the current problems of latency (delays in
transmission) and jitter (variations in delays).38 These problems are caused by the
current network protocol, Internet Protocol version 4 (IPv4) which employs a ‘best
effort’ quality of service.39 The best effort service is simply an onward
transmission service which routes packets of information based upon information
on congestion given to the sender from the next point or node in the network. This
means packets of information relating to a single transmission can become
separated and can arrive with delay variation causing jitter. Simple Internet
applications such as e-mail or web-browsing can tolerate these delays and
differentials, but streaming audio and video cannot: Internet telephony for example
cannot tolerate a delay of more than 250 milliseconds.40 To deal with these
problems network designers have suggested the creation of an intelligent network
which would allow for quality of service (QoS) solutions.41 The implementation of
QoS systems involve either the implementation of a complex virtual overlay
network (VON) which would allow traffic from a single network flow to pass
36 P. Curwen, ‘Next Generation Mobile: 2.5G or 3G?’ (2000) 2 Info 455, 461.
37
See J. Glasner, ‘Move Over, Pork Bellies’ Wired News May 20 1999 available at < http://
www.wired.com/news/business/0,1367,19796,00.html> (visited 4 January 2002); Lessig, n 1 above, 47.
38 See for example C. Huitema (Microsoft Corporation), ‘How Will IPv6 Change the World?’ paper
presented to IPv6 2000, October 19–20, 2000 Washington DC available at < http://www.ipv6for-
um.com/navbar/events/xiwt00/presentations/html/huitema/ > (visited 20 December 2001); Y. Pouff-
ary (Compaq), ‘The IPv6 Advantage’ paper presented to IPv6 2000, October 19–20, 2000 Washington
DC available at < http://www.ipv6forum.com/navbar/events/xiwt00/presentations/html/pou ffary/ >
(visited 20 December 2001).
39 David, n 14 above, 173.
40 Lessig, n 1 above, 46.
41 Lessig, n 1 above, 46–47. Generally Lessig is wary of such solutions as adding intelligence to the
network allows for control in the content layer.
The Modern Law Review [Vol. 65
498 ßThe Modern Law Review Limited 2002
through routers without competing with traffic from other network flows42 or as
seems more likely the implementation of a new network protocol, Internet Protocol
version 6 (IPv6).43 IPv6 offers many advances over IPv4. It allows for better
homogeneity of transmission. In the event of network queuing it allows for
streaming transmissions to be packaged together. This means time critical
transmissions such as streaming audio and video may be prioritised over less
time sensitive transmissions such as e-mails. Also it crucially supports the
Resource Reservation Protocol (RSVP) developed by Cisco Systems and MCI
WorldCom which allows service providers to sell bandwidth to users allowing
them to prioritise their transmissions over other traffic using the same routers.44
This functionality comes at a cost. These developments will almost certainly lead
to the development of fragmented proprietary networks within the wider network
structure and an end to the current end-to-end infrastructure of the Internet.45
Although bandwidth scarcity is not unique to Cyberspace the scarcity of domain
names is.46 It may seem bizarre to claim domain names are a scarce resource. The
permutations of domain names seem almost limitless. They may be made up of a
string of up to 61 characters47 in any permutation and a top level domain of which
there are more than 250.48 Despite this there is a scarcity of usable domain names.
Usable domain names reside almost exclusively in the .com top level domain and
are made up of recognisable terms in major languages.49 There is a paucity of such
names as usable domain names are of a one mark one owner architecture whereas
previous trade mark systems had been of a one mark many owners architecture.50
Competing demands for usable domain names quickly arose and the bodies
charged with overseeing the domain name system (initially the Internet Assigned
42 David, n 14 above, 173; Computer Science and Telecommunications Board, National Research
Council, The Internet’s Coming of Age (Washington DC: National Academy Press, 2001) 102–103.
Available at < http://bob.nap.edu/html/coming_of_age/ > (visited 20 December 2001).
43 IPv6 also solved the problem of a scarcity of Internet Protocol (IP)addresses. Currently there are just
under 4 billion available Ipv4 addresses. Although this seems a healthy figure large organisations such
as AT&T and MIT hold up to 16 million addresses each. Currently if you use dial-up access you will
be allocated a temporary IP address while connected. This allows several users to share the same IP
address. With new networked tools some analysts suggested the total fund of IP addresses would be
exhausted by 2004. IPv6 allows for 10
38
IP addresses more than enough for the foreseeable future.
44 ‘Traffic to Take the High Road – for a Price’ Wired News March 27 1997. Available at < http://
search.hotwired.com/search97/s97.vts?Action=FilterSearch&Filter=docs_filter.hts&ResultTempla-
te=news.hts&Collection=news&QueryMode=Internet&Query=IPv6 > (visited 20 December 2001).
45 David, n 14 above, 174–178. A. Odlyzko, The Economics of the Internet: Utility, Utilization, Pricing,
and Quality of Service AT&T Labs-Research 1998, 27–28. Available at <http://www.dtc.umn.edu/
~odlyzko/doc/internet.economics.pdf > (visited 20 December 2001).
46 M. Mueller, ‘Competing DNS Roots: Creative Destruction or Just Plain Destruction?’ Paper presented
to 29th Research Conference on Communication, Information and the Internet, October 2001,
Alexandria, Virginia, available at < http://www.arxiv.org/ftp/cs/papers/0109/0109021.pdf > (visited
19 December 2001).
47 The total length of a domain name (excluding root) may be up to 63 characters. As the shortest top
level domains are the two letter country code domains, this means the longest lower level domain
possible is 61 characters. See M. Galperin and I. Gordin, ‘The Domain Name System.’ Available at
< http://www.rad.com/networks/1995/dns/dns.htm > (visited 4 January 2002).
48 Currently there are 239 Country Code top level domains (ccTLDs) detailed in ISO-3166, 12 generic
top level domains (gTLDs) and two US Federal TLDs (.gov & .mil).
49 The latest Network Wizards Domain Name Survey (July 2001) records 37,502,747 .com domains. By
comparison the Oxford English Dictionary only contains ‘over half a million words’. (Source: About
the Oxford English Dictionary < http://www.oed.com/public/inside/ > (visited 4 January 2002)).
50 See for example A. Brunel and M. Laing, ‘Trademark Troubles with Internet Domain Names and
Commercial Online Service Screen Names’ [1997] 5 International Journal of Law and Information
Technology 1; A. Murray, ‘Internet Domain Names: The Trade Mark Challenge’ [1998] 6
International Journal of Law and Information Technology 285.
July 2002] Controlling the New Media
ßThe Modern Law Review Limited 2002 499
Numbers Authority (IANA) and Network Solutions Inc, and more latterly the
Internet Corporation for Assigned Names and Numbers (ICANN))51 were required
to develop a policy to deal with these competing claims. This policy, the Uniform
Domain-Name Dispute-Resolution Policy, attempts to balance the rights of trade
mark holders against the first-user policy previously applied. It is an extremely
controversial policy and will be examined in depth below when we analyse the
effectiveness of control mechanisms in the new media.
Extending the ‘modalities of regulation’ analysis
Lawrence Lessig’s Code and Other Laws of Cyberspace is widely regarded as one
of the most complete analytical attempts to capture the variety of forms which
regulation of new media does or may take.52 Lessig contends that there are four
distinct modalities of regulation. He attaches to these the labels law, markets,
norms and architecture. He thinks of these in terms of constraints on action.53 Thus
law constrains through the threat of punishment, social norms constrain through the
application of societal sanctions such as criticism or ostracism, the market
constrains through price and price-related signals, and architecture physically
constrains (examples include the locked door and the concrete parking bollard).
Lessig’s work is of great value for reminding us of the importance of
architecture as a basis for regulation. The potential for controls to be built into
architecture have long been recognised, as exemplified by Jeremy Bentham’s
design for a prison in the form of a panopticon (within which the architecture
permitted the guards to monitor all the prisoners) and the more recent observations
of the way in which visitors to Disney World are controlled by an architecture in
which nearly every aspect of the design has a disciplinary function.54 Lessig
observed the various constraints that are built into software by their designers.
Such architectural constraints in software code are chiefly used for commercial
purposes (such as restricting the user’s use to what they have paid for or
segmenting the market so as to charge higher prices in some segments without the
risk of arbitrage) but may also be used for other regulatory purposes (as with the
controls placed on users by Filterware).55 Lessig suggests that as a means of
regulation architecture is self-executing and thus different at least from norms and
51 IANA and ICANN are non-governmental not for profit agencies. Network Solutions Inc is a
subsidiary of Verisign Inc a for-profit publicly listed company.
52 See n 2 above; cf the five way analysis of controllers in ordering society put forward by Ellickson, n 6
above, 131. Ellickson sees order as a product of first party (or self-control) second party (or
contractual) control, third party control (based on social forces and norms), organisation (with
associated institutional apparatus) and, lastly (significantly) government with the laws. There is a
substantial political science literature on alternative instruments of governance see J. Kooiman (ed),
Modern Governance (London: Sage, 1993) and C. Hood, The Tools of Government (London:
Macmillan, 1983).
53 n 2 above, 235–239.
54 M. Foucault’s research on the history of the prison has been responsible for generating new interest in
surveillance generally and Bentham’s panopticon in particular. Discipline and Punish: The Birth of
the Prison (Harmondsworth: Penguin, trans A. Sheridan, 1977) ch 3. C. Shearing and P. Stenning,
‘From the Panopticon to Disney World: The Development of Discipline’ in A. Doob and E.
Greenspan (eds), Perspectives in Criminal Law (Aurora: Canada Law Book Co, 1984). Crime control
through design is exemplified by A. Lester, Crime Reduction through Product Design (Australian
Institute of Criminology, Trends and Issues in Criminal Justice no 206, 2001); N. Kaytal,
‘Architecture as Crime Control’ (2002) 111 Yale Law Journal (forthcoming).
55 Filterware is discussed further below. See ‘Other Forms of Control’.
The Modern Law Review [Vol. 65
500 ßThe Modern Law Review Limited 2002
law.56 This claim appears correct up to a point. However the analysis which
separates the functions of a control system shows that the standard-setting element
of architecture is not self-executing but is, by definition, designed by human hands.
Some architecture-based regimes may be self-executing as to monitoring and
behaviour modification. A parking bollard, for example, requires no further agency
on the part of a regulator to control parking. Other architectural controls do rely on
actions by the controller. For example, Bentham’s panopticon requires that prison
guards actively monitor prisoners and intervene to control deviance. The
panopticon can thus be seen as a hybrid of hierarchy and architecture.
The importance of Lessig’s analysis is to draw attention to the variety of bases
for control which can be deployed in the face of anxiety that technological change
(such as the Internet) and economic change (such as globalisation) tends to make a
variety of different forms of conduct unregulatable. The argument that variety in
forms of activity requires an equal or greater variety of bases for control if
regulation is to be effective has found formal expression in the cybernetics ‘law of
requisite variety’. It is expressed in other terms as the principle that ‘only variety
can destroy variety’.57 The sceptical position which Lessig challenges is premised
in part upon a myth that social and economic activity has traditionally been highly
amenable to regulation, conventionally defined. Recent scholarship on the limits to
control has emphasised the problems of trying to regulate social and economic
activity.58 This work has emphasised the importance of developing regulatory
regimes which seek to steer or stimulate activities within the target system
indirectly as an alternative to external command and control.59 Lessig’s work has
the potential to support efforts to reconceive regulation in a sense that is both more
modest in its claims and ambitions and more useful in providing mechanisms not
only, or perhaps mainly, of direct control but also of indirect control. A key method
of this new approach, which we deploy in this article, is to identify effective
regulation in whatever form it takes and to seek to support it, develop it or extend it
by analogy to other domains in which there are problems of regulation.
The concept of regulation deployed in Lessig’s analysis is a broad one,
extending beyond the narrowly defined ‘systematic oversight by reference to rules’
to encompass four ‘modalities of regulation’ which have the object or effect of
holding behaviour within one state among all the possible states which the
behaviour might take. Lessig refers to the ‘‘‘net regulation’’ of any particular
policy. . .’ domain as the ‘sum of the regulatory effects of the four modalities
together’.60 Regulation in this expansive sense is conceptually closer to the usage
56 n 2 above, 236–237. Lessig claims that markets have in common with normsand law the fact that they
require human agency and are not self-executing. This claim is contentious (though Lessig does not
recognise this) as the control exerted by a market does not operate at the level of the individual seller
and buyer, but rather in an aggregate. In the perfectly competitive market model the decisions of no
individual buyer or seller can affect the operation of the market.
57 S. Beer, Decision and Control (London: Wiley, 1966) 279–280.
58 P. Grabosky, R. Smith and G. Dempsey, Electronic Theft: Unlawful Acquisition in Cyberspace
(Cambridge: Cambridge University Press, 2001) 5–11; P. Nonet and P. Selznick, Law and Society in
Transition (New York: Harper & Row, 1978); I. Ayres and J. Braithwaite, Responsive Regulation:
Transcending the Deregulation Debate (New York: Oxford University Press, 1992); N. Gunningham
and P. Grabosky Smart Regulation: Designing Environmental Policy (Oxford: Oxford University
Press, 1998).
59 This prescription has its origins in systems theory: n 57 above; and is found in both the political
science literature: A. Dunsire, ‘Tipping the Balance: Autopoiesis and Governance’ (1996) 28
Administration and Society 299, and legal literature: G. Teubner, ‘Juridification: Concepts, Aspects,
Limits, Solutions’ in G. Teubner (ed), Juridification of Social Spheres (Berlin: De Gruyter, 1987).
60 L. Lessig ‘The Law of the Horse: What Cyberlaw Might Teach’ (1999) 113 Harv L Rev 501, 508.
July 2002] Controlling the New Media
ßThe Modern Law Review Limited 2002 501
of biologists and sociologists than to that of lawyers.61 It refers to any control
system. To be viable, within the terms of control theory, a control system must
have some standard-setting element, some means by which information about the
operation of the system can be gathered, and some provision for modifying
behaviour to bring it back within the acceptable limits of the system’s standards.62
With regulation information gathering is usually achieved through monitoring by
an agency, department or self-regulatory body and deviations addressed by
application of formal and informal sanctions (See Figure 1 below).
When locating Lessig’s description within the stricter analysis of control theory
some problems emerge both with the labels and the concepts which they describe. Put
simply the conceptual schema, drawn from Lessig’s work in law and economics,
needs enriching if it is to capture the institutional variety in control. Our earlier
discussion of control theory suggests that the appropriate schema involves not only a
four way division between different bases of control, but also a further fine grained
analysis of the three different elements necessary to generate a control system
(standard-setting, information gathering and behaviour modification). This devel-
opment of the analysis provides a clearer descriptive framework for understanding
how control is or can be achieved and opens up the possibility for identifying the
wide range of control systems which appear as hybrids of two or more modalities of
regulation. To develop this analysis we draw not only on Lessig’s work, but also on
attempts to deploy cultural theory to identify variety in control systems.
63
This
analytical frame has recently been put to work in analysing variety in risk regulation
regimes.
64
The term ‘regime’ is apt to capture variety not only in standards and
standard-setting (which represents the bias in Lessig’s analysis) but also in the
institutional dimensions of information gathering and behaviour modification. The
regime analysis makes it transparent that the various functions which contribute to
viable control systems can be widely dispersed among state and non-state actors,
even within a single regime, and can be assembled in mixed or hybrid forms.
Lessig’s conceptualisation of ‘law as command’65 suffers from a weakness in
that it fails to capture all of the control systems which are within the set of
command based or, as we label it, hierarchical control. Law, in this conception,
refers only to state law (whether made by judges, or, more commonly in this
context, legislatures)66 and neglects the plurality of forms which hierarchical
control structures may take. The richer conception of hierarchy looks to the form of
control rather than its source. Thus the regime for developing Internet domain
names has important elements which are non-state in character and yet which are
distinctly hierarchical (and are discussed further below). The term law also suffers
from the difficulty that it is often deployed in a way which infers only standards
and not the institutional elements of a control system (viz information gathering
and behaviour modification). Law in Lessig’s terms is merely the constraint placed
upon the individual. Accordingly hierarchical control provides both a better label
and a substantively enriched conception of this modality of regulation.
61 R. Baldwin, C. Scott and C. Hood (eds), ‘Introduction’ in Socio-Legal Reader on Regulation (Oxford:
Oxford University Press, 1998); M. Clarke, Regulation (London: Macmillan, 2000).
62 C. Hood, H. Rothstein, and R. Baldwin, The Government of Risk: Understanding Risk Regulation
Regimes (Oxford: Oxford University Press, 2001) 21–27.
63 C. Hood, ‘Control Over Bureaucracy: Cultural Theory and Institutional Variety’ (1996) 15 Journal of
Public Policy 207.
64 n 62 above, 9–14.
65 n 2 above, 235.
66 n 60 above, 507.
The Modern Law Review [Vol. 65
502 ßThe Modern Law Review Limited 2002
The concept of norms as it is deployed in Lessig’s analysis follows a usage
developed in the social psychological literature – referring to shared patterns of
behaviour – but which is unconventional and unhelpful in the study of law. Even in
its psychological usage the term norm does not describe the institutional
dimensions of a control system, but rather a set of standards which exist between
a particular social group for the time being. We argue that the preferred meaning of
the word norm is as the generic term for standards, guidelines and legal and non-
legal rules.67 The control form which involves societal or group standards, peer-
based information gathering and behaviour modification based on social sanctions
such as ostracisation or disapproval, we refer to as community-based control. This
category includes not only the social norms which exist generally or between
particular groups, but also some elements of more formalised regimes, as where
self-regulatory standards are socially generated and written down and then
combined in a hybrid form with hierarchical elements to create a self-regulatory
control system which is a hybrid between community and hierarchical bases.
The concepts of markets and architecture as they are deployed by Lessig are each
under-inclusive. Rivalry and competition provide a form of control in environments
where there is no identifiable market. Indeed recent public sector reforms have
made widespread use of what we will call competition-based controls in non-market
situations.68 Additionally there is a marked element of regulatory competition
applying to the development of regulatory standards in some domains both in the
US and the EU.69 Where the conditions for such regulatory competition exist (a
topic of hot debate), and states are permitted to develop their own rules, competition
for client businesses is said to create a check on any tendency to ‘over-regulate’.70
The concept of architecture, referring in Lessig’s terms to the whole built
environment with and without intended effects,71 does not capture the whole set of
control mechanisms which are premised upon design as a basis of control. Thus
there are social and administrative systems which have design features which
create control in a way in which the regulatee cannot affect. A key example is the
deployment of ‘contrived randomness’ in the oversight of taxpayers or employees
so as to reduce the scope of these groups to exploit a wholly predictable system of
opportunities and pay-offs.72 Accordingly we re-label this fourth modality of
regulation as design.73 The different elements of each of the four types of
regulation are illustrated in Figure 1.
67 P. Drahos and J. Braithwaite, Global Business Regulation (Oxford: Oxford University Press, 2000)
20.
68 P. Self, Government by the Market (London: Macmillan, 1993).
69 D. Esty and D. Gerardin (eds), Regulatory Competition and Economic Integration (Oxford: Oxford
University Press, 2001).
70 W. Bratton, J. McCahery, S. Picciotto and C. Scott, ‘Introduction: Regulatory Competition and
Institutional Evolution’ in Bratton, McCahery, Picciotto and Scott (eds), International Regulatory
Competition and Coordination (Oxford: Oxford University Press, 1996).
71 n 60 above, 507–508.
72 n 63 above, 211–214.
73 This concept of design has a loose affinity with the deployment of the term ‘technologies’ in the
Foucauldian literature on governmentality. It is possible that the term technologies ‘linking together
forms of judgement, modes of perception, practices of calculation, types of authority, architectural
forms, machinery and all manner of technical devices with the aspiration of producing certain
outcomes in terms of the conduct of the governed’ (N. Rose, ‘Government and Control’ (2000) 40
British Journal of Criminology 321, 323) infers a rather wider range of instrumentalities than are
inferred by the concept of design in this article. For deployment of the concept of technologies in
regulatory theory see J. Black, ‘Decentring Regulation: Understanding the Role of Regulation and
Self-Regulation in a ‘Post-Regulatory’ World’ (2001) 54 Current Legal Problems 103.
July 2002] Controlling the New Media
ßThe Modern Law Review Limited 2002 503
It is part of Lessig’s argument that there is scope for the use of hybrid forms of
regulation which link two or more of the ‘pure’ modalities of regulation noted
above.
74
In particular he suggests there is scope to link what are in his terms law and
architecture, for example by mandating software designers to build certain elements
into software code in pursuit of public regulatory objectives.
75
However we think he
underplays the extent to which contemporary control is already based on hybrid
regulatory forms and the extent to which a wide variety of regulatory hybrids may be
useful in developing regulatory control. Indeed, underlying Lessig’s argument is a
claim that there is considerable novelty to the nature of law in Cyberspace, a view
seemingly accepted by those Cyberlibertarians who contest the normative dimension
to Lessig’s work.
76
Nowhere in the work of Lessig or his critics is this claim
substantiated. As Lessig himself recognises, features which we might call design or
architecture have long been fundamental to the way we are governed, whether by
features of the built environment (such as the Parisian boulevard system) or the
Byzantine systems of an obscure public bureaucracy or of commercial actors such as
banks and insurance companies. It is not clear that design of software is
fundamentally different from design in other aspects of social and economic
activity. Wherever it is deployed it has controlling effects and a potential for those
controlling effects to be turned towards different or modified effects.
If each of the four pure bases of regulation is theoretically capable of being
deployed on its own and with each of the other three bases (giving four single
bases, six pairings, four threesomes and one foursome) then there are fifteen forms
of regulation in total. There is no empty set since all domains are subject to some
form of regulation (or else, by definition, they could not be a domain since they
would not hold a recognisable shape). Even regimes which apparently exhibit a
pure basis of regulation may have the dominant form tempered by another. For
74 n 60 above, 511–514.
75 ibid, 514–522.
76 n 2 above, 5–6; D. Post, ‘What Larry Doesn’t Get: Code, Law and Liberty in Cyberspace’ (2000) 52
Stan L R 1439, 1443.
Element of a Hierarchical Community- Competition- Design-Based
Control System Control Based Control Based Control Control
Standard Law or Other Social Norms Price/Quality Inbuilt design
Setting Formalised Ratio (and features and
Rules equivalents social and
with non-market administrative
decisions) systems
Information Monitoring Social Monitoring by Interaction of
Gathering (by agencies or Interaction dispersed design features
third parties) buyers, clients, with environment
etc
Behaviour Enforcement Social Aggregate of As for
Modification Sanctions (eg decisions by information
ostracism, buyers, clients, gathering (self-
disapproval) etc on purchase, executing)
take-up, location
etc
Figure 1: Elements of Control Systems
The Modern Law Review [Vol. 65
504 ßThe Modern Law Review Limited 2002
example much hierarchical regulatory enforcement is tempered by more co-
operative relationships more characteristic of community, and where there is a
proliferation of hierarchical regulators in a particular domain (telecommunications
and competition authorities in the communications domain for example)77 then
hierarchy may be tempered by a form of institutional competition as regulators
jockey for position and custom.
Among the widely observed hybrid forms are competition law and co-regulation
and enforced self-regulation. Though competition law is often equated with
competition in its control dimensions competition law exemplifies hierarchical
control, with elements of competition possible where third party actions are widely
deployed. Co-regulation and enforced self-regulation each link some of the
strengths of community-based control (notably within self-regulatory regimes)
with the use of hierarchy, for example by state approval of standards set by
industry groups (co-regulation) or mandating firms to establish and sometimes
enforce their own standards (enforced self-regulation). Other less prevalent forms
are observable but do not have widely accepted labels. Thus mandatory design
features (for example in product design) are hierarchy/design hybrids which we
could refer to as ‘enforced design’. The form taken by some self-regulatory efforts
to inhibit access to undesirable websites is a community/design hybrid.
One further set of remarks is necessary concerning the bases of control. Different
forms of control work differently in different contexts. Markets, hierarchies,
communities and design are each embedded in wider social practices.78 Key social
networks may be a factor in explaining relations of interdependence and thus how
power is played out in particular social settings.79 Similarly the effects of controls
may vary depending on how they are perceived in the cognition of those whom
they affect. Thus some individuals or societies may respond with resistance to
controls which are met with compliance by others or at other times. Thus an
analysis of modalities of regulation does not, by itself, provide a toolkit for
decisions on the design of controls, but rather a more limited analytical
understanding of controls which have been observed and might be deployed in
certain environments and which might be expected to be effective under
appropriate conditions.80
Putting controls to work
The importance of the reconfiguring and development of the modalities of
regulation argument further extends to institutional choices for seeking to use
controls for public policy objectives. Whereas Lessig places greater emphasis on
top-down institutional approaches, of which regulatory agency forms represent the
leading example, we contend that an emphasis on hybrid forms of control will tend
to lead to the deployment of hierarchical controls as instruments to steer organic or
bottom up developments, whether in the form of competition, community or
77 C. Scott, ‘Institutional Competition and Coordination in the Process of Telecommunications
Liberalization’ in McCahery, Bratton, Picciotto and Scott (eds), n 70 above.
78 J. Rogers Hollingsworth and R. Boyer (eds), Contemporary Capitalism: The Embeddedness of
Institutions (Cambridge: Cambridge University Press, 1997).
79 R. Rhodes, Understanding Governance (Buckingham: Open University Press, 1997) especially ch 3.
See also P. Drahos and J. Braithwaite, Global Business Regulation (Oxford: Oxford University Press,
2000) (discussion of ‘regulatory webs’, ch 23); C. Scott, ‘Analysing Regulatory Space: Fragmented
Resources and Institutional Design’ [2001] Public Law 329.
80 We are grateful to Julia Black for this point.
July 2002] Controlling the New Media
ßThe Modern Law Review Limited 2002 505
design-based control. In some instances successful regimes have combined three or
even all four of the bases for regulation.
Hierarchy/community
Hierarchy and community-based controls are often combined either to ensure that
industries effectively collaborate on controlling their sector or to give sectoral self-
regulation greater authority. The hierarchy/community hybrid bases of regulation
are exemplified by the structures established to address scarcity in domain names.
By regulating the domain name system ICANN plays a key role in the regulation of
Cyberspace.81 ICANN and its predecessors, IANA and Network Solutions Inc,
have long provided regulatory control over the domain name system but have done
so not as a function of hierarchical control, but rather to assist in the development
of the domain name system as required by the community and to ensure the system
design remained intact.
A simple example of the deployment of hierarchical controls to assist in the
development of community based controls may be seen in the promulgation by both
Network Solutions and ICANN of Domain-Name Dispute-Resolution Policies.82
These procedures are used to counteract the primary problem of misappropriation of
scarce resources. The procedure appears to have been extremely successful in
countering the problem of ‘cybersquatting’. The practice of cybersquatting was
recognised at an early stage of development of the Web. In its simplest form it is the
ability of unscrupulous individuals to register valuable domains such as Disney.com
and then to offer them on at a profit to the rightful holder of the trademark in
question. Individuals who entered into such practices were quickly dubbed
‘cybersquatters’ by the Web community, a reflection of their standing within the
community as equivalent to persons who unlawfully misappropriate physical
property in the real world. Community opinion was brought to bear. These people
were acting antisocially but social sanctions failed to affect their actions; being
ostracised in Cyberspace did not affect their everyday lives. Their actions were,
though more than socially unacceptable, they were also a threat to the developing
architecture of the domain name system. By controlling domain names which
reflected well known identifiers from the real world they posed a threat to the
system. How could people navigate the Web if they couldn’t rely on the knowledge
they had developed in the physical world?83 Although courts could intervene in
cases where cybersquatters had misappropriated another’s trademark84 regulatory
arbitrage meant enforcement of orders could sometimes prove problematic.
What was required was a regulatory regime which would apply to all
registrations and could be applied whatever the jurisdiction of the parties. This
led directly to the first Network Solutions Inc Domain-Name Dispute-Resolution
Policy, a policy which has now been adopted and refined by ICANN. The policy
has proven successful as it treats the domain name space as a separate jurisdiction,
thus preventing regulatory arbitrage. Anyone who resides in the ICANN domain
name space must contractually agree to be bound by the policy, and must agree to
81 As discussed above ICANN controls the allocation of a scarce resource and therefore plays an
important regulatory role.
82 Kleinwa¨chter, n 4 above, 271–272.
83 For example, if cybersquatters controlled domains such as disney.com, mcdonalds.com and
microsoft.com how would users navigate their way to the sites of these well known companies?
84 See eg Panavision vToeppen 945 F.Supp. 1296 (1996); British Telecommunications plc and others v
One in a Million Ltd [1999] RPC 1.
The Modern Law Review [Vol. 65
506 ßThe Modern Law Review Limited 2002
the arbitration procedure contained therein. Thus the values of the
cybercommunity may be upheld by ICANN through the arbitration process.
Secondly, the ICANN policy of using low-cost online arbitration at the expense of
court proceedings meets the needs of the community. One of the key problems
with usable domain names was they were unusually an inexpensive scarce
commodity. Scarce commodities often carry a proportionately high price tag, as
demonstrated by the UK and German 3G mobile spectrum licence auctions.85 This
is a simple application of the economic model of demand, supply and equilibrium
pricing. Domain names though do not fit the economic model particularly well as
the market as a whole is oversupplied while a small percentage of that market is
undersupplied or scarce. As registrars cannot differentiate useful (and therefore
scarce) domain names from the majority it means market-based controls may be
circumvented and a scarce and therefore valuable domain name may be had for as
little as $25. This allows for a high degree of speculation in domain names.
The previous Network Solutions Domain-Name Dispute-Resolution Policy
required the complainer to obtain a court order. This meant it was in many cases
cheaper to buy the disputed domain name from the defender than to pursue an
action to recover the name, especially if the dispute had an international element.
The present ICANN Uniform Domain-Name Dispute-Resolution Policy, through
its use of inexpensive arbitration procedures provides a regulatory process which
takes account of market conditions. This is not to say that the policy is not without
its critics. There is strong criticism of the ICANN policy on the grounds that it now
favours trademark holders over domain name holders who fail, for whatever reason
to comply with US trademark law.86 This has led to a practice known as ‘Reverse
Domain Name Hijacking’ occurring.87 This is a potential flaw in the ICANN
policy. As discussed the policy was originally introduced to deal with
cybersquatters who were perceived as socially unacceptable and a potential threat
to continued utility of the architecture of the domain name system. The policy now
needs to develop to provide a more balanced approach between the competing
interests of parties. Fortunately there is evidence that the arbiters under the policy
may be developing such a mature and balanced approach. There were some initial
claims that the policy was being used to restrict free speech.88 Recently though,
decisions of the arbitration panels have shown the policy has a degree of flexibility
which may allow them to develop the policy to meet the demands of the
community at large.89 Clearly the regulatory authority was implementing a
hierarchical control system to support the development of community-based and
design-based controls.
85 The UK raised US$35.4bn by auctioning 5 UMTS spectrum licences, while Germany raised $46.1 bn
by auctioning twelve spectrum blocks. In both cases the number of interested bidders exceeded the
number of licences available creating a scarcity of resources. This may be contrasted with the position
in the Netherlands where the auctioning of five licences was met with five serious bidders and raised
only $2.5 bnor in Italy where a similar situation to the Netherlands saw the Italian Government raise
only $10 bn.
86 See eg Froomkin, n 4 above, 96–101; C. Perry, ‘Trademarks as Commodities: The Famous Roadblock
to Applying Trademark Dilution Law in Cyberspace’ (2000) 32 Conn L Rev, 1127, 1155–1157.
87 Examples involving American Express and QVC may be found at < http://www.ejacking.com/ >
(visited 4 January 2002).
88 These claims are based in the so-called ‘sucks’ cases. Domains such as directlinesucks.com (D2000-
0583) and freeservesucks.com (D2000-0585) were transferred to the trademarks holders following
arbitration. Claims followed that decisions such as these were restricting free speech.
89 Three recent decisions wallmartcanadasucks.com (D2000-1104), lockheedmartinsucks.com (D2000-
1015) and michaelbloombergsucks.com (FA0097077) have all found in favour of the respondent.
These cases may signal a new approach in relation to such free speech cases.
July 2002] Controlling the New Media
ßThe Modern Law Review Limited 2002 507
Hierarchy/competition
The combination of hierarchical with competition-based controls is well
established in the media and communications sectors. Thus regimes which apply
economic or content controls more intensely to some firms than to others
effectively create a continuum within which firms exerting dominance are often
located closer to the hierarchy end while smaller and/or less powerful firms are
located towards the market end. Within the ‘responsive regulation’ theory this
approach is labelled ‘partial industry regulation’.90 The logic of the approach is that
the benefits sought for regulation may be secured less intrusively by applying
regulation only to a proportion of the firms, whilst creating space for other firms to
be controlled more by market elements. Typical patterns of more intense regulation
of broadcast over print media are said to have reduced risks of censorship and
promoted pluralism.91 In the telecommunications sector ‘asymmetric regulation’
has been deployed to provide tighter controls over dominant incumbents both to
maintain service levels and to promote access to the market by new entrants.92
With the new media other forms of control which mix hierarchy and competition
have been developed.
With the scarcity issue related to spectrum, conventional hierarchical controls
have been displaced by a hierarchy/competition hybrid in some domains. With 3G
mobile governments have attempted to use spectrum allocation mechanisms to
promote competitive markets, to promote efficient allocation of resources and in
some cases to secure fee-income windfalls for finance ministries. Attempting to set
policies that were friendly to the development of advanced infrastructure the
European Commission initially recommended that Member States should allocate
licences to 3G mobile operators free of charge.93 Only Finland and Sweden, among
the first movers on Universal Mobile Telecommunications System (UMTS)
licensing, followed this policy course. All the other Member States decided to
charge for the licences. Cynical accounts claim that the decision to charge was
premised upon the greed of finance ministries. But there is a more principled
explanation for the policy which is posited as a solution to one of the key problems
of scarcity – that governments may fail to allocate scarce resources to those who
are best able to exploit them to the general benefit.
The conventional instrument for the allocation of scarce spectrum is the exercise
of government’s hierarchical authority to examine potential applicants and make a
decision along the lines of a ‘beauty contest’.94 This method was used in eight of
90 I. Ayres and J. Braithwaite, Responsive Regulation (Oxford: Oxford University Press, 1992), ch 5.
91 L. Bollinger, ‘Freedom of the Press and Public Access: Towards a Theory of Partial Regulation of the
Mass Media’ (1976) 75 Michigan Law Review 1.
92 A. Perucci and M. Cimatoribus, ‘Competition, Convergence and Asymmetry in Telecommunications
Regulation’ (1997) 21 Telecommunications Policy 493. Partial industry regulation in telecommunica-
tions is exemplified by US rules which apply greater restrictions to the commercial packaging of
digital subscriber lines (DSL) provided by telecommunications companies than apply to functionally
equivalent cable modems provided by cable communications (formerly tv) companies: M. Lemley
and L. Lessig, ‘The End of End-to-End: Preserving the Architecture of the Internet in the Broadband
Era’ (2001) 48 UCLA Law Review 925.
93 European Commission Communication from the Commission to the Council, the European
Parliament, the Economic and Social Committee and the Committee of the Regions: Strategy and
Policy Orientations with Regard to the Further Development of Mobile and Wireless Communications
(UMTS) COM (97) 513 Final.
94 The theoretical basis of the shift towards auctions, which lies in game theoretic approaches is
described in D. Salant, ‘Auctions and Regulation: Reengineering of Regulatory Mechanisms’ (2000)
17 Journal of Regulatory Economics 195.
The Modern Law Review [Vol. 65
508 ßThe Modern Law Review Limited 2002
the Member States.95 The weakness of this method is said to lie in its dependence
on the knowledge and judgement of the applicable state bureaucracy both to guess
the appropriate fee to charge successful applicants and which applicants are best
placed to exploit the spectrum. This ‘limited knowledge’ problem is perhaps more
acute in the 3G mobile sector where there is little consensus on the commercial
prospects for services which are made possible in the digital environment but
which have not yet been tested in the market place.
The alternative method for allocating spectrum used in the remainder of the
member states was to auction the licences, combining hierarchy with competition as
the basis of control. Deviating from the sealed bid method used in previous spectrum
auctions, the UK government and others decided to use a transparent (ie no sealed
bids) simultaneous multi-round ascending auction under which bidders’ offers would
be revealed at the end of each round and whoever held the highest bid when the
number of bidders was reduced to equal the number of licences would win the
particular licence. In this way the price mechanism is used to determine which firms
should have access to the scare resource controlled by government. The outcome of
the UK auction was that payments for licences totalling 22 billion pounds were much
higher than was expected by commentators and government.
96
Details of auction
rules and incentives resulted in less successful outcomes in some other member
states.
97
The UK experience initially suggested the auction had been successful in
revealing a true value of the licences well above government estimates.
Commentators still do not agree on whether the high cost of licences, particularly
in the UK and Germany, will stifle the market as operators struggle to repay the
cost.
98
The German regulator has already indicated that it may allow the operators to
share infrastructure costs and the same thing may happen in the UK.
99
This
divergence between the actual operating conditions (and reduction in costs) over
those projected at the time of the auctions suggests that the injection of competition
in the licence allocation process has generally been less than successful.
With the problem of regulatory arbitrage the solutions are often put in terms of
regulatory competition or coordination. In other words arbitrage may be overcome
by providing coordinated or harmonised rules across jurisdictions or arbitrage itself
may seen as a solution to the problem of excessive regulation. Regulatory harmon-
isation was for a long time the favoured way of providing a level playing field for
competition in the internal market of the EU. However, this exercise of hierarchical
authority raises practical difficulties in terms of the scale of resource necessary to
achieve it, and is said to risk stultifying the very markets which are to be liberalised.
A partial response to the practical problems of harmonisation was the decision of the
European Court of Justice in the Cassis de Dijon case which gave judicial authority
to a principle of mutual recognition.
100
Regulatory competition is said to provide the
flexibility for jurisdictions to develop standards to match the local requirements
(whether technical or political), the capacity to innovate in regulation while
95 n 36 above, 461.
96 M. Cave and T. Valletti, ‘Are Spectrum Auctions Ruining Our Grandchildren’s Future?’ (2000) 2 Info
347.
97 n 36 above, 474–475.
98 n 96 above; J. Bauer, ‘Spectrum Auctions, Pricing and Network Expansion in Wireless Tele-
communications’ paper presented to 29th Research Conference on Communication, Information and
Internet Policy, October 2001, Alexandria, Virginia available at < http://www.arxiv.org/ftp/cs/papers/
0109/0109108.pdf > (visited 19 December 2001).
99 ‘MMO2 und T-Mobile schliesen UMTS-Kooperationsvertrag’ Frankfurter Allgemeine Zeitung, 22/
09/01.
100 Rewe-Zentral AG vBundemonopolverwaltung fu
¨r Branntwein [1979] ECR 649.
July 2002] Controlling the New Media
ßThe Modern Law Review Limited 2002 509
encouraging states to adopt rules of minimum necessary burden on business or others
(because of the threat that such regulatory clients might shift their business
elsewhere). A recent analysis suggests that the choice between competition and
coordination is a false one both in practice and normatively and that what we are
likely to see is elements of competition (for example between institutions) emerging
in domains that are notionally coordinated and vice-versa. Thus it is better to talk of
‘regulatory co-opetition’, a hierarchy/community hybrid form of control, both as
description of the phenomena and as normative aspiration.
101
Regulatory arbitrage is a well recognised phenomenon of Cyberspace, though
commentators reach different conclusions as to its significance.102 Cyber-
libertarians argue that regulatory arbitrage prevents hierarchical regulation of
Cyberspace. This is most clearly and famously put in David Johnson and David
Post’s seminal article, Law and Borders – The Rise of Law in Cyberspace.103 For
Johnson and Post the practical effect of regulatory arbitrage is that hierarchical
controls are rendered impotent. Netizens may choose to reject hierarchical controls
they find unpalatable by moving to another part of Cyberspace. As previously
outlined Netizens may choose how they wish to be regulated much more freely
than citizens of physical jurisdictions. The only effective regulatory system
according to Cyberlibertarian theory is therefore one which is acceptable to all (or
the vast majority of) Netizens. Johnson and Post therefore suggest a bottom-up or
organic regulatory model. They envisage a self-regulatory governance system
along similar lines to that developed to regulate the domain name system. Lessig
disagrees with their conclusion. He agrees that Cyberspace is a separate space and
can be seen as a distinct jurisdiction. He disagrees though with the conclusion that
it is a jurisdiction which requires the organic development of regulatory regimes.
For Lessig, once you isolate Cyberspace as a distinct space you may use its unique
architecture to establish a hierarchical regulatory structure. The argument of the
Cyber-paternalists is therefore that once a recognised regulator emerges in any
given activity they may impose regulatory regimes on Netizens through the unique
man-made architecture of the Web, its code.104
To the extent that regulatory arbitrage is a problem with new media generally,
and usage of the Internet in particular, it remains an open question to what extent
the balance between competition and coordination might be deployed to resolve
issues. For many commentators the nature of Internet technology makes regulatory
arbitrage inevitable and difficult to forestall, whatever may be desirable from a
policy point of view. It is the high mobility both of providers and users within
101 D. Esty and D. Geradin, ‘Regulatory Co-Opetition’ in Esty and Geradin (eds), Regulatory
Competition and Economic Integration (Oxford: Oxford University Press, 2001).
102 See for example, M. Froomkin, ‘The Internet as a Source of Regulatory Arbitrage’ n 28 above; D.
Johnson and D. Post, ‘Law and Borders – The Rise of Law in Cyberspace’ (1996) 48 Stan L Rev
1367; L. Lessig, ‘Zones in Cyberspace’ (1996) 48 Stan L Rev 1403; P. Samuelson, ‘Five Challenges
for Regulating the Global Information Society’ in C. Marsden (ed), Regulating the Global
Information Society (London: Routledge, 2001). An interesting side effect of regulatory arbitrage is a
regression to the least interventionist standard in a given area. The can most clearly be seen in relation
to freedom of speech following the decision of the US Supreme Court in ACLU vReno 177 S. Ct.
2329 (1997) where an online movement towards the US free speech standard may be detected. For
further discussion on this see D. Vick, ‘Exporting the First Amendment to Cyberspace: The Internet
and State Sovereignty’ in N. Morris and S. Waisbord (eds), Media and Globalisation: Why the State
Matters (Lanham: Rowman & Littlefield, 2001).
103 ibid. Online version available at < http://www.temple.edu/lawschool/dpost/Borders.html > (visited 4
January 2002).
104 n 2 above, passim. See also L. Eko, ‘Many Spiders, One Worldwide Web: Towards a Typology of
Internet Regulation’ (2001) 6 Communications Law and Policy 445.
The Modern Law Review [Vol. 65
510 ßThe Modern Law Review Limited 2002
Cyberspace which makes it difficult to envisage coordinative solutions. For some
this is a strength militating against excessive control of Cyberspace. It was argued
that regulatory arbitrage acted as a (limited) check on stringent UK legislation
governing state monitoring of electronic communications generally in the
Regulation of Investigatory Powers Act 2000.105 Arguably any solutions here are
likely to be a product of cooperation and community-based controls involving both
governments and businesses rather than of co-ordination between governments, as
through the EU or the World Trade Organisation (WTO).
Hierarchy, competition and design
In addition to the use of hierarchical/community controls discussed earlier, ICANN
is also applying a design/competition-based hybrid in an attempt to alleviate the
pressure on the domain name system. As domain names are a man made rather
than natural phenomenon they do not have to be rationed in the manner of natural
resources such as bandwidth. Whereas governments cannot simply create
additional bandwidth to meet the demand of mobile phone operators,106 ICANN
hopes to solve the domain name problem by creating additional resources. To this
end on 16 November 2000 ICANN announced seven new top level domains.107 It is
the hope of ICANN that by creating competition in new, more specialised domains,
demand will be lowered in the oversubscribed .com domain and a solution will be
found to the scarcity problem. There has been profound disquiet about allegedly
anti-competitive outcomes from ICANN’s allocation of new top-level domain
(TLD) names. Thus the allocation of some of these new resources (notably .pro and
.info) has been made to organisations already controlling other key TLDs such as
.com, .net and .org. The refusal to create other new top-level domain names (for
example .xxx for pornography) has been criticised for inhibiting design-based
controls over access or exploitation of particular sites. The solution to these
problems posited by one key critique is to open the domain name market to greater
competition between assignment organisations and use competition as a key form
of control.108 It may, though, already be too late for any competition-based
approach to work in relation to domain names. The scarcity of resource problem in
relation to domain names appears to be restricted to the .com TLD. As discussed
previously there are already a large number of alternative top-level domains
available.
Attempts previously to turn country code TLDs into generic TLDs have not
released useful domain names. The most concerted effort has been in relation to the
.ws (Western Samoa) domain, which is being promoted as a ‘World Site’ domain.
105 The House of Lords debated this possibility at some length at the Report Stage of the Bill (HL Deb
vol 615 cols 381–388; cols 400–452, 13 July 2000). See in particular the debate on Amendment No 64
at col 408–418.
106 Lessig offers an alternative solution to the problem of undersupply of bandwidth. Although
acknowledging supply of radiocommunications bandwidth is naturally limited he suggests we are
extremely wasteful of the resource available. He rejects the use of beauty contests or auctions to
propertise bandwidth as outlined above and suggests instead a design solution allowing a more
efficient use of bandwidth as a free or common resource. Lessig, n 1 above, ch 12.
107 They are .aero, .biz, .coop, .info, .museum, .name and .pro. For further details on these names
including who may apply for a name within these new domains see < http://www.icann.org/tlds/>.
108 M. Mueller, ‘Domains Without Frontiers’ (2001) Info 97, 99. See also M. Froomkin, ‘Is ICANN’s
New Generation of Internet Domain Name Selection Process Thwarting Competition?’ Presentation
to US House of Representatives Committee on Energy & Commerce, Subcommittee on
Telecommunications, February 8, 2001. Available online at < http://personal.law.miami.edu/
~froomkin/articles/commerce8Jan2001.htm > (visited 4 January 2002).
July 2002] Controlling the New Media
ßThe Modern Law Review Limited 2002 511
In many instances holders of current generic TLDs simply replicated their
registration in the new domain. There is little evidence that the creation of
manufactured additional resources deals with this particular scarce resources
problem. The availability of these alternatives has not encouraged sufficient
competition to effect the base of the .com domain. The relevant market, appears
therefore not to be the market in TLDs as a whole, or even generic TLDs, but is
restricted to the .com TLD. The .com TLD is, it appears, too well established to be
affected by the creation of alternative domains. The creation of such alternatives
does not appear to introduce competition within the relevant market it merely
creates alternative markets in which mere replication of registration occurs.
The only possible methodology which would appear to provide for a functioning
competition-based solution to the .com problem would be to increase the
marketability of competing TLDs. The current ICANN policy is for the creation
of alternative TLDs which they expect will increase in marketability through the
efforts of the registrars who deal in such names. They are relying upon a free
market rhetoric which states that those with saleable assets will work to increase
the marketability of their asset through advertising and marketing. ICANN believes
that the domain name system is thus a free market in which demand may be created
in new products through advertising and marketing. Unfortunately the free market
rhetoric does not apply to domain names in this manner. They are more than
simply saleable assets. Firstly valuable domain names are, in many cases, a
reflection of currently held trademarks. As has been previously alluded to, the
creation of alternative TLDs fails to release alternative resources due to replication
of registrations by current holders of trademarks and valuable .com domain names
to prevent any risk of cybersquatting. Secondly all domain names are a method of
indexing information and navigation. Thus they are streetnames not just
marketable assets. And as with all other communities the Web has its desirable
areas and its undesirable areas. In this virtual community .com is the business and
financial district. It is the Web’s equivalent to the City of London, Wall Street or
Rodeo Drive. And just as businesses in the real world will pay a premium for such
addresses so the focal point for competition in relation to domain names will
remain in the .com domain. Due to these problems the scarcity issue in relation to
domain names may be as ingrained as the bandwidth problem in relation to
telecommunications and a more radical solution may be required in the future.
Other forms of control
The emphasis of current thinking on alternatives to hierarchical control is largely
focused on linking hierarchy to competition or to community-based methods of
control. This focus largely excludes two major classes of forms of control, one
defined in terms of excluding hierarchy and the other defined in terms of including
design.
Design-based regulation
A key example which is located in both sets (employing design and excluding
hierarchy) is the use of regional management codes by DVD producers and
equipment manufacturers. Producers and equipment manufacturers have
collaborated in a regional coding system which allows for market segmentation
within the DVD industry. Regional coding was developed to permit studios to
The Modern Law Review [Vol. 65
512 ßThe Modern Law Review Limited 2002
control the home release of movies within different geographical regions allowing
the staggering of cinematic releases.109 The studios required that DVD software
codes included a simple code that could be used to prevent playback of certain
discs in certain geographical regions. The equipment manufacturers assisted by
producing region specific DVD players, each player being given a code for the
region in which it is sold. The player will refuse to play discs that are not encoded
for that region. This means that discs bought in one country may not play on
players bought in another country. The addition of regional management codes are
entirely optional for the maker of a disc, discs without codes will play on any
player in any country. These codes should not be confused with the DVD Content
Scramble System, discussed below, which acts as a copy-control measure.
Regional management codes are not an encryption system, they are merely one
byte of information on the disc, which denotes one of eight different DVD
regions.110 Thus an encoded DVD bought in the US will not be viewable on a
European DVD player. There is no hierarchical element to this. Customers are not
prevented by contract or any other laws from buying DVDs in other countries. The
control is effected by features of the diverse product standards which make a DVD
useless when paired with a player with a different coding.
Including design
A related example is the use of a hierarchy/design hybrid in an attempt to manage
the high levels of digital piracy which occur on the Web. Copy-control devices
have been employed by almost all copyright holders who trade in digital media.
These controls have met with varied degrees of success, but are supported by not
only industry groups such as the Motion Picture Association of America (MPAA)
and the Recording Industry Association of America (RIAA), but also have been
given the force of law through the actions of the World Intellectual Property
Organisation (WIPO)111 as enacted within the European Union by the Directive on
Certain Aspects of Copyright and Related Rights in the Information Society,112 and
in the United States through the Digital Millennium Copyright Act 1998.113 With
the legal support offered by these enactments several copy-control systems have
been developed and implemented by bodies representing copyright holders mostly
against the wishes of the community at large. One such standard developed by the
MPAA for use on DVD releases is the Content Scramble System (CSS). CSS was
109 The coding also allows studios to sell exclusive distribution rights to a variety of foreign distributors.
See D. Marks and B. Turnbull, ‘Technical Protection Measures: The Intersection of Technology, Law
and Commercial Licences’ paper presented to World Intellectual Property Organisation Workshop on
Implementation Issues of the WIPO Copyright Treaty (WCT) and the WIPO Performances and
Phonograms Treaty (WPPT), December 6–7 1999, Geneva available at <http://www.wipo.org/eng/
meetings/1999/wct_wppt/pdf/imp99_3.pdf > (visited 7 January 2002). The European Commission is
currently investigating whether DVD producers are using the technology to illegally partition the
market. See Financial Times 11 June 2001.
110 These are as follows: Region 1 USA and Canada; Region 2 Japan, Europe and Middle East; Region 3
Southeast and East Asia; Region 4 Australasia, Central and South America and Caribbean; Region 5
Eastern Europe, India and Africa; Region 6 China ; Region 7 Reserved and currently unused; Region
8 Special Venues (Planes, Cruise Ships etc.)
111 Article 11 of the WIPO Copyright Treaty requires contracting parties to, ‘provide adequate legal
protection and effective legal remedies against the circumvention of effective technological measures
that are used by authors in connection with their exercise of rights under this Treaty or the Berne
Convention and that restrict acts, in respect of their works, which are not authorised by the authors
concerned or permitted by law.’
112 Directive 2001/29/EC, Art. 6.
113 §§ 1201(a)(1) and 1201 (a)(2).
July 2002] Controlling the New Media
ßThe Modern Law Review Limited 2002 513
developed by two hardware companies, Matsushita Electric and Toshiba, for the
motion picture industry and was adopted as industry standard in 1996. The system
involves a dual key encryption system which encrypts all sound and graphic files
contained on a DVD release. The files may be decrypted by the appropriate
decryption algorithm which is made up of a series of keys stored on both the DVD
and the DVD player. This means that only players and discs containing the
appropriate keys may decrypt the necessary files and play the movies stored on the
DVDs.114 The CSS system did not prevent direct copying of DVD discs, the
contents of a DVD (while encrypted) could be copied directly from one DVD to
another. CSS did though prevent the uploading of the contents of a DVD on to hard
disc or a web server. The concern of some users was that CSS systems were only
licensed for use on Macintosh and Windows based operating systems (and for
dedicated DVD players). Users of open source operating systems such as GNU/
Linux could not play a CSS encoded DVD on their system. This led to a campaign
of civil disobedience leading to the development of a decryption code for CSS
which would allow the playing of CSS encrypted DVDs on any platform. The CSS
code was a quite weak 40 bit encryption system and in September 1999 it was
successfully hacked independently by an anonymous German hacker and a
member of the ‘Drink or Die’ cracking community.115 This development meant
that CSS encrypted DVDs could now be used on unlicensed DVD players and that
DVD material could be placed directly onto the Web. Such a development was an
obvious threat to the continued use of CSS by DVD producers. Action was taken
immediately in Norway where Jon Johansen who had been erroneously identified
as the author of DeCSS was prosecuted and in the United States where Universal
Studios successfully obtained injunctions under the Digital Millennium Copyright
Act against several individuals who were distributing the DeCSS code from US-
based websites.116 The decision in this case has been extensively criticised by
many commentators, including Lessig who argues that ‘DeCSS didn’t increase the
likelihood of piracy. All DeCSS did was (1) reveal how bad an existing encryption
system was; and (2) enable disks presumptively legally purchased to be played on
Linux (and other) computers’.117 Lessig is extremely critical of the use of law to
support these design controls arguing that they create an ‘imbalance where
traditional rights are lost in the name of perfect control by content holders.’118
This view taken by Lessig in his new book The Future of Ideas may though prove
to be unduly pessimistic. There is as yet no evidence of content holders attaining the
perfect control he fears in Cyberspace. Indeed the victory of Universal Studios and
the MPAA has proved to date to be pyrrhic. As is often the case in Cyberspace when
hierarchical/design controls are used to regulate the community at large the
community will rally in an attempt to defeat the regulatory control mechanisms. The
DeCSS code may currently be obtained from any one of hundreds of websites which
114 For more detail on CSS see the opinion of Judge Kaplan in Universal Studios Inc vReimerdes et al
111 F.Supp 2d 294 (2000). Affirmed Universal Studios Inc vCorley et al 28 November 2001, Second
Circuit Court of Appeals Docket No. 00-9185 Available at < http://eon.law.harvard.edu/openlaw/
DVD/NY/appeals/opinion.pdf > (visited 4 January 2002).
115 The media wrongly attributed the development of DeCSS to a fifteen year old Norwegian Jon
Johansen. Although Mr. Johansen was a member of the ‘Masters of Reverse Engineering’ community
which released DeCSS he was not the author of the program. This is made clear in a text file which
accompanied the release of the program. The text file is available at < http://www.lemuria.org/
DeCSS/dvdtruth.txt > (visited 4 January 2002).
116 Universal Studios Inc vReimerdes et al, n 114 above.
117 Lessig, n 1 above, 189. See further 187–190.
118 ibid 200.
The Modern Law Review [Vol. 65
514 ßThe Modern Law Review Limited 2002
remain out of the reach of the US authorities.
119
Currently the producers of DVD
titles and the hacking community are involved in a war of code. The motion picture
industry has updated the CSS code which means the DeCSS code no longer decrypts
the latest DVD releases. This has simply encouraged hackers to produce new, more
powerful, second generation decryption codes such as DVD-Decrypter. Both parties
continue to battle for the control of DVD encryption/decryption codes. The producers
of DVD titles and the community at large are both using design tools to attempt to
protect their position. The producers presently have the advantage, due primarily to a
weakness of current technology. At the moment the lack of widely available
broadband technology prevents distribution of decrypted movie data over the Web:
the producers hold the upper hand. As distribution technology improves the movie
industry may find that their design solutions cannot effectively function without
either the support of the community at large or far greater reliance upon the
hierarchical control elements introduced by the Digital Millennium Copyright Act
and the Directive on Copyright and Related Rights in the Information Society.
Producers of DVDs will need to decide within the next few years whether they wish
to rely on a hierarchy/design hybrid or a community/design hybrid.
120
Excluding hierarchy
A successful example of a community using design tools to effect a regulatory
scheme is the community-based approach to protecting children in Cyberspace. As
discussed above the anonymity problem raises two distinct dangers for minors in
Cyberspace. One is that they gain access to materials which are unsuitable for
minors and the other is that adults take advantage of anonymity to forge improper
relationships with minors. Hierarchical controls fail to remedy these problems but
a community-based solution has proved extremely successful, especially when
linked with design-based solutions. Within organised cybercommunities children
may be supervised by the community. Communities such as AOL encourage
family membership where parents register the details of the family as a whole and
each individual member has their own password. Unless the child were to
compromise an adult password, their status can therefore be made known to the
community and the community can supervise and protect the child while he is
online. Children cannot be watched all the time and the community cannot take
over all parenting responsibilities. To assist, additional design-based tools may be
used. In addition to the community supervision, parents may employ software
solutions such as CYBERsitter and Net Nanny. These products allow parents to set
acceptable parameters for their children when in Cyberspace.121 Combined, the
119
The website operated by Shawn Reimerdes (one of the defendants in the MPAA action) contains the
following advice: ‘A Federal Judge removes this link by court order! We are fighting for the right to put
this link back up for you! I am not allowed to have this decryption information anymore, so I will just tell
you the obvious: Go to your favorite search engine and enter ‘DeCSS’. You will find one of thousands of
websites that has decided to post this information’. Doing so will allow you to locate sites such as the
DeCSS mirror site at <http://heavymusic.8m.com/ > (visited 20 December 2001); Download.com
< http://www.download.cnet.com/ > (visited 4 January 2002) and < http://www.lemuria.org> (visited 4
January 2002) all of whom currently have the DeCSS program available for downloading.
120 ee further Lessig, n 1 above, chapter 11.
121
Such software programs are called alternatively Filterware or Censorware (depending very much upon
your political viewpoint). Many programs such as the ones listed use stand-alone value judgements to
categorise websites based on their content. The software provider will review sites and will put them on
either an ‘allowed’ or a ‘not allowed’ list. Other programs rely upon the Platform for Internet Content
Selection (PICS) a standardised industry system which allows content to be rated in various categories
including: topics such as ‘sexual content’, ‘race’, and ‘privacy’, under the control of the user.
July 2002] Controlling the New Media
ßThe Modern Law Review Limited 2002 515
role of the community and the security provided by these products appear to
provide a relatively successful solution to the access problem.
Conclusions
New and unpredictable configurations of power are among the hallmarks of the
new media. It is not surprising that the problem of control has attracted such a high
degree of interest among scholars. Not only are there interesting problems of
designing regimes to provide appropriate constraints on undesirable activities,
there are also challenges in securing the maximum benefit to the community of
new technologies such as the Internet and 3G mobile (each of which is said to be
subject to ‘network effects’ such that the more users there the greater the benefit to
the community generally). The new media phenomena present scholars with at
least two temptations. One is to overstate the novelty of the problems presented,
with a consequent tendency to reject ‘old’ forms of control.122 The second is to
overstate the extent to which the media themselves ‘hardwire’ or constrain the
possible means to addressing the problems. Both tendencies are prevalent in
analyses of the control problem as it applies to the Internet.
The alternative, which we have argued for, is to locate problems of controlling
the new media squarely within well established analyses of problems of regulatory
control. Such analysis encourages us to look at the mechanisms of control which
already subsist within the target system and to find ways to stimulate or steer those
indigenous mechanisms towards meeting the public interest objectives of
regulation. Thus a central role for hierarchy is to steer systems which involve
other forms of control based in community, market or design (or combination
thereof). This does not exclude the possibility that effective control may occur
through competition, design or community, together or separately, without
hierarchical involvement.
A key challenge presented by such novel governance mechanisms is how to
deploy them in such a way that are perceived as legitimate. The legitimacy of
democratic government is linked to processes of representation and open decision
making. Though other governance mechanisms may be legitmated in similar ways,
in many cases it will either be alternative process elements and/or outcomes which
are more important in generating legitimacy. Judgements on the appropriate
balance between democratic and other forms of legitimation are likely to differ
within different political cultures. This is evidenced in markedly different
responses in Europe and the United States to the creation of ICANN. For some
it represents an unacceptable delegation of government authority to a private
body.123 For others it is an efficient technical solution to a pressing problem, even
if its decision making is not wholly technical. A key challenge in deploying ideas
about the mixture of control forms advanced in this article is to balance these twin
concerns about efficiency and legitimacy. The conditions for achieving an
acceptable balance are likely to vary in different places and different times.
122 A useful early consideration of the ‘newness’ issue in Cyberspace is I. Trotter Hardy, ‘The Proper
Legal Regime for ‘‘Cyberspace’’’ (1994) 55 University of Pittsburgh Law Review 993. More recently
see M. Price, ‘The Newness of New Technology’ (2001) 22 Cardozo Law Review 1885.
123 Froomkin ‘Wrong Turn in Cyberspace’ n 4 above; J. Wienberg ‘Geeks and Greeks’ (2001) 3 Info 313;
cf R. Marlin Bennett ‘Icann and Democracy: Contradictions and Possibilities’ (2001) 3 Info 299.
The Modern Law Review [Vol. 65
516 ßThe Modern Law Review Limited 2002

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT