Sie sind auf Seite 1von 60

THE ADAPTIVE MOMENT: A FRESH APPROACH TO CONVERGENT MEDIA IN AUSTRALIA

Associate Professor Kate Crawford, Journalism and Media Research Centre, UNSW Professor Catharine Lumby, Journalism and Media Research Centre, UNSW

Author detAils
Associate Professor Kate Crawford is the Deputy Director at the Journalism and Media Research Centre at the University of New South Wales. She is an internationally recognised researcher of internet technologies, and recently conducted Australias largest study of mobile and social media use by 18-30 year olds, funded by the Australian Research Council. Catharine Lumby is Professor of Journalism and Director at the Journalism and Media Research Centre at the University of New South Wales. She is the author of seven books and numerous book chapters and journal articles, and is an international expert on media and gender studies. She has been awarded five Australian Research Council grants and was a member of the Advertising Standards Board.

Acknowledgements
James West and Hannah Withers provided invaluable research assistance in the preparation of this report and we would like to thank them for their careful work. We also thank Peter Coroneos, Chief Executive of the Internet Industry Association, Peter Leonard, a partner at Gilbert + Tobin, David Simon former member of the Classification Board and Dr Peter Chen of the University of Sydney for contributing their expert advice in the preparation of this report. We also acknowledge Google Australia who provided a contribution towards the research funding for this report.

tABle oF contents
Author detAils Acknowledgements tABle oF contents executive summAry introduction section 1: the stAte oF PlAy 1 1 2 4 8 10

1.1 The Rise of Convergent Media 1.2 More, Faster: Australia Over The Next Five Years 1.3 Regulating The Convergent Environment: The Current Picture
Regulatory Inconsistencies Case Study 1: Facebook and the Queensland Government Case Study 2: Games, Online And Offline

10 12 13
14 15 17

Conclusion
section 2: the internAtionAl ArenA

18
19

2.1. International Comparisons


2.1.1. Japans Safer Internet Program
Deregulation & the Need For a Unifying Act Regulating Content Aims of the Japan Safer Internet Program Content Regulation = Self-Regulation Poster and Slogan Competition: Case Study Response From Industry: NTT DoCoMo

19
21
21 23 23 24 24 25

2.1.2. New Zealands Digital Strategy


Background Confidence: An Integrated Response Case Study: Hectors World Content and Access Regulation: An Opt-in filter

26
26 27 27 29

2.1.3. Digital Britain


Digital Britain & Digital Economy Bill 2009-2010 UK Digital Content Self-Regulation Other Internet Content Regulation Mobile Content Self-Regulation Case Study: Click Clever Click Safe

30
31 32 33 33 34

2.2. Regional and Transnational Efforts


2.2.1 The European Union
The European Union Audiovisual Media Services Directive (AVMSD)

34
34
36

2.2.2. The Internet Governance Forum (IGF) 2.2.3. The Organisation for Economic Co-operation and Development (OECD)

37 37

Conclusion
section 3: towArds A new Policy FrAmework

37
39

Introduction 3.1. Clear Objectives And Guiding Principles


3.1.1 Who Is Responsible For Managing Convergent Media Content?
Government Industry The End-User Which mode of Governance is Preferable?

39 40
40
40 41 43 44

3.2. Content & Classification


3.2.1 Redefining Content 3.2.2 Rethinking Classification 3.2.3. Community Standards

45
45 46 47

3.3. A Research-Led Approach


section 4: conclusion

49
50

4.1. Recommendations
APPendix one: the history oF AustrAliAs regulAtory environment

51
53

executive summAry
The purpose of this report is to consider the following questions: How can Australia position itself to cope with the risks and opportunities presented by our current media environment? What principles would guarantee a flexible and balanced 21st century system of media content regulation in an evolving media ecology? What challenges lie ahead in adapting existing regulatory approaches to media silos - vertically defined media regulation for television, film, newspapers, radio - to a convergent media environment? What happens if we consider how content moves horizontally - moving between radio and television to podcasts and YouTube, to mobiles and tablet devices? What is the role of users in defining our media environment: from flagging problematic content, to shaping platforms, and contributing to media policy? What can Australia learn from international approaches to convergent media content regulation? What role should governments, industry and users play in media governance? How can we facilitate dialogue between nation-state governments, industry regulation, user communities and international laws? This report visits these issues in detail through a consideration of the history of online media governance, a comparison of international approaches, a series of case studies that highlight current challenges in managing online content and a consideration of where the regulatory balance should fall between government, industry and user communities. In considering these issues, we acknowledge that the current media environment poses extraordinary new challenges for governments, industry stakeholders and media users. While we canvas a broad range of issues and approaches to managing these challenges, we recognise the complexity of adapting existing regulatory approaches and of ensuring that people are given information and resources to enable them to navigate this new media landscape. In preparing this report we welcome the announcement of the Federal Governments Convergence Review by the Minister for Broadband, Communications and the Digital Economy, Senator Stephen Conroy. We present our findings as a research-based contribution to this process and its remit: to take a fresh look at Australias existing regulatory frameworks with a view to modernising them.1 The communications sector in Australia now reaches across an unprecedented array of sectors in the private and public spheres. Communications technologies are the backbone of our health, education, government, finance and culture sectors. The information revolution is critical to Australias economic competitiveness and ongoing social and cultural development. Yet, Australias laws have not kept up with this technological evolution or with the changes in the diverse modes of media production and consumption.

http://www.minister.dbcde.gov.au/media/media_releases/2010/115

Australia is moving away from the legacies of a vertical media environment, in which different networks such as telephone networks and radio networks were regulated and operated for clearly distinct purposes, to a convergent network environment. When we use the term convergence, it is referring to this collapse of borders between various media silos, where content can easily move horizontally across platforms. In this new horizontal environment, it is critical to pay attention to the different roles played by networks, platforms and content providers. Different regulatory and governance solutions need to be examined in each instance, rather than bundling together old media legislation. At the network layer, we argue, policy makers should focus on ensuring network openness, innovation and user choice. At the platform and content provider layers, government should work with industry and users, including in global fora, to encourage self-regulation while facilitating referral of genuinely disturbing material to national and international government regulatory instruments and agents. Community education about internet use, online security and legal obligations should be a priority in this area. There needs to be ongoing commitment to researching international approaches, emerging tools and community expectations.

Concerns about the production and distribution of harmful or criminal material in the convergent media environment have dominated much of the public debate about new technologies and platforms. This is understandable given that protecting the community, and particularly children, from inappropriate material has been and continues to be a core principle of media regulation and content management. This focus on harmful material, however, has often come at the cost of a broader appreciation of the benefits of the convergent media environment. The influential UK Byron Review2 noted that public policy and regulation that is genuinely and empirically grounded in an ethic of care for children and young people will fail if it relies too heavily on a simplistic block and control strategy. In Australia, the focus of debate on internet filtering has come at a cost of thinking about the wider media environment, leading us to ignore broader issues of care and user participation. In reconceptualising media content management and regulation it is critical to recognise that our networked media era offers unparalleled opportunities for innovation, entrepreneurship and the growth of knowledge, and that it has the potential to extend these opportunities to all Australians. Traditionally, Australian media content regulation has worked in a top down manner. Governments regulate or require media content providers to cooperate with them in co-regulatory or self-regulatory schemes. In this framework everyday media users had only a small capacity for direct input through complaints mechanisms or through the judicial system. This framework was based on a model of media audiences as largely passive bodies of consumers, with little need to interact with content producers or regulatory authorities. In contrast to this model, contemporary media users are not just consumers they are highly active, and are often media producers and distributors. Within social networking services that host extraordinary quantities of data it is users who are the most likely to identify offensive material and to notify the relevant host or government agency. In this report we suggest that it is critical to see government, industry and media users as key stakeholders who must work together in the future governance of media content. By cooperating, the three groups can increase opportunities for the identification of truly harmful material and the enforcement of criminal law. They can share the responsibility of governance of media content in an era where the sheer volume of material outstrips the capacity of any government or corporation to pre-vet all material.

Byron Review. 2008. Safer Children in a Digital World. The Report of the Byron Review: Department for Children, Schools and Families, and the Department for Culture, Media and Sport.

Convergent media governance must take the full spectrum of stakeholders into account from the end-user to the parent, from the school into the wider community, to industry and government. A key plank of this cooperation is the need for government and industry to educate consumers and provide them with resources to work in online communities to identify problematic content and to notify relevant organisations or authorities. Media literacy is vital. Education about opportunities and risks online is a particularly critical component of any strategy that aims to protect children, as well as maximise the potential for innovation and creative engagement. At a global level, it is important that Australia, as a robust democracy, inclusive society and cultural innovator, takes a leading role in furthering international cooperation between industry, governments and media user communities. The convergent media environment describes a globally networked media ecology in which no single national government or industry group can work alone to manage or regulate content. This report explores models for international cooperation, which we argue will be central to future governance schemes. In the convergent media environment, traditional media platforms, content and audiences now coexist with their new media counterparts. Its an environment marked by an unprecedented diversity of users and usage which spans the spectrum from amateur videos made for friends and family to high-end professional content produced for international consumer markets, all circulating on the same platforms. This diversity does not render the core values that have underpinned Australias approach to managing and regulating media content redundant. On the contrary, we argue that it is time to think carefully about how those core values can inform a set of principles and approaches to content management in the 21st century: that we need a fresh and adaptive approach to ensure that we balance the opportunities and complexities posed by the convergent media era. We argue that these ten key principles should underpin a new system of media content regulation in Australia:

1. An adaptive approach. We are part of a rapidly evolving media landscape, where media are not silos but are closely intertwined. Our policy frameworks have not kept pace. It is essential that Australia conducts a first principles review of media content policy and develops a flexible system. 2. Working together. The regulation and management of convergent media is best done by industry, government and end-users sharing approaches and concerns and acting collaboratively. 3. Recognising layers. In the convergent environment, there are distinct layers: the infrastructure level of the networks, then platforms and content. Governance solutions need to be attuned to the different issues at each level, while keeping the network layer open and interoperable. 4. Rethinking content. Content is not media specific but fluidly moving through multiple spaces, being repurposed and recirculated. Frameworks for acceptable content are more effective and have fewer chilling effects than network level filters. 5. The importance of users. Public policy and law must recognise the critical role that user communities now play as participants and creators of online environments as well as in notifying industry and government of offensive content. 6. Consistency. All states and territories should have a uniform approach to the sale, distribution and possession of prohibited or restricted content. 7. A new Classification Scheme. We strongly support the review of the Classification Scheme. It is time for a purpose-built system for a 21st century media environment, relying on comprehensive and empirical research into community standards in relation to media use.

8. Media literacy. A national plan needs to be developed, based on empirical research, to foster digital literacy. Industry, government and community members should participate in the formulation of a plan, which will include community education about safety and security on the internet. 9. Committing to the big picture. A broad set of national principles need to be developed that support technology neutrality, a commitment to the free flow of information, the protection of vulnerable users, and an innovative digital economy. 10. An international perspective. Australia needs to be an active participant working with other governments and industry within international fora on media governance issues.

introduction
The historical principles, relevant to the scope of this report, that have traditionally informed media content regulation in Australia include: The need to promote self-regulation and competition in the communications sector while protecting consumers and other users.3 Ensuring that adults should be able to read, hear and see what they want while minors are protected from material likely to harm or disturb them.4 The public interest in ensuring that news and current affairs media provides balanced, objective and accurate information.5 Regard for community standards in relation to material that condones or incites violence, particularly sexual violence, or portrays people in a demeaning manner.6 The need to ensure a balance between local and international media content and to ensure culturally diverse content.7 The protection of privacy.8 The need to balance the right to free speech with the responsibility of government to protect citizens from content deemed harmful.9 The protection of national security.10 The Productivity Commissions report on broadcasting contains a clear summary of the values that underpin media content regulation and management in Australia: Important social and cultural objectives of broadcasting policy include ensuring diversity of sources of information and opinion, adequate levels of Australian content and appropriate program standards. Freedom of expression is also important and should be added to the objectives of the Broadcasting Services Act 1992. Diversity of sources of information and opinion is most likely to beserved by diversity in ownership of media companies, and by competition. 11 The report also notes that controlling the potentially harmful consequences of media influence must be weighed against the benefits of independent and open media in a democratic society and that as new media proliferate and media organisations converge with other businesses, regulatory restrictions on freedom of expression will have an increasingly important place in media law. Media content is currently regulated through a combination of direct regulation (laws, government regulatory bodies and licences), co-regulation (industry-based codes of practice with government approval and potential sanctions) and self-regulation (industry endorsed codes of practice). This model privileges two actors:
3 4 5 6 7 8 9 10 11

ABOUT BCAST

About Communications and Media Regulation, The Australian Communications and Media Authority, www.acma.gov.au/WEB/STANDARD/pc=PUB_REG_ Classification (Publications, Films and Computer Games) Act 1995. See industry codes of practice at The Australian Communications and Media Authority, HTTP://www.acma.gov.au/WEB/STANDARD/pc=IND_REG_CODES_ The National Classification Code. See The Future for Local Content? Options for emerging technologies, 2001, Australian Broadcasting Authority, available at http://www.acma.gov.au/webwr/_assets/

main/lib100068/future_local_cont.pdf, viewed 25/03/11.

Productivity Commission, 2000, Report into Broadcasting, available http://www.pc.gov.au/projects/inquiry/broadcasting/docs/finalreport, viewed 15/4/10. Ibid. See Productivity Commission, 2000, Report into Broadcasting, available http://www.pc.gov.au/projects/inquiry/broadcasting/docs/finalreport, viewed 15/4/10, pp. 332-333; Robert Albon and Franco Papandrea, Media Regulation in Australia and the Public Interest, November 1998, Current Issues; Ibid.

government and industry. It reflects an era in which media consumption was dominated by the production of messages by the few for the many. While media users have historically had some role in media content regulation in Australia through the capacity to notify regulatory agencies or media organisations of their concerns about content, their role in regulation has been limited. In a media environment characterised by the rapid growth of online and mobile media in which media users are often media producers, and where the distinction between these activities is increasingly blurred, the potential role of media users in regulation assumes greater importance than it has ever been historically accorded. It is an era in which the one-to-many model of media content production and distribution has fundamentally and permanently altered. In the space of a decade, our media environment has transformed into one where average Australians actively produce and distribute their own media content through blogs, social networking sites, and videos uploaded to platforms such as Facebook and YouTube. The technological means to produce sophisticated media content have been domesticated and the distribution channels are also growing. We live in a rapidly evolving media environment and one in which technological and business models are volatile. It is apparent that industry and government stakeholders are still coming to economic, cultural and political terms with this global media environment. Users are increasingly challenging existing models of consumption and production, and of regulation. This rapid evolution requires us to think carefully and critically about the focus and scope of media content governance. An important example here is the issue of ensuring a balance between local and international media content. Historically, local media content was promoted and protected by systems that sought to find a balance between industry development and commercial considerations. As the Productivity Commission noted in a 2000 report on broadcasting policy: barriers to entry are balanced against programming obligations.12 In the convergent media environment, both the sources of media content and the means of distribution and consumption are multiplying exponentially. For example, while there remains an important role for governments to play in promoting and assisting the development of local media content and content creation, it is clear that the conventional framework for addressing concerns about maintaining the profile and diversity of local content do not automatically apply to online spaces. This report aims to put the convergent environment into context and to generate broad principles to guide the necessarily more detailed debates about the applicability of historical principles which have underpinned media content policy in the past. At present, Australian approaches to internet content regulation are still based largely on traditional media regulatory models and assumptions. The internet is not a new form of media: it is a new media environment where media users have unprecedented agency in consumption and production. We note that, unlike other nations and transnational groupings notably the UK and the EU, the risks and opportunities of online media consumption and production have not, to date, been the subject of broad rigorous empirical research to guide government in its approach to media content governance and management.

12

Productivity Commission (2000), Broadcasting, Report no. 11, Canberra: Ausinfo, p. 254.

section 1: the stAte oF PlAy


1.1 The Rise of Convergent Media
Australia possesses an advanced digital economy. Australians are connecting, sharing, locating and creating in the digital world more than ever before. There are 8.4 million internet subscribers in Australia (the vast majority on broadband), and 24.22 million mobile phone subscriptions (up over 2 million from 2008). Over half of these are 3G subscribers.13 These mobile adoption rates beat the UK and US; Japan leads Australia. The current plan for the National Broadband Network is to connect 93% of Australians to fibre-based internet services.14 A high speed, convergent media landscape is rich with potential, but also raises new questions about regulation, innovation, diversity of voices, user literacy and community standards. Recognising these issues, the Australian government is conducting a Convergence Review in 2011 to consider how media technologies have changed since the 1990s, when many of the current communications regulations were established.15 From the outset, four critical observations can be made. First, increased functionality of smart mobile phones and broadband mobile internet allow individuals to access the web wherever there is reception. In April 2010 an ACMA report found that as many as 40% of mobile phone subscribers were using the internet on their phones,16 with this figure now rising to 50% according to a new Nielsen study.17 Second, the one-to-many model of traditional broadcasting is being challenged by the internet and mobiles: audiences do not merely passively receive content, they may also be directly commenting on, altering or adapting mainstream media content. Third, social networking has increased the volume of online transactions engaged in by Australians. As we gain more experience online, we are subsequently participating more in the online world. . Australians are signing up to faster internet deals, spending more time online, and using that time more intensively in order to stay connected, maintain relationships and conduct business. According to the latest Nielsen online ratings, 73% of Australians are now regularly engaged in activities on social networking sites.18 Facebook retains its dominance of the online world, with three-quarters of all Australian internet users reporting that they had visited the site.19 However, the use of micro-blogging site Twitter also surged 400 per cent in 2009. Nearly one quarter of Australians claimed to have read a tweet that year.20 The figures paint a clear picture of Australians desire to connect, share and create online. Last, but not least, the proliferation of platforms, tools and high-speed access has resulted in many Australians uploading content to a wide range of online spaces. This can include photos (on services such as Flickr, Picasa and Instagram), video (YouTube, Vimeo), and music (MySpace, SoundCloud), and documents (in online storage spaces such as Google Docs). Cloud computing, be it collaborating with others in online spaces or using distributed systems for delivering services online, is increasingly popular. These four significant shifts feed each other. Social networking and content uploading has benefitted from the wide-scale adoption of internet-enabled smart phones. Sixty five per cent of online Australians now own a

13 14 15 16 17 18 19 20

ACMA (2009a), Communications Report 2008-09, pp 15-16. See NBNCO: http://www.nbnco.com.au Convergence Review Announcement, December 14, 2010. http://www.dbcde.gov.au/digital_economy/convergence_review ACMA (2010), Communications Report 2009-2010, report 2, p.13 Nielsen (2011), Media Release: Nielsens State of the Online Market: Evolution or Revolution?, http://au.nielsen.com/site/documents/ The Nielsen Company (2011), Nielsens State of the Online Market: Evolution and Revolution?, http://au.nielsen.com/site/documents/ The Nielsen Company (2010b), Australia Getting More Social Online as Facebook Leads and Twitter Grows, March 23, available http://blog.nielsen.com/nielsenwire/ The Nielsen Company (2010c), Nine million Australians now interacting via social media sites, Media Release, 15 March 2010, available http://www.nielsen-online.

AustralianOnlineConsumersReportMediaRelease.pdf, viewed 23 April 2011

AustralianOnlineConsumersReportMediaRelease.pdf, viewed 23 April 2011.

global/australia-getting-more-social-online-as-facebook-leads-and-twitter-grows/, viewed 3/24/10 com/pr/social_media_report-mar10.pdf, viewed 3/23/10.

10

phone capable of accessing the internet.21 Relaxed cap plans offered by service providers have seen mobile social networking soar in recent years. A quarter of self-described social networkers now do so on their phones, as well as on their home and work computers.22 Many online spaces have applications (apps) specifically designed for mobile phones. Facebooks iPhone app, for example, uses iPhones camera and gallery, enabling users to snap a picture and upload it to their wall for friends to instantly see and comment on, on the run. A similar app exists for YouTube. Mobile phones, in concert with social media services, have acted as windows onto a range of key international crisis events, including protests, natural disasters and terrorist attacks. This is an example of what researchers Axel Bruns and Mark Bahnisch call the hyperlocal benefits of social media trumping professional content makers: [] local participation may be harnessed to report on local events or record local histories, or to capture local insider knowledge which is available only to long-standing members of the (offline) local community. Here, particularly, there is also an important role for the use of mobile devices to capture such information on the spot and virtually in real time such uses range from the use of Flickr or Twitter to report high-profile events such as the 2005 London bombs or the 2008 Mumbai attack through to comparatively more mundane activities such as sharing information about traffic jams, potholes, restaurants, or travel destinations.23 The functionality of mobile devices will continue to collapse categories of communication. An always on culture of mobile internet use in Japan, discussed in Section 2 of this report, has been described as creating an ambient virtual co-presence24. Ubiquity, always-on connectivity, context sensitivity and the central role of the users identity make mobile internet devices powerful media tools. Ralph Schroeder argues: there will continue to be denser, more extensive, more time-consuming and more non-location-specific ties.25 Australian researchers have analysed this always on culture as a driver of online media cultures, finding that with the increase of mobile devices comes opportunities for different gradations of social connectedness.26 Mobile phones have also created a geo-mobile web, where a massive trove of information about location is tagged to RSS feeds, web pages and comments to social media sites, which is transforming the relationship between data space and physical space, creating new ways for users to create a sense of place or belonging.27

21 22 23 24 25 26 27

AustralianOnlineConsumersReportMediaRelease.pdf, viewed 23 April 2011.

The Nielsen Company (2011), Nielsens State of the Online Market: Evolution and Revolution?, http://au.nielsen.com/site/documents/ The Nielsen Company (2010b), Australia Getting More Social Online as Facebook Leads and Twitter Grows, March 23, available http://blog.nielsen.com/nielsenwire/

global/australia-getting-more-social-online-as-facebook-leads-and-twitter-grows/, viewed 3/24/10

Bruns, A. & Bahnisch, M. (2009), Social Media: Tools for User-Generated Content: Social Drivers behind Growing Consumer Participation in User-Led Content Generation, Smart Services CRC, Volume 1 State of the Art March 2009. Ito, M. and Okabe, D. (2005), Technosocial situations: emergent structuring of mobile e-mail use. Ito M, Okabe D and Matsuda M (eds) Personal, Portable, Pedestrian: Mobile Phones in Everyday Life. Cambridge, MA: MIT Press, 44751 Schroeder, R. (2010); Mobile phones and the inexorable advance of multimodal connectedness, New Media Society, 12; 75. Michael Bittman, Judith E. Brown, Judy Wajcman (2009). The Cell Phone, Constant Connection and Time Scarcity in Australia, Social Indicators Research, Volume 93, Number 1 / August, 2009 Crawford, A. & Goggin, G. (2009), Geomobile web: locative technologies and mobile media [Paper in special issue: Placing Mobile Communications. Lloyd, Clare; Rickard, Scott and Goggin, Gerard (eds).] Australian Journal of Communication, v.36, no.1, 2009: 97-109.

11

Traditional media companies are also changing how they offer content to their audiences. Newspapers such as The Australian and the Sydney Morning Herald are offering mobile and iPad-based content. Television networks are developing stronger links between web and mobile content and traditional programming, offering mobisodes and alternative reality games (such as the TV series Lost, which used an extensive alternative reality game between seasons). New intersections between broadcast and participatory media are emerging. For example, the ABC has established a service called Pool, a social media space where ABC employees and audiences can collaborate and co-create content. In the background, Australians are changing the type and combination of devices that they use as a method of accommodating change. Australians are opting for multiple services, including traditional fixed phones, 3G, broadband and wireless (most adults use three forms of communications technology regularly).28 Some Australians are substituting old phone lines with mobile handsets, with subscribers to 3G networks jumping 44 per cent between 2008 and 2009. Young consumers are leading this charge, according to ACMA.29 The level of mobile uptake for Australians aged between 24 and 35 is at 95 per cent, the highest in the country.30 Fixed telephone line use dropped over the same period to just one in ten people over 14 subscribing. Moreover, as prices fall, companies are simultaneously offering more services.31 There is a growing, if tentative, use of VoIP, especially Skype. Two and a half million Australians accessed a VoIP service in the middle of 2009.32 Once the National Broadband Network is in place, this trend will markedly increase.33 This is the commencement of a wide-scale investment in a post-web internet environment, and the development of new technologies and social configurations.

1.2 More, Faster: Australia Over The Next Five Years


Important changes are in store for Australia over the next five years. Key factors include: 1. Future government investment in network infrastructure under the planned National Broadband Network.34 2. Government plans to free up spectrum and the subsequent growth in new services. 3. An increase in 3G or higher networks, giving faster access to the net via mobile devices 4. Australia switching from analogue TV to full digital by 2013. 5. The delivery of TV over IP - research by the Australian Communication and Media Authority notes this is now growing rapidly.35 These developments will result in a wide range of innovation, both technical and cultural, potentially including: advanced cloud computing; faster video and VoIP conferencing (and reduction of travel to and from face-toface meetings as well as the greening benefits of new telecommunications); reliable streaming media and live, interactive content; e-health initiatives such as remote check-ups and health records; and Web 3.0, or a highly personalised web experience.36 Andy Oram of OReilly Media, quoted in a recent Pew Research Centre report, The Future of The Internet predicts a significant number of developments, including:
28 29 30 31 32 33 34 35 36

household_consumers.pdf 3/24/10.
Ibid. 25 Ibid. Ibid. Ibid.

ACMA (2009), Convergence and Communications. Available at http://www.acma.gov.au/webwr/_assets/main/lib100068/convergence_comms_rep-1_

Scott Rickard. 2009. Are you there? Encouraging users to move from peer-to-peer to Voice over Broadband. Telecommunications Journal of Australia. 59 (3): pp. 42.1 to 42.7. It is worth noting that while funding models underpinning the National Broadband Network have been the focus of disagreement between the major parties in Australia, there is bipartisan support for the need to ensure better broadband access across Australia. IPTV and internet video delivery models: video content services over IP in Australia, ACMA, June 2010. ACCAN (2009), Future Consumer, p 5.

12

More-powerful mobile devices, ever-cheaper net-books, virtualization and cloud computing, reputation systems for social networking and group collaboration, sensorsand other small systems reporting limited amounts of information, do-it-yourselfembedded systems, robots, sophisticated algorithms for slurping up data and performingstatistical analysis, visualization tools to report results of analysis, affective technologies, personalized and location-aware services, excellent facial and voice recognition, electronic paper, anomaly-based security monitoring, self-healing systems... 37 The manner in which people communicate with each other on the internet and the way that they produce content, now and in the future, requires a rethinking of the way traditional media policies work. Institutional forms that have sufficed for regulating some of these functions in other media no longer work, writes Australian media studies scholar Sal Humphries. It requires a breaking down and revisioning of policy areas and strategies. It requires a new form of literacy in users, and the development of new skills and strategies.38

1.3 Regulating The Convergent Environment: The Current Picture


The inherent complexity of the current Australian content regulation regime is partly a consequence of the allocation of powers in the Constitution between State and Federal Governments. Whilst the Federal Government has the power to legislate with respect to the internet (derived from section 51(v) the 1901 Constitution, the telecommunications power - allowing the federal government to regulate both online services and broadcast media such as television and radio), and in respect to the import and export of audiovisual recordings, computer games and printed matter (derived from section 90 of the Constitution, the customs power), the States are solely responsible for the production and sale of audiovisual recordings, computer games and printed matter. In a bid to ensure some consistency across the board, the States, Territories and Federal Government agreed to the establishment of the Office of Film and Literature Classification, the body that until 2007 was responsible for classifying material. From July 2007 the OFLC was absorbed by the Attorney Generals Department and now operates as the Classification Board, under the provisions of the Classification (Publications, Films and Computer Games) Act 1995 (Cth).39 The Classification Board is primarily responsible for classifying movies, home videos, computer games and publications, and also reviews individual classifications upon request. The Classification Act, Code and Guidelines underpin any classification decisions. A separate body, the Australian Communications and Media Authority (ACMA), formed in 2005 to replace the Australian Broadcasting Authority (ABA) andAustralian Communications Authority (ACA), oversees the regulation of broadcasting, online content, radio-communications and telecommunications. ACMAs responsibilities and powers are derived from the Broadcasting Services Act 1992 (discussed in greater detail below) and include setting legally binding standards in relation to Australian content and childrens television, and investigating complaints and issuing take-down notices for potentially illegal material hosted online. The Federal Governments inability to criminalise the production and sale of prohibited material within States or Territories means that the States and Territories are responsible for doing so themselves, creating disparities between different regulatory models across Australia. Appendix 1 addresses the key acts in this area, the history of their development, and the criticisms that have been made of their design and operation. The relevant legislation includes the Telecommunications Act 1997, The Telecommunications (Consumer Protection and Service Standards) Act 1999, the Broadcasting Services Act 1992 (BSA), the Online Service Amendment Act 1999 and The Communications Legislation Amendment (Content Services) Act 2007.
37 38 39 For more predictions from industry professionals see, Pew Research Centers Internet & American Life Project (2010), The Future of The Internet, available at http:// Humphries, S. (2009) op. cit. p 79 Available at http://www.comlaw.gov.au/Details/C2005C00359, accessed 4/04/11.

www.imaginingtheinternet.org, viewed 30/04/10.

13

Regulatory Inconsistencies
The internet is a multifaceted, distributed network with no centralised gatekeeper. The vast range of communication options it contains were once governed by distinct policy areas. This raises two significant problems in relation to future regulation. The first is that existing media becomes digitised and is distributed differently, which brings the existing rules pertaining to each medium into question. The scarcity of broadcast spectrum, for example, resulted in strict transmission rules that are now challenged by the relative abundance of digital transmission. The second problem is that separate media silos are now interchangeable on different digital networks. Television content can be seen on TV, delivered online, and to a mobile. The service is the same, but the rules for each medium are different. Speaking from the US experience, communications scholars Franois Bar and Christian Sandvig write that policy responses to convergence end up being ramshackle and jerry-built, a story of evolutionary inertia and incrementalism; changes in media policy have rarely appealed to an underlying driving truth the why of regulation rather they are knee-jerk responses to existing regimes: As a result, policy-makers looking to resolve convergence challenges have favoured incremental adaptation of past rules rather than fundamental redesign of the policy regime. They have chosen either to treat a new medium with the policy previously applied to whatever it seemed to resemble, or to adjust through the accretion of exceptions and additions. Thus, policy treats cable television as an extension of broadcast, itself viewed as an extension of radio.40 A similar situation exists in Australia where state-by-state, platform-by-platform, content is regulated differently. Australia has yet to fashion laws that fully amalgamate broadcasting and telecommunications with an understanding of the ways the internet is changing the end users experience. Communications is poised to become truly cross-sectoral reaching across health, government, commerce, and of course, the media.41 Australias laws have not kept up with the technological evolution and are ill-prepared to absorb such rapid change. In the area of consumer protection, there has been little opportunity to understand the complexities of user experience, despite wider community concern about protecting children from inappropriate content. This is especially the case for social networking, user generated content, gaming and online immersive worlds. As the following case studies will show, there is now a distinct need for industry to work with government and users to build media literacy and to expand cooperation with one another. Social networking platforms exist in a grey area of regulation: they are constantly changing the types of content they offer combining ephemeral and stored content the great majority of which is user-generated. Services like Facebook and Twitter argue that they cannot be held legally responsible for any of the content, as they are not the publishers they just provide the platform. Laws designed to regulate traditional publishers are often illsuited to online services because they are based on traditional media production and use practices, according to which a proprietor can be assumed to take responsibility for published content.

The lifespan of online content is also relevant here. On the one hand, online conversations through social networking sites, bulletin boards, email lists and instant messaging services are transitory, and on the other they have the capacity to exist well beyond their intended initial time-frame and purpose, and can be mined for information and bought and sold commercially, even after the death of the original author.42 Current regulations are not able to respond well to situations that collapse the public and private spheres, as well as stored and
40 41 42 Bar, F., & Sandvig, C. (2008). US communication policy after convergence. Media, Culture & Society, 30(4), 531. Gerard Goggin; Claire Milne. 2009. Great expectations? Regulating for Users in the United Kingdom and Australia. Telecommunications Journal of Australia. 59 (3): pp. 47:12. ACCAN (2009), Future Consumer, p. 63.

14

transient content. Further regulatory inconsistencies exist beyond social networking services. Online games provide one such example. Although regulated in the same manner as any other game (by the Classification Board), online games are also subject to regulation by ACMA, since they are considered to be online media. It is possible to access X18+ videos in the ACT but not online from Australian-hosted sites. R-rated content in films is refused classification if the content appears in a game though at the time of writing this was under government review. Gamers could simply circumvent the system by buying R18+ games online from stores offshore or downloading them illegally. The possibility of inconsistent regulation between platforms is a serious concern for policy makers and legislators. Humphries concludes that, however daunting, there needs to be a new approach, one that can cater to the many ways in which people use the internet, to the various platforms for delivering content and to the manner and locations in which content is consumed, either publicly or privately: With more conventional media well-worn pathways of distribution made it easy to control dissemination of restricted content, and the limited amount of content published made it possible to implement reviewing processes. Because these processes and conventions have been disrupted by the new structures and practices of the internet and its users, new conventions need to be established in order to achieve a balanced set of protocols that take account of freedom of expression as well as community standards.43 Increasingly, social networking and user generated content are producing challenges for the current regime. Two case studies help to illustrate these challenges: Facebooks self-regulation, and the current lack of an R18+ rating for games in Australia. Each example provides insight into the contradictions and gaps in current regulation.

Case Study 1: Facebook and the Queensland Government


In February and March 2010, public recriminations grew in Australia when it was perceived that Facebook failed to take strong action after memorial sites for two Queensland children, 12-year-old Elliott Fletcher and eight-year-old Trinity Bates, were defaced with pornographic images and offensive comments. One vigilante Facebook page was set up to vilify the man who allegedly murdered Trinity Bates. And in another instance, a Facebook user claimed to be able to deliver a long-lost missing Queensland boy Daniel Morcombe, if his Facebook-based group attracted 1 million members. The Queensland Premier Anna Bligh wrote to Facebook demanding to know what steps would be taken to prevent this kind of abuse happening again in the future, and warned Facebook that it would lose popularity if they failed to do more to reduce accessibility to offensive material on the site. Facebook stood accused of not acting quickly enough and of failing to adopt a sufficiently strict approach in regards to the monitoring and policing of its content.

In response, Facebook wrote a letter to Premier Bligh, which included an explanation of privacy control and alert settings that can be used for informing Facebook about the existence of inappropriate content. 44 They also expressed their regret: These vandals have cast a shadow on an already tragic experience, and we are disappointed and disgusted that anyone would turn a tribute Page into anything but a place of respect and honour, wrote communications executives Debbie Frost and Elliot Schrage. Facebook argued that such content violates the terms of use. Facebook also took the unprecedented action of reaching out directly to Australian users to emphasise the ways in which Page and Group administrators (often those who set up the sites) can
43 44 Humphries, S. (2009), op. cit. Facebook letter to Queensland Premier Anna Bligh, 25 February 2010.

15

remove content themselves. Facebook said that it continues to develop industry tools that enable it to respond more quickly to what it calls the 400-million strong police force that our Facebook users represent. The executives concluded that the complete prevention of inappropriate content or eradication of tasteless material is not something we or any society can deliver. Facebook users are required to adhere to a Statement of Rights and Responsibilities45, including points 3.6 and 3.7: You will not bully, intimidate, or harass any user; and you will not post content that is hateful, threatening, pornographic, or that contains nudity or graphic or gratuitous violence. Facebook makes it clear that it can remove anything posted it believes violates this statement or terminate the service. Like many social networking sites, Facebook resists being defined as a publisher. As Facebook spokesperson Debbie Frost explained to The Australian in March 2010: We didnt build a site to be a publisher, we built it to be a platform. We built it to give people tools to share information with each other and I think the enormous success weve seen is testament to the fact that human people do want to do that and the vast, vast majority hundreds of millions of people are not behaving the way these few people did in Australia, so it seems to be going OK as a system. Frost admitted to some procedural slowness, however she argued that the problem is worse in the real world: If I put up nude graffiti on the side of a church, how do you report that, how do you get it taken down in a way thats good enough for you? It takes time in real life and on the web and we think our system is actually more responsive If I phone you up and say really offensive things, does that mean the mobile phone operator is liable for that? The report also stated that Facebook keeps all of its records, and would cooperate completely with any police investigations.46 This being the case, it could be argued that Facebook in fact presents a stricter regulatory environment than that existing in the offline world. While questions can be asked about Facebooks responsiveness to particular acts of vandalism, it nonetheless puts considerable effort into removing obscene and vulgar content that may actually be legal to host, out of deference to its community and the standards amenable to advertisers. To the extent that illegal acts may be undertaken by users on Facebook, these can still be investigated by relevant authorities. For example, Queensland police have conducted investigations into the vandalism case cited above, with a particular focus on claims that some of the obscene materials posted included images of child abuse. Additionally, the pages set up to vilify the alleged murderer of Trinity Bates may constitute contempt of court as this case is currently sub judice.47 Such intentional misuses of the platform present significant challenges to media regulators. Mandatory filtering and complaints-based approaches to media regulation would face precisely the same issues of responsiveness encountered by Facebook itself. Given that Facebook is an ever-changing user-driven environment, automated filtering systems are unable to correctly differentiate between pages displaying legitimate and illegal content. And a system built on complaints to a regulator would require impossibly large resources in order to evaluate submissions and assess content posted by users, even if security and privacy issues related to access to users personal content could be successfully overcome. Only an approach based on collaboration between the platform provider and its users, including rapid responses to user feedback, can adequately address questions of
45 46 47

http://www.facebook.com/terms.php faceless-no-more-facebook-admits-errors/story-e6frg996-1225835350571, viewed 3/24/10.


Op cit Fitzsimmons, C. (2010), Faceless no more: Facebook admits errors The Australian, March 01, 2010, available http://www.theaustralian.com.au/business/media/

16

the suitability of content on a site such as Facebook. Community interests need to be heard, a responsibility that privately-owned services need to take very seriously.

Case Study 2: Games, Online And Offline


Australia is the only country in the developed world without an R18+ rating for video games, although this is due to be considered by the Standing Committee of Attorneys-General (SCAG) in July. Currently, a DVD may be rated anywhere between G and R18+, but the highest rating for a game is MA15+ suitable for players of 15 years or older. This means games with adult content are often refused classification or edited by games developers to try and fit the MA15+ category. In the lead up to the introduction of the Classification Act of 1995, a Senate Select Committee on Community Standards recommended that an R18+ rating be omitted from any classification decisions about games, for the following reasons: games, they argue, are more interactive and are therefore more likely to harm children; computer games are primarily used by children and adolescents; and parents are insufficiently educated about technology and software to make informed decisions about how to guide their kids. There has been, for 14 years, growing debate surrounding the classification of games in Australia, particularly in light of mounting evidence that suggests that gamers are often older averaging 30 years of age and given that 70 per cent of parents in homes with gaming devices also play them and are therefore familiar with the technology.48 The difficulty of the Australian situation was highlighted with the release of Grand Theft Auto IV in 2008. Released in other territories as an R18+ game, it was edited by Rockstar Games for its Australian release in order to obtain a MA15+ rating. However, the changes were minor, and did not substantially change the violent capabilities given to players. As one games academic observed it was a case of sneaking in to a lower category, despite its depictions of murder, blackmail, extortion and sex with prostitutes.49 Despite the aim of having a more restrictive, safer system, Australia was the only country allowing 15-18 year olds to play Grand Theft Auto IV when other democracies decided it was only suitable for adults. As campaigners for the R18+ rating have observed: IfGrand Theft Auto IVwere a movie, it would have been rated R18+ and kept out of the hands of children. In other countries such as New Zealand, Britain and across Europe it is impossible for children to purchase this game. However under Australian law, there is nothing to prevent children aged 15 and up or lower, with their parents consent from purchasing this game and playing it, simply because our ratings system does not have the capability to keep high-impact games like this out of their hands.50 On April 28, 2011, the South Australian Attorney-General John Rau announced that the state will introduce the R18+ classification regardless of the outcome of the SCAG meeting in July.51 At the same time, the state plans to phase out the MA15+ category, on the view that there should be a clear gap between content for adults and content for children. Should the federal government not gain the approval of all states to introduce a national R18+ rating, this could mean an inconsistent ratings system from state to state. While South Australias decision could be viewed as a step in the right direction, it introduces further classification confusion; it could create an uneven system where game buyers can purchase R18+ games in South Australia but not elsewhere.
48 49 50 51 Brand, J. E. (2010), The Violence Game, Paper presented at the Eight Annual Somerset International Conference for Librarians and Teachers, 15th March, Gold Coast, Australia.

why-we-should-back-an-r18-classification-1025

McCrea, Christian (2011), Fair games: Why we should back an R18+ Classification, The Conversation, 19th April, http://theconversation.edu.au/articles/fair-gameFrom the campaign site, Australia needs an 18+ rating for video games: http://www.r18games.com.au/gta/

4 Parker, Laura. (2011), South Australia to introduce R18+ for games, GameSpot AU, 28 April, http://au.gamespot.com/pages/news/story. php?sid=6310534&skipmc=1, viewed 28 April 2011.

17

Online games such as World of Warcraft reveal an even more complex side of gaming classification. These networked games are doubly regulated under the Australian system, both as online content and as traditional games however neither system of regulation truly demonstrates a deeper understanding of the textual or community meanings of the games. Online games enable, and require, user-generated content and social networking to take place. Defying traditional ideas of a finished product, online games continue beyond any one users experience, with millions of authors constantly interacting and competing. Such networked production, argues Sal Humphries, highlights the need to approach such media not merely as texts, but more as dynamic sets of relations and processes: Moves to force this new genre of participatory media into the strictures of old conventions seem unwise, yet the power and influence wielded by established media interests mean policy and regulation continue for the most part to act to preserve the old rather than facilitate the new. The interests of users, now participators and producers, need to be thought about alongside those of corporate publishers, not only in terms of their access to cultural and social capital, but in terms of what their rights, risks and obligations might reasonably be in such a system.52 We agree that the current regulation of games in Australia is problematic, and at odds with similar countries. One pressing issue is to introduce a national R18+ rating to prevent games from being either miscategorised, and thus available to younger audiences than they were intended, or inappropriately banned from adult use.

Conclusion
In this section we have mapped the ways in which Australias content regulation policies are struggling to keep up with the current media production and consumption. Social networking and user-generated content operate with a different production model in mind: they are both dynamic and networked. Traditional policy silos that governed individual media forms are no longer relevant. The extant classificatory regime is subject to significant blind spots when it comes to platforms such as mobile internet, games, social networking and user-generated content. More sophisticated, flexible and broad-based 21st century principles of media regulation are required in order to assess the impact of new technologies, and how Australia might sensibly adapt its laws to better reflect the realities of a convergent media environment.

52

Humphries, S. (2009), Discursive constructions of MMOGs and some implications for policy and regulation. Media International Australia: Incorporating Culture and Policy (130). pp. 63-64.

18

section 2: the internAtionAl ArenA


As a global phenomenon, the internet requires an approach to governance that is global - or at the very least transnational - in scale and impact. International governance, the new frontier in media convergence policy, is, however, inherently problematic in the sense that it must confront various barriers to implementation that span culture, economics, trade, language, politics and law.53 Whilst it is true that the initiation and maintenance of dialogue between government and industry on a transnational scale (including the framing and adoption of international or transnational treaties) is a complex and time-consuming undertaking, and one which may require the involvement of high-level actors such as the UN, this dialogue is critical to the success of future governance efforts in the new convergent media environment. The following section will highlight three distinct overseas governance models (Japan, New Zealand and the United Kingdom), followed by a selection of international and transnational efforts, including the European Union, the International Governance Forum, and the OECD. The aim of this section is to examine in detail how key comparable nations have dealt with governance and content management issues in order to identify useful strategies for Australia to consider in the Convergence Review. As many of these approaches are recent, or still developing, they are offered for consideration. We argues that these regulatory systems need ongoing study to assess their effectiveness and suitability for Australia, so that we may learn from the successes and failures of other nations.

2.1. International Comparisons


Governments around the world are grappling with the same issues as Australia when it comes to managing their digital economies. How can ICT policy remain flexible enough to deal with a rapidly changing market? How do governments work with industry and media users to manage the existence of harmful or offensive content in this environment? This section draws on examples from around the globe to show how governments are responding to these challenges. The policies selected for analysis address a range of approaches from selfregulation to primary legislation. This section will discuss how governments are coping across four key areas of digital management (a summary of this can be found in Figure 2): National legislation for convergent ICT; National digital strategies to outline and coordinate government responses to issues of regulation; National strategies for user-confidence and digital literacy; and Content filters and moves to regulate harmful online content.

53

Wilske, Stephan; Schiller, Teresa, International Jurisdiction in Cyberspace: Which States May Regulate the Internet, 50 Fed. Comm. L.J. 117 (1997-1998)

19

The table below contextualises Australias attempts to regulate the ICT market with international examples, and draws out concepts that can help built evidence for a best practice approach to regulation.

country

nAtionAl legislAtion For convergent ict

nAtionAl digitAl strAtegy

nAtionAl strAtegies For user-conFidence And digitAl literAcy

content Filter

Australia

No Complex web of laws (See Section 1)

Yes Australias Digital Economy: Future Directions

No No overarching strategy; media literacy falls under the remit of ACMA; Digital Education Revolution (2009). Yes Japan Safer Internet Program legislates minimum standards for private industry in media literacy and education.

Not yet Mandatory/Government (Proposed) Current: voluntary consumer Family Friendly Filters administered by IIA and ACMA. Yes Voluntary/Industry Japan Safer Internet Program encourages private industry to develop content filters; operators tailor filters and parental controls themselves.

Japan

Yes Information and Communications Law

Yes u-Japan Policy (Ubiquitous Internet); Under this comes the Act on Development of an Environment That Provides Safe and Secure Internet Use for Youth; Japan Safer Internet Program. Yes Digital Strategy 2.0 (2008)

New Zealand

No

Yes NetSafe, funded under the Digital Strategy 2.0.

Yes Voluntary/Government Digital Child Exploitation Filtering System Yes Voluntary/Industry IWF maintains a list given to ISPs.

UK

Yes Communications Bill (2003)

Yes Digital Britain (2009)

No Complimentary self-regulatory approaches include Click Clever Click Safe: The first UK Child Internet Safety Strategy (UKCCIP) and ThinkUKnow (CEOP).

Figure 2. International comparison of digital strategies

20

2.1.1. Japans Safer Internet Program


Japan is often regarded as a technology leader for its high rates of broadband and mobile internet access. Extensive and early adoption of mobile internet, along with native mobile protocols, have created a protected market environment in which a unique Japanese mobile internet has flourished. It has been termed a technological Galapagos Island: many unique products have evolved that rarely spread to international markets.54 The heavy use of the keitai (mobile phone) internet and text messaging as well as a widespread interest in gadgets has made Japan distinctive in the transnational arena, writes Mizuko Ito in Personal, Portable, Pedestrian: Mobile Phones in Japanese Life: The view of Japan as a curiously urbanized incubator for the future of mobile technology is based on an international appreciation of how Japan has pushed the envelope on mobile technology design, business practice, and use an appraisal that seems well-placed given Japans unparalleled levels of adoption of the keitai internet and its steady march into new areas such as camera and video phones, location-based services, broadband keitai internet, and m-commerce.55 Broadband has thrived since its launch in the late 1990s. Japan enjoys near-total bre-to-the-home broadband. The number of broadband subscribers (excluding mobile) now exceeds 30 million people.56 Figures from the Telecommunications Carrier Association show that Japan ended 2009 with 110.62 million mobile subscribers.57 These telecommunications services have become increasingly IP-based. More voice calls are being made via IP, for example. Telephone numbers were allocated to more than 20 million VoIP users by the end of March, 2009.58 Japanese teenagers more often than not will use the internet on their phones more than any other device. Nearly 60% of junior high and nearly 90% of high school students own a mobile phone and use the internet.59 Deregulation & the Need For a Unifying Act Japans ICT policies have developed since the 1990s with two aims in mind: the deregulation of the industry, and the regulation of harmful content, with a focus on self-regulation and user-empowerment. Like Australia, Japan grapples with two systems of law regulating media and communications: broadcasting (housou) and telecommunications (tsuushin). The Telecommunications Business Law regulates the telecommunications industry, and ensures a universal telephone service with three key requirements: essentiality, affordability and availability.60 Until the mid-1980s, the telecommunications industry was a monopoly run by Nippon Telegraph and Telephone (NTT). In 1985, NTT was privatised and in the early 1990s, mobile carriers began to compete in an open market. In the other corner, the Broadcast Law was enacted in 1950 as part of Japans constitution, setting up a two-tiered system: the NHK (Japan Broadcasting Corporation), a publicly owned corporation funded by viewers payments of a television licence fee, and a commercial broadcasting sector. Interactions between the laws began in June 2001, with the Law Concerning Broadcast on Telecommunications Service. This opened the door for operators with no broadcast infrastructure, making use of existing pipes. It also paved the way for telecommunications companies to use IP services.61 A new business model emerged for
54 55 56 57 58 59 60 61 Aizu, I. & and Bayer, J. (2009), Beyond network neutrality: The state of play in Japanese telecommunication competition, Telecommunications Journal of Australia. 59 (2): pp. 25.1. 25.26. Ito, M. (2005), Introduction, Personal, Portable, Pedestrian, MIT Press, p 2. Mitomo, H. & Tajiri, N (2010), Provision of universal service and access over IP networks in Japan, Telecommunications Policy 34 (2010) p 101. Telecompaper (2010), Japan ends 2009 with 110.62 mln mobile subs Monday 11 January 2010, available at http://www.telecompaper.com/news/article. Mitomo, H. & Tajiri, N (2010), op. cit.

aspx?cid=712163, viewed 28/04/2010.

S1-DOCOMOs_Filtering_Service_Initiatives.ppt, viewed 3/15/10.


Mitomo, H. & Tajiri, N (2010), op. cit.

Tano, H. (Consumer Services Department NTT DOCOMO)(2009), DOCOMOs Filtering Service Initiatives, presentation to ITU/MIC Strategic Dialogue for Safer Internet Environment for Children Tokyo, Japan 2-3 June 2009, available http://www.itu.int/osg/csd/cybersecurity/gca/cop/meetings/june-tokyo/documents/

Shiotani, S (2008). Challenges Facing the Cable Television (CATV) Industry in an Effort to Create Survival Business Models, Keio Communication Review, No. 30, 2008, p. 51.

21

telecommunications companies: triple play (or bundled) services: telephone, TV over IP, and internet access.62 Despite these changes, Japan, like Australia, is facing the challenges of rapid convergence. There is no current regulation (apart from copyright law) for digital content designed for IP-TV, video on demand, or for catchup TV programs on the internet. There is also no policy for what the Japanese government calls terminal convergence (mobile TV phones or TV sets that can receive IP-TV). There is no competition regulation covering cross-ownership of broadcast and telecommunications by the same entity.63 Regardless, Japan continues to produce a remarkable number of innovations, in what has been dubbed Content Fusion Technologies: live TV-based chat during programs, integrated search engines for both recorded and web content, automatic conversion of internet news into video via wEE (Web2TV with emotional expression), and even a system that automatically converts web content into cartoons for kids (called Interaction e-Hon).64 In August 2006, the Ministry of Internal Affairs and Communication set up a Study Group to take an in-depth look into the policy gaps that existed between telecommunications and broadcasting law. The Study Group released its final report in 2008.65 It concluded that there is a need to undertake a fundamental revision of the legal system for communications and broadcasting. In response to technological convergence, the Study Group was guided by these principles: Free flow of information Universal service Maintenance of safety and reliability of information and communications networks Technical neutrality We argue that these are important principles to consider when devising Australias way forward. The report also recommends a shift to horizontal management of the ICT sector, which would enable free combinations of contents and networks. Instead of being applied along the lines of the distribution model (eg. phone lines, cable, internet), a new system would apply according to three layers: 1. Content: regulated in categories, for example Media Service, or Open Media Content like blogs and personal web sites. The regulations would cover user-generated content. 2. Transmission infrastructure: consolidating regulations on communications and broadcasting transmission services; maintaining flexible use of spectrum across all platforms. 3. Platform: social media sites, e-government sites, etc.

62 63 64 65

Seki, K. (Director, International Economic Affairs Ministry of Internal Affairs and Communications MIC)(2010), Network Paradigm Shift: Deployment of Ultra-high Speed Access, available at http://www.bundesnetzagentur.de/media/archive/5471.pdf, viewed 17/03/2010. Ibid, p 33. For a more detailed discussion of emerging technologies, see Miyamori, H. et al. (2007), Fusion of Communication Content and Broadcast Content, Journal of the National Institute of Information and Communications Technology, Vol.54 No.3, pp. 66-69. Ministry of Internal Affairs and Communications (Japan)(2008), Final Report from the Study Group on a Comprehensive Legal System for Communications and Broadcasting, Biweekly Newsletter, Vol. 18 No. 21, February 8, 2008.

22

Regulating Content Market deregulation in Japan has occurred alongside moves to regulate harmful content. In 2002, the Provider Responsibility Law encouraged online providers to remove potentially harmful content and in response, ISPs drafted guidelines for self-regulation. In 2008, the government released its Final Report on a Comprehensive Legal System for Communications and Broadcasting66, which acknowledged the need for a complete restructuring of the communications and broadcasting legislative regime, from a vertical hierarchy to a horizontal one. Although the report stated that for the present moment, the need to regulate the platform layer as a separate individual layer is not great because it is a newly evolving service, it noted that the government would need to keep watch on the evolution of the platform layer in order to ensure that it does not become a bottleneck. The report argued that there was a need for softer, less prescriptive regulatory measures, while opposing the view made by some parties that the regulation of the platform layer should be left entirely to the Anti-Monopoly Act and general competition laws. In 2011, early signs are indicating that the Japanese government is increasing its interest in regulating the platform layer, citing concerns that platforms are becoming too powerful. Overall, the restructuring of the regulatory regime is still being developed and debated. On the issue of child safety, Japan has the Act on Development of an Environment that Provides Safe and Secure Internet Use for Youth67, which came into effect in April 2009. The Act calls on all stakeholders to ensure safety online, starting with children themselves, and working up through content providers, retailers, manufacturers, ISPs, teachers, parents, community groups and the government. Japan also formulated the Japan Safer Internet Program in 2009. The government calls this a comprehensive policy package regulating illegal and harmful online content. Industry is obliged to provide parental controls and mobile internet filtering, tailored by companies to the age of the child, as well as being obliged to take measures aimed at educating the public about media literacy. Aims of the Japan Safer Internet Program68

1. Development of a Basic Framework to Provide Safety and Security Improving the basic legal system for a safer internet (e.g. encouraging filtering software) Promoting international cooperation Promoting actions with local public authorities Supporting public-private partnerships 2. Promotion of Voluntary Efforts in the Private Sector Tackling illegal and harmful information Exploring effective access prevention measures against child abuse material. Encouraging the dissemination of content rating Supporting technical development

66 67 68

Japan International Affairs Department, Telecommunications Bureau, 2008, Final Report on a Comprehensive Legal System for Communications and Broadcasting, available at www.soumu.go.jp/main_sosiki/joho_tsusin/eng/...21/Vol18_21.pdf, accessed 20/03/11. Government of Japan (Cabinet Office)(2009), Act on Development of an Environment That Provides Safe and Secure Internet Use for Young People (English translation), available http://www8.cao.go.jp/youth/youth-harm/pdf/neteng.pdf, viewed 15/3/15/10 APEC (2009), Japan Policy and Regulatory Update, 39th Telecommunications and Information Working Group Meeting Plenary Session, Singapore, 16 - 18 April 2009, available http://ec.europa.eu/external_relations/japan/docs/2008_japan_rrd_proposals_en.pdf, viewed 3/16/10.

23

3. Supporting Parent-Child ICT Media Literacy Information on moral education in the family, community, and schools Promoting user educational activities from third-party organisations Investigating the impacts of harmful information on children

Content Regulation = Self-Regulation In Japan, obscene online content is defined as including:69 Information about undertaking or mediating, or inducing a crime, or that induces a suicide; Obscene depictions of sexual conduct or genitals or other information that considerably stimulates sexual desire; Grisly depictions of murder, execution, torture or other extremely cruel content. It is important to note that the acts of rating content and instituting criteria for filtering of content have been left to the private sector. The policy does set minimum standards for parental controls for private industry70, and companies can then tailor products directly to consumers. In regards to illegal content, users are encouraged to report illegal and offensive content to the Internet Hotline Center, which is a member of INHOPE (the International Association of Internet Hotlines).71 Commencing in April 2011, a newly created organisation called the Internet Content Safety Association (http:// www.netsafety.or.jp) will develop a blacklist of all child pornography sites located by the National Police Agency, which will then be referred on to search engines, ISPs and filtering companies, who are expected to voluntarily block the content from their services. Mobile filtering works on an opt-out basis; internet filtering is opt-in. Mobile carriers are required to provide users under 18 with filtering services unless otherwise requested by parents. Parents who sign up for a mobile phone service are required to declare the age of the user when they enter into a contract with them. The opt-out system reflects the governments view that it is more difficult to monitor the use of mobile phones at all times, making children arguably more vulnerable to harmful content. Poster and Slogan Competition: Case Study Child protection strategies run alongside media literacy programs, through various government supported campaigns. Information Security Day is on the 2nd of February every year, kicking off a month of events. The Check PC! campaign included a website informing users how to enhance PC security, and was widely publicised on public transportation and in the media.

69 70 71

This summary provided by Aizu, I. & and Bayer, J. (2009), op. cit. Ouchi, K. (Deputy Director, Ministry of Internal Affairs and Communications Japan) & Isozumi, K. (Deputy Director, Minisry of Economy, Trade and Industry)(2009), Workshop on Initiatives in Promoting Safer Internet Environment for Children, APEC-OECD Joint Symposium on Initiatives among Member Economies Promoting Safer Internet Environment for Children, available http://www.oecd.org/document/17/0,3343,en_2649_34223_43301457_1_1_1_1,00.html viewed 3/15/10. See http://www.inhope.org/gns/news-and-events/news/10-11-22/Internet_Hotline_Center_Japan_news.aspx, accessed 23/03/11.

24

One example of a digital literacy program run in Japan for children is that of the Information-technology Promotion Agency (IPA), which operates under the Ministry of Economy, Trade and Industry. For children and students, the IPA runs an Annual Information Security Slogan and Poster Awards, which began in 2006 and was an adaptation of the Korea Internet & Security Agencys program. Industry support for the program comes from entities including Microsoft and Symantec. Its goals are to identify online threats and to learn how to avoid them. The posting of personal information, defamation and cyber bullying are recognised as the major online threats for children and students.72

Figure 3. You dont know that you are well-known. Be careful for personal data leaks. Winning poster entry 2008, by Wakana Kato, Grade 3 Toki Commercial High school, Gifu Japan. Response From Industry: NTT DoCoMo One objective of the Japan Safer Internet Program is to promote media literacy through third-party organisations. Evidence points to the fact that industry has met and exceeded the legislated minimum requirements to provide filters and educational initiatives73. One example is NTT DoCoMos Mobile Phone Classroom. Since it began in 2004, nearly one and a half million people have attended 9,200 workshops.74 The company has also converted the session to DVD to send to homes. Topics include safe mobile use, parental controls and online abuse. The Docomo Family Safety Hotline responds to questions and concerns regarding mobile phone use by children, including questions about potential trouble, phone etiquette and appropriate billing plans. The hotline received some 60,000 inquires in 2008.75

72 73 74 75

dataoecd/0/12/43512000.pdf viewed 3/15/10


Ibid. p. 52.

Yamada, A. (2009), Information-technology Promotion Agency, Japan Initiatives on Awareness Raising of Students / Children, at APEC-OECD Joint Symposium on Initiatives among Member Economies Promoting Safer Internet Environment for Children, April 15, 2009, available at http://www.oecd.org/ ACMA (2009), Online Risk and Safety in the Digital Economy op. cit. p. 51. NTT DOCOMO (2010), Addressing the impact on children, http://www.nttdocomo.co.jp/english/corporate/csr/report/safe_secure/social/kids/index.html,

accessed 3/15/10

25

2.1.2. New Zealands Digital Strategy


Background In 2005, the New Zealand government released the draft of its first Digital Strategy, outlining the ways in which government ICT investment could be harnessed to realise New Zealands economic, social and cultural goals.76 Just two years after the original Digital Strategy was released, Ernie Newman, then-chief of the Telecommunications Users Association of New Zealand, warned that New Zealand was running dangerously behind. The international broadband train is accelerating at lightning speed with New Zealand swinging wildly from the back-door handle, and only by climbing aboard will we save ourselves, he said in 2007.77 New Zealand had considerable distance to cover. Broadband penetration and investment levels had put New Zealands ICT infrastructure into the bottom third of the OECD at 21st or 22nd out of 30 by 2005.78 The attempt to climb aboard came about with the review of the strategy in 200708, designed in part to strengthen the governance and coordination mechanisms of the existing strategy79; the country needed a step-change in broadband performance to meet the growing demand for bandwidth in the country.80 The development of Digital Strategy 2.0 was informed by the Digital Future Summit 2.0, over 40 pre-Summit stakeholder workshops and engagements with stakeholders, and submissions to the Draft Digital Strategy 2.0. The 2008 Digital Strategy 2.0 plan is an attempt to co-ordinate programs that promote the countrys economic, technical and social performance online. Goals of the plan include several targets for 2012: 80 per cent of households will have 20 megabit per second broadband connections. Open access fibre networks will be operating in at least 15 cities. Fewer than 5 per cent of household computers should be infected with computer viruses or malware. Three-quarters of advertised ICT job vacancies should be filled, up from just over half in 2007. Teleworking will cut the number of commutes to work by car by 5 per cent from 2009. At the same time the New Zealand Government approved funding of the Digital Development Council, an overarching digital sector forum to create partnerships between industry, community and voluntary groups and users and provide a coordinated voice to government on digital issues.81 Additional funding was committed to an existing review of broadcasting regulation. In 2011, under a new National-led government, the Digital Strategy is in abeyance. Users who attempt to access www.digitalstrategy.govt.nz are automatically redirected to the Archive section of the Ministry of Economic Developments website where a notice informs them:

76 77 78 79 80 81

Cunliffe, D. (2004), Digital Strategy proposes new directions for ICT, Media Statement, 11 June 2004. Newman, E. (2007), Address to Telecommunications Summit, Auckland, 25 June 2007, p 5. Keown, J. (2007), Digital strategy gets rethink, The Independent Financial Review, 27 June 2007, p 6. Cunliffe, D. (2007a), Digital Summit 2.0 Keynote speech at the Digital Summit 2.0, 28 November 2007. Cunliffe, D. (2007b), Address to 8th Annual Telecommunications and ICT Summit by the New Zealand Minister for Communications, Fast-forward to the Broadband future, Hyatt Regency, Auckland 25 June 2007. New Zealand Ministry of Economic Development (2008), The Digital Strategy 2.0, p 8. New Zealand Government (2008), New digital super group announced, Media Statement, 25 March 2008.

26

The Minister for Communications and Information Technology has decided that Digital Strategy 2.0 will remain in place as the outcomes remain relevant in the current climate. The actions that support the outcomes in the strategy are under review and some will change to reflect the new governments priorities. Announcements regarding changes to the actions will be made by the responsible Ministers, or you can contact the relevant agencies directly for updates.82 As the above notice states, the Digital Strategy has not been retracted or replaced. It remains in place as a statement of desired outcomes. Yet many of the projects it highlights as a means of achieving those outcomes have been discontinued. For example, the Digital Development Council83 and the broadcasting regulatory review84 were both discontinued in 2009. While the future of the Digital Strategy remains unclear, it remains as the best insight into New Zealands ICT goals. Confidence: An Integrated Response The New Zealand government acknowledges that more online access increases childrens vulnerability to online dangers, illegal content and bullying. New Zealands strategy looks at the risk of harmful content on children alongside other online threats to adults, including malware, viruses, identity fraud and security attacks. The strategy aims for a universal understanding of online safety, security and privacy issues. The strategy extended funding to two national programs, NetSafe through the Ministry of Education and the Digital Child Exploitation Filtering System through the Department of Internal Affairs. Consistent with the rollback of Digital Strategy initiatives, the additional funding to NetSafe was withdrawn after the 2008/2009 financial year.85 However, the Ministry of Education remains a strategic partner of NetSafe, and continues to provide some funding. Case Study: Hectors World NetSafe is an independent not-for-profit organisation that promotes confident, safe, and responsible use of cyberspace, with members drawn from across government, education, law, industry and the community.86

Figure 4. Screen grab of advice from NetSafe

82 83 84 85 86

http://www.med.govt.nz/templates/StandardSummary____43904.aspx
Saarinen, J. (2009) Govt kills Broadband Investment Fund and Digital Development Council, Computerworld, 5 February 2009. Joyce, S. and Coleman, J. (2009), Government concludes broadcasting regulatory review;, Media Statement, 7 April 2009. Ministry of Education (2010), Annual Report 2010, p. 109.

http://www.netsafe.org.nz

27

One of NetSafes most successful programs is the award-winning Hectors World87, a series of online products that help children to become confident users of the internet. Hectors World is produced by Hectors World Limited (HWL), a charitable subsidiary of NetSafe. Hector, a dolphin, and his sea creature pals learn net basics in the ocean. UKs Child Exploitation and Online Protection Centre have also launched it across primary schools in Britain88. In 2009, ACMA purchased a licence to the program through its Cybersmart website (www.cybersmart.gov.au).

Figure 5. Screen grab of Hectors World

Using Flash animation, the user is guided through a series of episodes. Each episode is accompanied by detailed lesson plans, homework suggestions, print-outs and colouring-in exercises. The aim is for students to learn to only give their personal details to people they can trust. The episodes emphasise listening to and acting upon any uneasy feelings, and the importance of trusting adults. Some of the messages for students include: stop and think for yourself before acting; bad websites can look like legitimate websites, and they can deliberately make terms and conditions difficult to understand; if something looks too good to be true, it probably is; not every person you meet online is trustworthy; young people can help other people by keeping an eye out for others online, and listen to your uneasy feelings.89 Hectors World is an example of a program that has shifted away from simple net safety ideas to a digital citizenship model (see Figure 6). This shift away from programs that only address online safety reflects the perceived benefits of providing adults and children alike with the skills to participate in the digital economy and to take responsibility for their own behaviour90.

87 88 89 90

http://www.hectorsworld.com; a good discussion of Hectors World can be read in ACMA (2009), Developments in internet filtering technologies and other measures for promoting online safety. Second annual report to the Minister for Broadband, Communications and the Digital Economy p4, available http://www.acma.gov.au/ webwr/_assets/main/lib310554/developments_in_internet_filters_2ndreport.pdf, viewed 3/16/10. ?pageID=294&sectionID=corporate&menuID=275, viewed 3/11/10. teacher_info_sheet_episodes_2_5.pdf, viewed 3/16/10.
ACMA (2009), op. cit. Hectors World Ltd (2008), NZs Hector Protector swims north to help UK children stay safe, Media release, 8 May 2008, http://www.netsafe.org.nz/keeping_safe.php Hectors World Lesson Plans for Episodes 2-5; Information for teachers of Years 0-6, available http://hectorsworld.netsafe.org.nz/wp-content/uploads/yrs_0_6_

28

Figure 6 Hectors Worlds digital citizenship model91

Digital citizenship combines three skills to create competency in participating safely and securely in the digital economy:92 Digital etiquette: displaying appropriate and responsible behaviour while online; Digital literacy: proficiency to access, understand, participate in or create online content; and Digital security: which involves securing ones own personal information We argue that the model of digital citizenship, covering the three key platforms of responsibility, literacy and security, could usefully be a part of Australian pedagogy, and would give a strong basis of confidence for children learning to engage online. Content and Access Regulation: An Opt-in filter In New Zealand, possession and publication of obscene material is covered under the Films, Videos, and Publications Classification Act 1993. It is illegal to possess, make, trade, distribute or display a publication deemed objectionable by the Classification Office.93 Classification decisions are made by the Classification Office only; the Court is not responsible for determining whether a publication is objectionable or not. Responsibility for policing what happens on the internet falls to the Department of Internal Affairs (DIA), which is primarily concerned with very serious offences such as material featuring the abuse of children. The Digital Child Exploitation Filtering System is now operating in New Zealand.94 It is a narrowly defined internet filter using Swedish software Netclean Whitebox. Website requests are filtered against a blacklist held on a central server in the government Censorship Compliance Unit. The list is maintained by the Independent Reference Group under the Department of Internal Affairs, which actively reviews banned URLs each month. The filter is not compulsory, but most ISPs have indicated that they will join it, although there is no public record of which ISPs are using the filter.

91 92 93 94

Hectors World Limited 2010. Evolution of Hectors World Learning Objectives 2009-2010, available http://hectorsworld.netsafe.org.nz/wp-content/uploads/hwl-

learning-objectives-diagram.pdfm, viewed 3/16/10.

ACMA (2009), op. cit. p 99. Films, Videos, and Publications Classification Act 1993, ss 131-132A. ONeill, R. (2010), New Zealands internet filter goes live, Computerworld, Stuff NZ, http://www.stuff.co.nz/technology/digital-living/3434754/New-Zealands-

internet-filter-goes-live, viewed 27/04/10.

29

According to the government, the list of sites that the system offers to block only includes child abuse materials; it is reviewed monthly, and has a clear appeals process outlined in a public Code of Practice.95 In the process of filtering, an internet users IP address is made anonymous, and data logs only kept for 30 days. As in Australia, the New Zealand Government characterises the filter as just one tool in making the net safer: The filtering system is a response to community expectations that the government and ISPs should do more to provide a safe internet environment. It is not a silver bullet that will prevent everyone from accessing any sites that might contain images of child sexual abuse, but it is another important tool in the Departments operations to fight the sexual abuse of children. Keith Manch New Zealand Internal Affairs Deputy Secretary, 16 July 2009.96 By focusing solely on child abuse material, the filter is designed to block only material that would be objectionable (and therefore illegal to possess) under the Films, Videos, and Publications Classification Act anyway. Any person who circumvents the filter to possess or trade publications that promote or support the sexual abuse of children will continue to be liable under this Act. It should also be noted that the filter covers only a subsection of objectionable material. Publications that promote or support other practices, such as bestiality and acts of torture are also objectionable but are not subject to the filter.97 The plan has attracted criticism from online rights groups like InternetNZ, fearing scope creep of the filter, and the overall effectiveness of the scheme, similar to criticisms of the proposed mandatory net filter in Australia: It risks leaving parents feeling that the Government is providing a safe environment, but it cannot deliver on that promise. The filter would only help at the margin, and child abuse material would still be available on the internet. The filter would disrupt the end-to-end connectivity that has made the internet the useful tool it is today. It creates some confidentiality concerns, and is not subject to all the usual lawful checks and balances that apply to all other parts of New Zealands censorship regime.98 The rollback of the Digital Strategy initiatives and the discontinuation of the Digital Development Council are, from the perspective of the principles which ground this report, regressive moves. While the current filter is optin and remains appropriately narrow in its scope, the focus of NZ government policy has moved away from a forward looking commitment to enhancing and resourcing user agency and digital citizenship. The extent of this policy shift remains to be seen.

2.1.3. Digital Britain


In Britain there is an historic split between telecommunications and broadcasting law. Reform and liberalisation led to the 1983 Telecommunications Bill, which allowed for the sale of British Telecom, and set up the Office of Telecommunications (OFTEL) in 1984. OFTEL began to gradually handle more and more aspects of the industry as technologies began to change: quality service for mobile networks, cable TV regulation, education and public access, data security and convergence more generally between TVs, telephone networks and computer networks. On the broadcasting front, with the BBC already in existence through royal charter, the Television Act of 1954 was the first legislative attempt at governing broadcasting. Subsequent broadcasting acts
95 96 97 98 New Zealand Department of Internal Affairs (2010), Digital Child Exploitation System Code of Practice, January 2010, available at http://www.dia.govt.nz/diawebsite nsf/wpg_URL/Services-Censorship-Compliance-Digital-Child-Exploitation-Filtering-System?OpenDocument, viewed 3/11/10 Manch, K. (Deputy Secretary, New Zealand Department of Internal Affairs)(2009), Web filter will focus solely on child sex abuse images, Media Statement, 16 July 2009 For the full list of characteristics of objectionable publications, see Films, Videos, and Publications Act 1993, s 3. InternetNZ (2010), Position Paper on Internet Filtering and the DIAs Digital Child Exploitation Filtering System, available http://internetnz.net.nz/media/media-

releases-2010/internetnz-rejects-centrally-operated-filtering-for-new-zealand viewed 15th March 2010.

30

were passed in 1973 and 1990, the latter of which created the Independent Television Commission. Two other agencies supervised compliance: the Broadcasting Complaints Commission and the Broadcasting Standards Council. Commercial radio was regulated separately by two different agencies: the Radio Authority, and the Radio Advertising Clearing Centre. Cable Television was regulated by the Cable Authority, created in 1984 as a result of the Cable and Broadcasting Act. The Cable Authority eventually merged with the Independent Television Commission. As a result, businesses with diverse broadcast and telecommunications interests had to report to several agencies, whose duties overlapped significantly. The complexity and redundancy of over-regulation, its cost to industry, and obsolete laws, forced the government to propose a unifying Communications Bill in 2003, which allowed for a converged regulator, Ofcom, to take responsibility for five agencies.99 Now, TV, radio, telecommunications and online content are all regulated under the one umbrella. The BBC, which remains subject to separate supervision from its governors, must now comply with Ofcom rules for the industry. The Communications Bill, however, does not supersede other relevant legislation in this area: the Wireless Telegraphy Act 2006, the Broadcasting Acts 1990 and 1996, and the Competition Act 1998. As the UK communications regulator, Ofcom oversees the wholesale and retail markets for all data networks. It also has a statutory duty to promote media literacy, and to manage online risks. Ofcoms annual communications market report100 shows Britains need for continual engagement in this rapidly changing sector. Thanks in large part to BBCs online iPlayer, which was receiving 70 million online requests per month in 2009, more than a quarter of households claimed to have watched TV programs online; that rises to a third of 15-24s.101 Catch up TV has been helped by the increased availability and take-up of broadband connections, wider access across computer platforms, heavy marketing campaigns, and distribution direct to television sets and gaming consoles. For example, Virgin Media, BT Vision and Tiscali TV now offer catch-up content directly through TV sets rather than through a computer. Digital Britain & Digital Economy Bill 2009-2010 Released in June 2009, Digital Britain102 was an attempt to synthesise and co-ordinate a consistent approach to Britains digital future, ensuring universal access to services and providing regulatory stability. The program outlines the liberalisation of spectrum for 3G, funding and investment for 3G networks, and plans to increase digital participation and improve digital skills, with a strong emphasis on self-regulation as a first practice. UK governance measures are also subject to and shaped by EU governance initiatives in this area, which are discussed in greater detail below. The former government sought to utilise Digital Britain as a means of promoting a range of industry efforts to increase trust and user confidence in the digital economy. Digital Britain does not explicitly outline online safety or user confidence programs. Rather, it reinstates support for changes made after the Byron Reviews Safer Children in a Digital World Report, which examined the effects of harmful content on young internet users. The Byron Review noted that a number of organisations work independently to develop and deploy online safety initiatives. It recommended one single approach. The UK Council for Child Internet Safety (UKCCIS) came into being in order to fit this brief: a stakeholder organisation with a focus on voluntary codes of conduct. The Council addresses online child exploitation through law enforcement, and education and awareness activities. Alongside UKCCIS, and part of the UK Police, is The Child Exploitation and Online Protection
99 For a more nuanced discussion of this shift, see Garca-Murillo, M. (2005), Regulatory responses to convergence: experiences from four countries, Journal of Policy, Regulation and Strategy for Telecommunications, 7, 1; p. 20.

100 Ofcom (2010), Communications Market Report 2010, available at http://stakeholders.ofcom.org.uk/market-data-research/market-data/communications-

market-reports/cmr10/uk/, accessed 4/04/11.

101 See the BBCs figures on iPlayer: http://blobfisk.com/wordpress/wp-content/uploads/2009/12/bbc_iplayer-01.PNG, viewed 2/4/2011 102 Department for Business Innovation & Skills, Department for Culture, Media & Sport (UK) 2009. Digital Britain, available http://www.culture.gov.uk/what_we_do/

broadcasting/5631.aspx/, viewed 3/18/2010.

31

(CEOP) Centre, dedicated to eradicating the sexual abuse of children. CEOP runs the ThinkUKnow website. Since 2006, ThinkUKnow has reached over 4 million children and young people. In 20082009 alone, over 3,500 local professionals and industry volunteers were trained. Increasingly, UKCCIP and CEOP team up to produce and support each others campaigns and even host each others events and websites. The controversial Digital Economy Act 2010103 implements aspects of Digital Britain. The Act: Extends the role of Ofcom to include reporting on communications infrastructure and media content; Imposes obligations on internet service providers to reduce online copyright infringement via a three strikes and youre out policy, cutting off or degrading persistent illegal file-sharers internet connections; Allows the Secretary of State to amend copyright legislation to the same end without parliamentary consent the hotly contested Clause 17 has since been defeated. Commits to giving courts the power to block websites that are infringing copyright; Extends the range of video games that are subject to age-related classification; With the passage into power of the new government in the UK however the Digital Britain scheme has taken a backseat, with portions of the agenda, including the web blocking clauses of the Act, under review. UK Digital Content Self-Regulation Internet Watch Foundation Unlike Australia, Britain does not have primary legislation covering offensive content on the internet. Content deemed illegal under the Obscene Publications Act and the Public Order Act of 1986, the relevant offline legislation, is also illegal in the online context, however there is no specific legislation relating to inappropriate content for adults in the online world. In its place, the UK encourages a self-regulatory approach, employing the services of the Internet Watch Foundation (IWF), established for the purpose of eliminating images of child pornography hosted anywhere in the world, as well as criminally obscene and criminally racist content hosted in the UK. The IWF works alongside law enforcement agencies worldwide and operates a notice and take down procedure in relation to content on UK sites and a list of international child abuse sites that ISPs can block at the network level. The Internet Services Providers Association (ISPA), the key UK industry association, indicates that all major ISPs as well as the majority of smaller providers, have implemented the database. A critique of the IWFs strengths and weaknesses forms the basis for a discussion about future regulatory models in Section 3 of this report. The work of the IWF is supported by other self-regulatory initiatives. The ISPA has developed its own Code of Practice to deal with the issue of inappropriate content, which all of its members are expected to adhere to. This Code of Practice mandates that members must comply with take-down notices issued by the IWF and requires ISPs to provide relevant user details to the police. ISPA also urges its members to provide sufficient information about filtering tools to all of their customers.

103 http://www.parliament.uk/briefingpapers/commons/lib/research/briefings/snha-05616.pdf

32

Other Internet Content Regulation The Good Practice Principles on Audiovisual Content Information104 were developed to ensure that consumers are able to make informed choices about the content they access in a fast-moving media environment. They were launched in February 2008. Participants include AOL, BBC, Bebo, BT, Channel 4, Five, Google, ITV, Microsoft, Virgin Media, and Yahoo! Five of the principles that providers sign up to are: Promoting and enabling media literacy through the provision of content information; Offering content information in order to empower users and allow them to make informed choices about the content that they and their families access/consume/watch; Offering information about content that may be harmful or offensive to the general public, and that may be unsuitable for children and young people. In particular, content information is designed to enable parents and carers to exercise supervision over the content viewed by those they are responsible for. Employing editorial policies that reflect the context in which their content is delivered. While the exact format of the information may vary from provider to provider according to context, providers aim to present it in a way that is easy to use and understand. Mobile Content Self-Regulation Mobile companies have set up three self-regulatory bodies to deal with digital content that is deployed over their networks. The PhonepayPlus Code of Practice covers premium content (similar to Australias Telephone Services (Mobile Premium Services) Determination 2005). The Independent Mobile Classification Body responsible for setting a Classification Framework for certain new forms of mobile Commercial Content against which content providers can self-classify and provide age-based access controls.105 The classifications include advice on violence, sex, nudity, language, drugs, horror, and imitable violent techniques. With the increased internet functionality of mobile phones, and because the Independent Mobile Classification Body does not encompass the mobile internet, the Mobile Broadband Group has developed and updated its own self-regulatory codes: UK code of practice for the self-regulation of new forms of content on mobiles.106 The Code covers new types of content, including visual content, mobile gaming, chat rooms and internet access. The Code admits that mobile operators have no control over the content that is offered on the internet and are therefore unable to insist that it is classified in accordance with the independent classification framework. Mobile operators therefore offer a filter to the mobile operators internet access service so that the internet content thus accessible is restricted. The filter is set at a level that is intended to filter out content approximately equivalent to commercial content with a classification of 18+.

104 http://www.audiovisualcontent.org/BSG%20Good%20Practice%20Principles%20on%20Audiovisual%20Content%20Information%20One%20

Year%20On%20March%202009.pdf, viewed 18/3/10.

105 Independent Mobile Classification Body (2005), IMCB Guide and Classification Framework for UK Mobile Operator Commercial Content Services, available at http://

www.imcb.org.uk/assets/documents/ClassificationFramework.pdf, viewed 18/3/10. com/documents/mbg_content_code_v2_100609.pdf, viewed 3/18/2010.

106 Mobile Broadband Group (2009), UK code of practice for the self-regulation of new forms of content on mobiles, available at http://www.mobilebroadbandgroup.

33

Case Study: Click Clever Click Safe As discussed above, the UK council for Child Internet Safety is responsible for the implementation of the Byron Report. It brings together over 140 organisations and individuals to help children and young people stay safe on the internet. It was launched by the Prime Minister on 29 September 2008 and is composed of companies, government departments and law enforcement agencies, charities, parenting groups, academic experts and others. In 2009, the Council released Click Clever Click Safe: The first UK Child Internet Safety Strategy.107

Figure 7 Click Clever Click Safe Digital Code

2.2. Regional and Transnational Efforts


A key recommendation of this report is that the Australian government works with national and international industry groups to ensure cooperation and alignment of policy and law in key international fora. The scope and nature of internet use, production and distribution makes its clear that no one nation can work productively in isolation when it comes to managing the risks and the opportunities of the convergent media environment. Increasingly, private sector organisations in sectors across the online, social and mobile media fields are transnational and are grappling with managing the differing policy and legal requirements in different countries while maintaining a global approach to content and service delivery. It is clear that the complex risks and opportunities that define the convergent media era require international collaboration between government and industry and the sharing of knowledge and research in the field. In this section we examine some examples of current international fora that provide a potentially productive avenue for these exchanges from an Australian perspective. We also recommend, however, that the Convergence Review includes consideration of how effective and appropriate the current international fora are in relation to international and industry based dialogue and makes recommendations.

2.2.1 The European Union


Practical efforts to regulate the internet supra-nationally in the EU began in the early 1990s. In 1994, the European Commission issued a paper calling for the creation of a regional regulative framework: Europes Way to the Information Society: An Action Plan. Two years later the European Consultative Commission on Racism and Xenophobia recommended that laws concerning illegal material in European Union member states be transposed to the online world, stating that: We hope the EU take all needed measures to prevent internet from becoming a vehicle for the incitement of racist hatred.108 This call was echoed in April 1996 by the then Irish Minister of State for European Affairs, Gay Mitchell, who encouraged the EU to investigate controls on the transmission of child pornography on the internet at a joint meeting to discuss the commercial exploitation of
107 Statistics derive from a DCSF Staying Safe Survey, 2009 , quoted in UKCCISS Click Clever Click Safe, the First UK Child Internet Safety Strategy, available at http://

www.education.gov.uk/publications/eOrderingDownload/Click-Clever_Click-Safe.pdf, accessed 4/04/11, p. 3. See the Click Clever Click Safe Website, available at http://clickcleverclicksafe.direct.gov.uk/index.html 108 (Reuters, 1996) in Yaman Akdeniz , The Regulation of Pornography and Child Pornography on the Internet, available online at http://www2.warwick.ac.uk/fac/soc/ law/elj/jilt/1997_1/akdeniz1/#a5.3 (accessed 13/09/10).

34

children organised by UNICEF and the Council of Europe.109 The EU Safer Internet Action Plan came into being as a four year action plan spanning 1999-2002, with a budget of 25 million Euros. The Safer Internet Action Plan was a three-pronged approached aimed at fostering: a favourable environment for the development of the internet industry by promoting safe use of the internet and combating illegal or harmful content.110 The three areas identified for action were: 1. The creation of a safer online environment through the establishment of a European network of hotlines and the encouragement of self-regulation and the use of codes of conduct. 2. The development of filtering tools. 3. Awareness-raising.111 The time period initially covered by the Plan was subsequently extended until the 31st of December 2004, with a corresponding increase in the budget of 13.3 million Euros. In 2005 the European Council established the Safer Internet Plus program, a follow-on initiative to the Action Plan that covered the years 2005-2008112. The plan was then re-extended for another four-year period covering 2009-2013. An Evaluation of the Program issued by the European Commission on the 3rd of November 2003 stressed that the Plan had had a positive impact in fostering networking and educating end-users about the safer use of the internet. In particular, the report concluded that: The programme has done a good job in producing a number of filtering software products although take-up of rating needs to be increased. Moreover, not all stakeholders agree that filtering is the best approach to child protection. At the policy level, the programme has been successful in putting the issues of developing a safer internet firmly on the agenda of the EU and the Member States; at action-line level, the Commission has instigated the development of a network of hotlines in Europe with associated members in the USA and Australia, funded research into tackling awarenessraising with end users, stimulated the development of filtering and supported the development of an international rating system; the programme has been successful in linking up stakeholders to produce a community of actors, although the Commission is disappointed by the lack of industry involvement as well as self-regulation organisations and consumer groups. In addition, the authors of the evaluation recommend extending the objectives of the programme to encompass new and emerging communication technologies (e.g. 3G mobile telephones) that will influence childrens use of the internet.113 The majority of these recommendations were taken up in subsequent programs, and the latest reincarnation of the Safer Internet Program has a budget of 55 million Euros is aimed at regulating not only illegal content but also harmful conduct such as grooming and bullying online114. The latest program also encompasses so-called Web 2.0 communication services, and is aimed at developing expert knowledge about existing and emerging uses, risks and consequences of online technologies for childrens lives, including the technical, psychological

109 Ibid. See also Feeley, Matthew J., EU Internet Regulation Policy: The Rise of Self-Regulation, 22 B. C. Intl & Comp. L. Rev. 164 (1999), available at http://heinonline.

org/HOL/Page?handle=hein.journals/bcic22&div=10&g_sent=1&collection=journals#170

110 European Union Legislation Summaries, http://europa.eu/legislation_summaries/information_society/l24190_en.htm (accessed 22/09/10), see also Electronic Frontiers Australia, available at http://www.efa.org.au/Issues/Censor/cens3.html (accessed 13/09/10). 111 European Union Legislation Summaries, http://europa.eu/legislation_summaries/information_society/l24190_en.htm (accessed 22/09/10) 112 Decision No 854/2005/EC of the European Parliament and of the Council of 11 May 2005 establishing a multiannual Community Program promoting safer use of the Internet and new online technologies, available at http://eur-lex.europa.eu (accessed 14/09/10). 113 COM (2006) 663, available at http://europa.eu/legislation_summaries/information_society/l24190_en.htm (accessed 24/09/10). 114 See the European Union Safer Internet Program Factsheet, available at http://ec.europa.eu/information_society/doc/factsheets/018-safer-internet.pdf (accessed

24/09/10).

35

and sociological aspects of online-related child sexual abuse.115 The program co-funds educative and selfregulatory initiatives, bringing together researchers at the European regional level, as well as providing endusers with national contact points to enable the reporting of illegal content or conduct online.116 Since 2004 the European network of awareness centres have worked with the Internet Telecommunications Union to organise the Safer Internet Day117, an awareness-raising day involving European and non-European countries that now takes place annually. The European Union Audiovisual Media Services Directive (AVMSD) The AVMS Directive sets out how every EU government should regulate online TV-like content such as YouTube and is currently the primary set of regulations pertaining to internet content in the region. First developed in the early 1980s in response to satellite broadcasting, the shared European Union audiovisual policy was revised in 1997, 2007 and again in 2010, where it was named the AVMSD118. After entering into force on 19 December 2007, member states were given until 19 December 2009 to transpose the regulations into their own domestic legislation. The goals of the AVMSD Directive, as set out on the European Union Audiovisual Services policy website119, are: providing rules to shape technological developments creating a level playing field for emerging audiovisual media preserving cultural diversity protecting children and consumers safeguarding media pluralism combating racial and religious hatred guaranteeing the independence of national media regulators. The regulations aim to be supportive and flexible, in order to strike a balance between the protection of users and the development of new business opportunities. The AVMSD differentiates between linear and on-demand services, subjecting the latter to less onerous regulatory standards, as well as seeking to promote member states use of self- and/or co-regulatory measures while eschewing new licensing schemes. 120 The AVMSD factsheet states that through these measures national legislators are thus able to, choose more flexible regulatory arrangements where these enjoy stakeholder support, align with their national legal systems and promise effective enforcement.121 The AVMSD are supplemented by the 1998 and 2006 Recommendations on the protection of minors and human dignity122. Amongst other things, the Recommendations call on industry to develop positive measures, such as harmonisation through cooperation and the exchange of best practices between the regulatory, self-regulatory and co-regulatory bodies of the Member States and to consider the possibility of creating filters for harmful content and instituting content labelling systems for material distributed online. The recommendations also urge member states to encourage media literacy and responsible use of the internet amongst children, and to implement complaints- based or remedial systems in regards to the distribution of harmful or illegal content.
115 Europeans Information Society Website, available at http://ec.europa.eu/information_society/activities/sip/policy/programme/index_en.htm (accessed 24/09/10) 116 European Union Safer Internet Program Factsheet, available at http://ec.europa.eu/information_society/doc/factsheets/018-safer-internet.pdf (accessed 24/09/10). 117 See ITU and European Commission Joint Press Release, issued 10 February 2009, available at http://www.itu.int/newsroom/press_releases/2009/01.html (accessed

24/09/10).

118 See http://ec.europa.eu/avpolicy/reg/history/index_en.htm, accessed 23/03/11. 119 Available at http://ec.europa.eu/avpolicy/reg/tvwf/index_en.htm, accessed 23/03/11. 120 See Modern Rules for Audiovisual Europe Factsheet, available at http://ec.europa.eu/avpolicy/docs/reg/avmsd/fact_sheet_en.pdf, accessed 24/03/11. 121 Ibid. 122 Recommendation of the European Parliament and of the Council of 20 December 2006 on the protection of minors and human dignity, available at http://eur-lex.

europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:32006H0952:EN:NOT, accessed 4/04/11.

36

2.2.2. The Internet Governance Forum (IGF)


The IGF is a forum for multi-stakeholder policy dialogue, created under the auspices of the UN Secretary General, in order to perform the mandate assigned to it at the World Summit on the Information Society (WSIS)123. The 2007 round of the IGF, organised by the EU, put child safety on the internet on its agenda for the first time under its theme: Promoting Cyber-Security and Trust. Experts representing the EUs Safer Internet Program attended in order to increase awareness of online safety. There is a regional body for the Asia-Pacific under the IGF mantle124.

2.2.3. The Organisation for Economic Co-operation and Development (OECD)


At the Seoul Ministerial Meeting on the Future of the Internet Economy in June 2008, officials acknowledged that governments, the private sector, civil society and the internet technical community need to work together to understand the impact of the internet on minors so as to enhance their protection and support when using the internet. They also called for increased cross-border co-operation of governments and enforcement authorities in the areas of improving cyber-security, combating spam, and protecting privacy, consumers and minors.125 Subsequent to the Seoul Declaration, the OECD Working Party on Information Security and Privacy (WPISP) implemented an initiative on the protection of children online aimed at: Enhancing mutual understanding of existing and planned policy approaches for the protection of children online.126 Exploring how international co-operation might be used to better protect minors on the internet. The initiative is divided into two phases: firstly, a joint symposium between APEC and the OECD that took place on the 15th of April 2009 in Singapore127, and secondly, a report issued by the OECD that will be publicly available in 2011, that will include: an overview of the nature of risks faced by children online, an overview of policy responses to address these risks, a policy analysis of commonalities and differences in approaches to protecting children online and a discussion of possible avenues to reduce gaps and increase cooperation.128 The OECD does not address child pornography in general but rather focuses on protecting children who use the internet. It does however seek input from the Council of Europe on these matters.

Conclusion
In this section we have mapped other national approaches to managing convergent media content in comparable developed countries. Our analysis suggests that the most effective approaches to media content governance involve government working cooperatively with industry to ensure voluntary regulation and governance of content and to give users options and information. It is also clear that a comprehensive and effective national
123 See IGF Website, available at http://www.intgovforum.org/cms/aboutigf (accessed 23/09/10). 124 See http://rigf.asia/asia-pacific-regional-igf-aprigf-roundtable-agenda/#aprigf (accessed 24/09/10). 125 See OECD Website, available at http://www.oecd.org/document/59/0,3343,en_2649_34255_44096251_1_1_1_1,00.html (accessed 23/09/10). 126 Ibid. 127 APEC-OECD Joint Symposium on Initiatives among Member Economies Promoting Safer Internet Environment for Children, available at

http://www.oecd.org/document/17/0,3746,en_2649_34255_43301457_1_1_1_1,00.html, accessed 4 April, 2011.

128 Ibid.

37

strategy for managing media content must include funding to promote user confidence and digital literacy. In this section we also considered current transnational efforts to provide a fora for government and industry to work together in a global context. Given the global nature of the internet, it is clear that international cooperation will play an increasingly vital role in governance.

38

section 3: towArds A new Policy FrAmework


Introduction
The challenge of regulating media content in the convergent era provides a critical opportunity to examine the rationale behind regulation and to engage key stakeholders in a dialogue that centres on the purpose, scope and future of media content governance. Sections 1 and 2 of this report have shown that Australia is now facing a host of challenges with emergent online and mobile technologies. The purpose of this section is to identify flexible, adaptable and pragmatic principles that could underwrite a media governance system purpose-built for the 21st century. Australia, as elsewhere in the world, has moved from old generation phone networks to next generation networks. Convergence is driving this change - facilitated by the transition from analogue to digital, voice to data, narrowband to broadband, circuit-switched to packet-switched, one way to interactive, scarcity to abundance.129 Australias content regulation policies currently fail to acknowledge all of these changes, and as outlined in Section 1 and 2, the framework for dealing with this transition is complex. When ACMA was formed in 2005 some pressing questions were addressed, however others remain unanswered. For instance, in a convergent media environment, who should regulate all platforms and content? Should platforms that have the capacity to effectively substitute one for the other be subject to the same modes of regulation and governance? Moreover, what about IPTV or mobile TV? Should mobile TV regulation be as strict as that of broadcasting, or as lenient as it is with regards to the internet? Can a deal be struck in order to accommodate the diverse nature of media platforms and content? ACMA acknowledges these questions, identifying its main future challenge as, regulating in a multi-provider, multi-choice environment where existing regulatory tools were developed in a single-provider/single-service model.130 The shift to Next Generation Networks in Australia most notably to the National Broadband Network forces Australia to identify and apply new approaches to governance sooner rather than later. Most critically, regulatory bodies must recognise the critical role that active online communities now play in producing and self-regulating online material and must determine what role industry and government can play in facilitating and supporting this end-user agency. As we noted in our introduction, Australia is moving away from the legacies of a vertical media environment, to a horizontal environment where it is critical to pay attention to the different roles played by networks, platforms and content providers. At the network layer, following international best practice in this area, policy makers should focus on ensuring network openness, innovation and user choice. At the platform and content provider layers, government should work with industry and users, including in global fora, to encourage self-regulation. The challenges of content regulation do not simply arise within our own national borders. In a connected world, where transnational ITC companies such as eBay, Amazon, Yahoo!, Microsoft, Google and Facebook offer services to billions of internet users within their proprietary platforms, the challenges of content regulation multiply. Russian-designed websites such as Chatroulette.com that are hosted on servers beyond Australian jurisdiction demand an active engagement with private companies, both in Australia and abroad. More than ever, Australian content regulation must be viewed in the context of international discussions about internet and media content governance. For this reason, throughout this report we have preferred the term governance to regulation, as the former indicates the importance of policy makers considering the full range of policy instruments at their disposal.

129 ITU & infoDev (2010), New Technologies and Their Impact on Regulation, ICT Regulation Toolkit, available at http://www.ictregulationtoolkit.org/en/

section.3118.html, viewed 27/04/2010.

130 Sutherland, Kevin (2009) International Training Program Next Generation Networks: communications convergence & regulatory challenges, ACMA Melbourne 30 November 2009, available, http://www.acma.gov.au/webwr/_assets/main/lib311250/next_generation_networks.pdf, 21/04/10.

39

Our objective in this section is to explore the opportunity for a coherent content governance scheme that is flexible and responsive as well as pragmatic. Our model is grounded in an adaptive rather than a clean slate approach, given that the latter represents an ideal rather than a pragmatic model. In preparing this section we hosted a roundtable discussion in order to hear expert commentary from Dr Peter Chen, from the University of Sydney, Peter Leonard, from the law firm Gilbert and Tobin, and David Simon, a former member of the Classification Board, and drew on a comprehensive international review of relevant literature. This section will cover the guiding principles that we argue should underpin media content governance, look at redefining classification and content to accommodate rapid change, and explore the road to building Australias capacity to engage in a constructive dialogue with internet companies such as Twitter, Facebook, eBay and Google. We conclude this section with our recommendations.

3.1. Clear Objectives And Guiding Principles


3.1.1 Who Is Responsible For Managing Convergent Media Content?
Questions as to who exactly should preside over internet governance are most often raised in relation to discussions of the internet as a global networked space.131 At present there is no one international body with jurisdictional reach sufficient to oversee policy in this global space. Governing convergent media is a complex task that must necessarily take the full spectrum of engagement with the internet into account, from children to parents, from the end-user to the wider community, to government and industry cooperation. The following section will address the question of agency and governance by analysing the advantages and disadvantages associated with content governance through three archetypal lenses: those of government, industry, and the individual user. Ultimately, for reasons of practicality, enforceability, transparency and accountability, we argue that each of the stakeholders plays a concrete and indispensable role in this new convergent media environment, requiring the development of detailed public law and policy that can facilitate collaboration between all three stakeholders. We begin this section by canvassing some of the conventional arguments for and against the role of government, industry and user groups in media content regulation and governance, as a prelude to discussing the principles that should guide regulation and governance in the future. Government The conventional argument for the role of national government in regulating media content is that government has the political mandate, the economic means and the infrastructure to cope with the procedural setup of regulatory frameworks, as well as the power to ensure the enforcement of laws via criminal sanctions, mandating that the state should be solely responsible for overseeing policies and laws in the media and communications domain.132 Yet in the convergent media environment it is clear that no one nation state can take unilateral action on a global scale, and the creation of multiple national regulatory frameworks in the international sphere could lead to users being faced with conflicting and contradictory regulations. Nation state governments clearly have a remit to enforce the laws of their country and to protect public policy priorities when it comes to cultural and social parameters. Their ability to enforce this remit is restricted due to the sheer volume of media content as well as the decentralisation and vast number of media producers.

131 See Mayer-Schonberger, Viktor (2003) The Shape of Governance: Analyzing the World of Internet Regulation, 43 Virginia Journal of International Law: 605-673, p. 611. 132 Ibid, p. 612.

40

For most nation states, internet governance involves balancing the risks and opportunities of networked media. This balancing act involves managing the perceived harms of material deemed to harm the community or national interest with the economic, educational and socio-cultural opportunities provided by access to information and services offered through convergent media. When a given nation state increases regulation to balance these risks and opportunities, industry groups can often respond by moving to a state with different laws, thereby evading regulation and highlighting the difficulty for the nation-state to act as unilateral regulator on the global stage. As Mayer-Schonberger puts it, in the internet economy, the market incentives are tilted against the states and their enforcement efforts.133 If the nation-state is the ultimate authority for the regulation, arbitration and enforcement of internet activity within its own sovereign jurisdiction, the logical corollary is that it in fact has no jurisdiction to regulate activity outside of its own borders. International law does recognise that in some limited circumstances a state may exercise jurisdiction extraterritorially, for instance, over its own nationals when they are outside the states territory.134 Furthermore, in cases of extreme crimes, such as incitement to genocide or the distribution of child pornography, the principle of universal jurisdiction allows any state to exert jurisdiction over a perpetrator residing in their territory regardless of their nationality or the location of their crime. However, neither the principle of extraterritoriality nor that of universality provide a state with jurisdiction to regulate content hosted offshore simply because that state deems it offensive or culturally harmful. Legal theorist James Boyle argues that the technology of the medium, the geographical distribution of its users and the nature of its content all make the net especially resistant to state regulation. The state is too big, too slow, too geographically and technically limited to regulate a global citizenrys fleeting interactions over a mercurial medium.135 While the state is necessarily limited in its jurisdiction to regulate global flows of content, it must still formulate policies at the domestic level. For instance, its role in apprehending and punishing those responsible for clear-cut cases of criminal behaviour online, such as the distribution of child pornography, is essential and undeniable. Furthermore, the state is the only legitimate stakeholder that is capable of contributing to the development of policies between other states on a bilateral or multilateral basis, and is therefore indispensable to the future of the convergent media landscape. As we note below, global internet companies also have an important role in ensuring international cooperation and a vested business interest in doing so. Clearly, nation state governments must retain a robust role in convergent media governance. However, it is a role that must be strengthened through collaboration with other nation states, users and global industry groups. Industry Industry itself plays an increasingly active role in empowering and educating users through the adoption of new technologies that focus on safety and privacy concerns. In the convergent environment it is important to note that the term industry collapses a diverse group who face different issues and who need to be understood in regulatory and policy terms distinct ways: telecommunications companies, internet service providers, platform providers and, in some cases, professional content providers. We are effectively moving from a vertical to a horizontal system of network layers in the convergent environment, and it is critical that we do not simply aggregate the existing legislation.

133 Ibid, p. 617. 134 The objective territorial principle was first introduced as a theory of extraterritorial jurisdiction by the Permanent Court of International Justice in The Case of the SS. Lotus. See Walter C. Dauterman Jr., Internet Regulation: Foreign Actors and Local Harms - at the Crossroads of Pornography, Hate Speech, and Freedom of Expression (2002) 28 North Carolina Journal of International Law & Commercial Regulation 177, p. 185 135 James Boyle, Foucault In Cyberspace: Surveillance, Sovereignty, and Hard-Wired Censors (1997) available online at http://www.law.duke.edu/boylesite/foucault.

htm (accessed 13/09/10).

41

There is a legitimate and key role for private companies to play alongside government, especially in areas where old-media models of content regulation do not hold. Platform providers, for example, have at their disposal the tools of exclusion, removal and referral to law enforcement authorities to encourage ethical online behaviour. Barring membership and participation to users who fail to abide by predetermined cultural norms is one popular tool, and while membership of popular online communities can be taken away, this does not necessarily prevent people from rejoining (sometimes using a different email address or name). The removal of harmful content impedes other users from inadvertently coming across it, and referral to law-enforcement agencies where conduct has clearly crossed into the criminal sphere strengthens industrys capacity to arbitrate and enforce. Large private companies engaged in platform or search engine provision in the online space equally have to be aware of managing their brands in relation to user communities and perceptions of how flexibly and transparently they support the needs and views of those communities. We see a clear need for companies such as Facebook, Google and Twitter to be cognisant and respectful of the desires for users to participate in the governance of their community spaces. The furore around privacy settings on Facebook in 2009-2010, when user preferences were not heeded, led to the creation of Quit Facebook Day on May 31, 2010. While it did not create any substantial decrease of Facebooks user numbers, it did increase awareness of the concerns about privacy setting. Concerns about privacy and security were further provoked when Google created its social networking and messaging tool, Goggle Buzz, which resulted in a complaint being filed to the Federal Trade Commission in the US. The FTC found Google in breach of its own privacy principles and has instructed that it undergo regular privacy audits for the next 20 years.136 How responsively and responsibly internet companies listen to user concerns and incorporate them into their own development and governance may underpin the success of business models in the convergent media environment in the future. In a transnational context, the legal obligations of private companies remain unclear on a range of legal and policy issues. This includes censorship and freedom of speech issues, and the past decade has seen concern grow regarding the way in which companies are balancing the goal of making money alongside the need to be ethical corporate citizens. A notable case was when Yahoo!, under coercion from the Chinese government to comply with strict local laws, revealed data including login times, corresponding IP addresses, and relevant email content from the Chinese journalist Shi Tao, leading to his conviction and imprisonment on a ten year sentence.137 Brian Israel argues it is not simply a moral problem for large technology companies, it is instead about business: This business quandary is the result of conflicting standards to which ICTs are simultaneously subject: the local regulations of authoritarian states, and a global standard informed by international human rights norms and societal expectations in the companies home markets.138 In some cases, social networking sites have responded to privacy concerns by making changes to the manner in which personal information can be used. This was demonstrated by Facebooks responses to the Office of the Privacy Commissioner of Canada. In 2009, the Commissioner investigated a series of allegations about Facebooks default privacy settings and the ways in which it uses data, in the context of Canadas Personal Information Protection and Electronic Documents Act. The Commissioner found Facebooks privacy information confusing: Facebooks account settings, for example, described how to deactivate a Facebook account but did not explain how to delete an account so that personal data is removed from Facebooks servers. The Commissioner also found problems with over-sharing of users personal information with third-party developers who create Facebook applications. Facebook complied with all of the Commissioners requests and made significant changes to its platform. Facebook now has a low, medium, or high privacy setting, giving more granular control to users. The company also introduced a per-object privacy tool, giving users control at the
136 FTC Charges Deceptive Privacy Practices in Googles Rollout of its Buzz Social Network (2011), http://ftc.gov/opa/2011/03/google.shtm (accessed April 2, 2011) 137 For an extensive survey of the ways in which Yahoo! and other companies cooperate with the Chinese Government, see Human Rights Watch (2002) Race to the Bottom: Corporate Complicity in Chinese Internet Censorship, Volume 18, No. 8(C), available http://www.hrw.org/en/reports/2006/08/09/race-bottom, viewed 24/04/10. 138 Ibid. p 619.

42

time of uploading or sharing. There is also a new privacy tour for new registrants.139 The new settings impacted all Facebook users, not just those in Canada. The investigation is a powerful example of how individual nations are reaching beyond their borders to regulate transnational ICT companies. The End-User There has been a shift away from the terminology of the consumer towards names such as users and digital citizens. These terms are most commonly applied in a general sense to mean people who use networked technologies, and partly it reflects the forms of highly active engagement that can occur in these spaces. The suggestion of passivity that comes along with consumers does not adequately reflect the way people engage in mobile and online environments. How can we better understand the roles of users, the kinds of actions can they take, and how they actively shape our media ecology? Rather than thinking about users as solitary individuals, seeking out content, they can be better understood as active agents within a participatory culture.140 Users are driving the public culture of the internet, evidenced in the growth of blogs, social media sites, video and photo sharing services, and within the comments structures of all mainstream news and discussion sites. This kind of everyday participation makes or breaks online communities and internet businesses, and it is an essential part of the contemporary media industry. This represents a significant disruption of previous media models, where the consumer received a finished product, at the end of an economic chain of production, to become an active player in a dynamic cycle of ever-changing content. Users determine where and whether a community will develop online, and how long it will last. But the role of users goes far beyond simply joining up with services, accessing data and then commenting on whether it is suitable or offensive. Through their participation, they create normative language and behaviours, thus determining what will become the acceptable uses of an online space. Everything, from bonding and discussion, to fights, criticising and trolling, to creating content, downloading, and simply listening to other users, create a current of activity that eventually shapes online engagement for other participants.141 This process is one that needs to be taken seriously by media regulators. Indeed, users can be considered as the most essential part of online governance. At a basic level, they can self-regulate within their own communities, using systems such as self-rating, reporting and inbuilt complaint mechanisms, such as those seen on Facebook or YouTube. These kinds of self-regulation mechanisms are also capable of crossing borders where state regulation cannot. Critical thinking by user-groups is arguably the most effective protection available against the proliferation of harmful content. But beyond these basic functions, and more importantly, users play a vital role in determining what our convergent environments look like and how they function. From the normative effects produced by user communities to the capacity for creating thriving spaces of human interaction, users are at the heart of why things work or become non-functional. We argue that the next stage of media governance must carve out a much larger role for users at a national and supra-national level, and they should be included within all the key bodies that consider media content.

139 Stoddart, J. (Canadian Privacy Commissioner)(2009) Press Release: Facebook agrees to address Privacy Commissioners concerns, available at

http://www.priv.gc.ca/media/nr-c/2009/nr-c_090827_e.cfm, viewed 27/04/2010.

140 Jenkins, H (2006) Convergence Culture: Where Old and New Media Collide. New York: New York University Press. 141 Burgess, J and Green, J (2009) YouTube: Digital Media and Society Series. Cambridge: Polity Press.

43

Which Mode of Governance is Preferable? Given the current complexity of the convergent media landscape, it is clear that reliance on any one stakeholder group alone is insufficient when it come to governance of the online environment. Pitting each regulatory extreme against the other achieves very little. Contemporary media users exercise an unprecedented level of choice and control over the content they consume and, indeed, are frequently sources of content themselves. This digital literacy is shaping media users into active media citizens who expect industry and government to consult with and inform them about risks and opportunities of media platforms and content. Media users are a stakeholder group who have been insufficiently recognised in the conventional regulatory framework for managing media content. A critical issue that was raised earlier when considering the efficacy of purely government-based forms of regulation is the practical capacity of government agencies to regulate user-generated content. To give but one example: 24 hours of video are uploaded to the internationally available site YouTube every minute of every day, and the site is home to two billion views per day.142 The amount of material generated and viewed some of it ephemeral is clearly beyond the capacity of any national or international regulatory body to monitor and regulate in real-time. In practical terms, there are simply not enough people with hours in the day to monitor and flag the sheer volume of content created by users on a daily basis. YouTube, similarly to other platforms discussed below, has confronted this issue by adopting clear protocols concerning the type of content deemed appropriate for publication and actively enlisting its users to flag inappropriate content, which is then placed in a queue, reviewed and taken down, or where appropriate, notified to relevant authorities. In this feedback loop, users accept moral agency for abiding by the guidelines of the particular site and flagging inappropriate content, industry equally assumes responsibility for providing users with a complaints and flagging mechanism, and for outlining clear guidelines for interacting with the site. The dialogue initiated by industry and participated in by user-groups is then mediated by the third stakeholder, government, which assumes responsibility for acting when illegal content is referred on to them. A mixture of all three forms of governance supplements the disadvantages and weaknesses of each mode of regulation with the advantages of the other. The procedural success of these feedback loops, as seen on platforms such as YouTube and Facebook, demonstrates the potential for similar mechanisms to be employed on a larger scale as a means to encourage and facilitate interaction, transparency and accountability between user-groups, industry and government. In their current format, users are offered minimal power to contribute to the loop, however we argue strongly that this should change in the future, with users being given greater responsibility to act as responsible agents in the governance and mediation of the platforms in which they participate, as well as the overarching state rules of media governance. Industry, government and end-users can strengthen this combination of efforts by engaging in a dialogue in which each is equally a participant and a beneficiary. For instance, government and industry can work to increase the awareness and education of the user, and community and industry can in turn educate the government about community perceptions and ethical considerations. This quid pro quo relationship should be used to advance effective policy pursuits and to facilitate legitimacy and moral agency. In turn, transnational efforts to govern internet content should not subjugate this tripartite framework to its own whims, but rather should feed into the system as another tool in Australias policy armoury to inform users, to engage the industry, and to regulate harmful activity online.

142 Chapman, Glenn (2010) YouTube serving up two billion videos daily, AP, May 16.

44

3.2. Content & Classification


3.2.1 Redefining Content
The internet acts as a conduit for multiple kinds of media content to circulate. Some is originally created, while other content has been uploaded from television, radio, games, film, and other sources. Equally, content flows in the other direction: with TV shows, films and radio programs drawing content from the internet. Trying to think of media forms as discreet, contained silos to be treated differently becomes very difficult in this environment. Rather, it is a media ecology: a dynamic, complex system of media genres and spaces which are closely interlinked.143 If we consider the media ecology model, content is no longer defined by where it first appears. A radio program may be broadcast on the FM band, then made available as a podcast, someone may take a component of it and add it to an interview that appears on YouTube. Someone else may sample part of that podcast and include it on a track that appears on SoundCloud, which may then come full circle and be played on radio. This kind of platform shifting and reappropriation of content is very common. Content is no longer medium specific, and should not be defined as such. Until recently the vast majority of circulating media content was comprised of material produced by media professionals for commercial or public interest purposes. While domestic media technologies have been accessible to amateurs for many decades, the means of distribution were limited, particularly in the case of audiovisual material. In the last ten years, the equation has been inverted, and much more material is available online from non-professionals created for a wide variety of purposes ranging from updating friends about a family holiday to political advocacy. Now it makes little sense to understand media content as though it were produced and distributed by an elite group of identified producers who are relatively easy to regulate. In the 2009 report, Untangling the Net: The Scope of Content Caught By Mandatory Internet Filtering144, the authors argued that Australia should not transpose the current system of media content classification that already treats different media inconsistently to the online environment. The internet is not a singular medium, it is a whole new media environment that incorporates many media forms. This requires us to rethink how we regulate content, protect vulnerable groups and define the relationship between media consumers and media producers. Content regulation in Australia has developed over many decades in an ad hoc manner and has not been built on sufficient empirical evidence about actual media consumption and community attitudes to media use. As weve shown in Sections 1 and 2 of this report, media content has been narrowly defined under existing law, and doesnt take into account the new and emerging media ecology of dynamic content. The current Federal Government has inherited a complex and inconsistent system for the regulation of networked media content.

143 Fuller, Matthew (2005). Media Ecologies: Materialist Energies in Technoculture. Cambridge, MA: MIT Press. 144 Lumby, C., Green, L. and Hartley, John (2009), Untangling The Net: The Scope of Content Caught By Mandatory Internet Filtering. http://jmrc.arts.unsw.edu.au/jmrc-

public-reports-and-submissions/-untangling-the-net/.

45

3.2.2 Rethinking Classification


In thinking through approaches to classification it is important to bear in mind that the classification laws and guidelines have been developed over many years with reference to traditional media content and its distribution. Changes have been made to the definitions and to the regulatory scheme as new media technologies have evolved. However, the internet must be understood as part of a complex media ecology in which conventional notions of media content and distribution are being radically challenged. It is for this reason that we welcome the Australian Law Reform Commissions review of the National Classification Scheme. We offer the following analysis as a contribution to this timely exercise. At present, the current Federal Government has renewed its commitment to using the Refused Classification category as a framework for filtering both locally and internationally hosted content. Australias National Classification Code guidelines are extremely broad, and the RC category is one the courts are inclined to leave to the discretion of the Classification Board and the Review Board. This raises issues of genuine public interest concern when one considers an online environment in which the range, scope and purpose for which material is produced and consumed is far wider than that encountered in media produced by professional media organisations for information and entertainment purposes. Adding weight to this concern is that the Classification Act states that when classifying material the persons or class of persons to or amongst whom it is published or intended or likely to be published must be taken into account. This provision in the Classification Act reflects a set of assumptions about the way material is generated and consumed which maps onto the traditional media environment, but which has far less bearing in an era where material migrates rapidly across many contexts and where an enormous amount of content is generated by consumers themselves. Refused Classification material is a porous category, argues Peter Leonard, a partner at Gilbert and Tobin, and it is subject to political whim and scope creep. RC is broadly defined by the National Classification Code (NCC) as covering items, including those that: offend against the standards of morality, decency and propriety generally accepted by reasonable adults. In this scenario, it is only the opinion of the classification board that is taken into consideration and what constitutes standards of morality, decency and propriety are not properly defined. Moreover, much of the material deemed RC in Australia would not be refused classification in other Western democratic liberal countries. The legislation does not offer an exhaustive list of RC material. The majority of material listed as falling within the RC category is not illegal to possess or access under Federal law, or under State or Territory laws, and some material which would be deemed RC if contained on the internet would not receive the same categorisation if contained in other media such as in printed material. The RC category is a classification category developed for a media era in which media content was produced by the few for the many. It is a manifestly inadequate category to govern or manage the range of media content and the purposes for which it is produced, consumed and distributed in the convergent media era. The RC category catches a wide range of material that may well have been produced and used in ways that meet community and public interest standards in the convergent media era and we welcome the current review of Classification Scheme.

46

3.2.3. Community Standards


So what content should be considered harmful and be filtered? What considerations need to be taken into account? There is no internationally agreed uniform definition of offensive content. In Australia, the parameters of social mores and community perceptions differ markedly across cities, rural and regional areas and across ethnic and religious groups. Whilst sexually explicit material is certainly not the only category of material capable of offending against community standards, it is the category that often attracts the most public media attention and the category that is highest on the list of material submitted for classification by bodies such as the Office of Film and Literature Classification (OFLC). Yet it is equally a category that exemplifies an apparent gap in Australian community standards and public policy when it comes to the application of classification laws and systems and public opinion. A 1992 empirical survey suggested that, having regard to community standards of morality, decency and propriety, the majority of Australian adults are not offended by films that primarily involve various forms of actual sexual activity, including close-ups, between consenting adults and in which there is no depiction of coercion or violence. The survey conducted by the OFLC titled Exploring Attitudes Towards Film, TV and Video Classification found that 70% of a representative sample of adults supported the availability of sexually explicit material to consenting adults (aged over eighteen).145 This study strongly suggests that reasonable Australian adults, whether they choose to consume sexually explicit material or not, are not offended by it. A 2006 survey of a representative sample of 1499 Australians by A C Nielsen found that 76% of adults (weighted to the Australian population) or 75% (weighted to the online population) agreed with the statement that: Films and videos primarily involving various forms of actual sex, including close-ups, involving consenting adults with no coercion or violence should be available (on a restricted basis) to people aged over 18 who wish to view or purchase it.146 The findings of recent and empirically sound Australian studies are also supported by similar research internationally. In particular, recent research undertaken in the United Kingdom, a common law country with a similar approach to censorship as Australia. A 2005 survey of public opinion commissioned and published by the British Board of Film Classification of more than 11,000 adults in the UK found that a clear majority thought the sex standards in the classification guidelines were about right147. The current UK guidelines permit the sale of sexually explicit films to adults in licensed sex shops. The respondents were also asked whether they agreed or disagreed with the statement that: People over the age of 18 have a right to see graphic portrayals of real sex in films and videos/DVDs. Nearly 30 per cent were neutral in their attitude (compared with 23 per cent in a 2000 survey by the same body) while exactly half agreed (compared with 46 per cent in 2000) and only 22 per cent disagreed (down from 31 per cent in 2000). There is also strong evidence that most Australians are far more concerned by portrayals of violence (sexual and non-sexual) in films than they are by sexually explicit material. In 1999, in a Roy Morgan survey, 46% of respondents cited depictions of violence as of concern in films and only 11 % nominated sexually explicit material. The 2006 Nielsen survey found that 71% of Australians supported adults being able to see films including violence, while 54% supported adults being able to see films that included sexual violence. In both cases, the figures are substantially lower than the percentage of people who support the availability of sexually explicit films (78%).

145 Potter, H. (1996), Pornography: Group Pressures and Individual Rights, Federation Press, Sydney, p. 85 146 A C Nielsen (2006), Film and Video Content, A C Nielsen, Sydney, p 5. 147 British Board of Film Classification (2005), Public Opinion and the BBFC Guidelines, London.

47

More research needs to be undertaken into community attitudes towards media content, particularly given the recent changes in its production, use and distribution. Our view of the history of regulation indicates that Australian media classification systems have not, to date, been built on sufficient empirical evidence about actual public attitudes or on evidence about actual media consumer behaviour. We recommend further research as a critical part of developing new policy in this area. A key question that needs to be explored in rethinking media content governance is the issue of how criminal law intersects with other forms of governance. The issue needs to be understood both from the point of view of regulatory scope and of supporting and broadening stakeholder roles, given the key roles that industry and media user groups now play in potential notification. Child abuse material offers a useful if disturbing case study here. While it is imperative that the availability and distribution of child abuse material should be strictly prohibited and prosecuted, it isnt necessary to differentiate between child abuse materials distributed on the internet versus child abuse material distributed in any other place or medium. It is clear that the production and dissemination of most child pornography begins with a primary crime: a sexual assault on a child. Producing, distributing and consuming images of the assault compound the crime and are, rightly, considered criminal activities, and should be regarded as such regardless of where they are distributed. If we focus regulatory and public resources on our concern to over-regulate the presumed largest point of distribution, the internet, we risk pulling resources away from evidence-based strategies to prevent child abuse and identify and prosecute perpetrators and producers of this material. That is not, of course, to say that law enforcement resources should not be directed to identifying the online producers and consumers of such material. And clearly Australia needs a body, currently ACMA, that maintains and enforces a blacklist of child abuse related websites. We also need a whole of society approach to the problem of child abuse and material produced in its wake. If we build our media content policy around the worst case scenarios we risk building our media content governance policy on the basis of the lowest common denominator. We should, however, be strongly guided in policy by experienced law enforcers in the area and by social policy experts with evidenced-based knowledge in how to prevent the abuse of children. A key question for media content regulation online is to what extent we can and should rely on existing criminal laws to differentiate the content that is harmful and worthy of censorship from that which should be left to individual discretion. In exploring this question it is critical to acknowledge the role that active online communities who are facilitated by industry can play in notifying evidence of crime they encounter online. The UK Home Office has observed: It is important to distinguish between illegal material and material that is legal but which some would find offensive. Self-regulation is an appropriate tool to address the latter. Dealing with illegal material is a matter for the courts and the law enforcement agencies.148 Peter Leonard argues a better balance must be struck between the criminal codes and the classification codes: Ultimately this has to be solved by finding the middle ground between what is currently RC and what is criminal.

148 House of Lords, Select Committee on Science and Technology (1996) Information Society: Agenda for Action in the UK, Session 1995-96, 5th Report, London: HMSO, 23 July 1996, available at http://www.parliament.the-stationery-office.co.uk/pa/ld199596/ldselect/inforsoc/inforsoc.htm, para. 4.163

48

3.3. A Research-Led Approach


We argue that an understanding of online risk must be balanced with an appreciation of the opportunities that our new convergent media environment provides. The UK Byron Review clearly states that public policy and regulation that is genuinely and empirically grounded in an ethic of care for children and young people will fail if it relies too heavily on a simplistic block and control strategy. It also strongly suggests that getting the balance right between regulation and the education of parents and young people about safe internet use is critical when it comes to the overall effectiveness of a protection strategy. In broader terms, focusing too much public attention and government policy on filtering material detracts from the need to promote and propagate the use of the internet much more widely, for purposes including education, science, journalism, imaginative work, health and community-building. The internet offers new opportunities for innovation, entrepreneurship and the growth of knowledge, and it extends these opportunities across the population. The Australian Communications Consumer Action Network (ACCAN) argues the first step is to conduct well-resourced empirical research on determining what sorts of skills Australians need to become empowered consumers in the communications and media market149. ACCAN suggests looking at the Ofcom pyramid model of digital skills, with the base being representing access to broadband, which is then built upon by digital life skills (ability to acquire and develop digital skills needed for employment and beyond), and media literacy representing the top tier level (ability to use, understand and create media and communications).150 Recognising that digital media literacy is a key concern when it comes to regulation, ACMA has worked since 2007 on conducting scholarly research. While a growing clearing house of research and information hosted by ACMA has come into existence151, the regulators own audit of services around the country shows a fragmented approach, state-by-state, to the issues of digital participation.152 Australia is certainly well behind its EU counterparts in commissioning and acting on research into the social, cultural and economic impact of the convergent media environment. While there has been a great deal of expert attention focused on the network components of delivery, in Australia few resources have been made available to enable researchers to assist both government and industry to understand and act on the challenges of the convergent media era. We strongly recommend that government and industry work together in the Australian context to promote relevant and responsive research into community attitudes to media content and into emerging media use, distribution and production practices. As technologies and platforms evolve, new challenges will continue to emerge.

149 Australian Communications Consumer Action Network (ACCAN) (2009), Future Consumer: Emerging Consumer Issues in telecommunications and Convergent Communications and Media, p 20. 150 Ibid, p 21. 151 http://www.acma.gov.au/WEB/STANDARD/pc=PC_311474 152 ACMA (2009), Adult digital media literacy needs (August 2009) available at http://www.acma.gov.au/webwr/_assets/main/lib310665/adult_digital_media_

literacy_needs_research.pdf, viewed 23/04/10

49

section 4: conclusion
Contemporary networked media are all now part of a complex ecology which draws together previously disparate platforms and participants, including governments, media industries and an international community of users. Online content is highly dynamic; it crosses borders, constantly produced and consumed for an enormous variety of purposes, and the technologies and access points are in flux. In this environment, it is clear that media governance needs to be flexible, based on up-to-date research, grounded in international dialogue, and operating within an active dialogue and collaboration between the key players who constitute the networked environment. There is no question that media governance has become more complex. However, this report has argued that there is a clear path forward, based on equitable and effective principles. The inconsistencies of the current media regulation system need to be remedied. We can no longer think of media forms vertically: existing in individual silos such as television networks, radio, newspapers, film and so forth. Rather, we need to think across the shared horizontal levels across convergent media: networks, platforms and content. Content can be accessed on a multitude of devices, from mobile phones to tablets to laptops and internet-enabled games consoles and televisions. Policies need to be technology neutral in order to adapt and remain useful. Forms of media policy that offer the most flexibility and effectiveness for the 21st century will maximise opportunities for users to participate in various spaces, while also being able to filter content at their end point of the network. Users have more agency to shape convergent media environments, and should be welcomed into governance processes as full participants. Similarly, media and technology companies need to respect the needs of users as digital citizens, and to maximise their opportunities to have a say in the design of platforms, including privacy controls and transparency of how their data is used. New models of media governance will allow for both self-regulatory and co-regulatory frameworks, as we have seen in the various international examples in this report. Further, they need to emphasise the importance of media literacy, creativity and education. Encouraging users to develop their skills and knowledge will be a more effective basis for a thriving convergent environment than punitive top-down approaches, except in the case of clearly criminal behaviour. It is clear from international experience and research that network-level filters do not increase media literacy, nor do they create a perfectly safe internet. Rather, network filtering is an opaque system that is open to abuse, as well as giving a mistaken impression that all offensive or illegal content can be removed entirely. Frameworks for acceptable content are both more effective, encourage users to engage critically with the spaces they use, and avoid the chilling effects of total network filtering. In summary, media content governance in the 21st century needs to move away from the top-down approach that has dominated content regulation in the past to embrace a system grounded in co- and self-regulatory approaches, emphasising user agency and literacy. Government has a clear role in ensuring that industry groups commit to robust codes of practice and in promoting active industry and user engagement in policy development. Industry needs to resource and demonstrate a commitment to working collaboratively to give users a voice in how their data is managed, how platforms develop, and to notify inappropriate content. In an era when it is media users themselves who are creating and exchanging much of our media content, it is essential that they are recognised as full digital citizens and given a clear role in media content governance and policy.

50

4.1. Recommendations
The authors of this report make the following recommendations to develop: 1. The Creation Of A Convergent Media Board. The Board should be comprised of representatives from government, industry and user groups. The Board should have a broad remit: to consider social, cultural and regulatory issues in relation to convergent media content and to identify areas for further policy debate and research. It should not be charged with arbitrating individual complaints about convergent media content, as it will be separate from existing regulatory mechanism and bodies, including ACMA. The Boards role is to engage with emerging technologies and to track the issues, innovative potential and community concerns that arise regarding media content production and distribution. It is the body that identifies any gaps in a broad-scale self- and co-regulatory system. The Board would provide the essential linking forum where government, industry and user groups can work together. Finally, it will act as Australias centralised point of contact with international fora addressing media content governance. We also note the critical role that ACMA plays in media governance and the importance of ensuring it is adequately resourced to monitor complaints about media content and codes of industry practice. 2. A Full Review Of The Laws That Currently Regulate Media Content. Current media content regulation works in confusing ways across criminal codes, state laws and federal laws. These laws need to be reviewed to promote consistency across Australian states and take account of the convergent media environment. We argue there is a clear need for a national R18+ category for games and that research into community attitudes supports this move. On the issue of the Refused Classification (RC) category, we argue that its current framing is too broad and uncertain in its scope, and it should not to be used as a mechanism for filtering online material. We note that the Classification Act does not offer sufficiently detailed criteria for determining whether content is RC, and it is time that the category is given careful review. 3. Government Commitment To A Self-Regulatory Approach To Media Content Management, Including The Use Of Filters. The proposed mandatory internet filtering plan should be abandoned as the scope of filtered content is far too broad, opaque to the general community, and creates a false sense of security in the community rather than enhancing user agency and literacy. In the convergent media era government needs to commit to an approach to media content management that focuses on working collaboratively with industry to enhance the capacity of users to identify and notify inappropriate media content in a transparent system. 4. Industry Commitment To Codes Of Practice That Enhance User Agency. As the range of platforms for media content multiply it is critical that industry groups commit to updating codes of practice and that compliance with these codes is monitored by the Convergent Media Board in concert with ACMA. Further, industry should demonstrate a clear commitment to enhancing user capacity to identify and notify inappropriate content, to protecting user privacy, and to giving user communities a say in how their data is used and how platforms are managed.

51

5. The Funding Of Ongoing And Excellent Research. The federal government should work with industry to adequately fund expert research conducted by existing government entities and academic researchers to ensure Australia stays in touch with public attitudes, user behaviours and emerging technologies. 6. Support for media literacy education. Government and industry should work together to fund substantial programs in schools and in the community that ensure Australians have the skills and understanding to engage with convergent media with responsibility, knowledge and security. These programs should be based on research and designed in concert with educators. Media literacy should form part of a standardised national curriculum and include education about online security, ethics and literacy. 7. The building of national and international frameworks and links to ensure that government, industry and user groups have input into Australian policy and law making. In the convergent media environment, government, industry and media users need to work together and they require fora which ensure that their dialogue has concrete public policy and law outcomes. The Convergent Media Board should actively identify and promote links with relevant international fora and agencies, and work in concert with ACMA to ensure Australians are positioned to reap the opportunities of the convergent media era while minimising the risks.

52

APPendix one: the history oF AustrAliAs regulAtory environment


The primary pieces of legislation dealing with the regulation of telecommunications and broadcasting in Australia are the Telecommunications Act 1997, The Telecommunications (Consumer Protection and Service Standards) Act 1999 (regulating telephone sex services) and the Broadcasting Services Act 1992 (BSA). Schedule 5 of the BSA 1992 (establishing a comprehensive co-regulatory regime for stored internet content) was inserted by the Broadcasting Services Amendment (Online Services) Act 1999 (OSA). Schedule 7 of the BSA 1992 was inserted by the Communications Legislation Amendment (Content Services) Act 2007 (CSA) to extend content regulation to live streamed internet content devices and services that provided links to content. Australian communications law is grounded in a technically complex legislative and regulatory landscape that developed piecemeal as new technologies and content services came into being. Two key developments in regulating online and mobile media are the Online Services and the Content Services Acts, which are outlined below:

The Online Services Amendment Act 1999


In summary, the Online Services Act: Established a complaints-based system whereby complaints initiated by members of the public about prohibited content or potentially prohibited content would be investigated (defined in the amended BSA, Schedule 5, ss 10 and 11) Characterised prohibited content as being RC or X-rated material, or R-rated material if it was hosted within Australia and not subject to a restricted access system (Schedule 5, s 10(1) BSA 1992, as amended by the Online Services Act 1999) Applied to Internet Content Hosts and ISPs alike (s 3 and 8 respectively) but different standards applied to each Gave ACMA discretion to disregard vexatious or frivolous complaints Provided for the ABA (now ACMA) to issue a final take down notice to the ICH requiring that they remove the offensive material within 24 hours (s 30(1)) where prohibited content was being hosted within Australia. In relation to content hosted overseas, provided for the ABA to direct ISPs to carry out blocking measures in accordance with an industry specified code, or if no code was in place, ISPs were obliged to use all reasonable efforts to prevent access to content hosted offshore (a standard access prevention notice) (s 40(1)) Directed ACMA to have regard to the technical and commercial feasibility of taking the steps in determining what constituted reasonable efforts (s 40(2)(a)) Provided that an ISP did not need to block overseas hosted content if an ABA approved alternative access prevention arrangement was in place which provided a reasonably effective means of preventing access to prohibited content (s 40(4))

53

Gave the ABA the power to issue special take down notices or special access-prevention notices prohibiting ICHs from hosting and requiring ISPs to block any content that was the same, or substantially the same, as prohibited content identified in a prior take down notice (s 47) Required that industry codes be developed (s 60) Mandated that notices had to be complied with by no later than 6pm the next business day (ss 37(1)-(3)) Exempted ICHs and ISPs from liability for breach of State or Territory laws in regards to carrying offensive material if they were unaware of its presence (s 91(a) and (c)), and with respect to any requirement by a State or Territory that an ISP or ICH monitor, make inquiries about or keep records of internet content carried or hosted by them (s 91(b) and (d))

The Internet Industry Codes:


Came into force on the 1st of January 2000 Were developed by the Internet Industry Association (IIA) in consultation with internet industry groups and end users, and negotiated with the ABA Encouraged end user responsibility for filtering Encouraged education of children and other end users about the dangers of the internet Encouraged ISPs to use appropriate labelling systems (Content Code 1, cl 5.2(a); Content Code 3, cl 7.2(a)) Mandated that ISPs should provide appropriate filtering software to end users via a link to a download or via an installation disk (Content Code 2, cl 6.2) Mandated that ISPs should take all reasonable steps to ensure that end users are aged 18 years or over (Content Code 1, cl 5.1; Content Code 3, cl 7.1) Mandated that users should be provided with information on their rights and responsibilities under the Act (Content Code 1, cl 5.5; Content Code 3, cl 7.6) Mandated that procedures should be introduced to deal with pornographic SPAM (Content Code 1, cl 5.7; Content Code 3, cl 7.8)

54

Criticisms of the Act include153:


Liability was attached to internet service providers The concept of hosting, or of what exactly constitutes a host, was not adequately defined in the legislation It was onerous on ISPs to require them to ensure that content the subject of a take-down notice would never be hosted again. The Act defined Internet Service Provider very broadly and failed to restrict its meaning to persons providing services partially or wholly within Australia, which could mean that foreign ISPs could be held to be service providers for the purposes of the Act The provision for substituted notice within the Act (where an ISP is held to be liable for a breach of notice even where they werent aware of the take down notice but the ABA felt that they should have been) was particularly burdensome on ISPs By increasing regulation of Australian ISPs, the Act would potentially force providers and content producers offshore The Act did not address other legitimate community concerns such as the proliferation of hate speech, harassment and online gambling Regulatory decisions would fall to commercial filtering services from States such as the US whose beliefs and value systems were not synonymous with Australia A CSIRO Report determined that packet level blocking was indiscriminate, that it would inhibit e-commerce infrastructure, that it may affect other services, that routers could easily be circumvented, that it could complicate firewalls, that implementing packet-level blocking would be costly and that sites could easily be renumbered in order to circumvent blocking154 Ordinary email was excluded from the scope of the legislation Content standards deemed appropriate for television and film are not necessarily appropriate when applied to the internet The system is open to abuse and complaint flooding155 A whole site may be blocked rather than simply the offensive material hosted on the site The Act lacked procedural fairness

153 For more information: Arasaratnam, Niranjan, Brave New (Online) World, (2000) 23(1) University of New South Wales Law Journal 205, pp. 10-13; Chen, Peter; Pornography, Protection, Prevarication: The Politics of Internet Censorship, (2000) 23(1) University of New South Wales Law Journal 221, pp. 18-20; Scott, Brendan, Silver Bullets and Golden Egged Geese: A Cold Look at Internet Censorship, (2000) 23(1) University of New South Wales Law Journal 215; Chen, Peter, Regulating the Internet Censorship? Australias Internet Censorship Regime: History, Form and Future, 3 Macquarie Law Review (1999), pp. 121-142; Coroneos, Peter, Chapter 4 - Internet content policy and regulation in Australia, [2008] SydUPLawBk 10; in Brian Fitzgerald, Fuping Gao, Damien OBrien, Sampsung Xiaoxiang Shi (eds), Copyright Law, Digital Content and the Internet in the Asia-Pacific (2008) 49 154 CSIRO, Blocking Content on the Internet, June 1998, available at www.cmis.csiro.au/projects+sectors/blocking.pdf, accessed 25/03/10 155 Arasaratnam, Niranjan, Brave New (Online) World, (2000) 23(1) University of New South Wales Law Journal 205, pp. 10-13, p. 13

55

The Content Services Act 2007


Spurred on once more by growing technological trends, particularly live internet streaming and delivery via mobile handset, a new regulatory regime came into force in 2007. This new co-regulatory regime, implemented by the Communications Legislation Amendment (Content Services) Act 2007 (the CSA), synthesised the previously divergent telecommunications regime and online services regime in order to apply uniform standards regulating access to prohibited content across the two platforms. The CSA came into force on the 20th of January 2008, introducing a new Schedule 7 to the original BSA. The scheme imposes: a comprehensive, uniform regime on services that were previously regulated under the disparate regimes outlined above, and extends content regulation to live streamed internet content devices, and to services that provide links to content.156

The Communications Legislation Amendment (Content Services) Act 2007:


The CSA effectively covers all stored content not already regulated under another regime on the proviso that it has a relevant Australian connection (that is, that the content is hosted or originating in Australia). The CSA does not cover data storage and back up services, SMS services, news and current affairs services, broadcasting services, online trading services, voice and video calls with other end users and search engines. The Act inserted a new Schedule 7 into the Broadcasting Act 1992 (BSA) this new regulatory framework included: a prohibition on X18+ and RC content; a prohibition on R18+ content, unless subject to appropriate access restrictions; a new prohibition on commercial MA15+ content, unless subject to appropriate access restrictions; access restrictions to be put in place by providers of hosting services, live content services, link services and commercial content services if providing R18+ and commercial MA15+ content Complaints about the availability of prohibited or potentially prohibited content are first directed to the service provider, and then to ACMA if the complainant is unsatisfied with the service providers response. ACMA also has the power to initiate its own investigations The Act is grounded in a co-regulatory approach that provides for the development of industry codes to address issues including the classification of content, procedures for handling complaints about content and increasing awareness of potential safety issues associated with the use of content services.157 Commercial content service providers are required to arrange for a trained assessor where they consider the content to be either prohibited or potentially prohibited, requiring them to make a priori judgments about which material falls into this category

156 Ibid, p. 31.5 157 See ACMA Media Release, 21 December 2007, available at http://www.acma.gov.au/WEB/STANDARD/pc=PC_310907 (accessed 3/09/10).

56

Criticisms of the legislation have included:


Content developed by users could potentially place hosts in a position of liability where they have not determined in advance that content is prohibited or potentially prohibited, or provided age verification measures.158 It is difficult to envisage how to subject MA15+ and R18+material to age-verification procedures without the existence of a uniform age identifier, particularly in relation to user-content generated sites or sites where there are no procedures for such verification.159 Age verification online in an Australian context is particularly difficult in the absence of a universal social security identifier, leaving content providers to grapple with the practical and economic burden of verification. Complexity exists in identifying which services fall within the regime, in addition to the difficulty of differentiating between the various categories of services included in the regime.160 Content that is streamed from a country outside of Australia does not fall within the legislation and therefore eludes regulation.161

158 See Coroneos, Peter, Chapter 4 - Internet content policy and regulation in Australia, [2008] SydUPLawBk 10; in Brian Fitzgerald, Fuping Gao, Damien OBrien, Sampsung Xiaoxiang Shi (eds), Copyright Law, Digital Content and the Internet in the Asia-Pacific (2008) 49, p. 63 159 Ibid, p. 63 160 Lindsay D, Rodrick, S and de Zwart M, Regulating Internet and Convergent Mobile Content (2008) 58 Telecommunications Journal of Australia 31.1-31.29; pp. 31.7-31.8 161 Ibid, p. 31.9

57

58

Das könnte Ihnen auch gefallen