Overview of the Bill
The Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024 (the Bill) amends the Broadcasting Services Act 1992 (the BSA),
and other Acts as relevant for consequential amendments and transitional provisions. It has three key objectives:
• to empower the Australian Communications and Media Authority (ACMA) to require digital communications platform providers take steps to manage the risk that misinformation and disinformation on digital communications platforms poses in Australia
• to increase transparency regarding the way in which digital communications platform providers manage misinformation and disinformation
• to empower users of digital communications platforms to identify and respond to misinformation and disinformation on digital communications platforms.
The Bill adds a new Schedule 9 to the BSA. It imposes core obligations on digital communications platform providers to:
• assess risks relating to misinformation and disinformation on their platforms, and publish a report of the outcomes of that assessment
• publish their policy or policy approach in relation to managing misinformation and disinformation
• publish a media literacy plan setting out the measures the provider will take to enable end-users of the platform to better identify misinformation and disinformation.
New Schedule 9 also empowers the ACMA to:
• obtain information and documents relating to misinformation and disinformation from digital communications platform providers
• make rules requiring digital communications platform providers to make and retain records relating to misinformation and disinformation, and to prepare reports consisting of information contained in those records
• approve and register enforceable misinformation codes that have been developed by sections of the digital platforms industry, setting out the measures those sections of the industry will take to reduce the risk of misinformation and disinformation
• in certain circumstances – for example, if misinformation codes do not adequately protect the Australian community from misinformation and disinformation, determine misinformation standards for sections of the digital platforms industry
• make rules requiring digital communications platform providers to implement and maintain a process for handling complaints and resolving disputes about misinformation and disinformation
• publish information relating to misinformation and disinformation.
Schedule 9 to the BSA defines misinformation and disinformation as the dissemination of content on a digital communications platform that, among other criteria, contains information that is reasonably verifiable as false, misleading or deceptive, and is reasonably likely to cause or contribute to serious harm of a specified type, with significant and far-reaching consequences for the Australian community (or a segment thereof) or severe consequences for an individual in Australia.
The types of serious harm are:
• harm to the operation or integrity of an Australian electoral process
• harm to public health in Australia
• vilification of a group in Australian society distinguished by race, religion, sex, sexual orientation, gender identity, intersex status, disability, nationality or national or ethnic origin, or vilification of an individual because of a belief that the individual is a member of such a group
• intentionally inflicted physical injury to an individual in Australia
• imminent damage to critical infrastructure or disruption of emergency services in Australia
• imminent harm to the Australian economy. The effect of misinformation and disinformation being defined in this way is that the entirety of Schedule 9 to the BSA – including the ACMA’s regulatory powers and the core transparency obligations imposed on digital communications platform providers – is aimed specifically at addressing the risk that verifiably false, misleading or deceptive content disseminated on digital communications platforms will cause or contribute to one of these types of harm. These harms align with the purposes for which international human rights law allows restrictions to be placed on the freedom of expression.
The measures provided for in Schedule 9 focus on systems and processes, rather than the regulation of individual pieces of content. In line with this intent, there is an explicit statement in Schedule 9 to the BSA that nothing therein – or in any rule, code or standard made, approved or determined pursuant to Schedule 9 to the BSA – can require digital communications platform providers to remove content or ban an account, except in the case of
disinformation that involves inauthentic behaviour.Schedule 9 to the BSA empowers the ACMA to enforce compliance with digital platform rules, approved misinformation codes or misinformation standards, and core transparency obligations. Enforcement mechanisms available to the ACMA include formal warnings, remedial directions, infringement notices and civil penalties.
https://parlinfo.aph.gov.au/parlInfo/search/display/display.w3p;query=Id%3A%22le...Vilification: abusively disparaging speech or writing.
Disparaging: expressing the
opinion that something is of little worth.
So it IS about policing opinions. And what IS 'inauthentic behaviour"?