Arts Law’s AI Submission
In July 2023, Arts Law and NAVA (with the support of the Australian Society of Authors), conducted a survey to understand how artists and creators are using AI and the potential impacts on Australian creators. These results informed the submission made by Arts Law in August 2023 in response to the Department of Industry, Science and Resources (DISR) discussion paper ‘Safe and Responsible AI in Australia’. The discussion paper sought submissions on whether further governance mechanisms are required to mitigate risks from AI and increase public trust and confidence in its use. There was a focus on governance mechanisms to ensure AI is used safely and responsibly.
Arts Law’s submission focusses on the potential risks for artists and creators from AI that are not addressed at all in the DISR ‘Safe and responsible AI in Australia Discussion Paper’ (Discussion Paper), and existing regulatory approaches in Australia do not encourage or require responsible AI practices. Arts Law notes that the Discussion Paper does not consider the negative impact of AI on creators and creative organisations and communities, or the impact this technology will have on First Nations creators and Indigenous Cultural and Intellectual Property (ICIP).
While the Discussion Paper does not seek to consider all issues related to AI, for example intellectual property including copyright, Arts Law did refer our submission dated 7 March 2023 in response to the Attorney-General’s Department ‘Copyright Enforcement Review, Issues Paper, December 2022 and the ongoing obligation and burden placed on individual creators and copyright owners to enforce their copyright and moral rights. Similar issues arise with AI and enforcement when copyright and moral rights infringement arises – where again the individual creator and copyright owner must pursue the infringer and there is no regulator to step in on behalf of these rights holders. Accordingly, any consideration of governance mechanisms to ensure AI is used safely and responsibly should also include consideration of the impact and risks AI poses to creators and creative communities and organisations in Australia, and in particular the impact on First Nations creators and the protection of ICIP.
The submission also focusses on the survey results, which demonstrate that while almost 40% of the respondent creators are using AI tools as part of the creative process, there is still overwhelming concern about the threat generative AI poses to creative output, ownership of copyright material and employment prospects for creators. In fact, 64% of respondents think that generative AI is a threat to the creative professions, 48% are concerned that generative AI will affect the amount of money a creator can make from creative work (34% are not sure), and 51% are concerned that generative AI will be used to replace human creators. Some respondents commented that ‘some artforms will become less viable and others will be enabled as a result’ and that ‘transparency of use is essential to this tool not threatening creatives.’
On the issue of whether creators are seeing their work used on a generative AI platform without their permission we note that only 11% said yes, 28% said no, and 61% said they don’t know. Despite this there is a concern that AI will have a negative impact on Australian creators, and that where the copyright in their work is infringed, that they will not have the resources to legally pursue the copyright infringer. We highlighted some of the feedback from respondents, including one creator who stated: ‘You should be asking is there any point in becoming an artist, writer, illustrator, filmmaker, actors? … Since AI popped up it’s ugly head I haven’t had one brief or new freelance job offered in 6 months when normally I would have 10-20!’
Any regulatory action should mitigate the risks of AI to Australian creators and creative communities and provide low cost options for resolving copyright and moral rights infringement disputes, as discussed further in our submission to the Copyright Enforcement Review Issues paper. Alongside this should be additional funding should be provided to creative and arts peak bodies and organisations such as Arts Law and the National Association for the Visual Arts (NAVA), the Australian Society of Authors (ASA) and others, to develop ethical principles for AI in the creative sectors and to undertake professional development, develop resources and advocate for compliance with these ethical principles. This will also support the continued growth of Australian creators and creative organisations and communities and ensure detailed consideration the impact of AI and what is best for the creative and cultural economy and society as a whole. And any coordination of AI governance across government should be undertaken alongside Australia’s AI Ethics Principles. This could be introduced alongside the introduction of laws, like those in the UK, to create a statutory duty for regulators to have regard to the UK framework and principles for responsible development and use of AI. This same approach could be taken in Australia to create a statutory compliance with Australia’s AI Ethics Principles. Arts Law’s survey revealed that 93% of respondents support the introduction of either guidelines, regulations, a code of practice of legislative protections to regulate generative AI platforms. Arts Law noted that without the introduction of additional obligations such as AI Ethics Principles, a burden is placed on individual creators to enforce their rights under the Copyright Act as copyright owners.
Arts Law also submits that for creators and creative organisations and communities, transparency is most critical and valuable at any stage where copyright material is reproduced, communicated, or published without the permission of the copyright owner and without attribution provided to the creator. This will improve public trust and confidence that AI is complying with the Copyright Act. There should also be transparency where creative material is created by AI rather than an individual creator (ie replacing the creative work and role of a creator with AI).
Arts Law would also like to see a government initiative that funds creative and arts peak bodies and organisations such as Arts Law and the National Association for the Visual Arts (NAVA), the Australian Society of Authors (ASA) and others, to engage the community and deliver education on the importance of an ethical approach to the use of AI and a consideration of the impact on artists and creators. And if a risk-based approach to AI is take, it should take into consideration the risk to the culture and creative sectors.