7. Court user guidelines to support safe use of AI
Overview
•Guidelines can support court users to use AI safely in Victoria’s courts and VCAT.
•There are differences in AI guidelines for court users across Australia and internationally. The experience of other jurisdictions is useful to inform the scope of guidelines for Victoria.
•AI guidelines should be consistent across Victoria’s courts and VCAT.
•The Supreme Court guidelines should be expanded and updated. The Commission recommends opportunities to reform the Supreme Court AI guidelines by:
–clarifying their scope
–updating principles and including educative information
–providing direction on how court users can use AI to prepare court documents.
•This chapter provides an example of what AI guidelines for Victoria’s court and tribunal users could look like.
How can guidelines support the safe use of AI in courts and tribunals?
7.1Our terms of reference ask us to recommend principles and guidelines for the safe use of AI in Victoria’s courts and tribunals. In doing so, the Commission considered:
•how AI is regulated in other jurisdictions and potential lessons for Victoria
•the rapid development of AI technologies and how this may influence the extent to which such technologies should be adopted and regulated.
7.2Jurisdictions within Australia and internationally have issued guidelines on the use of AI in courts and tribunals to manage the safe use of AI. Most jurisdictions recognise the need to monitor and update guidelines as AI technology evolves and new risks and opportunities emerge. As discussed in Chapter 4, non-legislative approaches such as principles and guidelines can flexibly respond to AI as technology and uses rapidly develop.
7.3This chapter examines current guidelines for court users and makes recommendations for expanding existing guidelines in Victoria. Chapter 8 considers guidelines for judicial officers.
AI guidelines for court users in Victoria
7.4In Victoria, some courts and professional bodies have developed guidelines relating to the safe use of AI by court users. These are discussed below.
Court-issued AI guidelines in Victoria
7.5In May 2024, the Supreme Court issued guidelines to assist lawyers and self-represented litigants when using AI in litigation.[1] The County Court adopted these guidelines in July 2024.[2]
7.6In summary the Supreme Court guidelines:
•define and explain common AI terms
•highlight limitations and risks of AI tools
•explain that the use of AI is subject to, and must align with, existing legal professional obligations
•encourage court users to check AI-generated text to ensure it is accurate and applicable
•encourage parties to disclose their use of AI to each other and the court where appropriate
•encourage self-represented litigants and witnesses to include a statement about the AI tool used in any AI-generated materials.
7.7The guidelines have not been adopted by the Magistrates’ Court, Children’s Court, Coroners Court or VCAT at this stage. VCAT is developing its own set of guidelines for tribunal users. This work is still under development.[3]
7.8The Victorian Legal Services Board and Commissioner (VLSB+C) census found that less than a third (29.1 per cent) of Victorian lawyers surveyed had read the existing court guidelines.[4] While the guidelines were generally viewed positively in our consultations, stakeholders identified opportunities for them to be updated and expanded. These opportunities are discussed in this chapter.
Professional body AI guidelines in Victoria
7.9Peak legal professional bodies in Victoria, Western Australia and New South Wales have released a joint Statement on the Use of Artificial Intelligence in Australian Legal Practice..[5]
7.10The statement emphasises that when using AI, lawyers must comply with their ethical standards and professional obligations, as contained in the Legal Profession Uniform Law and conduct rules for barristers and solicitors.[6] It states that lawyers must:
•maintain client confidentiality by not inputting confidential, sensitive or privileged client information into public AI tools
•provide independent advice by applying their own assessment and analysis to AI outputs
•be honest and deliver legal services competently and diligently by verifying AI outputs for accuracy and relevance
•charge costs that are fair, reasonable and proportionate by ensuring the use of AI does not unnecessarily increase costs to clients above traditional methods.[7]
7.11The statement indicates how the existing duties can be practically applied to guide the use of AI by lawyers. However, the VLSB+C census found that there was not broad awareness of this statement among Victorian lawyers, with less than a third (29.2 per cent) being familiar with the statement.[8]
7.12Nonetheless, there appears to be broad agreement and awareness among Victorian lawyers of their duty to ensure AI use complies with their professional obligations. The VLSB+C census found an overwhelming majority of lawyers (96 per cent) either agreed or strongly agreed that they have a duty to ensure their use of AI complies with their professional obligations.[9] However, 80.3 per cent of lawyers agreed or strongly agreed that insufficient guidance on responsible use of AI posed a risk to using AI in legal practice.[10]
7.13In August 2025, the Law Institute of Victoria released the Ethical and Responsible use of Artificial Intelligence ethics guidelines.[11] The guidelines provide examples of precautions lawyers should take when using AI to ensure use is consistent with their ethical obligations and duties. This includes duties of competence and diligence, honesty, confidentiality and privilege, and supervision of legal services.
7.14Some examples of ‘best practice’ AI use provided in the guidelines include to:
•Check the outputs of AI tools for accuracy and relevance.
•Never enter confidential client information or legally privileged information into open source or commercial AI systems without client consent.
•Disclose the use of AI to clients, and be prepared to inform opponents and the Court (if asked about AI use).
•Ensure clients are charged fairly for work completed with AI.[12]
7.15The guidelines provide instruction on privacy and data management and direct lawyers to read the privacy policy and terms of use before using an AI tool. Where available, lawyers should review information about the AI model to understand the training data and any potential for inaccuracy or bias by looking at the AI’s Model Card and/or Data Nutrition Label.[13] The Law Institute of Victoria has also provided educative information about AI on its AI Hub.[14]
7.16In August 2025, the Victorian Bar released Guidance on the Ethical Use of Generative AI. This guideline does not discourage the use of AI by barristers but encourages them to be cautious about potential risks to client confidentiality and integrity of court documents.[15]
7.17The Legal Practitioners’ Liability Committee provides professional indemnity insurance to Victorian lawyers and many of Australia’s national law firms.[16] It has released statements on the limitations and risks of using AI in legal practice[17] and how to manage those risks.[18]
7.18There has been a mixed response by legal workplaces to implementing internal guidelines on the use of AI. The VLSB+C census found that 42 per cent of lawyers surveyed indicated their workplace had established AI guidelines, but 45 per cent noted the absence of such policies. A further 13 per cent were uncertain about the existence of any AI-related guidance.[19]
7.19Guidance on AI use has also been issued by prosecutorial agencies. In 2024, Victoria Police introduced the Victoria Police Artificial Intelligence Ethics Framework.[20] This framework was discussed in our consultation paper. It aims to ensure use of AI by Victoria Police is ethical and consistent with its existing legal and ethical obligations. Additionally, the Australia New Zealand Policing Advisory Agency has released a Responsible and Ethical Artificial Intelligence Framework to support police and forensic services by detailing how AI ethics principles can be operationalised across the AI lifecycle.[21]
Other Australian AI guidelines for court users
7.20Some Australian court jurisdictions have developed or are considering AI guidelines for court users. However, there is currently no common approach. Legal professional bodies across Australia have also released guidelines for lawyers on the use of AI.
Interstate court-issued AI guidelines for court users
7.21In November 2024, the NSW Supreme Court released Practice Note SC Gen 23 – Use of Generative AI.[22] The Practice Note contains guidance for court users about the use of GenAI. It was amended on 28 January 2025, in response to feedback from the legal profession.
7.22The Practice Note contains directions to court users and limits the use of AI in some circumstances. In summary, it:
•defines GenAI and specifies what is excluded from their definition, such as search engines like Google.[23]
•lists shortcomings of GenAI that court users should understand.[24]
•prohibits certain information from being entered into GenAI, if it is subject to a non-publication or suppression order or the implied Harman undertaking. But there are exceptions if the information can be considered ‘preparatory’ such as generating chronologies or summarising documents,[25] and if the information will remain within a controlled environment.[26]
•prohibits the use of GenAI to generate affidavits, witness statements and character references and requires a disclosure statement that GenAI was not used. But there is an exception for ‘merely preparatory’ work and leave may also be granted to use AI in exceptional cases.[27]
•requires that if GenAI is used in written submissions, summaries or skeletons of argument, court users must verify in the document that citations, legal and academic authority, case law and legislation references exist, are accurate and relevant.[28]
•prohibits the use of GenAI to draft or prepare the content of an expert report without prior leave of the court. If leave is obtained the expert must disclose specified information about the use of AI in the report.[29]
7.23The Practice Note has been adopted by the:
•NSW Land and Environment Court[30]
•District Court of NSW[31]
•NSW Civil and Administrative Tribunal.[32]
7.24The NSW Civil and Administrative Tribunal procedural direction notes that modifications have been made to accommodate particular types of proceedings.[33]
7.25In May 2024, Queensland Courts issued The Use of Generative Artificial Intelligence (AI) Guidelines for Responsible Use by Non-Lawyers.[34] The guidelines outline the risks and limitations of AI use for non-lawyers. There is no requirement in these guidelines to disclose AI use. At the time of finalising this report, these guidelines were updated by Queensland Courts.[35]
Guidelines under development
7.26In April 2025, the Federal Court released a Notice to the Profession on Artificial Intelligence use in the Federal Court of Australia.[36] In this statement Chief Justice Mortimer indicated that the court is considering ‘the development of either Guidelines or a Practice Note in relation to the use of Generative Artificial Intelligence by practitioners and courts users.’[37]
7.27The Court was to consult with stakeholders on development of this work. Submissions to this process closed on 13 June 2025. The notice reminds court users that in the meantime they:
•remain responsible for material tended to the court
•should only use GenAI in a way that is consistent with their existing obligations
•should disclose their use of GenAI ‘if required to do so by a Judge or registrar of the Court.’[38]
7.28The Supreme Court of Western Australia sought stakeholder feedback on the development of a practice direction on the use of AI by the legal profession in court.[39] Submissions to this process were due by 31 March 2025. At the time of writing the Court had not released a response to the submissions received.
7.29On 30 May 2025, the Chief Justice of South Australia released a survey seeking information from lawyers about their use of GenAI in South Australian courts.[40] A committee of judicial officers and other stakeholders was established to consider how the South Australian courts might respond to the use of GenAI. Submissions to this process were due by 30 June 2025. At the time of writing the Court had not released a response to the submissions received.
Interstate professional AI guidelines
7.30As discussed at paragraph [7.9], the VLSB+C, the Legal Practice Board of WA and the Law Society of NSW released the joint Statement on the Use of Artificial Intelligence in Australian Legal Practice.[41]
7.31In our consultation paper we identified that several legal professional bodies across Australia have released guidance to lawyers on the use of AI. This includes:
•The Law Society of NSW released A Solicitor’s Guide to Responsible Use of Artificial Intelligence in November 2023 (updated in October 2024).[42]
•The NSW Bar Association issued guidelines to barristers about how their professional obligations align with the use of AI in July 2023.[43]
•The Queensland Law Society issued a guiding statement on the use of AI in legal practice, in 2023 (updated in October 2024).[44] The statement sets out principles rather than specific obligations arising from the use of AI. It also released an AI Companion Guide in September 2024 which aims to guide lawyers to understand AI and construct appropriate governance and risk frameworks to mitigate risks.[45]
7.32These guidelines share a common focus on raising awareness that lawyers must comply with their existing professional obligations when using AI.
7.33The Law Society of South Australia has not issued formal guidance to lawyers but in May 2025, it made a submission to the Federal Court’s proposal for guidelines or a practice note on the use of AI.[46] The submission was aligned with the approach taken in NSW and submitted that there should be mandatory disclosure for the use of GenAI and restrictions on the kinds of materials permitted to be AI generated such as affidavits, witness statements and expert reports.
7.34The Law Society of Western Australia has not issued formal guidance to lawyers but made submissions to the Supreme Court of Western Australia on the request for input on an AI practice direction or consultation note.[47] It also made a submission to the Federal Court’s proposal for guidelines or a practice note on the use of AI.[48] In these submissions, the Law Society of Western Australia supported principle-based guidance for all court users and argued against a prescriptive approach as adopted in NSW. It did not support a prohibition on the use of GenAI or mandatory disclosure of AI use.[49]
7.35The Law Council of Australia, which consists of 16 Australian law societies and bar associations, also made a submission to the Federal Court in response to its consultation on AI use in the Court.[50] The submission gave in principle support to the development of a practice note on the use of GenAI but noted it did not have a settled position on the contents of a practice note because it had received mixed views from its constituent bodies.[51]
International guidelines for court users
7.36Courts and professional bodies around the world have released guidelines on how courts users may use AI in courts and tribunals.
7.37Examples include AI guidelines issued by:
•Canadian federal[52] and provincial courts,[53] and professional bodies[54]
•Caribbean courts[55]
•English and Welsh professional bodies[56]
•European Union professional bodies[57]
•Malaysian professional bodies[58]
•New Zealand courts[59] and professional bodies[60]
•Scottish professional bodies[61]
•Singaporean courts[62]
•United States (US), state and district courts and professional bodies.[63]
7.38Approaches vary across jurisdictions but there are common features which may inform development of guidelines in Victoria. For example, most guidelines connect existing legal professional obligations to the use of AI.
7.39Some common features of international AI guidelines for court and tribunal users are discussed below (for further details on international guidelines see Appendix C).
International disclosure requirements for court and tribunal users
7.40International guidelines take different approaches to whether court users should disclose their use of AI. Some courts do not require court users to pre-emptively disclose if they have used AI but remind people that they could be asked about their use of AI. Other courts mandate disclosure upon filing of documents and some require court users to disclose how they have used AI.
7.41Table 3 contrasts some different international approaches to disclosure.
Table 3: International approaches to disclosure of AI use by courts users
|
Approach |
Jurisdiction |
|---|---|
|
Pre-emptive disclosure is required |
Canada: Courts with disclosure requirements include the Federal Court as well as courts of Manitoba and Yukon.[64] The Federal Court of Canada requires court users to make a declaration about their use of AI to prepare materials filed with the Court.[65] United States: Some district courts in California,[66] Texas,[67] and Pennsylvania,[68] have issued standing orders requiring court users to sign and submit a form pre-emptively disclosing if they used any form of AI for legal research or drafting in connection with a case and certifying the accuracy of filed documents. |
|
No pre-emptive disclosure is required |
Canada: Courts that do not mandate pre-emptive disclosure include Alberta, Newfoundland and Labrador, Nova Scotia, Quebec.[69] Pre-emptive disclosure is not required but court users must exercise caution when referring to authorities or analysis derived from AI. Caribbean: Disclosure is not always required but the court ‘may require a user to disclose whether a GenAI tool was employed… [and] court users should be prepared to identify specific portions of their submissions influenced by GenAI and explain the steps taken to ensure accuracy’.[70] England and Wales: Provided AI is used responsibly, there is no reason why a legal representative ought to refer to its use, but this is dependent upon context.[71] New Zealand: Disclosure by default is not required, but court users may have to disclose if asked by the court or tribunal.[72] Singapore: ‘pre-emptive declaration of the use of Generative AI is not required’[73] but court users ‘must be prepared to identify the specific portions of the Court documents which used AI-generated content and explain to the Court how you have verified the output’.[74] United States: The Illinois Supreme Court’s policy states ‘Disclosure of AI use should not be required in a pleading.’[75] |
International accuracy requirements for court and tribunal users
7.42It is common for international AI guidelines to highlight risks around hallucinations and inaccuracies of AI. In response to this risk, some jurisdictions encourage or require courts users to verify the accuracy of AI materials submitted to courts or tribunals.
7.43International guidelines often reinforce the importance of accountability and emphasise that court users remain responsible for materials submitted to courts.
7.44Table 4 provides examples of the different international approaches to accuracy and accountability.
Table 4: International approaches to accuracy and accountability of AI use by courts users
|
Jurisdiction |
Approach to accuracy and accountability |
|---|---|
|
Canada (Alberta) |
‘any AI-generated submissions must be verified with meaningful human control. Verification can be achieved through cross-referencing with reliable legal databases, ensuring that the citations and their content hold up to scrutiny’.[76] |
|
Caribbean |
‘Court users who utilise GenAI tools assume full responsibility for the accuracy, relevance, and appropriateness of the outputs’.[77] |
|
England and Wales |
‘Review generative AI outputs for accuracy and factual correctness, including mitigation of biases and factchecking’.[78] |
|
Malaysia |
‘All output generated by the systems must be independently verified against traditional legal databases and confirmed for relevance and accuracy’.[79] |
|
New Zealand |
‘You are responsible for ensuring that all information you provide to the court/tribunal is accurate. You must check the accuracy of any information you get from a GenAI chatbot.’[80] |
|
Scotland |
‘Solicitors should still exercise oversight and verify the accuracy and suitability of the information provided by generative AI systems.’[81] |
|
Singapore |
‘you should ensure that any AI-generated output used in your court document … is accurate [and] is relevant.’[82] |
|
US (Illinois) |
‘All users must thoroughly review AI-generated content before submitting it in any court proceeding to ensure accuracy and compliance with legal and ethical obligations’.[83] |
International privacy and confidentiality requirements for court and
tribunal users
7.45It is common for international AI guidelines to highlight risks and obligations for court and tribunal users around data privacy and confidentiality. Some jurisdictions restrict what data court users can input into public AI tools as shown in Table 5.
Table 5: International approaches to privacy and confidentiality for court users
|
Jurisdiction |
Approach to data privacy and confidentiality |
|---|---|
|
Canada |
‘Lawyers must reasonably ensure the security of an AI system before use, to the extent feasible.’[84] |
|
Caribbean |
‘Court users must not input sensitive, confidential, or privileged information into open source GenAI.’[85] |
|
England and Wales |
‘If you are using a free, online generative AI service where you have no operational relationship with the vendor other than use, do not put any confidential data into the tool. If you are procuring or working with a vendor to develop a personalised generative AI product for internal use contained solely within your firm’s legal environment, you may wish to consider if and how you want to put confidential data into the tool, subject to the terms of use.’[86] |
|
Malaysia |
Lawyers must review ‘the terms and conditions associated with the GenAI tools to be used and the provider’s privacy policy.’[87] |
|
New Zealand |
‘you should not enter any information into an AI chatbot that is not already in the public domain.’[88] |
|
Scotland |
‘confidential or client sensitive information should not be shared with public generative AI systems.’[89] |
|
Singapore |
‘ensure that there is no unauthorised disclosure of confidential or sensitive information when you use Generative AI tools.’[90] |
|
United States |
‘Before lawyers input information relating to the representation of a client into a GAI tool, they must evaluate the risks that the information will be disclosed to or accessed by others outside the firm.’[91] |
Additional guidance for Victoria’s court users
7.46There was strong support for court-issued guidelines on the use AI by court users. The Victorian Bar Association stated, ‘informed Court guidelines are critical’.[92]
7.47The existing AI guidelines by the Victorian Supreme Court and County Court were generally viewed positively by stakeholders. The VLSB+C stated that the guidelines ‘are measured and helpful.’[93]
7.48There have been mixed approaches to the form AI guidance to court users should take. In Victoria, the Supreme Court has issued guidelines, whereas the Supreme Court of NSW has issued a practice note.[94]
7.49Practice notes and practice directions provide guidance on how the court intends to apply the law and manage cases.[95] Practice notes can be considered by courts when exercising case management and cost powers. The Supreme Court characterises them as follows:
While they do not have the force of law, lawyers with the conduct of proceedings are expected to be familiar with their content and follow their requirements where applicable. The Court may take a failure to comply with a Practice Note into account in the exercise of its case management and costs powers.[96]
7.50Practice notes generally apply to all litigants (whether represented or not) but some practice notes contain separate requirements specifically for lawyers.
7.51Guidelines are also not enforceable and are intended to assist litigants by providing direction on the interpretation of rules and processes for improved consistency.
7.52There is an expectation that litigants are aware of and comply with the Supreme Court’s AI guidelines. Recently, in Director of Public Prosecutions v GR, Justice Elliott stated that ‘it is essential that all litigants and practitioners adhere to these [the Supreme Court’s] guidelines’.[97]
7.53However, as discussed above (paragraph [7.8]) only 29.1 per cent of Victorian lawyers have read the Supreme Court guidelines.[98] Moving AI guidelines into the form of a practice note may increase awareness.
7.54The Federal Court has consulted on what form AI guidance should take.[99] During this consultation process, the Law Council of Australia identified several reasons why a practice note may be preferable to guidelines.[100] However, the Law Society of Western Australia suggested that principle-based guidelines as already adopted by the Supreme Court of Victoria was preferred by most lawyers in Western Australia.[101]
7.55It is up to the courts to decide which format is most appropriate for AI guidelines for court users.
7.56However, there is an opportunity to build on the content of the Supreme Court guidelines by:
1)ensuring consistency in AI guidelines across Victoria’s courts and VCAT
2)clarifying the scope of guidelines by:
a)incorporating the OECD definition of AI and defining subcategories of AI
b)clarifying the guidelines apply to civil and criminal proceedings
c)clarifying the guidelines apply to lawyers, litigants (whether represented or not), witnesses and experts.
3)developing additional guidance for court users by providing direction on:
a)the Commission’s proposed principles
b)submissions
c)affidavits and statements
d)expert reports.
AI guidelines across Victoria’s courts and VCAT should be consistent
7.57Many people told us that there should be consistency in AI guidelines.[102] This included consistency across Victoria’s courts and VCAT, as well as nationally.[103]
7.58As discussed above (paragraph [7.5]) the Supreme Court and County Court guidelines are consistent. However, other court jurisdictions in Victoria are yet to adopt guidelines and VCAT is developing its own guidelines.[104]
7.59There is inconsistency in the approaches adopted by courts across Australia. Like Victoria’s Supreme and County courts, Queensland courts do not require disclosure or prohibit the use of AI, noting their guideline is only focused on non-lawyers.[105] In contrast, the NSW Supreme Court has taken a more restrictive approach, which requires disclosure and prohibits use in some circumstances as discussed above from paragraph [7.21].[106]
7.60The Supreme Court told us:
Regulation of the use of AI is a matter that largely falls on individual jurisdictions. However, as AI technologies develop, it will be important for Victorian jurisdictions to work towards a consistent or uniform approach, to the extent feasible.
It will also be highly desirable to work towards a nationally consistent and coordinated approach to the use of AI. A nationally consistent and coordinated approach can be expected to be cost efficient, to enable a productive sharing of scarce resources across jurisdictions leading to the effective evaluation and adoption of the most appropriate AI tools for use in all courts and tribunals and uniform guidelines for the use of AI by legal practitioners and litigants. A coordinated approach will help to ensure appropriate identification and effective management of information security risks.[107]
7.61However, some jurisdictions flagged a need for guidelines to be adapted to suit the context of individual jurisdictions. Representatives of the Magistrates’ Court stated:
The Magistrates’ Court is a volume-intense jurisdiction which gives rise to additional considerations distinct from the Supreme and County Courts. Any operational impositions or protraction in proceedings is magnified in this jurisdiction.[108]
7.62While there are differences across courts, there is also significant interaction between courts. For instance, many criminal cases start and are managed in the Magistrates’ Court but are then further managed and heard in the County Court. Representatives of the Office of Public Prosecutions stated:
Consistency would be ideal. While the courts deal with different issues sitting under them, one set of guidelines would be welcome. Cases will move through County to Supreme courts, so we shouldn’t have to do different things for each court, especially if they impact expert reports which flow through the courts on appeal.’[109]
7.63Regular interaction between different jurisdictions highlights the importance of consistency, so that court users are not navigating an overly complex and conflicting system.
7.64Legal professional bodies strongly supported consistency in the approach to regulating AI. The Law Institute of Victoria:
cautions against individual courts or tribunals establishing their own guidelines, which would likely lead to varying standards within and across jurisdictions. Instead, a uniform approach is preferable to ensure consistency and fairness.[110]
7.65The Centre for the Future of the Legal Profession and UNSW Law and Justice stated that inconsistent approaches may lead to an unfair outcome for court users:
This has real potential to increase the burden on lawyers, parties and witnesses who may find that some AI tools or products are acceptable in one court but not another. Ideally, courts and tribunals would work together to achieve unified national guidance where possible.[111]
7.66Having consistent AI guidelines across Victoria’s courts and VCAT will support:
•clarity for court users about their obligations
•awareness about the accepted uses and limitations, leading to safe use
•fairness in access to and use of tools
•sharing of information and learnings across the jurisdictions.
7.67The Supreme Court’s guidelines adopt a principle-based approach, which contrasts with the more prescriptive approach taken in NSW. As discussed in Chapter 6, stakeholders were supportive of a principle-based approach to AI regulation, as it provides flexibility to respond to rapidly developing technology. However, there are opportunities to build on the existing Supreme Court’s principle-based guidelines. In this chapter we discuss several ways in which the Supreme Court’s guidelines can be expanded.
7.68It is recommended that once amended by the recommendations in this chapter, the Supreme Court’s guidelines should be adopted by the rest of Victoria’s courts and VCAT. However, we acknowledge that some modifications may need to be made to accommodate the types of proceedings determined by individual court jurisdictions.[112] Any such modifications should be coordinated through Courts Council so that there is consistency where possible.
Working towards consistency
7.69Uses of AI and responses by courts and tribunals are evolving. Courts will need to consistently review and respond to AI over time. Guidance from courts is likely to shift as the technology, risks and uses evolve. It is unsurprising that there will be different responses to the risks and opportunities of AI, particularly in the early stages of providing guidance. This is evident from the diverse responses of other jurisdictions. However, we heard that consistency is desirable across Australian jurisdictions where possible in the longer term.
7.70The Supreme Court suggested that the Council of Chief Justices could help to facilitate consistency. Representatives of the Supreme Court told us that the desirability of national consistency will guide the court in its response to AI, noting that:
The profession is a national profession and if possible, practitioners should not have to take different approaches to their use of AI in courts dependent on the jurisdiction in which they appear.[113]
7.71The Council of Chief Justices of Australia and New Zealand will convene an Australian Legal Convention in November 2025.[114] AI is one of seven issues on the agenda for the convention.[115] Chief Justice Gageler has stated that AI is a ‘huge challenge’[116] for the courts to consider. The convention is an early opportunity to bring together representatives of organisations within the Australian legal system to discuss issues relating to AI, among other current and emerging issues.
7.72It will be important for Victoria’s courts to remain aware of the AI practices of courts in other states and territories and consider where there may be future opportunities for alignment nationally.
|
Recommendations 5.The Supreme Court’s Guidelines for Litigants: Responsible Use of Artificial Intelligence, as varied by recommendations 7 to 11, should be adopted by Victoria’s courts and VCAT. 6.The Chief Justice of Victoria should consider working with his counterparts through the Council of Chief Justices to seek to achieve consistency nationally on the response to AI throughout Australia’s courts and tribunals. |
The scope of court user guidelines should be clarified
7.73There is an opportunity to clarify the scope of the existing Supreme Court guidelines by specifying:
•the scope of AI uses included in the guidelines
•what proceedings they apply to
•who is expected to comply with the guidelines.
Court user guidelines should use the OECD definition of AI
7.74As stated in Chapter 1, there is no universally agreed definition of AI. Many stakeholders highlighted that there are significant difficulties in defining AI because of how quickly the technology is evolving. This has resulted in definitions of AI changing over time.[117]
7.75However, we heard from some stakeholders that Victoria’s courts and VCAT should seek to adopt a consistent definition of AI.[118] The Law Institute of Victoria stated:
Adopting a clear definition of AI would greatly assist courts and tribunals to regulate permissible and impermissible uses of AI, to ensure the integrity of evidence and of the judicial process.[119]
7.76Guidance issued by courts and professional bodies adopt a variety of AI definitions. The Supreme Court’s guidelines describe AI as ‘a term describing a range of technologies and techniques used to computationally generate outputs that typically require human intelligence to produce.’[120]
7.77The OECD defines AI as:
a machine-based system that, for explicit or implicit objectives, infers from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. Different AI systems vary in their levels of autonomy and adaptiveness after deployment.[121]
7.78The main benefits of adopting the OECD definition are explained in Chapter 1. In summary, this definition is recognised internationally,[122] and has been adopted by the Australian and Victorian governments.[123] Incorporating this definition into all guidelines for Victoria’s courts and VCAT will help ensure a consistent understanding of what AI is and what it is not.
7.79In addition, we heard from stakeholders that it would be helpful to define subcategories of AI. We heard from representatives of Law Firms Australia that there:
Is a distinction between generative AI (GenAI) and AI tools more generally, as well as between open and closed AI systems … [We are] interested in a regulatory response that can treat different forms of AI differently.[124]
7.80There are various approaches to how AI guidelines define GenAI. The NSW Supreme Court Practice Note lists certain types of GenAI tools which are not included in the definition of GenAI and are not regulated under the Practice Note.[125] For example, transcription and translation tools are not subject to the restrictions and prohibitions contained in the Practice Note.[126]
7.81Some stakeholders stated AI guidelines may not need to apply to common, low-risk AI uses. Representatives of Law Firms Australia said that there are ‘innocuous uses of GenAI in commonly used tools, such as in relation to spelling, grammar and searching. These uses are sensibly excluded from the NSW Practice Note’.[127] Similarly, the Centre for the Future of the Legal Profession and UNSW Law and Justice stated AI can be ’built into technological tools that are in everyday use and where the AI component does not necessarily present risks that courts or tribunals should be concerned with (such as grammar checking)’.[128]
7.82However, in Chapter 3 we discuss that there are difficulties in creating a defined list of permissible or prohibited AI tools because the technology is developing so fast and new uses and tools are emerging with increasing frequency. This is demonstrated by the recent amendment to the NSW Supreme Court Practice Note to update the list of excluded GenAI tools.
7.83Rather than inserting a static list of AI tools, the Commission recommends guidelines include the following subcategories of AI which we have defined in Chapter 3:
•GenAI: Software systems that generate content as text, images, music, audio and videos, based on a user’s prompts.
•Public AI: AI tools that are openly accessible to the public, typically via the internet. Public AI tools are trained on broad, often public datasets, most commonly for general purpose use.
•Closed AI: The phrase ‘closed AI’ is defined in contrast to public AI. Closed AI tools are generally not openly accessible to the public and information used in closed AI tools remains within a controlled environment. When an AI tool is ‘closed’ there are controls to reduce risks related to privacy, or confidentiality settings that protect information from being made publicly available or used to train the AI tool.
7.84In Chapter 3, we discuss how these different types of AI carry different types and levels of risk. The Commission recommends that guidelines to court users explain these different risks.
7.85Using categories of AI directed to underlying risks such as privacy, rather than a list of AI tools, will allow guidelines to better adapt to evolving technology. We heard from one stakeholder that it could be useful to direct AI regulatory responses to the ‘particular issues enlivened by the use of particular AI products, instead of attempting to develop a definition which covers current and potentially future AI technologies’.[129]
7.86Guidelines to court users should highlight that using public AI tools can create greater privacy risks compared to using closed AI tools. For this reason, guidelines should prohibit court users from entering any information into public AI tools which is subject to client privilege, confidential or sensitive (this includes information subject to a non-publication or suppression order).
7.87Additionally, guidelines should direct court users to exercise caution and review the contractual terms of closed AI tools to ensure the information they input will be kept secure and not made public or used to train the AI program. Examples of privacy and data security considerations are also illustrated in Table 6.
7.88Guidelines to court users should cover both AI and GenAI. One reason for this is because we heard guidelines should play a role in setting expectations for experts in their use of AI to form opinions, not just the use of GenAI. This is discussed from paragraph [7.159].
Guidelines should apply to civil and criminal matters
7.89The Supreme Court guidelines do not state whether they apply to civil and criminal matters.[130] In contrast, Queensland’s guidelines state that they apply to both civil and criminal proceedings.[131]
7.90We asked stakeholders if there was a need to develop separate guidelines for civil and criminal matters. There was not strong support for separate guidelines at this stage.
7.91The County Court told us:
While it is foreseeable that different risks may emerge in criminal and civil matters, at this stage, the data security and confidentiality risks associated with AI is a significant concern. Both criminal and civil matters have their own sensitivities and there is currently no benefit in distinguishing them based on risk. In addition to the data security and confidentiality risks, there are a number of significant risks with the AI technologies themselves, and until those risks are mitigated, the Court is not able to determine what risks it would be prepared to accept in which matters.[132]
7.92To avoid any doubt, the existing guidelines should be updated to clearly state that they apply to both civil and criminal proceedings.
Who should court user guidelines apply to?
7.93Court-issued guidelines should apply to all court users. This should include lawyers, litigants (whether represented or not), witnesses and experts. The Supreme Court’s AI guidelines are directed to litigants and have been ‘designed to assist both legal practitioners and self-represented litigants’.[133]
7.94There was discussion amongst stakeholders about whether separate guidelines are necessary for lawyers and self-represented litigants. This arose on the basis that lawyers have existing professional obligations which can be applied to guide their use of AI, whereas self-represented litigants do not have the same obligations.[134]
7.95It was noted that self-represented litigants may not possess the necessary legal knowledge and skills to verify the accuracy and relevance of AI outputs.[135] The Law Institute of Victoria stated guidelines should distinguish between different users, including self-represented litigants.[136] The VLSB+C also supported the development of separate guidance for self-represented litigants.[137]
7.96Courts have made comments that while self-represented litigants do not have the same professional obligations as lawyers, they should still check the accuracy of outputs from AI tools. Judge Porter, when considering a document filed by a self-represented litigant which contained hallucinated cases, stated:
It is obvious that legal practitioners have a professional duty not blindly to rely on the output of any research tool. While litigants in person do not have the same professional duties, I do not think that gives them a free pass to uncritically adopt the output of AI models. As awareness builds in the community of the potential for hallucinations in output from large language models, a time will come where uncritically placing such output before a Court in circumstances where a litigant is reckless as to its accuracy could amount to contempt of court, in the form of interference in the administration of justice by the litigant in person.[138]
7.97The Federal Court of Canada provides a clear explanation for why court-issued guidance on AI should apply equally to lawyers and self-represented litigants:
The Court recognizes that counsel have duties as Officers of the Court. However, these duties do not extend to individuals representing themselves. It would be unfair to place elevated AI-related responsibilities only on these self-represented individuals, and allow counsel to rely on their duties. Therefore, the Court provides this Notice to ensure fair treatment of all represented and self-represented parties and interveners.[139]
7.98This idea of fairness was echoed by Victoria’s courts. We heard from most courts that self-represented litigants and lawyers should be treated the same in relation to use of AI. To do otherwise was seen to create an unfair barrier on self-represented litigants that would create inconsistent expectations on court users. A representative of the County Court explained:
It is hard to see why, or how, we would be able to say to parties that they have to comply with different standards. Whatever guidelines we have for submissions, it should be for all parties. Uniformity in how we deal with parties is important. Treating different parties differently might be seen as being unjustified.[140]
7.99The Supreme Court guidelines should be the same for self-represented litigants and lawyers. But they should bring lawyers’ attention to their existing legal duties and raise awareness of how those duties apply to the use of AI.
Guidelines should include principles and educative information
7.100The Commission’s principles set out in Chapter 6 could be given effect by being embedded into the Supreme Court guidelines.
7.101While there has been broad support for the principles, we heard that ‘principles alone are helpful but insufficient’.[141] Stakeholders told us that if the principles are to be effective, they need to be enlivened with practical examples.[142]
7.102The Supreme Court’s guidelines contain a set of principles for use of AI by litigants. This should be updated to list and describe the Commission’s principles. Guidelines should also explain practical ways to align use of AI to the principles. This would serve an educative purpose to increase court users understanding of the risks and limitations of AI as well as their existing obligations.
7.103Table 6 provides extracts from a range of international guidelines as examples of practical guidance relevant to the principles set out in Chapter 6. We provide an example of how this could be applied in Victoria from paragraph [7.183].
Table 6: Examples of principle-based guidance for court users
|
Principle |
Guidance for court users |
|---|---|
|
Impartiality and fairness |
‘Have regard to ethical issues—particularly biases and the need to address them. GenAI chatbots generate responses based on the dataset they are trained on (which is generally information from the internet). Information generated by a Gen AI chatbot will reflect any biases or misinformation in its training data’.[143] ‘Unsubstantiated or deliberately misleading AI generated content that perpetuates bias, prejudices litigants, or obscures truth-finding and decision-making will not be tolerated’.[144] |
|
Accountability and independence |
GenAI can ‘generate answers that appear to be persuasive and authoritative but could be extremely inaccurate or even fabricated. Generative AI chatbots can invent cases and statutes. They can also include facts which you never provided to them or make arguments that you never asked them to make. This is also known as “hallucinating”’.[145] ‘any output generated should only be used on the basis that the Court User assumes full responsibility for the output’.[146] ‘Where the Court User is a lawyer, the lawyer’s duty to comply with the rules of professional conduct remains. Lawyers continue to have a professional obligation to ensure that materials they put before the Courts are independently verified, accurate, true, and appropriate’.[147] ‘Irrespective of the systems used, solicitors [lawyers] should still exercise oversight and verify the accuracy and suitability of the information provided by generative AI systems’.[148] ‘Where the Court User is a Self-Represented Person, he or she is also responsible for ensuring that all information provided to the Court is independently verified, accurate, true, and appropriate’.[149] |
|
Transparency and open justice |
Discussion on disclosure of AI use for submissions, affidavits, statements and expert reports is discussed in this chapter from paragraph [7.104]. |
|
Contestability and procedural fairness |
Court users ‘must be prepared to identify the specific portions of the court documents which used AI-generated content and explain to the Court how you have verified the output’.[150] |
|
Privacy and data security |
‘Court users must not input sensitive, confidential, or privileged information into open source GenAI’.[151] ‘If a solicitor is considering sharing confidential information with private generative AI systems at a minimum they should undertake appropriate checks to satisfy themselves that (a) appropriate terms are in place with the vendor so that the information inputted will not be accessible by the vendor or used for any other purposes (b) the security arrangements meet appropriate information security standards’.[152] |
|
Access to justice |
‘The Court does not prohibit the use of Generative AI tools to prepare Court Documents, provided that this Guide is complied with’.[153] The use of AI ‘may be expected, should not be discouraged, and is authorized provided it complies with legal and ethical standards’.[154] |
|
Efficiency and effectiveness |
Users may find that using GenAI ‘saves significant time and resources’ in several tasks including legal research, document review, contract drafting, due diligence, drafting letters, emails and documents, as well as performing other administrative tasks but there are risks associated with use and verification of outputs is required.[155] |
|
Human oversight and monitoring |
‘In the interest of maintaining the highest standards of accuracy and authenticity, any AI-generated submissions must be verified with meaningful human control. Verification can be achieved through cross-referencing with reliable legal databases, ensuring that the citations and their content hold up to scrutiny’.[156] |
|
Recommendation 7.The Supreme Court guidelines should be updated to: a.define AI, GenAI, public AI and closed AI b.apply to civil and criminal matters c.apply to all court users including lawyers, litigants (whether represented or not) and witnesses (lay and expert) d.contain the Commission’s principles. |
Guidelines to clarify obligations for using AI in court documents
7.104The Supreme Court guidelines should be updated to include direction to court users on the use of AI for preparing documents for courts and tribunals. The guidelines should provide direction on:
•submissions
•affidavits, character references and witness statements
•expert reports.
Using GenAI to prepare submissions
7.105Some court users are using GenAI to develop written submissions that are being filed with courts and tribunals.
7.106The use of GenAI by court users to prepare submissions raises risks such as:
•They may contain inaccuracies (for instance, incorrectly summarising cases or legislation, incorrectly attributing legal principles to cases or proposing cases that are irrelevant to the proceedings) and ‘hallucinations’ (such as non-existent case citations or references to non-existent legislation or creating non-existent quotes or extracts from cases and legislation) and may appear accurate, leading to an overreliance on outputs without verification.[157]
•There may be ‘added cost and complexity to the proceedings, and where unverified, add to the burden of other parties and the Court in responding to it’.[158]
•The use of GenAI tools may lead to confidential and or sensitive information being input into systems that do not protect privacy or confidentiality.[159]
7.107Referencing non-existent cases is not in the interests of the court or parties.[160] Courts have issued statements about the risks of misleading information which may result from using AI to prepare submissions. Recently, in Director of Public Prosecutions v GR, a defence lawyer used AI to assist in the preparation of joint submissions which were agreed to by the prosecution, before being filed with the court.[161] The submissions contained references to non-existent cases and fictitious quotes. Although the court provided the lawyer with an opportunity to revise the submissions, the revised documents were also found to contain non-existent legislation. In response to these events Justice Elliott stated:
The ability of the court to rely upon the accuracy of submissions made by counsel [lawyers] is fundamental to the due administration of justice … any use of artificial intelligence without careful and attentive oversight of counsel would seriously undermine the court’s processes and its ability to deliver justice in a timely and cost-effective manner … counsel must take full and ultimate responsibility for any submissions made to the court. To this end, it is not acceptable for artificial intelligence to be used unless the product of that use is independently and thoroughly verified.[162]
7.108However, GenAI can also bring significant opportunities for court users in the preparation of submissions. It can:
•create efficiencies by reducing the time taken to draft submissions
•enhance access to justice, particularly for self-represented litigants. Representatives of the Magistrates’ Court highlighted that the ‘use of GenAI could improve and assist the quality of submissions, particularly those that might struggle with English comprehension’.[163]
7.109A representative of the Federal Circuit and Family Court provided an example of how AI may support access to justice for self-represented litigants to draft submissions:
Recently, a self-represented litigant with limited reading and writing skills could not put his statement of claim together. He was given numerous opportunities to redraft his statement but was unable to. He then used ChatGPT and was able to produce something that was readable and structured that could meet the initial threshold. He disclosed his use of AI and in the circumstances, I did not find it objectional. If he had not used ChatGPT, his statement of claim may have been struck out. It allowed him to articulate the claim and provide something to get him over the threshold. It was still his narrative but the end product was in a form which assisted the court and the opposing party. Think how many who are denied access could get access through the assistance of AI tools.[164]
7.110Currently, the Supreme Court’s guidelines encourage:
•court users to check AI-generated text to ensure it is not out of date, incomplete, inaccurate or incorrect, inapplicable to the jurisdiction or biased
•parties to disclose their use of AI to each other and the court where appropriate
•self-represented litigants to include a statement about the AI tool used in any AI generated materials to be filed with the court.[165]
7.111Other courts and tribunals have taken different approaches to the use of GenAI to produce submissions. Some approaches are that:
•court users are not required to pre-emptively disclose if and how they have used AI to prepare submissions[166]
•court users are required to disclose if they have used AI in the creation of a submission, this may take the form of a written declaration.[167]
•where GenAI is used to prepare written submissions, court users are required to verify in the body of the submission, the accuracy and relevance of all citations, legal and academic authority and case law and legislative references.[168]
7.112We heard mixed views on whether court users should disclose if they use GenAI to create submissions. Table 7 illustrates the breadth of views.
Table 7: Stakeholder views on disclosure requirements of AI use in submissions
|
Position on disclosure of AI use in submissions |
Stakeholder views |
|---|---|
|
Pre-emptive disclosure should not be mandatory |
Supreme Court: ‘Restriction on the use of AI in submissions is undesirable. It is plain forms of AI are already used extensively by the profession. If practitioners are able to more efficiently produce submissions utilising AI this should not be the subject of restriction …The Court does not presently see a need for disclosure, certification or restriction regarding the use of AI in submissions’.[169] |
|
County Court: ‘Requiring disclosure of how a person has arrived at their submission is not necessarily useful because submissions can be tested in the usual way by discussion in Court.’[170] |
|
|
Coroners Court: ‘We would encourage disclosure in submissions like [in] the Victorian Supreme Court [Guidelines]; we wouldn’t want to discourage AI use or mandate disclosure’.[171] |
|
|
Magistrates’ Court: ‘in every case … the judicial officers of the Magistrates’ Court serve as the finder of fact and application of relevant law and will assess the submissions on its proper merits, regardless of whether AI has been used’.[172] |
|
|
Pre-emptive disclosure should be mandatory |
VCAT: ‘Our preliminary view is that, for now, all tribunal users must disclose if they use GenAI for any of the three types of documents: submissions, affidavits or evidence’.[173] |
|
Victorian Bar Association: ‘mandate disclosure where generative AI has been used in the preparation of written submissions, along with certification that the content of such submissions has been directly verified by the author’.[174] |
|
|
InTouch Multicultural Centre Against Family Violence: ‘Mandatory disclosure is preferrable. For instance, the requirement for disclosure where English is not their first language. This might trigger someone to work with them and help them input, and make sure it is accurate’.[175] |
|
|
Office of the Victorian Information Commissioner: ‘the importance of self-represented litigants and legal professionals disclosing to courts and tribunals when AI has been, or is being, used…’ should be included in guidelines to court users.[176] |
7.113The VLSB+C census found that there is broad acceptance for the use of AI across the Victorian legal profession.[177]
7.114GenAI is already being used broadly across society and is being embedded into common products like Google, Adobe and Word. A prohibition or mandatory disclosure would be practically difficult for Victoria’s courts and VCAT to enforce because people ‘may be unaware that a tool they are using includes AI and therefore inadvertently fail to disclose’.[178] Several stakeholders echoed this view that mandatory disclosure requirements would be challenging because ‘people are often not aware they are using it [AI]’.[179]
7.115While we received mixed views on disclosure from our consultations, the VLSB+C census indicates that there may be support among Victorian lawyers to disclose their use of AI. The census found that 68 per cent of respondents agreed or strongly agreed lawyers should be required to disclose their use of AI in litigation.[180]
7.116In considering the value and limitations of disclosure representatives of the Judicial College of Victoria stated:
Disclosure is a risk management exercise for the person making the disclosure and for person receiving it. One of the principal purposes of requiring disclosure for the person receiving it is … to flag there may be fictitious statements or issues with submitted evidence. But … a culture of over-disclosure might risk obscuring the meaningful part of what needs disclosure.[181]
7.117It is likely that, due to the ubiquity of GenAI, a mandatory pre-emptive disclosure requirement would lead to over disclosure where court users are asked to perform a ‘box ticking’ exercise. We heard that a blanket disclosure requirement could ‘increase the administrative burden on courts and tribunals without a corresponding benefit’.[182] We heard from several judicial officers that disclosure is not necessarily helpful to determine the matter before them and what is more important is whether the submission is accurate. Representatives of the Supreme Court noted:
Disclosure is unnecessary. The critical concern is not how the submissions were drafted but that they are helpful to the Court and accurate. Disclosure of the use of AI does not address that critical concern.[183]
7.118Some stakeholders considered that existing obligations may be sufficient to address the risk of inaccuracies in submissions prepared with AI. Representatives of Law Firms Australia stated that if AI is used to draft submissions, lawyers are still required to sign off on that document and remain responsible for its accuracy.[184] As discussed above (from paragraph [7.10]), if lawyers fail to verify the accuracy and relevance of documents prepared with AI they may be in breach of their professional obligations.[185] Participants in civil litigation, including self-represented litigants have a paramount duty to further the administration of justice.[186] Inaccurate or misleading submissions ‘can impede the administration of justice by leading to a waste of court time and resources’.[187]
7.119In responding to instances where inaccurate or false material has been filed or relied on in courts, judges have emphasised the critical importance of checking material before relying on it. Chief Justice Bell has stated there is an:
absolute necessity for practitioners who do make use of Generative AI in the preparation of submissions … to verify that all references to legal and academic authority, case law and legislation are only to such material that exists, and that the references are accurate, and relevant to the proceedings.[188]
7.120In considering all the issues, the Commission does not recommend court users preemptively disclose that they have verified AI use in court documents. As representatives of the Supreme Court cautioned, while ‘it is critical that submissions are checked for accuracy, the Court does not consider that a certification requirement in relation to AI use in submissions is necessary’.[189]
7.121Instead, educative guidance on the risk of inaccuracy may help to raise awareness among court users about the risks and limitations of GenAI. This would highlight the importance of verifying outputs where GenAI has been used.
7.122To address concerns relating to accuracy, it is recommended that court-issued guidelines contain a list of considerations for court users to check the accuracy and applicability of submissions prepared with GenAI. This would help to address concerns that the use of GenAI will lead to reliance on inaccurate or irrelevant information in documents for courts and tribunals.
7.123This approach aligns with Singapore’s response.[190] The Supreme Court of Singapore does not require disclosure but encourages users to check outputs and reminds users that the court retains the ability to ask about the use of AI.
|
Singapore Courts Guide on the use of Generative Artificial Intelligence Tools by Court Users Court-issued guidelines in Singapore contain the following accuracy considerations. To ensure accuracy in the Court Documents you submit, you should do the following: a)Fact-check and proof-read any AI-generated content that you use. b)Edit and adapt AI-generated content to suit your situation. c)Verify that any references to case law, legislation, textbooks or articles provided as AI-generated content actually exist and stand for the legal positions that are attributed to them. If the AI-generated content includes extracts or quotes, you must verify that these are extracted/quoted accurately and attributed to the correct source. d)When checking the materials referred to in (c) above, you should use a source that is known to have accurate content… e)Please note that it is not sufficient verification for you to ask a Generative AI tool for confirmation that the materials exist or contain the content that the AI-generated content says it does. To be clear, you cannot use one Generative AI tool to confirm the content generated from another Generative AI tool.[191] |
7.124GenAI systems are known to contain inaccuracies. Victorian lawyers have reported taking steps to ensure the accuracy of AI-generated content. The VLSB+C census found that it is common practice (76.1 per cent of respondents) to cross-check AI outputs against other sources to verify legal accuracy.[192] This census also found that nearly half of respondents specifically cross-check information to guard against hallucinations.[193]
7.125Incorporating accuracy considerations into guidelines will help court users to thoroughly check their work and to improve the quality of materials filed with the court.
7.126For these reasons, accuracy considerations contained in the Singapore guidelines should be replicated in the Victorian Supreme Court guidelines. This would help to enliven the principles of accountability and independence, and human oversight and monitoring.
7.127Several stakeholders drew attention to the fact that judicial officers will retain the ability to ask about the use of GenAI in submissions. We heard that courts are well placed to ask about submission content and that this was a core function of courts.[194] This was also supported by representatives of the Magistrates’ Court.[195] Additionally, the Chief Justice of Western Australia recently argued that: ‘Distinguishing between the artificial and the real, however, has always been the role of the legal system, and of the institution of the judiciary’.[196]
7.128Guidelines should remind court users that courts and tribunals have broad powers to make directions for the conduct of proceedings. This includes directions relating to documents and evidence.[197] Therefore courts may give directions for parties to provide further information about documents they have produced with the assistance of GenAI.[198]
7.129Guidelines should make court users aware of the risks of providing inaccurate material. If court users rely on unverified GenAI outputs in submissions, courts and tribunals may:
•refer lawyers to the regulator on the basis they have breached their professional obligations[199]
•order lawyers to personally pay costs to the other side incurred by using unverified AI content[200]
•issue warnings to self-represented litigants about the unsatisfactory consequences of relying on unverified GenAI materials to prepare submissions, or not give weight to or disregard parts of submissions that contain unverified GenAI content[201]
•refer litigants to AI guidelines or practice notes.[202]
7.130It is recommended that court-issued guidance remind court users that courts and VCAT retain the ability to ask about their use of GenAI in the preparation of materials filed with the courts. Guidance should remind court users that it is important they are able to explain how they have used GenAI to prepare materials for the court if asked.
|
Recommendation 8.The Supreme Court’s guidelines should state that court users can use GenAI in the preparation of submissions but to ensure accuracy court users should: a.fact-check and proofread GenAI content b.edit and adapt GenAI content to suit the situation c.verify references d.use sources known to be accurate e.not use GenAI to verify subsections a to d above. |
Using GenAI to prepare affidavits, witness and other statements
7.131GenAI is being used to develop affidavits, character references, witness statements and other written statements.
7.132There are several different forms of written statements which can be used to reflect the evidence and/or opinion of the person in legal proceedings. This includes but is not limited to:
•Affidavits: formal written statements which set out facts known to the person. The statement must be signed under oath or affirmation, which is the person saying that the information is true.[203]
•Witness statements: evidence by witnesses is generally led orally, but can in some instances be in writing, For example when ordered in the Supreme Court Commercial Court.[204] Where evidence is given orally, at the hearing, the witness will be required to make an oath, or affirmation, before giving their evidence confirming that they will provide ‘the truth, the whole truth and nothing but the truth’.[205]
•Character references: a person who has pleaded guilty or been found guilty of an offence may wish to provide a character reference to the court. The person producing the character reference includes information about how they know the person and information about their character.[206]
•Victim impact statements: if someone has pleaded guilty or is found guilty of a crime, a victim of the crime can choose to tell the court about how the crime has affected them in a victim impact statement.[207] There are rules about what can be contained in the statement.[208] The person making the victim impact statement must make a statutory declaration or orally swear or affirm their evidence.[209]
7.133Some of the risks associated with using GenAI in the preparation of affidavits and witness statements include:
•The use of GenAI may lead to confidential and or sensitive information being input into environments that do not maintain the privacy or confidentiality of that information.
•Statements may be inaccurate and distort the views of the statement maker. This risk is acute for self-represented litigants, whereas lawyers are bound by professional obligations to their client and the court and should not allow clients to sign documents that they have not fully accepted as their own position and language.
• ‘Witnesses may defer to the technology, just as they sometimes defer to their lawyer. Once words or language have been suggested they may infect recollection.’[210] This may alter or inadvertently shape the evidence.
•Statements could make people appear more competent than they are. This may prevent courts and lawyers from identifying that a person does not understand information or requires accommodations. This is a serious risk for people who are marginalised or who have a disability.[211]
•‘Normalising the use of AI tools to create witness statements might signal you need perfect witness statements, and then AI is determining what’s acceptable and what’s not. The significance of witness statements is not just what is said, but how it’s being said’.[212]
7.134We spoke to representatives of the Victorian Advocacy League for Individuals with Disability (VALID), an organisation that works with people with intellectual disabilities. They stated that the use of AI to produce statements may remove an important human element of the justice system. They commented that if AI is used to prepare witness statements for their clients it may:
dilute the experience of justice and processing trauma. In human interaction, there is a sense of processing and having an experience of social justice if you can go through what happened with somebody and they take your statement. Then that statement is read out in court and you might be able to feel you received justice. If you take that [human] element out, you will have less of a sense of justice. If you make the recorder be AI, something is taken out, and that might remove something from the experience of social justice. Part of therapy is having a say, and writing can be powerful.[213]
7.135Warnings about the use of GenAI for character references emerged in Director of Public Prosecutions v Khan.[214]Justice Mossop held that little weight could be placed on a character reference from the offender’s brother because the reference appeared to be generated by an AI program such as ChatGPT.[215]
7.136One emerging issue is the use of avatars to present statements or the views of parties. Recently, a US court allowed an AI-generated victim impact statement, with the words of a victim’s family read by an avatar of the deceased. Concern was raised that it was potentially manipulative and did not represent the victim’s own words.[216] In another US case, a self-represented litigant was to provide the court with an audiovisual presentation of his submission because he claimed to be suffering an ailment that prevented him from articulating himself.[217] Instead he created an AI-generated avatar. On realising this, the court stopped the presentation and instructed the litigant to give his argument personally.
7.137These issues have not yet emerged in Australia and are unlikely to be permitted under current rules in Victoria.[218] This is because the relevant legislation requires the involvement of a ‘person’. It is a requirement in Victoria that a victim impact statement is made by a victim—which is described as a person or body that has suffered injury, loss or damage or another person on behalf of a victim under certain circumstances.[219] A victim may request their statement is read aloud by another person approved by the court.[220] Additionally, the Evidence Act 2008 (Vic) defines competence and compellability to give evidence in relation to a ‘person’.[221] The court may rule as inadmissible the whole or any part of a victim impact statement.[222]
7.138However, there are provisions made to support witnesses who may need an interpreter or where the witness is deaf and or mute.[223] The court may give directions about ‘any appropriate’ means by which a deaf or mute witness gives evidence.[224] The Judicial College of Victoria has stated that this provision ‘provides a wide scope for a court to make directions. Such direction might include the use of augmentative and alternative communication to enhance or replace speech’.[225] This opens the possibility that in future AI may be used to help people who are deaf or mute to give evidence, but it will still need to be the statement of that ‘person’. Importantly, under the current provisions, this use can only happen after application to the court to proceed in this way.
7.139There are also opportunities associated with the use of GenAI in the preparation of affidavits and statements by improving:
•efficiency by reducing the time taken to draft affidavits and statements
•access to justice because AI can be used to assist people to draft a cohesive statement.[226] AI can support people to express themselves with more clarity and assist the court to understand their story. That is somewhat analogous to assistance lawyers provide to clients in helping to draft statements.
7.140In relation to affidavits and witness statements, the Supreme Court’s guidelines include the following warning:
Particular caution needs to be exercised if generative AI tools are used to assist in the preparation of affidavit materials, witness statements or other documents created to represent the evidence or opinion of a witness. The relevant witness should ensure that documents are sworn/affirmed or finalised in a manner that reflects that person’s own knowledge and words.[227]
7.141Other court jurisdictions have taken a variety of approaches to GenAI use in affidavits and statements.[228] Approaches range from:
•cautioning court users about potential risks[229]
•prohibiting the use of GenAI in the generation of the content of affidavits, witness statements, or other similar materials tendered for cross-examination.[230]
7.142NSW has taken a restrictive approach by requiring a disclosure that GenAI was not used in the preparation of these materials.[231] To give effect to this requirement, the Uniform Civil Procedure Rules 2005 (NSW) were amended.[232] NSW has also updated the forms for affidavits and witness statements, which now require the person on oath or affirmation to declare that they have not used GenAI to generate an affidavit,[233] or witness statement.[234]
7.143We heard mixed views about the use of AI for the preparation of affidavits and witness statements. The breadth of views is captured in Table 8.
Table 8: Stakeholder views on GenAI use in affidavits and statements
|
Position on use of AI in affidavits, statements etc. |
Stakeholder views |
|---|---|
|
No prohibition, disclosure or certification |
Supreme Court: ‘Certification will not add anything additional given the existing jurat requirements for affidavits. Further, while witness statements are not sworn, the person is required to adopt them when they give evidence … The preferred approach is to maintain the status quo and not require disclosure/certification for affidavits and statements.’[235] |
|
Court users are required to disclose the use of GenAI |
VCAT: ‘Our preliminary view is that, for now, all tribunal users must disclose if they use GenAI for any of the three types of documents: submissions, affidavits or evidence.’[236] |
|
Court users are required to certify the accuracy of the document and that it reflects their own voice |
County Court: ‘Certification could be seen as a better compromise. If the document is certified, [the Court] can see which bits are created with AI and ask how [a party] adopted this as [their] evidence and ask them to tell [the Court] more about that. It provides an invitation to the court to scrutinise that use. Cross-examination is where that use can also be scrutinised.’[237] |
|
Prohibit or limit use of GenAI |
Victorian Bar Association: ‘safeguards are required to … prohibit the use of Generative AI in the preparation of affidavits, witness statements, witness outlines, answers to interrogatories and character references. Such documents must reflect only the evidence of the witness or author and should be expressed in the witness’ or author’s own words, not the words of a large language model or system. The Bar urges for particular caution to be shown in respect of criminal proceedings. For example, the use of AI in the preparation of evidential material such as witness statements … would pose a serious risk to the integrity and reliability of evidence in criminal proceedings and is opposed by the Bar. Further, such documents should contain a statement certifying that Generative AI has not been used’.[238] |
7.144Some stakeholders raised concerns about the approach taken to affidavits and statements in NSW.[239] A possible rationale for NSW’s approach is to address concern that the voice of the person will not be accurately reflected if AI is used in drafting the document. However, we heard that the risk of an affidavit not reflecting the person’s voice already exists and is not a risk solely created by the use of GenAI. Representatives from Eastern Community Legal Centre explained:
The same risk is already present in someone else drafting an affidavit. Lawyers currently draft them and then send to the client to confirm. Whether it is a human or AI drafting the document is not the issue. Accuracy is the issue. There is a need for human oversight. Once someone swears an affidavit that is them confirming the accuracy of that document. I do not totally understand NSW’s approach, the risk is already present with human drafting.[240]
7.145This view was echoed by representatives of the Supreme Court:
The concern that an affidavit or witness statement may not be a genuine reflection of the authors voice is an existing issue that predates AI. A lawyer may draft a statement, and it is written in a way that does not reflect the person’s voice. For example, where English is a second language for the person, but the statement is written with overly complicated words they do not understand.[241]
7.146Similar statements were made by representatives of the Federal Circuit and Family Court.[242]
7.147Judges have previously raised concerns about the authenticity of written materials and that lawyers may be deliberately or inadvertently altering evidence through the drafting of affidavits.[243]
7.148However, Professor Michael Legg has argued that there are important differences between lawyers drafting affidavits compared to GenAI. This is because lawyers are subject to professional obligations of candour and honesty.[244]
7.149To address concerns about the authenticity of written evidence, some jurisdictions have started to return to oral evidence. For example, in the Supreme Court of Western Australia:
a procedure whereby written statements constituted the evidence of witnesses in civil cases that had been in place for several years has been replaced with a return to oral evidence. Significantly, that change was a consequence of judicial disquiet about the authenticity of written evidence in which the reliability of written material could no longer be assumed to reflect the words and recollections of witnesses themselves.[245]
7.150The Chief Justice of Western Australia has commented that:
an ounce of oral dialogue is worth a pound of written advocacy. The greater intelligibility and immediacy of oral advocacy, again serves to emphasise the interpersonal nature of the judicial process.[246]
7.151Several stakeholders viewed existing court processes of swearing or affirming an affidavit as sufficient to address the risk that AI may alter the accuracy or voice of the person. If a person makes a false statement in breach of their oath or affirmation they can be charged with the offence of perjury.[247]
7.152Additionally, it was noted that cross-examination is where the use of AI in statements of affidavits could be examined.[248] It was stated that ‘evidentiary flaws are usually explored in cross-examination.’[249] Representatives of the Supreme Court stated if the document ‘does not reflect the author’s view, this will likely be exposed through existing court processes of cross-examination’.[250]
7.153However, we were told there may be a gap that could be addressed because character references and some other written statements do not require an oath or affirmation about the contents of the document.[251]
7.154To address this gap, a verification requirement could be introduced for written statements and character references. This could take the form of the person declaring that they have verified the accuracy of the document and confirm it represents their views. This approach was supported by Professor Ian Freckelton AO KC who stated:
It is not a problem to get help generating a document from AI so long as the person in whose name it is written swears to its being their document and takes responsibility for every single word within it … The person whose name the document is in should always certify the document reflects their view if they have used AI. It would be useful to have a required section, such as a jurat, that might say: ‘All the content, memories, statements, reasoning, opinions etc, represent my genuine views and has been written by…’[252]
7.155The benefit of this approach is that it would reduce the risk of inaccuracy or the potential distortion of the person’s voice.
7.156We consistently heard that disclosure is less meaningful than checking accuracy. For this reason, the verification should not require court users to disclose whether they have used GenAI but that they have checked the accuracy and adopt what is written in the document.
7.157Where a written statement (such as a character reference) does not require a jurat, a statutory declaration oath or affirmation, the document should include a verification from the author as to the accuracy of the statement and that it represents their view.
7.158This direction could be contained in AI guidelines to court users. There may also be future opportunities to explore legislative change to detail the requirements of what is to be contained in character references. For example, the Sentencing Act 1991 (Vic) details how a victim impact statement is to be prepared, and that it is to be a statutory declaration and/or given orally by evidence which is sworn or affirmed.[253] A similar requirement could be inserted into the Sentencing Act 1991 (Vic) to provide direction on how to prepare a character reference which could include completion of the above verification statement.
|
Recommendation 9.A person making a witness statement, character reference or similar statement must verify its accuracy and that it represents their view. |
Using AI in expert reports
7.159GenAI is being used by some experts to draft or prepare the content of their expert reports. Some experts may also use other forms of AI such as predictive algorithms in the content of their reports.
7.160There is a risk that the use of AI to produce expert reports may:
•distort the opinion of the expert
•not be fully explicable, for example, the ‘black box’ problem (given it is necessary for the courts to understand how an expert came to their decision)
•lead to unfair or discriminatory outcomes (because of underlying bias in the data that an AI system is trained on or because of the use of unrepresentative data)
•lead to confidential and or sensitive information being placed into sources that are not compliant with privacy requirements.
7.161But the use of AI in the preparation of expert reports also brings opportunities to:
•improve efficiency, by reducing the time taken to draft expert reports
•enhance processing, summarisation and analysis of large amounts of data to inform content
•identify patterns and connections in large amounts of data.
7.162Representatives from the Federal Circuit and Family Court noted the potential benefits of experts using AI stating:
The family law section of the court relies on expert evidence … Anecdotally, experts in family law are under pressure to do their work quickly and efficiently, similarly to the judiciary. The potential for AI to relieve some of that pressure should not be ignored – but it has to be appropriate use, and it needs to be transparent.[254]
7.163Currently, the Supreme Court’s guidelines warn:
Particular caution needs to be exercised if generative AI tools are used to assist in the preparation of affidavit materials, witness statements or other documents created to represent the evidence or opinion of a witness. The relevant witness should ensure that documents are sworn/affirmed or finalised in a manner that reflects that person’s own knowledge and words. Similar considerations arise in the use and identification of such tools in compiling data in the preparation of any expert reports or opinions, and compliance with the Expert Witness Code of Conduct.[255]
7.164Courts and tribunals have taken different approaches where AI is used to prepare expert reports. The Federal Court of Canada requires expert witnesses to disclose AI use in their reports.[256] In Victoria, this approach has been taken in the recent update to the Practice Note Expert Evidence in Criminal Trials.[257]
7.165In contrast, the approach in NSW prohibits the use of AI in expert reports without prior leave of the court. Where leave is granted, there are requirements about what information is necessary to be included in the report about any AI tool used.[258]
7.166Some stakeholders noted concerns with the approach taken in NSW for the use of AI in expert reports. Representatives of Law Firms Australia stated:
The NSW requirement to seek leave to use AI in certain circumstances … might add a time and cost burden that is not proportionate to the risk of use, provided use is disclosed. Seeking leave in respect of the use of AI for expert reports raises similar considerations, provided the Court is able to assess the appropriate weight to be given to a report.[259]
7.167In Victoria, the recently amended Practice Note Expert Evidence in Criminal Trials includes new validity requirements for scientific, medical or technical reports (as discussed in Chapter 5).[260] Relevantly, it now contains specific disclosure requirements relating to the use of AI. The new section 7.4 states:
Where an expert used artificial intelligence to assist in the generation or expression of the opinion contained in his or her report:
a)those matters are to be disclosed in the report; and
b) the report should identify any possible biases that may affect the content generated by the artificial intelligence tool that was used.[261]
7.168We heard mixed views about the use of AI in expert reports. The breadth of views is captured in Table 9.
Table 9: Stakeholder views on the use of AI in expert reports
|
Position on use of AI in expert reports |
Stakeholder views |
|---|---|
|
Mandate certification where AI is used in expert reports via the Practice Note Expert evidence in criminal trials and update the Expert Witness Code of Conduct |
Supreme Court: ‘The Court should not prohibit the use of AI by experts. Experts are already using AI across several fields. There is no appetite for experts to have to seek leave to request to use AI … However, there is value in certification of AI use for expert reports. Experts would be required to state in their evidence where and how AI has been used. This increases transparency. It is necessary to avoid a ‘black box’ problem where AI is used. A revised expert evidence practice note for civil trials will help to avoid this problem.’[262] |
|
County Court: ‘It would be useful for them to certify it is accurate and reflects their views etc. If part of the knowledge expressed in the report is based on the expert exercising supervision of an AI tool, that does add something we may want to know.’[263] |
|
|
Coroners Court: ‘The expert has to tell us they used AI however we define it, but if the expert is relying on it, they should tell us how they use it. If they rely on it to come to their opinion or interpret a CT scan, then I want them to tell me that … We would probably adopt the practice note [Expert Evidence in Criminal Trials]. My starting position is that we are content to rely on what has been developed by the [Forensic Evidence Working Group] updating the practice note, but otherwise the traditional processes of examining the contents of an expert report shouldn’t need to vary simply as a result of the introduction of generative AI tools.’[264] |
|
|
Magistrates’ Court: This might be ‘better addressed through additions to the Expert Code of Conduct … then experts can show they have complied with it. Where experts have created parts of the report by using AI, or in forming their opinion, they could identify that use in explaining their opinion. For example, an acknowledgement that they have read all the information to make a determination and to satisfy themselves that they still maintain their opinion.’[265] |
|
|
VCAT: ‘If an expert does use AI in the preparation of an expert report, they still must verify that it is correct and true. The reference to the use of AI must include a statement that it is accurate and reflects their knowledge.’[266] |
|
|
Prohibit the use of AI in expert reports unless leave is granted. |
Victorian Bar Association: ‘safeguards are required to prohibit the use of Generative AI in the preparation of expert reports without leave of the Court or Tribunal. Expert reports must reflect the enquiries, reasoning and opinions of the expert, expressed in the expert’s own words. Where the use of Generative AI may be justified in the work of an expert, leave of the Court or Tribunal should be able to be sought. Expert witness codes of conduct should be amended to reflect a prohibition on the use of Generative AI by experts.’[267] |
7.169Many stakeholders rejected prohibiting the use of AI and GenAI by experts. Representatives of the Supreme Court stated that ‘the Court should not prohibit the use of AI by experts.’[268] Representatives from VCAT stated that ‘it is arguable that it goes too far to say they cannot use GenAI at all’.[269]
7.170As discussed in Chapter 2, some experts are already using AI in their practice and use will continue to increase. This includes use in writing and drafting reports but also in the analysis of information relevant to their opinion.
7.171We heard that disclosure is not always as helpful as knowing that the content of an expert report accurately reflects the expert’s view. Professor Ian Freckelton AO KC stated:
The key thing is that judges receive good products in terms of expert evidence, that they can rely on. It does not really matter if an expert report was written with a quill, a proofreader or with AI, so long as it represents the considered view of the expert. The key is whether the case findings are reliable … it is still a judge’s responsibility to weigh up everything they have. Knowing whether AI has been used or not doesn’t take judicial officers far, save that it may prompt scrutiny during a hearing as to whether it is genuinely the view of the author and whether its bases are sound and its reasoning is sound.[270]
7.172Representatives of the County Court stated that not every use of AI by an expert would be necessary to disclose. They did not see value in disclosing the use of AI to draft an expert report where that use was limited to predictive text or grammar tools.[271] But they were interested in experts disclosing where they had relied on AI to form the basis of their opinion:
Perhaps disclosure comes in when AI is used to identify, describe, develop or analyse data or form the basis for the opinion that needs to be disclosed … [I am] … interested in when AI may be used for formulating or analysing information that formed part of the basis of the opinion.[272]
7.173The Supreme Court’s amended Practice Note Expert Evidence in Criminal Trials has created a disclosure requirement where AI is used to ‘assist in the generation or expression of the opinion.’[273] The Practice Note does not define AI or distinguish between AI and GenAI. This phrasing may capture use of AI to form an opinion and GenAI used more generally to draft an expert report.
7.174It is not clear if the Practice Note was intended to capture GenAI tools like spell check or formatting assistants which could be used in ‘expressing’ an opinion. There is also a requirement for experts to identify ‘any possible biases that may affect the content’ generated by AI.[274]
7.175The Practice Note prescribes that the expert report must describe if and how the method used in forming the opinion has been validated.[275] The benefits of the new validity requirements are discussed in Chapter 5. In summary the validity requirements seem to address the courts’ concerns for experts to describe the method they have relied on to form the opinion and the process by which that method has been validated.[276]
7.176The amended Practice Note will likely help parties and courts understand how AI has been used. It is likely to provide benefits in addressing transparency and bias. It will be necessary to monitor how it is applied over time. It may need refinement in future to avoid over disclosure of GenAI grammar and spelling tools.
7.177The Supreme Court’s amended Practice Note has been approved by the County Court.[277] Representatives from the Coroners Court indicated they were considering adopting the Supreme Court’s amended Practice Note.[278]
7.178As discussed above, VCAT is developing its own guidelines to tribunal users that is likely to include a requirement for experts to disclose if AI has been used in expert reports.[279]
7.179It may be valuable for Victoria’s other courts and VCAT to consider adopting the direction on AI for expert reports contained in the Supreme Court’s amended Practice Note.
7.180We also heard from representatives of the Supreme Court that it may be valuable to consider adopting a similar approach in the civil law context, for example by updating the Expert Witness Code of Conduct.[280] We note that in relation to civil law matters experts are already bound to comply with overarching obligations in the Civil Procedure Act 2020 (Vic).[281] The Commission recommends that the Expert Witness Code of Conduct applicable in civil trials should be updated to align with the Practice Note Expert Evidence in Criminal Trials in relation to the use of AI by experts.
7.181Requiring disclosure and introducing validity requirements relating to the use of AI for expert opinions will support transparency, accountability and contestability of expert evidence.
7.182The Supreme Court guidelines to court users on AI should refer to the obligations relating to the disclosure of AI in expert reports in the Practice Note Expert Evidence in Criminal Trials.[282]
|
Recommendations 10.The Expert Witness Code of Conduct applicable in civil trials should be updated to align with the Practice Note Expert Evidence in Criminal Trials in relation to the use of AI by experts. 11.The Supreme Court guidelines should refer to obligations for the use of AI by expert witnesses contained in the: a.Practice Note Expert Evidence in Criminal Trials and b. Expert Witness Code of Conduct (once updated as per recommendation 10). |
Example of updated Supreme Court guidelines
7.183The section below provides an example for how the different elements of guidelines discussed in this chapter could fit together in an updated version of the Supreme Court guidelines.
Example guidelines for court/tribunal users on the safe use of artificial intelligence
Introduction
1These guidelines for the use of AI in court/tribunal proceedings have been developed to assist court/tribunal users, which includes lawyers, litigants (whether represented or not) and witnesses (lay and expert).
2These guidelines apply to civil and criminal proceedings.
Definitions
3Artificial intelligence (AI): A machine-based system that, for explicit or implicit objectives, infers from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. Different AI systems vary in their levels of autonomy and adaptiveness after deployment. AI is an umbrella term that captures the following:
a)Generative AI (GenAI): Software systems that generate content as text, images, music, audio and videos, based on a user’s prompts.
b)Public AI: AI tools that are openly accessible to the public, typically via the internet. Public AI tools are trained on broad, often public datasets, most commonly for general purpose use.
c)Closed AI: The phrase ‘closed AI’ is defined in contrast to public AI. Closed AI tools are generally not openly accessible to the public and information used in closed AI tools remain within a controlled environment. When an AI tool is ‘closed’ there are controls to reduce risks related to privacy or confidentiality settings that protect information from being made publicly available or used to train the AI tool.
Principles
4The following principles are intended to guide the use of AI in documents submitted to the court/tribunal.
Principle 1: Access to justice
5The court/tribunal recognises that AI use can enhance access to justice and court users’ participation in court processes. However, there are limitations and risks court users need to be aware of before using AI which are set out in these guidelines.
6The use of AI tools by court/tribunal users is permitted provided these guidelines are complied with.
Principle 2: Impartiality and Fairness
7AI can produce outputs that are biased and misleading. This is because it:
a)produces responses based on the dataset it was trained on. This means the responses it generates will reflect any biases (cultural or ethical) or geographical information or misinformation in the training data.
b)produces responses based on a statistical prediction of what the most likely combination of words are. It does not have an ability to critically examine the patterns it identifies in data. This can result in it drawing inaccurate or biased conclusions.
8Users of AI should seek to understand the limitations of AI tools and take reasonable steps to minimise and avoid reinforcing or perpetuating discriminatory or biased applications and outcomes.
Principle 3: Accountability and independence
9Users of AI remain fully responsible for the content in all documents submitted to the court/tribunal.
10Court users should be aware that the outputs of AI can be inaccurate or incorrect. Court users must personally verify any AI output and confirm it is not:
a)Out of date: the model used may only have been trained on data to a certain point in time and therefore will be unaware of any more recent jurisprudence or other developments in the law that may be relevant to a case.
b)Incomplete: the tool may not generate material addressing all arguments that a party is required to make or all issues that would be in a party’s interests to cover, and summaries generated by such tools may not contain all relevant points.
c)Inaccurate or incorrect: the tool may not produce factually or legally correct output (for example in some situations, users have been adversely affected by placing reliance on made-up cases, legislation or incorrect legal propositions).
d)Inapplicable to the jurisdiction: as the data used to train the underlying model might be drawn from other jurisdictions with different substantive laws and procedural requirements.
11Lawyers should be aware that they have existing duties to act with competence and diligence, and to provide independent advice. This means that irrespective of the AI tools used, lawyers must exercise oversight and verify the accuracy and suitability of the information provided by any AI system.
12Lawyers should also be aware that if they rely on unverified AI outputs in material submitted to courts and tribunals they may be referred to the Victorian Legal Services Board and Commissioner and/or ordered to pay costs to the other side.
Principle 4: Privacy and data security
13Users should understand that there are privacy and data risks associated with using AI.
14Users should be aware that some AI tools retain the information you input and can use it to train the AI system and to respond to queries from other users.
15Users should be aware that data contained in AI training datasets may have been obtained in breach of copyright.
16For public AI tools:
a)Be aware that any information you enter into a public AI tool could become publicly available.
b)Users should not enter any information which is confidential or sensitive (including information subject to a non-publication or suppression order) into public AI tools. If using public AI tools, users should make sure inputs are appropriately anonymised and generalised.
c)Lawyers should not enter confidential, sensitive or privileged client information into public AI tools. Lawyers must comply with their obligations to maintain client confidentiality.
17For closed AI tools:
a)If users input private, confidential or sensitive information into closed AI tools they need to exercise caution and satisfy themselves by reviewing the contractual terms or privacy and confidentiality settings that the information they input will be kept within a secure environment and not made public or used to train the AI program.
Principle 5: Transparency and open justice
18The use of AI by court users must not indirectly mislead another participant in the litigation process or the court/tribunal about the nature of any work undertaken or the content produced by that program.
19The use of AI by lawyers to assist in the completion of legal tasks must be subject to the obligations of lawyers in the conduct of litigation. This includes the obligation of candour to the Court and, where applicable, to obligations imposed by the Civil Procedure Act 2010, by which lawyers and litigants represent that documents and submissions have a proper basis.
Principle 6: Contestability and procedural fairness
20Courts/tribunals may give directions to court users to provide further information about documents they have produced with the assistance of AI.
21Court users must be prepared to identify the specific portions of the documents which were produced with AI and be able explain how the output was verified.
Principle 7: Efficiency and effectiveness
22Users may find that using AI can save significant time and resources in several tasks. But AI outputs can be inaccurate. Users should factor in appropriate time to verify outputs.
Principle 8: Human oversight and monitoring
23Any content produced using AI must be verified with meaningful human control.
Submissions
24To ensure accuracy in submissions submitted to the court/tribunal, for any GenAI content court users should:
a)Fact-check and proofread
b)Edit and adapt the content to suit the situation.
c)Verify that any references to case law, legislation, textbooks or articles provided as AI content have been verified to ensure that they exist and stand for the legal positions attributed to them. If the AI content includes extracts or quotes these must be verified as accurate and attributed to the correct source.
d)Note that when checking the materials referred to in (c), use a source that is known to have accurate content. For self-represented litigants this includes AustLII https://www.austlii.edu.au/ for case law and Victorian legislation https://www.legislation.vic.gov.au/ for legislation.
e)Note that it is not sufficient verification to ask an AI tool for confirmation that the materials exist or contain the content that the GenAI content says it does. To be clear, one AI tool cannot be used to confirm the content generated from another AI tool.
Affidavits and written statements
25Particular caution needs to be exercised if GenAI tools are used to assist in the preparation of affidavit materials, witness statements or other documents created to represent the evidence or opinion of a witness.
26The relevant person should ensure that documents are sworn/affirmed or finalised in a manner that reflects that person’s own knowledge and words.
27To ensure written statements are accurate and contain and reflect a person’s own knowledge, where a jurat is not already required, the person whose statement it is must verify at the end of the document that they have read the contents of the statement and the documents referred to in it and that it is an accurate and true representation.
Expert reports
28Particular caution needs to be exercised if AI is used to assist in the generation or expression of an opinion contained in an expert report.
29Expert reports must be prepared in compliance with the Expert Witness Code of Conduct and the Practice Note Expert Evidence in Criminal Trials which contains directions to disclose the use of AI as per section 7.4.
-
Supreme Court of Victoria, Guidelines for Litigants: Responsible Use of Artificial Intelligence in Litigation (Guidelines, 6 May 2024) <http://www.supremecourt.vic.gov.au/forms-fees-and-services/forms-templates-and-guidelines/guideline-responsible-use-of-ai-in-litigation>.
-
County Court of Victoria, Guidelines for Litigants: Responsible Use of Artificial Intelligence in Litigation (Guidelines, 3 July 2024); Supreme Court of Victoria, Guidelines for Litigants: Responsible Use of Artificial Intelligence in Litigation (Guidelines, 6 May 2024) <http://www.supremecourt.vic.gov.au/forms-fees-and-services/forms-templates-and-guidelines/guideline-responsible-use-of-ai-in-litigation>.
-
Consultation 9 (Victorian Civil and Administrative Tribunal).
-
F Abedi and NJ Balmer, AI Use in the Legal Profession: Findings from the 2025 Victorian Lawyer Census (Report, Victorian Legal Services Board and Commissioner, forthcoming 2025) 10.
-
The Law Society of NSW, Legal Practice Board of Western Australia, and Victorian Legal Services Board and Commissioner, Statement on the Use of Artificial Intelligence in Australian Legal Practice (Statement, 26 March 2025) <https://lsbc.vic.gov.au/news-updates/news/statement-use-artificial-intelligence-australian-legal-practice>.
-
Legal Profession Uniform Conduct (Barristers) Rules 2015; Legal Profession Uniform Legal Practice (Solicitors) Rules 2015.
-
The Law Society of NSW, Legal Practice Board of Western Australia, and Victorian Legal Services Board and Commissioner, Statement on the Use of Artificial Intelligence in Australian Legal Practice (Statement, 26 March 2025) <https://lsbc.vic.gov.au/news-updates/news/statement-use-artificial-intelligence-australian-legal-practice>.
-
F Abedi and NJ Balmer, AI Use in the Legal Profession: Findings from the 2025 Victorian Lawyer Census (Report, Victorian Legal Services Board and Commissioner, forthcoming 2025) 10.
-
Ibid 9.
-
Ibid 6.
-
Law Institute Victoria, Ethical and Responsible Use of Artificial Intelligence Guideline (Ethical Guideline, 13 August 2025) <https://www.liv.asn.au/download.aspx?DocumentVersionKey=69158983-87f3-4c1d-be99-8c300b5c7afd> The Law Institute of Victoria consulted with the VLSB+C and the Legal Practitioners’ Liability Committee in drafting the guideline.
-
Ibid 2.
-
Ibid 3. The dataset nutrition label provides information about a dataset including its intended use and other known uses, the process of cleaning, managing, and curating that data, ethical and or technical reviews, the inclusion of subpopulations in the dataset, and a series of potential risks or limitations in the dataset. See ‘The Dataset Nutrition Label’, The Data Nutrition Project (Web Page, 2021) <https://labelmaker.datanutrition.org/> for further information.
-
‘Artificial Intelligence Hub’, Law Institute of Victoria (Web Page) <https://www.liv.asn.au/aihub>.
-
James Barber KC, ‘Guidance on the Ethical Use of Generative AI’, Victorian Bar (Web Page, 25 August 2025) <https://www.vicbar.com.au/Web/Web/Contents/Resources/Ethics-Hub/Common-Issues/Use-of-Generative-AI.aspx>.
-
‘Who We Are’, Legal Practitioners’ Liability Committee (Web Page) <https://lplc.com.au/about/who-we-are>.
-
‘Limitations and Risks of Using AI in Legal Practice’, Legal Practitioners’ Liability Committee (Web Page, 17 August 2023) <https://lplc.com.au/resources/lplc-article/limitations-risks-ai-in-legal-practice>.
-
‘Managing the Risks of AI in Law Practices’, Legal Practitioners’ Liability Committee (Web Page, 13 February 2025) <https://lplc.com.au/resources/lij-article/managing-the-risks-of-ai-in-law-practices>.
-
F Abedi and NJ Balmer, AI Use in the Legal Profession: Findings from the 2025 Victorian Lawyer Census (Report, Victorian Legal Services Board and Commissioner, forthcoming 2025) 7.
-
Victoria Police, Victoria Police Artificial Intelligence Ethics Framework (Policy, 20 March 2024) <https://www.police.vic.gov.au/victoria-police-artificial-intelligence-ethics-framework>.
-
Australia New Zealand Policing Advisory Agency (ANZPAA), Australia New Zealand Responsible and Ethical Artificial Intelligence Framework (Report, 22 July 2025) <https://www.anzpaa.org.au/products/products/australia-new-zealand-responsible-and-ethical-artificial-intelligence-framework>.
-
Supreme Court of New South Wales, Supreme Court Practice Note SC Gen 23 Use of Generative Artificial Intelligence (Gen AI) (Practice Note, 28 January 2025) <https://supremecourt.nsw.gov.au/documents/Practice-and-Procedure/Practice-Notes/general/current/PN_SC_Gen_23.pdf>.
-
Ibid paras 2, 6.
-
Ibid para 7.
-
Ibid para 9B clarifies that the prohibition does not apply to: ‘(a) the generation of chronologies, indexes and witness lists; (b) the preparation of briefs or draft Crown Case Statements; (c) the summarising or review of documents and transcripts; (d) the preparation of written submissions or summaries of argument’.
-
Ibid para 9A.
-
Ibid paras 10-15.
-
Ibid para 16.
-
Ibid para 20.
-
Land and Environment Court of New South Wales, Practice Note: Use of Generative Artificial Intelligence (Gen AI) (Practice Note, 11 February 2025) <https://lec.nsw.gov.au/lec/news-and-announcements/practice-note—use-of-generative-ai.html>.
-
District Court New South Wales, District Court General Practice Note 2 Generative AI Practice Note (Practice Note, 18 December 2024) <https://districtcourt.nsw.gov.au/documents/practice-notes/district-court-pn—general/Gen_AI_Practice_Note.pdf>.
-
NSW Civil & Administrative Tribunal, NCAT Procedural Direction 7 – Use of Generative Artificial Intelligence (Gen AI) (Procedural Direction, 7 March 2025).
-
Ibid 1.
-
Queensland Courts, The Use of Generative Artificial Intelligence (AI) Guidelines for Responsible Use by Non-Lawyers (Guidelines, 14 May 2024) <https://www.courts.qld.gov.au/about/news/news233/2024/the-use-of-generative-artificial-intelligence-ai> Note, these guidelines were revised on the 15th of September 2025.
-
Queensland Courts, The Use of Generative Artificial Intelligence (AI) Guidelines for Responsible Use by Non-Lawyers (Guidelines, 15 September 2025) <https://www.courts.qld.gov.au/__data/assets/pdf_file/0012/798375/Artificial-Intelligence_Guidelines-for-Non-Lawyers.pdf>.
-
Federal Court of Australia, Notice to the Profession: Artificial Intelligence Use in the Federal Court of Australia (Notice, 29 April 2025).
-
Ibid.
-
Ibid.
-
Supreme Court of Western Australia, Artificial Intelligence Practice Direction: Consultation Note (Report, 27 February 2025) <https://www.supremecourt.wa.gov.au/_files/AI_practice_direction.pdf>.
-
‘A Statement from The Honourable Chris Kourakis, Chief Justice of South Australia Launching a Survey about Use of Generative AI in the South Australian Courts’, Courts Administration Authority of South Australia (Web Page, 30 May 2025) <https://www.courts.sa.gov.au/2025/05/30/a-statement-from-the-honourable-chris-kourakis-chief-justice-of-south-australia-launching-a-survey-about-use-of-generative-ai-in-the-south-australian-courts/>.
-
The Law Society of NSW, Legal Practice Board of Western Australia, and Victorian Legal Services Board and Commissioner, Statement on the Use of Artificial Intelligence in Australian Legal Practice (Statement, 26 March 2025) <https://lsbc.vic.gov.au/news-updates/news/statement-use-artificial-intelligence-australian-legal-practice>.
-
Law Society of NSW, ‘A Solicitor’s Guide to Responsible Use of Artificial Intelligence’ (14 November 2023) LSJ Online <https://lsj.com.au/articles/a-solicitors-guide-to-responsible-use-of-artificial-intelligence/>; Law Society of New South Wales, A Solicitor’s Guide to Responsible Use of Artificial Intelligence (Guideline, October 2024) <https://www.lawsociety.com.au/sites/default/files/2024-12/LS4527_MKG_ResponsibleAIGuide_2024-10-25%5B43%5D.pdf>.
-
NSW Bar Association, Issues Arising from the Use of AI Language Models (Including ChatGPT) in Legal Practice (Guidelines, 12 July 2023) <https://inbrief.nswbar.asn.au/posts/9e292ee2fc90581f795ff1df0105692d/attachment/NSW%20Bar%20Association%20GPT%20AI%20Language%20Models%20Guidelines.pdf>.
-
Queensland Law Society and QLS Ethics and Practice Centre, Guidance Statement No.38 Artificial Intelligence in Legal Practice (Report, 24 October 2024) <https://www.qls.com.au/content-collections/statements/guidance-statement-no-38-artificial-intelligence-in-legal-practice>.
-
Queensland Law Society, AI Companion Guide: The QLS overview guide to AI for solicitors (Guide, Version 1.0, Queensland Law Society, September 2024) <https://www.qls.com.au/content-collections/guides/ai-companion-guide>.
-
Law Society of South Australia, Use of Artificial Intelligence in the Federal Court of Australia (Submission to Federal Court Consultation on Artificial Intelligence Use in the Federal Court of Australia, 30 May 2025) <https://lssa.informz.net/lssa/data/images/Website/submissions/L300525_toLCA_UseofAIintheFederalCourt.pdf>.
-
Law Society of Western Australia, Submission: Supreme Court of Western Australia – Artificial Intelligence (Submission to Supreme Court of Western Australia’s Consultation on the Court’s Artificial Intelligence Practice Direction Consultation Note, 11 April 2025).
-
Law Society of Western Australia, Submission: Federal Court of Australia Artificial Intelligence Project Group (Submission to Federal Court Consultation on Artificial Intelligence Use in the Federal Court of Australia, 11 June 2025) <https://lawsocietywa.asn.au/wp-content/uploads/2025/06/2025JUN11-Submission-to-Federal-Court-AI-Project-Group.pdf>.
-
Ibid 9.
-
Law Council of Australia, Artificial Intelligence Use in the Federal Court of Australia (Submission to Federal Court Consultation on Artificial Intelligence Use in the Federal Court of Australia, 16 June 2025) <https://lawcouncil.au/publicassets/e4c1fdf4-334b-f011-94b6-005056be13b5/4686%20-%20S%20-%20AI%20use%20in%20the%20Federal%20Court%20of%20Australia.pdf>.
-
Ibid 6–7.
-
For example, Federal Court of Canada, Notice to Parties and the Profession – The Use of Artificial Intelligence in Court Proceedings (Notice, 7 May 2024).
-
Some examples include Court of King’s Bench of Manitoba, Practice Direction: Re: Use of Artificial Intelligence in Court Submissions (Practice Direction, 23 June 2023); Court of Quebec, Notice to the Legal Community and the Public – Maintaining the Integrity of Submissions before the Court When Using Large Language Models (Notice, 26 January 2024) <https://courduquebec.ca/en/article/notice-to-the-legal-community-and-the-public-maintaining-the-integrity-of-submissions-before-the-court-when-using-large-language-models> See Appendix C for further examples.
-
Some examples include Canadian Bar Association, Ethics of Artificial Intelligence for the Legal Practitioner – Toolkit (Guide, 2025) ‘3. Guidelines Relating to Use’ <https://cba.org/resources/practice-tools/ethics-of-artificial-intelligence-for-the-legal-practitioner/3-guidelines-relating-to-use/>.The Law Society of Alberta, The Generative AI Playbook (Guide, January 2024) <https://www.lawsociety.ab.ca/resource-centre/key-resources/professional-conduct/the-generative-ai-playbook/>; The Law Society of Manitoba, Generative Artificial Intelligence: Guidelines for Use in the Practice of Law (Guidelines, April 2024) <https://educationcentre.lawsociety.mb.ca/wp-content/uploads/sites/2/2024/04/Generative-Artificial-Intelligence-Guidelines-for-Use-in-the-Practice-of-Law.pdf> Additional examples are provided in Appendix C.
-
Caribbean Court of Justice, Practice Direction No. 1 of 2025: The Use of Generative Artificial Intelligence Tools in Court Proceedings (Practice Direction, 14 February 2025) <https://ccj.org/wp-content/uploads/2025/02/PRACTICE-DIRECTION-NO.-1-OF-2025-THE-USE-OF-GENERATIVE-ARTIFICIAL-INTELLIGENCE-TOOLS.pdf>.
-
The Law Society of England and Wales, Generative AI – the Essentials (Guide, August 2024) <https://www.lawsociety.org.uk//en/Topics/AI-and-lawtech/Guides/Generative-AI-the-essentials>.
-
Peter Homoki, Guide on the Use of Artificial Intelligence-Based Tools by Lawyers and Law Firms in the EU (Report, Council of Bars and Law Societies of Europe (CCBE) and European Lawyers Foundation (ELF), 2022) <https://www.ccbe.eu/fileadmin/speciality_distribution/public/documents/IT_LAW/ITL_Reports_studies/EN_ITL_20220331_Guide-AI4L.pdf>.
-
Bar Council Malaysia, The Risks and Precautions in Using Generative Artificial Intelligence in the Legal Profession, Specifically ChatGPT (Circular No 342/2023, 24 November 2023) <https://www.malaysianbar.org.my/document/members/circulars/2020—2024/2023&rid=46578>.
-
Courts of New Zealand, Guidelines for Use of Generative Artificial Intelligence in Courts and Tribunals: Lawyers (Guidelines, 7 December 2023); Courts of New Zealand, Guidelines for Use of Generative Artificial Intelligence in Courts and Tribunals: Non-Lawyers (Guidelines, 7 December 2023).
-
Law Society of New Zealand, Lawyers and Generative AI (Guidance, March 2024) <https://www.lawsociety.org.nz/assets/Professional-practice-docs/Rules-and-Guidelines/Lawyers-and-AI-Guidance-Mar-2024.pdf>.
-
Law Society of Scotland, Guide to Generative AI (Guide, October 2024) <https://www.lawscot.org.uk/members/business-support/lawscottech/resources/guide-to-generative-ai/>.
-
Supreme Court of Singapore, Guide on the Use of Generative Artificial Intelligence Tools by Court Users (Registrar’s Circular No 1 of 2024, 1 October 2024) <https://www.judiciary.gov.sg/docs/default-source/news-and-resources-docs/guide-on-the-use-of-generative-ai-tools-by-court-users.pdf>.
-
Some examples include Supreme Court of Illinois, Illinois Supreme Court Policy on Artificial Intelligence (Policy, 1 January 2025); District Courts & County Courts at Law, Wichita County, Texas, Standing Order Regarding Use of Artificial Intelligence (Standing Order, 26 March 2024) <https://wichitacountytx.com/download/standing-order-regarding-use-of-artificial-intelligence/>; American Bar Association, Standing Committee on Ethics and Professional Responsibility, Formal Opinion 512: Generative Artificial Intelligence Tools (Report, 29 July 2024) <https://www.americanbar.org/content/dam/aba/administrative/professional_responsibility/ethics-opinions/aba-formal-opinion-512.pdf> For additional examples see Appendix C.
-
Federal Court of Canada, Notice to Parties and the Profession – The Use of Artificial Intelligence in Court Proceedings (Notice, 7 May 2024); Supreme Court of Yukon, Use of Artificial Intelligence Tools (Practice Direction General No 29, 26 June 2023); The Law Society of Manitoba, Generative Artificial Intelligence: Guidelines for Use in the Practice of Law (Guidelines, April 2024) <https://educationcentre.lawsociety.mb.ca/wp-content/uploads/sites/2/2024/04/Generative-Artificial-Intelligence-Guidelines-for-Use-in-the-Practice-of-Law.pdf>.
-
Federal Court of Canada, Notice to Parties and the Profession – The Use of Artificial Intelligence in Court Proceedings (Notice, 7 May 2024) 1–2.
-
Orange County Superior Court, Orange County Superior Court Department C31: Pre-Trial Order and Standing Order Re: Artificial Intelligence (Guidance, 25 January 2024) <https://www.occourts.org/system/files?file=civil/knillprocedures.pdf>.
-
394th Judicial District Court (Texas), Standing Order Regarding Use of Artificial Intelligence (Standing Order, 1 August 2024) <https://img1.wsimg.com/blobby/go/2f8cb9d7-adb6-4232-a36b-27b72fdfcd38/downloads/Standing%20order%20Regarding%20Use%20of%20Artificial%20Int.pdf?ver=1720638374301>; County Courts at Law Judges in and for Collin County, Texas, Standing Order No. 3: Use of Artificial Intelligence for Any Court Filing (Standing Order, 3 July 2024) <https://topics.txcourts.gov/LocalRulesPublic/PreviewAttachment/2030>; District Courts & County Courts at Law, Wichita County, Texas, Standing Order Regarding Use of Artificial Intelligence (Standing Order, 26 March 2024) <https://wichitacountytx.com/download/standing-order-regarding-use-of-artificial-intelligence/>; Gray County Court, Texas, Standing Order Regarding Use of Artificial Intelligence (Standing Order, 17 July 2024) <https://topics.txcourts.gov/LocalRulesPublic/PreviewAttachment/2014>; Hudspeth County Court, Texas, Standing Order Regarding Use of Artificial Intelligence (Standing Order, 12 July 2024) <https://topics.txcourts.gov/LocalRulesPublic/PreviewAttachment/1997>.
-
United States District Court for the Eastern District of Pennsylvania, Standing Order Re Artificial Intelligence (‘AI’) in Cases Assigned to Judge Baylson (Standing Order, 6 June 2023) <https://www.paed.uscourts.gov/sites/paed/files/documents/locrules/standord/Standing%20Order%20Re%20Artificial%20Intelligence%206.6.pdf>.
-
Court of Appeal of Alberta, Court of King’s Bench of Alberta and Alberta Court of Justice, Notice to the Profession & Public – Ensuring the Integrity of Court Submissions When Using Large Language Models (Notice, 6 October 2023) <https://albertacourts.ca/kb/resources/announcements/notice-to-the-profession-public—use-of-ai-in-citations-submissions>; Court of Quebec, Notice to the Legal Community and the Public – Maintaining the Integrity of Submissions before the Court When Using Large Language Models (Notice, 26 January 2024) <https://courduquebec.ca/en/article/notice-to-the-legal-community-and-the-public-maintaining-the-integrity-of-submissions-before-the-court-when-using-large-language-models>; Nova Scotia Courts, Use of Artificial Intelligence (AI) in Proceedings before the Nova Scotia Court of Appeal (Report, 14 March 2025) <https://www.courts.ns.ca/resources/notices/use-of-artificial-intelligence-ai-proceedings-nova-scotia-court-of-appeal>; Supreme Court of Newfoundland and Labrador, Notice to the Profession and the General Public Ensuring the Integrity of Court Submissions When Using Large Language Models (Notice, 12 October 2023).
-
Caribbean Court of Justice, Practice Direction No. 1 of 2025: The Use of Generative Artificial Intelligence Tools in Court Proceedings (Practice Direction, 14 February 2025) 3 <https://ccj.org/wp-content/uploads/2025/02/PRACTICE-DIRECTION-NO.-1-OF-2025-THE-USE-OF-GENERATIVE-ARTIFICIAL-INTELLIGENCE-TOOLS.pdf>.
-
Courts and Tribunals Judiciary (UK), Artificial Intelligence (AI) Guidance for Judicial Office Holders (Guidance, 14 April 2025) 7 <https://www.judiciary.uk/wp-content/uploads/2025/04/Refreshed-AI-Guidance-published-version.pdf>.
-
Courts of New Zealand, Guidelines for Use of Generative Artificial Intelligence in Courts and Tribunals: Lawyers (Guidelines, 7 December 2023) 4; Courts of New Zealand, Guidelines for Use of Generative Artificial Intelligence in Courts and Tribunals: Non-Lawyers (Guidelines, 7 December 2023) 4.
-
Supreme Court of Singapore, Guide on the Use of Generative Artificial Intelligence Tools by Court Users (Registrar’s Circular No 1 of 2024, 1 October 2024) 1 <https://www.judiciary.gov.sg/docs/default-source/news-and-resources-docs/guide-on-the-use-of-generative-ai-tools-by-court-users.pdf>.
-
Ibid 3–4.
-
Supreme Court of Illinois, Illinois Supreme Court Policy on Artificial Intelligence (Policy, 1 January 2025) 2.
-
Court of Appeal of Alberta, Court of King’s Bench of Alberta and Alberta Court of Justice, Notice to the Profession & Public – Ensuring the Integrity of Court Submissions When Using Large Language Models (Notice, 6 October 2023) <https://albertacourts.ca/kb/resources/announcements/notice-to-the-profession-public—use-of-ai-in-citations-submissions>.
-
Caribbean Court of Justice, Practice Direction No. 1 of 2025: The Use of Generative Artificial Intelligence Tools in Court Proceedings (Practice Direction, 14 February 2025) 2 <https://ccj.org/wp-content/uploads/2025/02/PRACTICE-DIRECTION-NO.-1-OF-2025-THE-USE-OF-GENERATIVE-ARTIFICIAL-INTELLIGENCE-TOOLS.pdf>.
-
The Law Society of England and Wales, Generative AI – the Essentials (Guide, August 2024) 6 <https://www.lawsociety.org.uk//en/Topics/AI-and-lawtech/Guides/Generative-AI-the-essentials>.
-
Bar Council Malaysia, The Risks and Precautions in Using Generative Artificial Intelligence in the Legal Profession, Specifically ChatGPT (Circular No 342/2023, 24 November 2023) 6 <https://www.malaysianbar.org.my/document/members/circulars/2020—2024/2023&rid=46578>.
-
Courts of New Zealand, Guidelines for Use of Generative Artificial Intelligence in Courts and Tribunals: Lawyers (Guidelines, 7 December 2023) 4; Courts of New Zealand, Guidelines for Use of Generative Artificial Intelligence in Courts and Tribunals: Non-Lawyers (Guidelines, 7 December 2023) 4.
-
Law Society of Scotland, Guide to Generative AI (Guide, October 2024) 9 <https://www.lawscot.org.uk/members/business-support/lawscottech/resources/guide-to-generative-ai/>.
-
Supreme Court of Singapore, Guide on the Use of Generative Artificial Intelligence Tools by Court Users (Registrar’s Circular No 1 of 2024, 1 October 2024) 3 <https://www.judiciary.gov.sg/docs/default-source/news-and-resources-docs/guide-on-the-use-of-generative-ai-tools-by-court-users.pdf>.
-
Supreme Court of Illinois, Illinois Supreme Court Policy on Artificial Intelligence (Policy, 1 January 2025) 2.
-
Canadian Bar Association, Ethics of Artificial Intelligence for the Legal Practitioner – Toolkit (Guide, 2025) ‘3. Guidelines Relating to Use’, [3.2] <https://cba.org/resources/practice-tools/ethics-of-artificial-intelligence-for-the-legal-practitioner/3-guidelines-relating-to-use/>.
-
Caribbean Court of Justice, Practice Direction No. 1 of 2025: The Use of Generative Artificial Intelligence Tools in Court Proceedings (Practice Direction, 14 February 2025) 3 <https://ccj.org/wp-content/uploads/2025/02/PRACTICE-DIRECTION-NO.-1-OF-2025-THE-USE-OF-GENERATIVE-ARTIFICIAL-INTELLIGENCE-TOOLS.pdf>.
-
The Law Society of England and Wales, Generative AI – the Essentials (Guide, August 2024) 12 <https://www.lawsociety.org.uk//en/Topics/AI-and-lawtech/Guides/Generative-AI-the-essentials>.
-
Bar Council Malaysia, The Risks and Precautions in Using Generative Artificial Intelligence in the Legal Profession, Specifically ChatGPT (Circular No 342/2023, 24 November 2023) 6 <https://www.malaysianbar.org.my/document/members/circulars/2020—2024/2023&rid=46578>.
-
Courts of New Zealand, Guidelines for Use of Generative Artificial Intelligence in Courts and Tribunals: Lawyers (Guidelines, 7 December 2023) 3; Courts of New Zealand, Guidelines for Use of Generative Artificial Intelligence in Courts and Tribunals: Non-Lawyers (Guidelines, 7 December 2023) 3.
-
Law Society of Scotland, Guide to Generative AI (Guide, October 2024) 10 <https://www.lawscot.org.uk/members/business-support/lawscottech/resources/guide-to-generative-ai/>.
-
Supreme Court of Singapore, Guide on the Use of Generative Artificial Intelligence Tools by Court Users (Registrar’s Circular No 1 of 2024, 1 October 2024) 4 <https://www.judiciary.gov.sg/docs/default-source/news-and-resources-docs/guide-on-the-use-of-generative-ai-tools-by-court-users.pdf>.
-
American Bar Association, Standing Committee on Ethics and Professional Responsibility, Formal Opinion 512: Generative Artificial Intelligence Tools (Report, 29 July 2024) 6 <https://www.americanbar.org/content/dam/aba/administrative/professional_responsibility/ethics-opinions/aba-formal-opinion-512.pdf>.
-
Submission 23 (Victorian Bar Association).
-
Submission 6 (Victorian Legal Services Board and Commissioner).
-
Supreme Court of New South Wales, Supreme Court Practice Note SC Gen 23 Use of Generative Artificial Intelligence (Gen AI) (Practice Note, 28 January 2025) <https://supremecourt.nsw.gov.au/documents/Practice-and-Procedure/Practice-Notes/general/current/PN_SC_Gen_23.pdf>; Supreme Court of Victoria, Guidelines for Litigants: Responsible Use of Artificial Intelligence in Litigation (Guidelines, 6 May 2024) <http://www.supremecourt.vic.gov.au/forms-fees-and-services/forms-templates-and-guidelines/guideline-responsible-use-of-ai-in-litigation>.
-
Supreme Court of Victoria, Practice Note SC Gen 1 Practice Notes and Notice to the Profession (Practice Note, 30 January 2017) para 4.3 <https://www.supremecourt.vic.gov.au/sites/default/files/assets/2017/09/25/18c025b3f/gen1practicenotesandnoticetotheprofession.pdf>.
-
Ibid.
-
Director of Public Prosecutions v GR [2025] VSC 490, [78].
-
F Abedi and NJ Balmer, AI Use in the Legal Profession: Findings from the 2025 Victorian Lawyer Census (Report, Victorian Legal Services Board and Commissioner, forthcoming 2025) 10.
-
Federal Court of Australia, Notice to the Profession: Artificial Intelligence Use in the Federal Court of Australia (Notice, 29 April 2025).
-
Law Council of Australia, Artificial Intelligence Use in the Federal Court of Australia (Submission to Federal Court Consultation on Artificial Intelligence Use in the Federal Court of Australia, 16 June 2025) 20 <https://lawcouncil.au/publicassets/e4c1fdf4-334b-f011-94b6-005056be13b5/4686%20-%20S%20-%20AI%20use%20in%20the%20Federal%20Court%20of%20Australia.pdf>.
-
Law Society of Western Australia, Submission: Federal Court of Australia Artificial Intelligence Project Group (Submission to Federal Court Consultation on Artificial Intelligence Use in the Federal Court of Australia, 11 June 2025) 8 <https://lawsocietywa.asn.au/wp-content/uploads/2025/06/2025JUN11-Submission-to-Federal-Court-AI-Project-Group.pdf>.
-
Submissions 2 (Assoc Prof Marcus Smith), 16 (Law Institute Victoria), 22 (Centre for the Future of the Legal Profession and UNSW Law and Justice). Consultation 6 (Office of Public Prosecutions).
-
Submissions 2 (Assoc Prof Marcus Smith), 26 (Supreme Court of Victoria).
-
Consultation 9 (Victorian Civil and Administrative Tribunal).
-
Queensland Courts, The Use of Generative Artificial Intelligence (AI) Guidelines for Responsible Use by Non-Lawyers (Guidelines, 13 May 2025) <https://www.courts.qld.gov.au/about/news/news233/2024/the-use-of-generative-artificial-intelligence-ai>; Supreme Court of Victoria, Guidelines for Litigants: Responsible Use of Artificial Intelligence in Litigation (Guidelines, 6 May 2024) <http://www.supremecourt.vic.gov.au/forms-fees-and-services/forms-templates-and-guidelines/guideline-responsible-use-of-ai-in-litigation>.
-
Supreme Court of New South Wales, Supreme Court Practice Note SC Gen 23 Use of Generative Artificial Intelligence (Gen AI) (Practice Note, 28 January 2025) <https://supremecourt.nsw.gov.au/documents/Practice-and-Procedure/Practice-Notes/general/current/PN_SC_Gen_23.pdf>.
-
Submission 26 (Supreme Court of Victoria).
-
Consultation 15 (Magistrates’ Court of Victoria).
-
Consultation 6 (Office of Public Prosecutions).
-
Submission 16 (Law Institute Victoria).
-
Submission 22 (Centre for the Future of the Legal Profession and UNSW Law and Justice).
-
For example, the NSW Civil and Administrative Tribunal modified the NSW Supreme Court’s Gen AI Practice Note to accommodate the particular types of proceedings in their jurisdiction: NSW Civil & Administrative Tribunal, NCAT Procedural Direction 7 – Use of Generative Artificial Intelligence (Gen AI) (Procedural Direction, 7 March 2025).
-
Consultation 32 (Supreme Court of Victoria).
-
Michael Pelly, ‘An Interview with Chief Justice Gageler’, Westlaw Updates & Alerts (Web Page, 8 July 2025) <https://support.thomsonreuters.com.au/product/westlaw-precision-australia/updates-alerts/interview-chief-justice-gageler>.
-
Ibid.
-
Ibid.
-
Submission 22 (Centre for the Future of the Legal Profession and UNSW Law and Justice).
-
Submission 16 (Law Institute Victoria).
-
Ibid.
-
Supreme Court of Victoria, Guidelines for Litigants: Responsible Use of Artificial Intelligence in Litigation (Guidelines, 6 May 2024) 5 <http://www.supremecourt.vic.gov.au/forms-fees-and-services/forms-templates-and-guidelines/guideline-responsible-use-of-ai-in-litigation>.
-
Organisation for Economic Co-operation and Development (OECD), Recommendation of the Council on Artificial Intelligence, OECD/LEGAL/0449, 3 May 2024, 7 <https://legalinstruments.oecd.org/en/instruments/OECD-LEGAL-0449>.
-
The OECD is an authoritative international body that plays a leading role in setting international standards. It has 38 member states including Australia, Canada, New Zealand, Germany, Japan, UK, USA, France and Denmark. ‘Members and Partners’, OECD (Web Page) <https://www.oecd.org/en/about/members-partners.html>.
-
Department of Industry, Science and Resources (Cth), Safe and Responsible AI in Australia: Proposals Paper for Introducing Mandatory Guardrails for AI in High-Risk Settings (Proposals Paper, September 2024) 53; Department of Premier and Cabinet (Vic), Administrative Guideline – The Safe and Responsible Use of Generative AI in the Victorian Public Sector (No 2024/07, Issue 1.0, November 2024) <https://www.vic.gov.au/sites/default/files/2024-11/Generative-AI-Guideline-%281%29.pdf>.
-
Consultation 33 (Law Firms Australia).
-
Supreme Court of New South Wales, Supreme Court Practice Note SC Gen 23 Use of Generative Artificial Intelligence (Gen AI) (Practice Note, 28 January 2025) para 6 <https://supremecourt.nsw.gov.au/documents/Practice-and-Procedure/Practice-Notes/general/current/PN_SC_Gen_23.pdf>.
-
Ibid.
-
Consultation 33 (Law Firms Australia).
-
Submission 22 (Centre for the Future of the Legal Profession and UNSW Law and Justice).
-
Submission 6 (Victorian Legal Services Board and Commissioner).
-
Supreme Court of Victoria, Guidelines for Litigants: Responsible Use of Artificial Intelligence in Litigation (Guidelines, 6 May 2024) <http://www.supremecourt.vic.gov.au/forms-fees-and-services/forms-templates-and-guidelines/guideline-responsible-use-of-ai-in-litigation>; County Court of Victoria, Guidelines for Litigants: Responsible Use of Artificial Intelligence in Litigation (Guidelines, 3 July 2024).
-
Queensland Courts, The Use of Generative Artificial Intelligence (AI) Guidelines for Responsible Use by Non-Lawyers (Guidelines, 13 May 2025) 1 <https://www.courts.qld.gov.au/about/news/news233/2024/the-use-of-generative-artificial-intelligence-ai>.
-
Submission 24 (County Court of Victoria).
-
Supreme Court of Victoria, Guidelines for Litigants: Responsible Use of Artificial Intelligence in Litigation (Guidelines, 6 May 2024) 1 <http://www.supremecourt.vic.gov.au/forms-fees-and-services/forms-templates-and-guidelines/guideline-responsible-use-of-ai-in-litigation>.
-
Submission 6 (Victorian Legal Services Board and Commissioner). Consultation 4 (Victorian Legal Services Board and Commissioner).
-
Consultation 11 (Law Institute of Victoria).
-
Submission 16 (Law Institute Victoria).
-
Submission 6 (Victorian Legal Services Board and Commissioner).
-
Weedbrook v Partlin [2024] QDC 194, [41].
-
Federal Court of Canada, Notice to Parties and the Profession – The Use of Artificial Intelligence in Court Proceedings (Notice, 7 May 2024) 3.
-
Consultation 12 (County Court of Victoria).
-
Submission 22 (Centre for the Future of the Legal Profession and UNSW Law and Justice).
-
Consultations 22 (Court Services Victoria), 23 (Dr Fabian Horton).
-
Courts of New Zealand, Guidelines for Use of Generative Artificial Intelligence in Courts and Tribunals: Lawyers (Guidelines, 7 December 2023) 4.
-
Supreme Court of Illinois, Illinois Supreme Court Policy on Artificial Intelligence (Policy, 1 January 2025) 2.
-
Supreme Court of Singapore, Guide on the Use of Generative Artificial Intelligence Tools by Court Users (Registrar’s Circular No 1 of 2024, 1 October 2024) [4.3] <https://www.judiciary.gov.sg/docs/default-source/news-and-resources-docs/guide-on-the-use-of-generative-ai-tools-by-court-users.pdf>.
-
Ibid [3.3].
-
Ibid [3.2(a)].
-
Law Society of Scotland, Guide to Generative AI (Guide, October 2024) 9 <https://www.lawscot.org.uk/members/business-support/lawscottech/resources/guide-to-generative-ai/>.
-
Supreme Court of Singapore, Guide on the Use of Generative Artificial Intelligence Tools by Court Users (Registrar’s Circular No 1 of 2024, 1 October 2024) [3.2(b)] <https://www.judiciary.gov.sg/docs/default-source/news-and-resources-docs/guide-on-the-use-of-generative-ai-tools-by-court-users.pdf>.
-
Ibid [4].
-
Caribbean Court of Justice, Practice Direction No. 1 of 2025: The Use of Generative Artificial Intelligence Tools in Court Proceedings (Practice Direction, 14 February 2025) 3 <https://ccj.org/wp-content/uploads/2025/02/PRACTICE-DIRECTION-NO.-1-OF-2025-THE-USE-OF-GENERATIVE-ARTIFICIAL-INTELLIGENCE-TOOLS.pdf>.
-
Law Society of Scotland, Guide to Generative AI (Guide, October 2024) 10 <https://www.lawscot.org.uk/members/business-support/lawscottech/resources/guide-to-generative-ai/>.
-
Supreme Court of Singapore, Guide on the Use of Generative Artificial Intelligence Tools by Court Users (Registrar’s Circular No 1 of 2024, 1 October 2024) [3(1)] <https://www.judiciary.gov.sg/docs/default-source/news-and-resources-docs/guide-on-the-use-of-generative-ai-tools-by-court-users.pdf>.
-
Supreme Court of Illinois, Illinois Supreme Court Policy on Artificial Intelligence (Policy, 1 January 2025) 2.
-
Bar Council Malaysia, The Risks and Precautions in Using Generative Artificial Intelligence in the Legal Profession, Specifically ChatGPT (Circular No 342/2023, 24 November 2023) 2–3 <https://www.malaysianbar.org.my/document/members/circulars/2020—2024/2023&rid=46578>.
-
Supreme Court of Newfoundland and Labrador, Notice to the Profession and the General Public Ensuring the Integrity of Court Submissions When Using Large Language Models (Notice, 12 October 2023).
-
See for example, Director of Public Prosecutions v GR [2025] VSC 490, [70].
-
May v Costaras [2025] NSWCA 178, [16].
-
Consultation 33 (Law Firms Australia).
-
Claire Roberts, ‘Generative AI in Australian Courts: Early Cases, Emerging Risks’ (2025) 27(5–6) Internet Law Bulletin 79, 79.
-
Director of Public Prosecutions v GR [2025] VSC 490, [61]-[80].
-
Ibid [79]-[80]. See also Consultation 33 (Law Firms Australia).
-
Consultation 15 (Magistrates’ Court of Victoria).
-
Consultation 13 (Federal Circuit and Family Court of Australia).
-
Supreme Court of Victoria, Guidelines for Litigants: Responsible Use of Artificial Intelligence in Litigation (Guidelines, 6 May 2024) <http://www.supremecourt.vic.gov.au/forms-fees-and-services/forms-templates-and-guidelines/guideline-responsible-use-of-ai-in-litigation>.
-
Queensland Courts, The Use of Generative Artificial Intelligence (AI) Guidelines for Responsible Use by Non-Lawyers (Guidelines, 13 May 2025) <https://www.courts.qld.gov.au/about/news/news233/2024/the-use-of-generative-artificial-intelligence-ai>.
-
For example, the Federal Court of Canada requires a written declaration if content in the material provided to the court was directly provided by AI Federal Court of Canada, Notice to Parties and the Profession – The Use of Artificial Intelligence in Court Proceedings (Notice, 7 May 2024) 1.
-
Supreme Court of New South Wales, Supreme Court Practice Note SC Gen 23 Use of Generative Artificial Intelligence (Gen AI) (Practice Note, 28 January 2025) para 16 <https://supremecourt.nsw.gov.au/documents/Practice-and-Procedure/Practice-Notes/general/current/PN_SC_Gen_23.pdf>.
-
Consultation 32 (Supreme Court of Victoria).
-
Consultation 12 (County Court of Victoria).
-
Consultation 2 (Coroners Court of Victoria).
-
Consultation 15 (Magistrates’ Court of Victoria).
-
Consultation 9 (Victorian Civil and Administrative Tribunal).
-
Submission 23 (Victorian Bar Association).
-
Consultation 26 (inTouch Multicultural Centre Against Family Violence).
-
Submission 5 (Office of the Victorian Information Commissioner).
-
F Abedi and NJ Balmer, AI Use in the Legal Profession: Findings from the 2025 Victorian Lawyer Census (Report, Victorian Legal Services Board and Commissioner, forthcoming 2025) 9. Only 18.4% of respondents agreed or strongly agreed that lawyers should not use AI.
-
Submission 22 (Centre for the Future of the Legal Profession and UNSW Law and Justice).
-
Consultation 8 (Federation of Community Legal Centres Workshop) Also, Consultations 27 (UNSW’s Centre for the Future of the Legal Profession and Professor Lyria Bennett Moses), 28 (Monash University Digital Law Group).
-
F Abedi and NJ Balmer, AI Use in the Legal Profession: Findings from the 2025 Victorian Lawyer Census (Report, Victorian Legal Services Board and Commissioner, forthcoming 2025) 9.
-
Consultation 7 (Judicial College of Victoria).
-
Submission 22 (Centre for the Future of the Legal Profession and UNSW Law and Justice).
-
Consultation 32 (Supreme Court of Victoria).
-
Consultation 33 (Law Firms Australia).
-
The Law Society of NSW, Legal Practice Board of Western Australia, and Victorian Legal Services Board and Commissioner, Statement on the Use of Artificial Intelligence in Australian Legal Practice (Statement, 26 March 2025) 1 <https://lsbc.vic.gov.au/news-updates/news/statement-use-artificial-intelligence-australian-legal-practice>.
-
Civil Procedure Act 2010 (Vic) ss 10, 16.
-
Judicial College of Victoria, Civil Procedure Bench Book (Online Manual, 2024) ‘1.2.2.1 The paramount duty’ <https://resources.judicialcollege.vic.edu.au/article/1041737> n 42 citing Re Manlio [No 2] [2016] VSC 130, [25], [27].
-
May v Costaras [2025] NSWCA 178, [17]. See also comments by Justice Elliott Director of Public Prosecutions v GR [2025] VSC 490, [80].
-
Consultation 32 (Supreme Court of Victoria).
-
Supreme Court of Singapore, Guide on the Use of Generative Artificial Intelligence Tools by Court Users (Registrar’s Circular No 1 of 2024, 1 October 2024) <https://www.judiciary.gov.sg/docs/default-source/news-and-resources-docs/guide-on-the-use-of-generative-ai-tools-by-court-users.pdf>.
-
Ibid [5.3].
-
F Abedi and NJ Balmer, AI Use in the Legal Profession: Findings from the 2025 Victorian Lawyer Census (Report, Victorian Legal Services Board and Commissioner, forthcoming 2025) 5.
-
Ibid.
-
Consultation 12 (County Court of Victoria).
-
Consultation 15 (Magistrates’ Court of Victoria). A representative stated, ‘the judicial officers of the Magistrates’ Court serve as the finder of fact and law and will assess the submissions on its proper merits regardless of whether AI has been used’.
-
Justice Peter Quinlan, ‘The Impact of Social Media and AI on Public Trust in the Judiciary’ (Speech, Global Summit of Hellenic Lawyers, Athens, Hellas, 9 July 2025) 16 <https://www.supremecourt.wa.gov.au/_files/Speeches/2025/The%20Impact%20of%20Social%20Media%20and%20AI%20on%20Public%20Trust%20in%20the%20Judiciary.pdf>.
-
For example, Supreme Court (General Civil Procedure) Rules 2025 (Vic) r 34.01–02.
-
For example, in Nikolic & Anor v Nationwide News Pty Ltd & Anor [2025] VSCA 112 the Supreme Court was unable to find copies of the two decisions relied upon by the plaintiffs, and the Registry asked the plaintiffs for copies of them. This led the court to determine that that the cases ‘do not exist in the real world. They are most probably ‘hallucinations’: at [39].
-
See for example, Dayal [2024] FedCFamC2F 1166, [21]-[22]; Valu v Minister for Immigration and Multicultural Affairs (No 2) [2025] FedCFamC2G 95, [37]-[38]; Handa & Mallick [2024] FedCFamC2F 957, [10].
-
Murray on behalf of the Wamba Wemba Native Title Claim Group v State of Victoria [2025] FCA 731, [16].
-
For example, in Wang v Moutidis [2025] VCC 1156, Judge Kirton determined the matter based on the defendant’s oral submissions because the written submissions had been prepared with AI and contained errors and irrelevancies: at [14]-[15]. See also examples in Luck v Secretary, Services Australia [2025] FCAFC 26, [14]; Nikolic & Anor v Nationwide News Pty Ltd & Anor [2025] VSCA 112, [41]; Ivins v KMA Consulting Engineers Pty Ltd & Ors [2025] QIRC 141, [75]-[79]; May v Costaras [2025] NSWCA 178, [49]; Kaur v RMIT [2024] VSCA 264, [26] n 19; Goodchild v State of Queensland (Queensland Health) [2025] QIRC 46, [39].
-
LJY v Occupational Therapy Board of Australia [2025] QCAT 96, [22]-[26]; Bottrill v Graham & Anor (No 2) [2025] NSWDC 221, [69]-[77]; Director of Public Prosecutions v GR [2025] VSC 490, [78]; Page v Long [2025] VCC 868, [19]-[21].
-
Oaths and Affirmations Act 2018 (Vic) s 27; Supreme Court (General Civil Procedure) Rules 2025 (Vic) ord 43.01.
-
Supreme Court of Victoria, Practice Note SC CC1 Commercial Court (Second Revision) (Practice Note, 26 February 2024) para 6.4 <http://www.supremecourt.vic.gov.au/areas/legal-resources/practice-notes/sc-cc-1-commercial-court-second-revision>.
-
Evidence Act 2008 (Vic) s 21, sch 1.
-
‘Character References’, Magistrates Court of Victoria (Web Page, 14 August 2024) <https://www.mcv.vic.gov.au/criminal-matters/sentencing-and-penalties/character-references>.
-
‘Victim Impact Statements’, Victims of Crime (Web Page, 21 November 2023) <https://www.victimsofcrime.vic.gov.au/victim-impact-statements>.
-
Sentencing Act 1991 (Vic) s 8L.
-
Ibid s 8K.
-
Michael Legg, ‘Generative AI and Affidavits’ (2025) 99(5) Australian Law Journal 360, 362.
-
Consultation 24 (Victorian Advocacy League for Individuals with Disability).
-
Ibid.
-
Ibid.
-
Director of Public Prosecutions (ACT) v Khan [2024] ACTSC 19.
-
Ibid [39]-[44].
-
James D Metzger and Tyrone Kirchengast, ‘Why a US Court Allowed a Dead Man to Deliver His Own Victim Impact Statement – via an AI Avatar’, The Conversation (online, 18 June 2025) <https://theconversation.com/why-a-us-court-allowed-a-dead-man-to-deliver-his-own-victim-impact-statement-via-an-ai-avatar-259045>.
-
Dewald v Massachusetts Mutual Insurance Company 237 AD3d 562 (2025); A video of the court responding to the AI avatar is available at New York Supreme Court, Appellate Division, First Department, ‘March 26, 2025, Appellate Division, First Department Live Stream’ (YouTube, 26 March 2025) <https://www.youtube.com/watch?v=Ctv4ZQRZgbA> 00:19.23-00:26:40; Thomas Claburn, ‘Judge Slams AI Entrepreneur for Having Avatar Testify’, The Register (online, 9 April 2025) <https://www.theregister.com/2025/04/09/court_scolds_ai_entrepreneur_avatar_testify/>.
-
James D Metzger and Tyrone Kirchengast, ‘Why a US Court Allowed a Dead Man to Deliver His Own Victim Impact Statement – via an AI Avatar’, The Conversation (online, 18 June 2025) <https://theconversation.com/why-a-us-court-allowed-a-dead-man-to-deliver-his-own-victim-impact-statement-via-an-ai-avatar-259045>; For example, Sentencing Act 1991 (Vic) ss 3 (definition of victim), 8K(2).
-
Sentencing Act 1991 (Vic) s 8K.
-
Sentencing Act 1991 (Vic) s 8Q.
-
Evidence Act 2008 (Vic) s 12.
-
Sentencing Act 1991 (Vic) s 8L(3).
-
Evidence Act 2008 (Vic) ss 30–31.
-
Ibid s 31(3)(b).
-
Judicial College Victoria, Uniform Evidence Manual (Online Manual, 6 May 2025) ’s31-Deaf and mute witnesses’ [6] <https://resources.judicialcollege.vic.edu.au/article/1053064> (15 December 2023).
-
Consultation 9 (Victorian Civil and Administrative Tribunal); For example, representatives of VALID stated ‘a person who is not able to verbalise could give an AI tool like ChatGPT input to create something.’: Consultation 24 (Victorian Advocacy League for Individuals with Disability).
-
Supreme Court of Victoria, Guidelines for Litigants: Responsible Use of Artificial Intelligence in Litigation (Guidelines, 6 May 2024) para 10 <http://www.supremecourt.vic.gov.au/forms-fees-and-services/forms-templates-and-guidelines/guideline-responsible-use-of-ai-in-litigation>.
-
At the time of finalising this report, Queensland Courts released updated guidelines to non-lawyers which aligns with the approach taken by the Supreme Court of Victoria to affidavits and witness statements: Queensland Courts, The Use of Generative Artificial Intelligence (AI) Guidelines for Responsible Use by Non-Lawyers (Guidelines, 15 September 2025) 2 <https://www.courts.qld.gov.au/__data/assets/pdf_file/0012/798375/Artificial-Intelligence_Guidelines-for-Non-Lawyers.pdf>.
-
Courts and Tribunals Judiciary (UK), Artificial Intelligence (AI) Guidance for Judicial Office Holders (Guidance, 14 April 2025) <https://www.judiciary.uk/wp-content/uploads/2025/04/Refreshed-AI-Guidance-published-version.pdf>; Courts of New Zealand, Guidelines for Use of Generative Artificial Intelligence in Courts and Tribunals: Lawyers (Guidelines, 7 December 2023); Courts of New Zealand, Guidelines for Use of Generative Artificial Intelligence in Courts and Tribunals: Non-Lawyers (Guidelines, 7 December 2023); Supreme Court of Singapore, Guide on the Use of Generative Artificial Intelligence Tools by Court Users (Registrar’s Circular No 1 of 2024, 1 October 2024) <https://www.judiciary.gov.sg/docs/default-source/news-and-resources-docs/guide-on-the-use-of-generative-ai-tools-by-court-users.pdf>; Supreme Court of Victoria, Guidelines for Litigants: Responsible Use of Artificial Intelligence in Litigation (Guidelines, 6 May 2024) <http://www.supremecourt.vic.gov.au/forms-fees-and-services/forms-templates-and-guidelines/guideline-responsible-use-of-ai-in-litigation>; Oaths and Affirmations Act 2018 (Vic) s 27.
-
Caribbean Court of Justice, Practice Direction No. 1 of 2025: The Use of Generative Artificial Intelligence Tools in Court Proceedings (Practice Direction, 14 February 2025) 1 <https://ccj.org/wp-content/uploads/2025/02/PRACTICE-DIRECTION-NO.-1-OF-2025-THE-USE-OF-GENERATIVE-ARTIFICIAL-INTELLIGENCE-TOOLS.pdf>.
-
Supreme Court of New South Wales, Supreme Court Practice Note SC Gen 23 Use of Generative Artificial Intelligence (Gen AI) (Practice Note, 28 January 2025) para 13 <https://supremecourt.nsw.gov.au/documents/Practice-and-Procedure/Practice-Notes/general/current/PN_SC_Gen_23.pdf>.
-
Uniform Civil Procedure Rules 2005 (NSW) rr 31.4(3A)-(3C) relating to witness statements and r 35.3B relating to affidavits.
-
Department of Communities and Justice (NSW), Uniform Civil Procedure Rules: Form 40 – Affidavit (Version 8) [2] <https://ucprforms.nsw.gov.au/documents/pdf/ucpr_form_40_v8.pdf>.
-
Department of Communities and Justice (NSW), Uniform Civil Procedure Rules: Form 163 – Witness Statement (Version 3) [2] <https://ucprforms.nsw.gov.au/documents/pdf/ucpr_form_163_v3.pdf>.
-
Consultation 32 (Supreme Court of Victoria).
-
Consultation 9 (Victorian Civil and Administrative Tribunal).
-
Consultation 12 (County Court of Victoria).
-
Submission 23 (Victorian Bar Association).
-
Submission 8 (Damian Curran). Consultations 11 (Law Institute of Victoria), 33 (Law Firms Australia).
-
Consultation 30 (Eastern Community Legal Centre).
-
Consultation 32 (Supreme Court of Victoria).
-
Consultation 13 (Federal Circuit and Family Court of Australia).
-
Michael Legg, ‘Generative AI and Affidavits’ (2025) 99(5) Australian Law Journal 360, 361 n 14 citing Queensland v Masson (2020) 94 ALJR 785, [112] (Nettle and Gordon JJ); [2020] HCA 28; Concrete Pty Ltd v Parramatta Design and Developments Pty Ltd (2006) 229 CLR 577, [175] (Callinan J).
-
Ibid 362.
-
Justice Peter Quinlan, ‘The Impact of Social Media and AI on Public Trust in the Judiciary’ (Speech, Global Summit of Hellenic Lawyers, Athens, Hellas, 9 July 2025) 14 <https://www.supremecourt.wa.gov.au/_files/Speeches/2025/The%20Impact%20of%20Social%20Media%20and%20AI%20on%20Public%20Trust%20in%20the%20Judiciary.pdf>.
-
Ibid 15 (emphasis in original).
-
Crimes Act 1958 (Vic) s 314.
-
Consultation 12 (County Court of Victoria).
-
Consultation 19 (Professor Ian Freckelton AO KC).
-
Consultation 32 (Supreme Court of Victoria).
-
Confidential consultation.
-
Consultation 19 (Professor Ian Freckelton AO KC).
-
Sentencing Act 1991 (Vic) s 8K.
-
Consultation 13 (Federal Circuit and Family Court of Australia).
-
Supreme Court of Victoria, Guidelines for Litigants: Responsible Use of Artificial Intelligence in Litigation (Guidelines, 6 May 2024) para 10 <http://www.supremecourt.vic.gov.au/forms-fees-and-services/forms-templates-and-guidelines/guideline-responsible-use-of-ai-in-litigation>.
-
Federal Court of Canada, Notice to Parties and the Profession – The Use of Artificial Intelligence in Court Proceedings (Notice, 7 May 2024) para 1.
-
Supreme Court of Victoria, SC CR 3 – Expert Evidence in Criminal Trials (Practice Note, 1 June 2025) para 7.4 <https://www.supremecourt.vic.gov.au/areas/legal-resources/practice-notes/sc-cr-3-expert-evidence-in-criminal-trials>.
-
Supreme Court of New South Wales, Supreme Court Practice Note SC Gen 23 Use of Generative Artificial Intelligence (Gen AI) (Practice Note, 28 January 2025) para 22 <https://supremecourt.nsw.gov.au/documents/Practice-and-Procedure/Practice-Notes/general/current/PN_SC_Gen_23.pdf>.
-
Consultation 33 (Law Firms Australia).
-
Supreme Court of Victoria, SC CR 3 – Expert Evidence in Criminal Trials (Practice Note, 1 June 2025) para 6.3 <https://www.supremecourt.vic.gov.au/areas/legal-resources/practice-notes/sc-cr-3-expert-evidence-in-criminal-trials>.
-
Ibid para 7.4.
-
Consultation 32 (Supreme Court of Victoria).
-
Consultation 12 (County Court of Victoria).
-
Consultation 2 (Coroners Court of Victoria).
-
Consultation 15 (Magistrates’ Court of Victoria).
-
Consultation 9 (Victorian Civil and Administrative Tribunal).
-
Submission 23 (Victorian Bar Association).
-
Consultation 32 (Supreme Court of Victoria).
-
Consultation 9 (Victorian Civil and Administrative Tribunal).
-
Consultation 19 (Professor Ian Freckelton AO KC).
-
Consultation 12 (County Court of Victoria).
-
Ibid.
-
Supreme Court of Victoria, SC CR 3 – Expert Evidence in Criminal Trials (Practice Note, 1 June 2025) para 7.4 <https://www.supremecourt.vic.gov.au/areas/legal-resources/practice-notes/sc-cr-3-expert-evidence-in-criminal-trials>.
-
Ibid para 7.4(b).
-
Ibid para 6.3(a).
-
Ibid.
-
Ibid para 1.3.
-
Consultation 2 (Coroners Court of Victoria).
-
Consultation 9 (Victorian Civil and Administrative Tribunal).
-
Consultation 32 (Supreme Court of Victoria); Supreme Court (General Civil Procedure) Rules 2025 (Vic) Form 44A.
-
Civil Procedure Act 2010 (Vic) s 10(3).
-
Supreme Court of Victoria, SC CR 3 – Expert Evidence in Criminal Trials (Practice Note, 1 June 2025) para 7.4 <https://www.supremecourt.vic.gov.au/areas/legal-resources/practice-notes/sc-cr-3-expert-evidence-in-criminal-trials>.
|
|
