1. Introduction: Artificial Intelligence in Victoria’s Courts and Tribunals

Our terms of reference

1.1The Victorian Law Reform Commission (the Commission) was asked to make recommendations on legislative reform opportunities and principles to guide the safe use of artificial intelligence (AI) in Victoria’s courts and tribunals.

1.2We were asked to provide principles or guidelines that can be used to assess the suitability of future uses of AI in Victoria’s courts and tribunals.

1.3We were guided by the terms of reference (see page viii) given to us by the former Attorney-General, the Hon. Jaclyn Symes MP, on 8 May 2024.

The Commission’s aims

1.4The Commission’s aim is to maintain and further develop a fair, just, inclusive and accessible legal system for all Victorians.

1.5Our main task is to examine existing laws, prepare reports and make recommendations to serve the needs of the Victorian community.

Our approach

Our leadership

1.6The Hon. Jennifer Coate AO was the Commission’s Acting Chair from the beginning of this inquiry in May 2024, until the Hon. Anthony North KC resumed the role of Chair from April 2025.

1.7The Acting Chair established a Division to guide and make decisions about the inquiry. All Commissioners were Division members. Their names are listed on the inside front cover.

What we published

1.8On 16 October 2024 we published a consultation paper to seek views on the use of AI in Victoria’s courts and tribunals and how to ensure safe and effective use of AI. We invited submissions by 12 December 2024.

Submissions and consultations

1.9We received 29 submissions (see Appendix A) and published the public submissions on our website. We held 49 consultations with 52 individuals and organisations (see Appendix B).

1.10We engaged with a wide range of stakeholders across courts and the legal profession. This involved Victorian courts, the Victorian Civil and Administrative Tribunal (VCAT), Court Services Victoria (CSV),[1] the Federal Circuit and Family Court, the Coronial Council, prosecutorial bodies, lawyers, academics and peak organisations that provide education and support to lawyers and judicial officers. We also engaged with government organisations, such as the Office of the Victorian Information Commissioner and the Public Record Office Victoria.

1.11We consulted with human rights organisations and specialist access to justice services, such as the Victorian Equal Opportunity and Human Rights Commission. We held a workshop with 15 community legal centres, facilitated by the Federation of Community Legal Centres. We also consulted technology-focused organisations such as Cenitex and Microsoft.

1.12The Victorian Legal Services Board and Commissioner made available findings collected from their 2025 Lawyer Census, which included a focus on AI usage within legal practice in Victoria.[2] The survey sample involved 1,887 Victorian practising certificate holders.

Scope of this report

1.13Our terms of reference refer to the use of AI in Victoria’s courts and tribunals. While the definition of ‘tribunal’ is potentially broad, we confined our scope to Victoria’s courts and VCAT. This is consistent with the composition of the Courts Council, which is the governing body of CSV. The Courts Council consists of the heads of the Supreme Court, County Court, Magistrates’ Court, Children’s Court, Coroners Court and VCAT.[3] We note that principles and guidelines recommended in this report may also be relevant to other tribunal bodies in Victoria.

Our report to the Attorney-General

1.14Our report is due to the Attorney-General by 31 October 2025. Within 14 sitting days of receiving our report, the Attorney-General must table it before the Victorian Parliament. It will then be published on our website and in print.

A note on language

1.15Unless otherwise noted, references in this report to the Supreme Court and the Court of Appeal, the County Court, the Magistrates’ Court, the Children’s Court and the Coroners Court refer to the Victorian courts, and references to the High Court refer to the High Court of Australia. References to the Federal Court refer to the Federal Court of Australia. References to the Federal Circuit and Family Court refer to the Federal Circuit and Family Court of Australia.

1.16The term ‘judicial officer’ is used to refer to a judge, an associate judge or a judicial registrar of the Supreme or County court, a magistrate or judicial registrar of the Magistrates’ or Children’s court, a coroner or a judicial registrar of the Coroners Court. Discussion relating to ‘judicial officers’ is also relevant to VCAT members. In the recommendations we refer to VCAT members separately.

Common terms used in this report

1.17This report uses some technical terms. Definitions of technical terms used in this report are provided in the Glossary on page ix. Below are some common AI terms and concepts that we refer to throughout this report.

Artificial intelligence

1.18While there is no universally agreed definition of AI, this report adopts the Organisation for Economic Co-operation and Development’s (OECD) definition:

a machine-based system that, for explicit or implicit objectives, infers from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. Different AI systems vary in their levels of autonomy and adaptiveness after deployment.[4]

1.19The OECD definition was updated in 2023 and will continue to be updated from time to time. We adopt the definition for the following reasons:

It allows policy makers to consider the impact of either individual or societal harms caused by a wide range of computational systems. This includes AI capabilities embedded in technology and software ranging from YouTube to ChatGPT.

It includes knowledge-based and machine learning approaches.[5] This means the definition includes older AI systems that require human intervention to learn, based on knowledge provided by experts.[6]

It is adopted by the Australian and Victorian governments.[7]

1.20We discuss in Chapter 3 that feedback from consultations and submissions noted some limitations with the OECD definition. Representatives of the Victorian Bar Association described the OECD definition as ‘too complex and manifold for general use’.[8] For this reason, we nominate additional terms that outline how different types of AI require different regulation by Victoria’s courts and VCAT.

1.21In this report ‘AI’ is often used in a general way, but it is important to note that AI is an umbrella term that refers to many types of technologies with different capabilities, uses and risks. Examples of different AI tools are discussed in Chapter 2.

AI lifecycle

1.22The Australian Government describes the AI lifecycle as a structured process that occurs in three stages:

a)Discover: design, data, train and evaluate

b)Operate: integrate, deploy and monitor

c)Retire: decommission.[9]

1.23The term AI lifecycle is important as regulatory responses to AI often highlight that governance and accountability mechanisms are needed across the entire AI lifecycle.[10] Different risks and opportunities can occur at different points of the AI lifecycle.[11]

1.24The development of AI is distinct from other software development because of the importance of data and of models that rely on data for training and evaluation.[12]

Machine learning

1.25Machine learning is a property of all Generative AI (GenAI) systems. It is defined as a ‘set of techniques for creating algorithms so that computational systems can learn from data’.[13] As stated in our consultation paper, these types of AI can improve their performance by ‘learning’ over time based on experience and feedback to previous responses.[14] These can be defined in contrast to expert AI systems that use pre-defined rules (‘if-then rules’) and a knowledge base to infer conclusions.[15] Google Translate is an example of a software product using machine learning that is not a GenAI tool. Another example is algorithms embedded in YouTube and other social media platforms that recommend content based upon previous viewing preferences.[16]

Generative AI

1.26Generative AI or GenAI is a subset of machine learning referring to ‘software systems that create content as text, images, music, audio and videos based on a user’s prompts’.[17] GenAI systems ‘infer’ a response or output to a user prompt, using statistical predictions based on large amounts of data. Popular AI tools such as ChatGPT, Claude, Gemini and others are commonly known as GenAI tools. GenAI is also a characteristic of software that allows people to edit, compose and rewrite text, such as Grammarly.[18]

1.27Agentic AI systems are often characterised as an advancement of GenAI systems.[19] They pursue high-level goals by perceiving their environment, acting or executing tasks autonomously or semi-autonomously and over extended periods of time.[20] AI companies are incorporating AI agents or agentic capabilities into products they currently offer. Some common applications involve AI agents coding software or providing customer service.

1.28Many existing court guidelines focus on risks of GenAI. In our report we consider how risk varies across different types of AI. This involves distinguishing between public and closed AI tools (see Chapter 3).

General Purpose AI

1.29A General Purpose AI or GPAI system can be defined as:

an advanced AI system capable of effectively performing a range of distinct tasks. Its degree of autonomy and ability is determined by several key characteristics, including the capacity to adapt or perform well on new tasks that arise at a future time, the demonstration of competence in domains for which it was not intentionally and specifically trained, the ability to learn from limited data, and the proactive acknowledgment of its own limitations in order to enhance its performance.[21]

1.30A key characteristic of General Purpose AI systems is the ability to perform a wide range of tasks across many domains of knowledge. This includes tasks for which a system has not been specifically trained. For instance, ChatGPT has been described as a General Purpose AI. Some regulatory responses classify General Purpose AI as high risk and place specific obligations on developers to ensure the technology is safe.[22]

Expert systems

1.31Expert systems are computer programs that use pre-defined rules (‘if-then’ rules) and a knowledge base to infer conclusions. Expert systems were an early example of AI and do not learn or adapt without human intervention.[23] Expert systems are knowledge-based and fall within the OECD definition of AI.[24]


  1. CSV was established in 2014 as an independent statutory body corporate to provide administrative services and facilities to support the performance of judicial, quasi-judicial and administrative functions of Victoria’s courts and tribunals, the Judicial College and the Judicial Commission. The CSV establishes a governance model that is directed by the judiciary, reinforcing the principle of judicial independence. Court Services Victoria, Delivering Excellence in Court and Tribunal Administration, Annual Report 2023-24 (Report, October 2024) 6,90.

  2. F Abedi and NJ Balmer, AI Use in the Legal Profession: Findings from the 2025 Victorian Lawyer Census (Report, Victorian Legal Services Board and Commissioner, forthcoming 2025).

  3. Court Services Victoria Act 2014 (Vic) s 12. Courts Council also includes a non-judicial member.

  4. Organisation for Economic Co-operation and Development (OECD), Recommendation of the Council on Artificial Intelligence OECD/LEGAL/0449, 3 May 2024, <https://legalinstruments.oecd.org/en/instruments/OECD-LEGAL-0449>.

  5. Organisation for Economic Co-operation and Development (OECD), Explanatory Memorandum on the Updated OECD Definition of an AI System (OECD Artificial Intelligence Papers No 8, 5 March 2024) 6, 8 <doi.org/10.1787/623da898-en>.

  6. Ibid 7.

  7. Department of Industry, Science and Resources (Cth), Safe and Responsible AI in Australia: Proposals Paper for Introducing Mandatory Guardrails for AI in High-Risk Settings (Proposals Paper, September 2024) 8, 53; Department of Premier and Cabinet (Vic), Administrative Guideline – The Safe and Responsible Use of Generative AI in the Victorian Public Sector (No 2024/07, Issue 1.0, November 2024) 7 <https://www.vic.gov.au/sites/default/files/2024-11/Generative-AI-Guideline-%281%29.pdf>.

  8. Consultation 5 (Victorian Bar Association).

  9. Digital Transformation Agency (Cth), Australian Government’s AI Technical Standard (Version 1, July 2025) 13 <https://www.digital.gov.au/policy/ai/AI-technical-standard>.

  10. See Department of Industry, Science and Resources (Cth), National Artificial Intelligence Centre, and CSIRO, Voluntary AI Safety Standard (Report, August 2024) 13, 29 <https://www.industry.gov.au/sites/default/files/2024-09/voluntary-ai-safety-standard.pdf>; Digital Transformation Agency (Cth), Australian Government’s AI Technical Standard (Version 1, July 2025) 13, 14–16, 28–30 <https://www.digital.gov.au/policy/ai/AI-technical-standard>; National Security Agency’s Artificial Intelligence Security Centre (AISC) (US) et al, AI Data Security: Best Practices for Securing Data Used to Train & Operate AI Systems – Joint Cybersecurity Information (Cybersecurity Information Sheet (CSI) No U/OO/157249-25, PP-25-2301, 23 May 2025) 3–5.

  11. In relation to data security risks across the AI lifecycle, see National Security Agency’s Artificial Intelligence Security Centre (AISC) (US) et al, AI Data Security: Best Practices for Securing Data Used to Train & Operate AI Systems – Joint Cybersecurity Information (Cybersecurity Information Sheet (CSI) No U/OO/157249-25, PP-25-2301, 23 May 2025) 3–5.

  12. Organisation for Economic Co-operation and Development (OECD), Scoping the OECD AI Principles: Deliberations of the Expert Group on Artificial Intelligence at the OECD (AIGO) (OECD Digital Economy Papers No 291, November 2019) 13 <https://doi.org/10.1787/d62f618a-en>.

  13. Fan Yang, Jake Goldenfein and Kathy Nickels, GenAI Concepts: Technical, Operational and Regulatory Terms and Concepts for Generative Artificial Intelligence (GenAI) (Report, ARC Centre of Excellence for Automated Decision-Making and Society (ADM+S), and the Office of the Victorian Information Commissioner (OVIC), 2024) 4 <https://doi.org/10.60836/PSMC-RV23>.

  14. Victorian Law Reform Commission, Artificial Intelligence in Victoria’s Courts and Tribunals: Consultation Paper (Report, Victorian Law Reform Commission, October 2024) 9.

  15. Organisation for Economic Co-operation and Development (OECD), Explanatory Memorandum on the Updated OECD Definition of an AI System (OECD Artificial Intelligence Papers No 8, 5 March 2024) 6, 8 <doi.org/10.1787/623da898-en>.

  16. Paul Covington, Jay Adams and Emre Sargin, ‘Deep Neural Networks for YouTube Recommendations’ in Proceedings of the 10th ACM Conference on Recommender Systems (Conference Paper, 15 September 2016) 191, 191–2 <https://research.google/pubs/deep-neural-networks-for-youtube-recommendations/>.

  17. Fan Yang, Jake Goldenfein and Kathy Nickels, ‘GenAI Concepts’, ADM+S Centre and OVIC (Web Page, 2024) <https://www.admscentre.org.au/genai-concepts/>.

  18. ‘Introducing Generative AI Assistance’, Grammarly Support (Web Page, 2025) <https://support.grammarly.com/hc/en-us/articles/14528857014285-Introducing-generative-AI-assistance>.

  19. For example, OWASP Gen AI Security Project, Agentic AI – Threats and Mitigations: OWASP Top 10 for LLM Apps & Gen AI Agentic Security Initiative (White Paper, Version 1.0, February 2025) 3 <https://genai.owasp.org/resource/agentic-ai-threats-and-mitigations/>.

  20. Deven R Desai and Mark Riedl, Responsible AI Agents (Georgia Tech Scheller College of Business Research Paper No 5147666 (preprint), 20 February 2025) 7–9 <https://papers.ssrn.com/abstract=5147666>; Stuart Russell and Peter Norvig, Artificial Intelligence: A Modern Approach (Pearson, 4th ed, 2022) 54–6.

  21. Isaac Triguero et al, ‘General Purpose Artificial Intelligence Systems (GPAIS): Properties, Definition, Taxonomy, Societal Implications and Responsible Governance’ (2024) 103 Information Fusion 102135, 7 <https://www.sciencedirect.com/science/article/abs/pii/S1566253523004517?via%3Dihub>.

  22. For example, Regulation (EU) 2024/1689 (Artificial Intelligence Act) [2024] OJ L 2024/1689, arts 53, 54, 55.

  23. Richard E Susskind, ‘Expert Systems in Law: A Jurisprudential Approach to Artificial Intelligence and Legal Reasoning’ (1986) 49(2) The Modern Law Review 168 <https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1468-2230.1986.tb01683.x>.

  24. Organisation for Economic Co-operation and Development (OECD), Recommendation of the Council on Artificial Intelligence OECD/LEGAL/0449, 3 May 2024, 7 <https://legalinstruments.oecd.org/en/instruments/OECD-LEGAL-0449>.


Voiced by Amazon Polly