5. Is legislative reform necessary for safe use of AI in courts and tribunals?
Overview
•Our terms of reference ask us to consider opportunities to build on existing legislation, regulation and common law to support the safe use of AI in Victoria’s courts and tribunals.
•We heard from some stakeholders that legislative reform might be needed in privacy law. There are also emerging issues in evidence law and administrative law that might require legislative reform over time to respond to risks of AI.
•The feedback we received did not identify clear gaps in existing legislation. Given AI technology and use in courts is evolving. and the regulatory environment is still developing at the national level, we are not proposing legislative reform at this stage.
•We were also told that court rules and procedures will need to adapt but can be currently managed within existing rule-making powers.
•This chapter provides an overview of the current legislative framework and considers potential areas where future law reform may be required.
Laws and regulations relevant to AI
5.1While there is no AI-specific legislation in Victoria, elements of Victoria’s broader legislative and regulatory framework are relevant in considering how AI can be safely used in Victoria’s courts and VCAT. These include:
•court rules and procedures
•laws and legal principles concerning human rights
•privacy law
•evidence law
•administrative law
•legal professional obligations.
5.2In our consultation paper we asked for feedback about whether the existing legislative framework required reform. The overwhelming response was that it is too early to consider legislative reform, given AI technology and its use in courts and tribunals is still developing. It was also recognised that the Australian Government is considering broad regulatory responses that could influence approaches by Victoria’s courts and VCAT. As a result, the feedback we received did not identify many examples of gaps or areas where law reform may be required.
Court rules and procedures
5.3In our consultation paper, we highlighted that changes to court rules and procedures may be needed to support the safe use of AI in Victoria’s courts and VCAT. Through consultations and submissions, we explored potential rules changes that may be required and whether such changes could be made under existing rule-making powers.
5.4Current rule-making powers appear adequate for courts to address immediate issues, such as developing a practice note for court users about their use of AI.
5.5Some courts anticipated legislative changes may be necessary in the future. The extent of legislative and rule changes will depend on the type of AI and how it is used. We were told that without clear examples of AI use, it is difficult to anticipate changes needed across potentially relevant legislation.
5.6Many types of legislation support court processes and operations:
•Acts relating to court structure establish the various courts in Victoria and define core functions and powers. Each Act empowers judicial members to make rules, practice directions and practice notes.
•Statutory rules on court procedure provide practical detail about how cases are conducted within each court. These rules provide operational details of the primary Acts, specifying procedures for court users, lawyers and court staff.
5.7Other legislation also shapes court processes. For instance, the Criminal Procedure Act 2009 (Vic) guides a criminal trial process, the Evidence Act 2008 (Vic) sets out the rules of evidence, and the Sentencing Act 1991 (Vic) establishes a framework for sentencing.
5.8Various aspects of this legislative framework may become relevant when contemplating future uses of AI across courts and VCAT (see Table 2).
Table 2: Laws impacting Victoria’s court processes
|
Types of Legislation |
Examples of relevant Acts and rules |
|---|---|
|
Acts relating to court structure |
•Supreme Court Act 1986 (Vic) •County Court Act 1958 (Vic) •Magistrates’ Court Act 1989 (Vic) •Victorian Civil and Administrative Tribunal Act 1998 (Vic) •Coroners Act 2008 (Vic) •Children, Youth and Families Act 2005 (Vic) |
|
Statutory rules on court procedure |
•Supreme Court (General Civil Procedure) Rules 2025 (Vic)/(Criminal Procedure) Rules 2017 (Vic) •County Court Civil Procedure Rules 2018 (Vic)/Criminal Procedure Rules 2019 (Vic)/County Court Miscellaneous Rules 2019 (Vic) •Magistrates’ Court General Civil Procedure Rules 2020 (Vic)/Criminal Procedure Rules 2019 (Vic) •Victorian Civil and Administrative Tribunal Rules 2018 (Vic) •Coroners Court Rules 2019 (Vic)/Coroners Regulations 2019 (Vic) •Children’s Court Criminal Procedure Rules 2019 (Vic) |
|
Other Acts shaping court practice and procedure |
•Bail Act 1977 (Vic) •Charter of Human Rights and Responsibilities Act 2006 (Vic) •Civil Procedure Act 2010 (Vic) •Criminal Procedure Act 2009 (Vic) •Evidence Act 2008 (Vic) •Evidence (Miscellaneous Provisions) Act 1958 (Vic) •Family Violence Protection Act 2008 (Vic) •Juries Act 2000 (Vic) •Open Courts Act 2013 (Vic) •Sentencing Act 1991 (Vic) •Vexatious Proceedings Act 2014 (Vic) |
|
Administrative and supporting legislation |
•Court Security Act 1980 (Vic) •Health Records Act 2001 (Vic) •Public Records Act 1973 (Vic) •Privacy and Data Protection Act 2014 (Vic) •Spent Convictions Act 2021 (Vic) •Judicial Proceedings Reports Act 1958 (Vic) |
How are court rules and procedures developed?
5.9Legislation that establishes courts also provides for judicial officers of that court to make statutory rules.[1] Court rules set out procedures and conduct to be followed in court proceedings.[2] This enables courts to adapt procedures as required. The Office of the Chief Parliamentary Counsel may assist in the development of rules but certification by the Chief Parliamentary Counsel is not required.[3]
5.10The Council of Judges makes rules in the Supreme Court and is supported by a Rules Committee.[4] In the Magistrates’ Court, the Civil Practice Committee supports the Chief Magistrate and Deputy Chief Magistrates in developing statutory rules. The Committee consults on rules, processes and procedures and is comprised of the court’s judiciary, administration and representatives of the legal profession.[5]
5.11Courts also publish practice notes or practice directions through the Chief Justice, Chief Judge or Chief Magistrate. These provide guidance on how the court intends to apply the law and manage cases.[6] Practice notes are not strictly enforceable like legislation. The Supreme Court characterises them as providing ‘information about the Court’s practice and procedure. They also set out the Court’s expectations of parties coming before the Court’.[7]
Are reforms to court rules and procedures needed?
5.12The use of AI may require changes to rules and procedures. Specific changes will depend on the type of AI and how it is used.[8]
5.13There are recent examples of changes to court rules and procedures to enable new technology. The Children’s Court made new rules to support the introduction of an electronic case management system.[9] VCAT described how changes were made to statutory rules to accommodate introduction of a digital portal.[10]
5.14While rule changes will be required to implement AI systems, it is difficult to specify these without examples. The County Court stated:
Whether there is any need for change to legislation, rules or processes will depend on the AI system and its intended application relevant to the Court.[11]
5.15When implementing AI systems, courts should consider changes to statutory rules, where possible. Rule changes are more flexible than legislative changes and can respond to new use cases as they are identified, with the development of emerging technologies.
Are reforms to court legislative powers needed?
5.16The power to make rules is confined to matters set out in legislation. The specific categories differ across court legislation but are relatively broad. Some courts told us that existing rule-making powers are sufficient to manage wide-ranging matters relating to AI. For instance, the Supreme Court stated:
The Court can deal with AI through existing rule making powers. Practice notes and guidelines are a better method to manage AI related issues because they are more flexible and can be implemented quickly.[12]
5.17The County Court suggested legislative amendment could clarify that court rule-making powers include the power to govern the use of emerging technologies. For instance, by referring to emerging technologies under section 78 of the County Court Act 1958 (Vic), which empowers the court to make rules of practice.[13]
5.18In preparing a practice note for court users on the use of GenAI, representatives of VCAT said that no change was required to its statutory rules.[14] Other courts said that there is scope for including AI practice management procedures in practice notes.[15]
5.19At this stage, courts appear to have sufficient powers to establish rules, practice directions and practice notes governing use of AI. While rule-making powers do not refer to emerging technology, AI-related rules can likely be made under the existing broad powers.
5.20Other legislative changes may also be required but these are difficult to predict, and none were identified at this early stage.
5.21An example of emerging technology requiring legislative change is during the COVID-19 pandemic, where legislation was introduced to enable the use of technology in courts.[16]
5.22Legislation may require amendment if certain court functions become automated, such as tasks currently requiring a registrar or deputy registrar to be present.[17]
5.23We heard from the Magistrates’ Court that amendments to some legislation shaping court procedure and practice might be needed in the future. This includes possible amendments to the:
•Criminal Procedure Act 2009 (Vic)
•Civil Procedure Act 2010 (Vic)
•Sentencing Act 1991 (Vic).[18]
5.24VCAT suggested minor legislative amendments or changes to statutory rules may be needed in future if AI tools are used to assist with tasks currently performed by a human. For example, legislation currently requires a member of the Tribunal, the principal registrar or a nominated ‘person’ to convene compulsory conferences.[19] If in future this function is assisted by AI, legislative or rule change may be necessary.[20]
5.25Insights about proposed legislative reforms were not more specific than those listed. In consultations, it was noted that it is difficult to predict legislative or rule changes without knowing how AI could or will be used in courts.
Human rights and AI in Victoria’s courts and VCAT
5.26Victoria’s courts and VCAT are at an early stage of considering AI.[21] However, the future uses of AI, from case management systems to neurotechnology,[22] may raise human rights issues. Considering the human rights implications of AI is fundamental to public trust in courts and supports a fair and equitable justice system.
5.27It is important for Victoria’s courts and VCAT to take a human rights approach to AI issues. A human rights approach ‘places the human experience at the core’ of analysis and decision making and treats people as rights holders.[23] It also requires collecting data to evaluate whether court and tribunal AI use is in line with human rights norms and standards.[24] This can create a focal point for shaping, implementing, monitoring and reviewing court and tribunal AI use.
Victorian legislative framework on human rights
5.28Victoria’s courts and VCAT are obliged to give effect to human rights and apply and interpret human rights under the Charter of Human Rights and Responsibilities Act 2006 (Vic) (the Charter).[25]
5.29Charter rights particularly relevant to Victoria’s courts and VCAT include:
•recognition and equality before the law[26]
•right to a fair hearing[27]
•right to liberty and security of person[28]
•rights in criminal proceedings[29]
•right to privacy[30] (discussed later in this chapter).
5.30Victoria’s courts and VCAT are bound by sections 4 and 38 to act compatibly with the Charter, and to give proper consideration to relevant Charter rights when acting in an administrative capacity.[31] Courts also have a responsibility under section 32 to interpret Victorian legislation in a manner compatible with the human rights set out in the Charter, as far as it is possible to do so.[32] The Charter provides examples of a court acting in an administrative capacity. This includes committal proceedings, issuing warrants, listing cases, adopting practices and procedures.[33] If there is ambiguity as to whether a judicial or administrative function is being exercised, Victoria’s courts and VCAT will consider what legal character the function in question holds.[34]
5.31There is some uncertainty about the scope of the obligation under section 6(2)(b) for courts and tribunals to act compatibly with Charter rights when acting in a judicial capacity. It has been observed that there are three possible constructions of this obligation: a broad, intermediate or narrow one.[35] Generally, courts have adopted the intermediate approach, which considers that the courts’ judicial function is to apply or enforce only the Charter rights that relate to the court proceedings.[36] However, there remains a lack of clarity about an authoritative approach to this aspect of the Charter’s application.
5.32The difficulty of delineating the scope of administrative and judicial functions of the courts is discussed in the privacy section of this chapter.
5.33Courts have obligations under other human rights legislation, particularly in administrative capacities, including the Equal Opportunity Act 2010 (Vic) and the Privacy and Data Protection Act 2014 (Vic).
How can Victoria’s courts and VCAT protect human rights when using AI?
5.34To protect human rights and avoid risks like algorithmic bias and automation bias, Victoria’s courts and VCAT should ensure they have appropriate procurement, oversight and review processes in place. This includes building human rights principles into governance frameworks.[37] In Chapter 9, we consider that a human rights impact assessment could form part of an AI assurance framework for Victoria’s courts and VCAT when making decisions about AI.
5.35Where possible, courts should adopt human rights impact assessments as part of governance frameworks. Considering human rights in each governance decision involving AI can help courts to maintain public confidence in their ability to use AI and protect human rights.
5.36Overall, several stakeholders supported human rights impact assessments as an important part of preventing human rights breaches.[38] The Australian Human Rights Commission has expressed the same views.[39] The Office of the Victorian Information Commissioner suggested that human rights impact assessments should be conducted to determine compliance with the Charter.[40] This was supported by the Human Rights Law Centre, which stated that assessments should be undertaken at the ‘pre-design and deployment stages to identify and mitigate risks to Charter-protected rights’.[41]
5.37Human rights impact assessments vary in complexity and sophistication. While there is no human rights impact assessment specifically considering AI use in Victoria’s courts, a range of approaches to human rights impact assessments could be applied. Examples include sets of simple, non-AI specific questions,[42] to more complex AI-specific evaluations.[43] The Australian Human Rights Commission has developed a human rights impact assessment tool to consider the use of AI in the banking sector.[44]
5.38The Victorian Equal Opportunity and Human Rights Commission has developed a guide for Victorian public sector workers to consider and apply human rights when making decisions. The guide is not AI-specific but provides a list of considerations that could be applied to the use of AI tools in Victoria’s courts and VCAT.[45] The guide suggests:
•identifying relevant rights
•identifying any interference or limitations
•considering what impact the decision will have on the rights of people affected by the decision
•justifying the decision, after balancing all competing rights and interests, and taking into account whether any limitation on human rights is reasonable, justifiable and proportionate.[46]
5.39In the context of AI, this could include:
•identifying if any human rights are relevant to the procurement, development, management or review of an AI tool
•identifying any interference or limitation to those rights in those processes
•identifying possible impacts of a decision on a person’s rights, particularly in terms of the right to a fair trial, non-discrimination and privacy
•considering whether any limitation on a human right is reasonable, justifiable and proportionate to the intended result of the AI tool
•considering whether there is any less restrictive means available to achieve the intended result
•conducting a balancing exercise, considering how an AI tool promotes and limits different human rights, and competing private and public interests.
5.40Internationally, the Law Commission of Ontario, Canada has developed a detailed human rights impact assessment specifically for AI, involving 33 questions to assess and mitigate risks.[47] Notably, it indicates that AI systems deployed in the justice system are likely to be operating in an area covered by human rights law, which would make a human rights assessment relevant.[48] The UNESCO Global Toolkit on AI and the Rule of Law for the Judiciary provides other examples of impact assessment tools, noting they ‘can assist in the identification of risks that judicial operators might not otherwise foresee in AI development and deployment’.[49]
5.41Developing a human rights impact assessment framework tailored to Victoria’s Charter obligations would assist court and tribunal use of AI. This was highlighted by the Victorian Equal Opportunity and Human Rights Commission and the Federation of Community Legal Centres and Justice Connect.[50] If the Victorian Equal Opportunity and Human Rights Commission was to develop such a framework, benefits would extend beyond the Victorian justice system, but it would require appropriate resourcing.
5.42Education is also crucial to ensure that biases are avoided and human rights are considered when courts and tribunals are considering procurement of AI tools. Approaches to education are discussed in Chapter 10.
Privacy risks and AI in Victoria’s courts and VCAT
5.43The use of AI in Victoria’s courts and VCAT raises significant considerations for privacy and information security risks. These are not new issues for courts, but they are exacerbated by the speed and scale with which AI technology can collect, use, disclose or disseminate personal or sensitive information.[51]
5.44AI models are complex, and AI technology can learn and adapt, creating new challenges for privacy. AI can infer personal information from existing data, or use information provided for different purposes.[52] As the Castan Centre noted:
While court users consent to a certain level of sharing of personal data in order to participate in the court, the open-ended risks of uploading personal data into AI platforms poses a distinct challenge to existing expectations of privacy and information sharing.[53]
5.45The Law Institute of Victoria strongly advises lawyers not to enter confidential client information, or information subject to legal professional privilege (LPP), into open AI systems. This is to avoid a breach of confidence or the implied waiver of LPP.[54] The Law Institute of Victoria’s guidance is that even closed AI systems should be used cautiously when entering confidential information:
Any assertion that an AI [system] is a closed system should be carefully examined given the importance of protecting confidential information. Closed AI systems may still be vulnerable to data breaches, cyber-attacks and exfiltration of training data.[55]
5.46We heard from a representative of Digital Rights Watch that AI systems may breach privacy in the supply chain. For that reason, it may not be appropriate for courts to use such systems, even if they are accurate.[56] For example, in 2021 Clearview AI was found to have breached Australians’ privacy by collecting facial images and biometric data from social media sites, in order to build its facial recognition technology.[57]
Examples of privacy breaches in courts and tribunals
5.47While not specific to the use of AI, two recent examples of data breaches demonstrate privacy and data security risks in courts.
5.48In 2024, a Court Services Victoria (CSV) managed audiovisual network was hacked, resulting in the possibility that recordings of some court hearings had been accessed.[58] Data hacking incidents involve a third-party gaining access to highly sensitive information, as CSV itself acknowledged in that case. To date, publication of the recordings on the internet has not been detected.[59] Following the incident, CSV published information about remedial action and some people who were affected were contacted.[60]
5.49In 2025, the NSW Online Registry website, administered by the NSW Department of Communities and Justice, was involved in a data breach. In this incident, 9,000 sensitive court files from criminal and civil cases, including domestic violence orders and affidavits, were downloaded from the Court’s portal.[61] Community legal centre representatives pointed out unauthorised access to private information could endanger victims of domestic violence and undermine confidence in the court process.[62] No data from the breach had been published as of 10 April 2025.[63] The breach was reported to Cyber Security NSW and the NSW Police Cybercrime Squad.[64]
5.50These examples highlight how individuals may suffer real world impacts because of infringements of privacy within courts. The response of courts, tribunals and court administrators to such incidents is a critical factor in shaping and maintaining public trust in judicial institutions.
Victoria’s privacy regulation regime
5.51In Victoria, privacy is governed by the Privacy and Data Protection Act 2014 (Vic) (PDP Act), section 13 of the Charter, and the Privacy Act 1988 (Cth). These Acts are technology-neutral, meaning they are applicable irrespective of whether the protection of those privacy rights relates to different kinds of digital technology. The advent of AI was not anticipated when the legislation was drafted.
Privacy obligations under the Charter
5.52The right to privacy under the Charter prohibits unlawful or arbitrary interference with a person’s privacy, family, home or correspondence.[65] The Charter is intended to be interpreted consistently with existing information privacy frameworks in Victoria. This includes the PDP Act and the Information Privacy Principles discussed below.[66] The purpose of the right to privacy is to protect people from unjustified interference with their personal and social identity, and protect their right to physical and psychological integrity, including personal security.[67]
5.53Particularly relevant to AI, a recent report to the UN Human Rights Council found that hacking, restrictions on encryption and surveillance of public places can infringe on the right to privacy.[68] The UN Special Rapporteur on the right to privacy noted that risks related to AI include data processing and the decision made as a result of processing.[69] The importance of having a sound ethical and legal basis for data processing was highlighted, particularly if processing leads to decisions affecting a person’s rights.[70]
5.54The right to privacy can be limited where the limitation has a legitimate purpose, and the nature and extent of the limitation is proportionate, justifiable and reasonable.[71] It can also be limited if interference with a person’s privacy is lawful and not arbitrary.[72] In practice, interference is lawful when it aligns with a regulatory framework, and will not be arbitrary where it is not capricious, unpredictable, unjust or unreasonable, and is proportionate to the legitimate aim sought.[73] When organisations such as courts and tribunals seek to limit the right to privacy, such limitations should be articulated in these terms.[74]
Privacy and Data Protection Act
5.55The Privacy and Data Protection Act 2014 (Vic) requires public sector organisations to comply with Information Privacy Principles (IPPs) regarding the collection, use and disclosure of personal information. The principles set out:
•when and how organisations can collect personal information
•when organisations can use and disclose personal information for a secondary purpose
•how personal information should be handled
•general transparency requirements to prepare and publish privacy policies and to be open with individuals about the personal information an organisation holds
•when individuals can seek access to and correct their personal information
•when an organisation can use unique identifiers
•when an organisation must allow for anonymisation when an individual transacts with the organisation
•when personal information can be transferred outside of Victoria
•requirements for when sensitive information can be collected.[75]
5.56The PDP Act also sets out a framework for monitoring and assuring the security of public sector data.[76] Public sector data is information obtained, received or held by an agency that comes under the framework.[77]
5.57The Office of the Victorian Information Commissioner has also published guidance on AI use for public sector organisations. This includes guidance for Victorian Public Sector (VPS) organisations on use of:
•enterprise GenAI tools—tools that are purchased by an organisation and operate within a public sector agency’s secure information environment.[78] The guidance covers the minimum standards for VPS organisations to identify and mitigate risks in the existing environment before and during procurement, and following the roll out of any tools.[79]
•publicly available General Purpose AI tools, such as ChatGPT.[80] The Office of the Victorian Information Commissioner considers that entering personal information into publicly available GenAI tools is likely to be a contravention of the PDP Act’s IPPs and may cause harm to individuals. To align with the IPPs, it recommends that organisations limit entering public sector information into such tools.[81]
5.58At the time of writing, the Office of the Victorian Information Commissioner was updating its general guidance on AI and privacy obligations.[82]
5.59The Office of the Victorian Information Commissioner regulates the Victorian Protective Data Security Standards under the PDP Act. The standards set out mandatory requirements, which includes reporting requirements which relevant organisations must follow.[83]
How does privacy legislation apply to Victoria’s courts and VCAT?
5.60As discussed in the human rights section, the Charter obliges Victoria’s courts and VCAT to apply human rights, and therefore the right to privacy, when interpreting laws.
5.61The Charter also obliges judicial officers to consider the right to privacy when acting in an administrative capacity.[84] As discussed in paragraph [5.31], the application of the Charter to judicial functions is less clear. It has been accepted in relation to certain Charter rights relevant to court and tribunal proceedings, such as the right to equality in section 8(3) and the right to a fair hearing in section 24(1).[85] While uncertain, it is likely that the right to privacy sits outside this scope.
5.62Victoria’s courts and tribunals are exempt from the PDP Act, including from compliance with the IPPs, in respect of the collection, holding, management, use, disclosure or transfer of information:
a)in relation to its or the holder’s judicial or quasi-judicial functions, by—
(i)a court or tribunal; or
(ii)the holder of a judicial or quasi-judicial office or other office pertaining to a court or tribunal in their capacity as the holder of that office; or
b)in relation to those matters which relate to the judicial or quasi-judicial functions of the court or tribunal, by—
(i)a registry or other office of a court or tribunal; or
(ii)the staff of such a registry or other office in their capacity as members of that staff.[86]
5.63In practice, this exemption is interpreted broadly, and administrative functions executing or supporting the judicial function are treated as exempt. The Office of the Victorian Information Commissioner is limited in its ability to conciliate privacy complaints in Victoria’s courts and VCAT because of the PDP Act exemption.[87] This includes complaints relating to the misuse of AI. It also includes exemption from the Victorian Protective Data Security Standards.[88]
5.64Even so, Victoria’s courts’ and VCAT’s stated aims are to follow state and national privacy regimes, and to consider privacy and data security issues where they are not incompatible with other court obligations.[89]
Stakeholder views on Victoria’s privacy legislation and AI
5.65Some stakeholders saw benefit in increased oversight or accountability mechanisms for AI use in Victoria’s courts and VCAT in the context of privacy rights. The Castan Centre noted a ‘difficult tension between the carve-out under the [PDP] Act and the obligations of courts and tribunals under the Victorian Charter’.[90]
5.66Other stakeholders held the view that Victoria’s courts and VCAT should take measures to uphold the right to privacy in line with Office of the Victorian Information Commissioner guidance, the PDP Act and the Charter when using AI.[91] Victoria’s courts and VCAT could limit the use of public AI tools to safeguard privacy,[92] and ensure AI systems adhere to privacy standards, including principles of data minimisation, anonymisation, and informed consent.[93]
5.67Another view was that privacy reform should occur at the national level, with reform to the Privacy Act 1988 (Cth).[94]
Proposal to narrow exemption of courts and tribunals from the PDP Act
5.68The Office of the Victorian Information Commissioner proposed legislative amendments to narrow the existing judicial and quasi-judicial exemption. This would bring a broader range of court and tribunal functions under its conciliation powers for breaches of the IPPs.[95]
5.69It proposed:
•repealing the legislative carve-out for Victoria’s courts and VCAT under the PDP Act (section 10)
•inserting a new section into the PDP Act (section 15) stating Victoria’s courts and VCAT are exempt from the IPPs only when they are acting in a judicial capacity. Based on existing section 15, it is likely that it would also require ‘belief on reasonable grounds that noncompliance is necessary’.[96]
•amending section 84 of the Act to include courts and Court Services Victoria (CSV) as bodies to which Part Four applies, enlivening the application of the Victorian Protective Data Security Standards to CSV, courts and VCAT.[97]
5.70The Office of the Victorian Information Commissioner’s view is that the wording in section 10 of the PDP Act being, ‘relates to’ judicial and quasi-judicial decision-making allows for an overly broad interpretation that extends to administrative activity of Victoria’s courts and VCAT.[98] The Office of the Victorian Information Commissioner considers the wording of the PDP Act exemption to be more broadly worded than comparable privacy legislation in NSW, and notes that there is no equivalent exemption in the Privacy Act 1988 (Cth).[99]
5.71The Office of the Victorian Information Commissioner seeks a functional approach to a new PDP Act exemption, that would only exempt a court or tribunal engaged in a judicial function. It considers that:
•a judicial function is where personal information is processed for judicial purposes, such as verdicts or decisions and civil and criminal proceedings
•an administrative function is all other functions, such as registry officers dealing with personal information.[100]
5.72The Office of the Victorian Information Commissioner’s proposal to amend section 84 is intended to cover all functions of Victoria’s courts, not just administrative functions. It would require courts, tribunals and CSV to report to the Office of the Victorian Information Commissioner under the Victorian Protective Data Security Standards regulatory framework.[101]
5.73In response to the Office of the Victorian Information Commissioner’s proposal, CSV stated that the current exemption is appropriate and strikes ‘a balance between protecting privacy and information security, judicial independence and enabling the efficient administration of justice in the context of open justice principles’.[102]
5.74CSV also noted that no sharp distinction can be made between judicial and administrative functions. It pointed to the uncertainty that the proposed reform to the exemption would create in the day-to-day practice of courts.[103] Registry staff would need to decide on a case-by-case basis whether the personal information was handled in an administrative or judicial capacity, creating further administrative burden for a significant volume of their work.[104] It would also be difficult for courts and registry staff to understand at what stage in the proceedings the narrowed carve-out would apply, whether the carve-out would depend on the identity of the person handling the information, and whether the carve-out applied sensibly to all IPPs.[105]
5.75CSV also expressed concern that the additional burden would have a flow-on effect to the principles of open justice, and that court media officers might no longer be able to provide timely and accurate information to media outlets for reporting of proceedings.[106]
5.76The Supreme Court and CSV considered the Office of the Victorian Information Commissioner’s concern with the scope of section 10 was a general concern about privacy and the PDP Act, rather than an AI-specific one.[107] Any proposed legislative amendment should therefore be considered in a broader legislative review of privacy provisions.[108] CSV also noted that comparison with other jurisdictions ‘is complex and requires greater analysis and consideration’ but that a number of other jurisdictions similarly use language ‘relates to’ judicial functions.[109]
Broad scope of proposed reform
5.77Privacy risks in relation to AI are significant. But the Office of the Victorian Information Commissioner’s proposal is broad and not possible to implement only in relation to AI. It is therefore outside our terms of reference, which ask us to look for opportunities to build on existing legislation and regulations supporting the use of AI within Victoria’s courts and tribunals. The scope of the impact on other organisations also extends beyond Victoria’s courts and VCAT.[110]
5.78A recent report by the Victorian Parliamentary Integrity and Oversight Committee was supportive of consideration of a similar recommendation by the Office of the Victorian Information Commissioner to extend the Victorian Protective Data Security Standards to additional organisations, including Victoria’s courts and tribunals. However, the Committee said that it would ‘be prudent to undertake further consultation and information gathering’.[111]
5.79The proposal requires substantive analysis of the points raised by the Office of the Victorian Information Commissioner, CSV and the Supreme Court. This includes:
•the impact on the administration of justice, open justice and judicial independence
•practical implications of distinguishing between judicial and administrative functions
•resourcing implications for Victoria’s courts and VCAT
•broader impacts, for instance the impact it would have on the health records regime, which contains a carve-out for Victoria’s courts and VCAT, similar to that in the PDP Act.[112]
Balancing privacy and other principles of justice
5.80The Office of the Victorian Information Commissioner’s position reflects the social and legal importance of protecting the right to privacy. As AI tools are rapidly evolving, and their use is increasing, it is critical for courts to protect privacy rights. It is also important to safeguard the principles of open justice and judicial independence. The regulation of privacy in courts requires careful balancing of these principles. Finding the right balance can promote public confidence in the judiciary.
5.81Open justice contributes to the maintenance of public trust in judicial institutions and the integrity and impartiality of courts.[113] It involves a set of related principles and activities, including transparency about court processes and operations, accurate and truthful reporting of proceedings and public access to hearings.[114] Open justice is not absolute, and courts’ other duties and obligations might take precedence in some circumstances.[115] The Federal Circuit and Family Court’s privacy policy balances open justice with a family’s right to privacy.[116]
5.82Judicial independence is the judiciary’s ability to make decisions without undue influence by the executive and legislature.[117] More broadly, it requires that a judge be, and be seen to be, independent of all sources of power or influence in society.[118] One reason for the judiciary to remain independent from government departments and agencies is because the judiciary plays an important role in ‘reviewing the decisions and actions of government officials, agencies, and other public bodies’.[119] The independence of courts allows for judges to undertake the important role of judicial review which enable individuals to challenge decisions of government bodies that affect their rights or interests.[120]
5.83Separation of powers requires that the judiciary and the court system operate independently from Parliament and the executive arm of government.[121] If the boundary of judicial function is defined too narrowly, judicial independence could be compromised. A narrow definition could result in the Office of the Victorian Information Commissioner, as part of the executive arm of government, overseeing some judicial functions.
5.84Amending section 84, as proposed by the Office of the Victorian Information Commissioner, would apply to all courts and tribunals, including judicial functions. While the Office of the Victorian Information Commissioner ‘does not believe that extending coverage of Part 4 to courts and tribunals would challenge the independence of the judiciary in making decisions in its judicial capacity’,[122] this requires further analysis.
Challenges in defining ‘administrative functions’
5.85There are practical difficulties in defining administrative and judicial functions. Some processes such as mediation may not be defined as judicial work. Personal information may be handled by registries, officers and other court staff for a range of administrative purposes necessary for executing the court or tribunal’s judicial or quasi-judicial functions.[123]
5.86In the case of Kline v Official Secretary to the Governor-General, the majority of the High Court held that documents of an administrative matter concern the management and administration of office resources,[124] or the ‘apparatus’ of administration that supports substantive judicial functions.[125] Documents used in processes and activities concerned with the exercise of substantive powers and judicial powers are not of an administrative nature.[126] While this case applied the law in the context of a Freedom of Information legislative regime, it is useful to consider the complexities in defining administrative and judicial functions.
5.87Exploring how legislative carve-outs apply in other jurisdictions could inform whether the amendment is appropriate in the Victorian context. While no jurisdiction uses a directly comparable legislative carve-out to Victoria, several jurisdictions use similar phrasing of ‘relates to’ judicial and quasi-judicial information handling practices.[127]
5.88The NSW Administrative Decision Tribunal considered a complaint about the conduct of registry officers granting access to documents relating to a court file. It was found that the Privacy and Personal Information Protection Act 1998 (NSW) did not apply because it ‘relates to’ the exercise by the Court of its judicial functions:
The efficient performance of judicial functions depends greatly on there being a system for the receipt and organisation of intended evidence in advance of the formal hearing of a matter. This system is commonly provided by a Registry under the direction of a Registrar. Decisions will frequently have to be taken by Registry officers as to the extent to which access is given to this material, ahead of hearing; or after the material has been dealt with at hearing, and has, possibly, become part of the evidence. The function of giving access to documents of that kind, and to the personal information they may contain, is one, I consider, that ‘relates to’ the exercise by the Court of its judicial functions.[128]
Implementation implications
5.89The number of organisations impacted by the proposal is undefined but likely to be relatively high. Care should be taken not to overburden the capacity of the regulator, and the capacity of Victoria’s courts and tribunals to discharge obligations. The Office of the Victorian Information Commissioner recognised that there are many matters to attend to with the current legislative scheme and that ‘if such reform were to occur then we would seek a staged implementation’.[129]
Policy approaches could strengthen public trust about privacy risks
5.90Courts and VCAT could manage privacy risks through policy approaches, in lieu of legislative reform. Such approaches could uphold rights to privacy and public trust in the use of AI.
5.91Some court privacy policies refer to privacy principles and how the court collects, stores, uses and destroys information.[130] The Federal Court also publishes a data breach response plan, which states that the Court is responsible for ensuring that all reasonable steps are taken to handle personal information in accordance with Australian Privacy Principles.[131]
5.92Some court websites provide contact details of privacy officers and information on processes for handling complaints. For example, the Federal Circuit and Family Court’s privacy policy lists a point of contact for privacy complaints and states timeframes and process for acknowledging and responding to the complaint. The court will advise if an investigation is undertaken, and the outcome of that investigation as soon as practicable.[132]
5.93The Office of the Victorian Information Commissioner has developed a ‘Privacy Officer Toolkit’ that provides guidance on the role of a privacy officer and how they operate within the PDP Act regime.[133] It notes that privacy officers can handle privacy complaints directly, assist in the completion of privacy impact assessments, liaise with the Office of the Victorian Information Commissioner, and coordinate an organisation’s response to data breaches, among others functions.
5.94In England and Wales, a Judicial Data Protection Panel considers complaints regarding all courts and tribunals. It has the jurisdiction to deal with complaints concerning processing of personal data by courts and tribunals acting judicially,[134] or individual judicial officers in their judicial capacity.[135] A Judicial Data Processing Complaints Handling Policy sets out the process of investigation and response by the panel.[136]
5.95Victoria’s courts and VCAT could ensure that people affected by data breaches are able to submit their complaints free of charge, publicly outline processes and timeframes for responding to complaints, and possibly create referral to alternative dispute resolution mechanisms.[137] CSV has a privacy coordinator as a primary point of contact for privacy complaints but details about the role and processes are not published.
5.96The Office of the Victorian Information Commissioner also undertakes consultation where it empowers agencies to implement information handling practices based on the requirements of the PDP Act and best practice.[138] Consultations involve providing advice and feedback on proposed projects, policies, procedures and guidelines, and in the 2023-2024 reporting period, specifically on AI.[139] Bodies that have consulted with the Office of the Victorian Information Commissioner include statutory authorities, local councils and private sector organisations.[140]
5.97The Commission recommends that privacy and AI policies for Victoria’s courts and VCAT should refer to the Victorian Information Privacy Principles (IPPs) and demonstrate how they seek to act consistently with them. We note that the County Court privacy policy already outlines relevant IPPs. Development of the CSV AI Policy (discussed in Chapter 9) should incorporate consideration of the IPPs.
5.98Courts should also consider data security standards and the potential for specified roles for ongoing management of data quality, data structuring, and organisational data infrastructures. They should also consider whether a privacy by design approach to court data would be beneficial to support implementation of AI tools (discussed in Chapter 3).[141]
5.99Victoria’s courts and VCAT could work with the Office of the Victorian Information Commissioner to implement information handling practices, policies and procedures.
5.100Victoria’s courts and VCAT should consider:
•publishing AI-related privacy and information handling policies and practices, including how they seek to act consistently with the Victorian Information Privacy Principles.
•publishing details of privacy officers and processes for handling AI-related complaints and queries
•proactively consulting with the Office of the Victorian Information Commissioner on privacy and information handling practices.
|
Recommendation 2.Victoria’s courts and VCAT should publicly state how their privacy and AI policies seek to be consistent with the Victorian Information Privacy Principles. |
Evidence law and AI
5.101As discussed in Chapter 2, AI evidence is likely to increasingly feature in court and tribunal hearings in different ways. AI evidence encompasses a wide range of media generated by AI, including text, audio or video evidence. AI can be used to:
•process evidence (for example by law firms for large-scale document discovery)
•inform expert opinion and undertake analysis (for example, when preparing and presenting forensic evidence)
•generate materials such as affidavits and character references.
5.102The use of AI in evidence raises issues of accuracy, reliability and transparency. AI evidence is often the outcome of complex processes that are difficult to trace and understand.[142] Concerns include how to detect whether evidence has been manipulated (for example, using deepfakes)[143] and whether AI evidence is based on biased data or a flawed algorithm.[144]
5.103For the adversarial court system to function as intended, evidence created, processed or analysed by AI must be able to be properly evaluated. The opaque nature of AI may obstruct a court or tribunal understanding the issues. This might be because of the complexity of the AI model, or chain of models and data sources through which output has been produced. AI developers may also be unwilling to explain how an AI system functions to protect the intellectual property of their technology.
5.104Some of these challenges are being considered by the Artificial Intelligence for Law Enforcement and Community Safety Lab, based at Monash University, which is examining how the explainability of AI can best be pursued from an evidentiary perspective.[145]
Evidence laws and procedures in Victoria
5.105The Evidence Act 2008 (Vic) (Evidence Act)[146] introduced the Uniform Evidence Law in Victoria.[147] The Evidence Act is largely aligned with the Commonwealth, New South Wales, Tasmania, Australian Capital Territory and Northern Territory evidence laws. Western Australia has recently introduced the Evidence Act 2025 (WA) to align with the Uniform Evidence Law.[148] The Evidence Act applies to all proceedings commenced in a Victorian court and sets out rules relating to evidence.[149]
5.106The Supreme, County and Magistrates’ Courts have also adopted statutory rules that set out an expert witness code of conduct.[150] This code is not identical to, but uniform with, expert requirements in other Uniform Evidence Law jurisdictions.
5.107The Code of Conduct outlines an expert’s duties to the court, such as conferring with other expert witnesses, providing joint reports when requested, and responding to court directions in a timely way. It also sets out the content required in reports, such as any qualifications to an opinion, and a declaration that the expert has made all the inquiries which the expert believes are desirable and appropriate. The court can also direct experts to have a conference.[151]
5.108The Supreme and County Courts’ Practice Note Expert Evidence in Criminal Trials was updated on 1 June 2025.[152] We discuss the updates in paragraph [5.162]. The Practice Note applies to experts who give or prepare expert evidence in criminal trials in Victoria. It provides direction on:
•how consecutive or concurrent evidence should be given[153]
•directions on disclosure if an expert used AI to generate or support their opinion[154]
•additional information that experts should disclose on the validity and reliability of their methods[155]
•actions that parties should take to inform themselves of the scientific validity of the expert’s evidence if they propose to lead expert evidence of a scientific, medical or technical nature.[156]
5.109VCAT Practice Note – PNVCAT2 Expert Evidence outlines what must be included in an expert report and mechanisms for regulating how expert evidence is given.[157] Similarly, the Coroners Court has a Code of Conduct for expert witnesses.[158]
Are reforms to evidence laws and procedures needed?
5.110We heard from some stakeholders that the Evidence Act was generally flexible enough to address current risks of AI use in Victoria’s courts.[159] Much of the Act’s flexibility lies in the exclusion sections 135 and 137, which allow courts to exclude evidence if its probative value is outweighed by its prejudicial value.[160]
5.111We were advised that it is too early to identify whether legislative reform is needed. However, with increased use of AI in evidence, issues requiring reform may become more clearly defined.[161]
5.112Stakeholders considered the Practice Note Expert Evidence in Criminal Trials and the Expert Witness Code of Conduct could be easily revised to keep up to date with changes in AI technology. This flexible approach would suit AI’s rapid rate of evolution. Over time, reforms to the Evidence Act could also be considered.
5.113Stakeholders acknowledged the significant complexity involved in reforming the Evidence Act, given it is part of the Uniform Evidence Law.[162]
Deepfakes
5.114GenAI can be used to create deepfakes that are difficult to authenticate, or to distinguish from non-manipulated material. Deepfakes are digital photos, videos or sound files that have been edited to create a convincingly realistic but false depiction. A person may be presented as doing or saying something they did not do or say.[163] The risk to court and tribunals is exacerbated by how easily and quickly manipulated content can be created and that it is increasingly difficult to detect.[164] The ability to create deepfakes has become increasingly easy:
anyone with an internet connection can create a deepfake, as there are numerous websites and mobile applications that provide deepfake capabilities, in image video and audio format.[165]
5.115The use of this technology to create sexually explicit, manipulated images of people, particularly in school environments, has ‘become an extremely serious social problem’,[166] and led to reforms in criminal law. In 2022, Victoria introduced the Justice Legislation Amendment (Sexual Offences and Other Matters) Act 2022 (Vic). Amongst other things, it clarifies that the definition of ‘produce’ in relation to an intimate image includes digital creation, and therefore deepfakes.[167] NSW has recently announced the introduction of three bills to criminalise the creation and distribution of sexually explicit deepfakes and intimate images.[168] Australia has created new criminal offences relating to making and distributing sexually explicit deepfake images and videos.[169] Similar offences have been introduced in the United Kingdom.[170]
5.116There are many circumstances where deepfakes could be submitted as evidence. For example, we heard there is a risk AI could be used by perpetrators of family violence to manipulate media such as audio recordings.[171] Technology facilitated abuse is ‘becoming increasingly common worldwide, with the proliferation of smart phones, smart appliances within the home, and the availability of hidden cameras’.[172] One stakeholder specialising in family and gender-based violence told us that they had heard about the use of AI voice tools by perpetrators: ‘It looks like AI is now enabling assisted technology abuse by imitating the voice of the victim survivor. It is being used to do all kinds of things, even locking fridges.’[173]
Deepfake material as evidence
5.117As noted in our consultation paper, there is a risk that deepfakes submitted as evidence, and claims of deepfake evidence in proceedings, will proliferate. This might lead to suspicion of authentic evidence and may undermine public trust in the judicial process.[174] It might also prolong proceedings, creating increased pressure on court resources. At this stage in AI’s evolution, the extent of the problems that deepfake technology may cause in courts is uncertain.
5.118The question raised is whether the existing provisions of the Evidence Act are adequate to deal with the challenges of deepfake material. For example, courts can use existing sections 58 and 183 of the Evidence Act to ‘draw inferences from documents themselves as to their authenticity or identity.’[175] The view of some stakeholders was that deepfake issues are similar to common issues that courts grapple with, such as forgeries.[176] There are many occasions where technology is ahead of the legal system, and evidence law has already demonstrated its flexibility and adaptability to address technological developments.[177] It may be that there is no legislative solution to deepfake issues and they will be dealt with through the existing means of calling expert evidence and cross examination.[178]
5.119Deepfakes might present a greater challenge in proceedings with self-represented litigants, who can face difficulties with technical, procedural and legal aspects of the court system.[179] It has been recognised that a judge is obliged under section 24(1) of the Charter to provide to self-represented litigants ‘certain advice and assistance to ensure that they effectively participated in the hearing’.[180] But to what extent should a judicial officer assist a self-represented litigant when, for example, an allegation of ‘deepfake’ evidence is raised, particularly where a party lacks resources to prove or disprove its authenticity? This is an issue that courts should continue to monitor and assess, noting that the judge must maintain neutrality and not become an advocate of the self-represented litigant.[181]
Authenticating alleged deepfake evidence
5.120Stakeholders raised concerns that it is becoming increasingly difficult to detect deepfakes.[182] Some stakeholders considered that in the race between deepfake generation technology and deepfake detection technology, detection technology would ultimately fail.[183] Rather, authenticating a document through, for example, digital forensics would be the way forward.[184] Other measures might include the use of watermarks by AI model developers.[185] However, the increased costs and complexity of authentication could have broader impacts on genuine evidence. Robert Chesney and Danielle Citron have warned of ‘the liar’s dividend,’ noting that as ‘the public becomes more aware of the idea that video and audio can be convincingly faked, some will try to escape the accountability for their actions by denouncing authentic video and audio as deep fakes’.[186] It has been observed that:
Deepfakes’ authentication difficulties are twofold. One problem is how to show a video is fake. The other is how to show it isn’t. As deepfakes become increasingly common and realistic, their very existence will undermine the reliability of genuine evidence, creating headaches for the proponents of authentic videos.[187]
5.121Such impacts are already apparent in courts. In a recent NSW Civil and Administrative Tribunal case, an applicant sought the release of bodycam footage by the NSW Police of interactions between himself and a police officer. The NSW Police unsuccessfully argued against the release of the video footage based on public interest reasons. It was argued that if released, the footage or meta-data could be manipulated given the rise ‘of artificial intelligence and “deep fake” technology’.[188]
5.122In an aggravated assault case before the Ontario Superior Court of Justice in Canada, a defendant attempted to challenge the authenticity of a surveillance video of the incident, claiming it was faked. The judge found the video was authentic, despite there being some acknowledged issues with the evidence, including gaps in the chain of custody. He noted that:
The fact that gaps in the evidence leave open the possibility of a faked or altered video being produced does not necessarily render that possibility a reasonable one absent some evidence tending to make that inference more than a reasonable or far-fetched one.[189]
5.123If video and other media are increasingly subject to authentication challenges, this would have implications for access to justice and the time and resources involved in challenging authentication.
Are further reforms needed to deal with the challenges of deepfakes?
5.124Some stakeholders told us that courts are currently capable of working with the issues raised by deepfakes, and that reforms to the Evidence Act were not necessary.[190] There are also concerns that amending rules of evidence to provide additional requirements in response to deepfakes could lead to increased litigation costs and potentially impact access to justice.[191]
5.125Given the challenges raised by deepfakes, Victoria’s courts should monitor how issues relating to deepfakes are managed within existing evidence laws. This requires consideration of issues emerging from other jurisdictions.
5.126Most stakeholders that commented on AI in relation to evidence said that authenticity could likely be established through existing rules of evidence.[192] However, care must be taken to ensure AI is distinguished from other technologies like email and social media. Courts have indicated a tendency to accept these technologies,[193] and this could exacerbate the risks presented by deepfakes.
5.127As with privacy law, identifying, authenticating and addressing deepfakes in Victoria’s courts and VCAT intersects with national and international regulatory and legislative regimes. This includes the consideration of mandatory guardrails by the Australian Government (discussed in Chapter 4).
Approaches to deepfakes in the United States
5.128In the US, the issue of deepfakes and evidence has been discussed in academic debate and judicial circles.[194] Since 2023, the Federal Advisory Committee on Evidence Rules (the Advisory Committee) has considered adopting a rule to address deepfakes.[195] The Advisory Committee acknowledges deepfakes are similar to forgeries but are distinguished by the sophistication of the AI tool’s output and the difficulty faced in detecting them.[196]
5.129The core issue of debate for the Advisory Committee is whether existing federal rules of evidence are a strong enough safeguard against courts finding fake evidence to be authentic.[197]
5.130The Advisory Committee has taken an incrementalist approach to the question of reform of rules. It states:
You should not upset the apple cart with bold changes absent a showing that the existing rules really cannot address the problem adequately. Moreover, you only have to look at the various iterations of how you can authenticate … to see that we always have had special rules for special kinds of evidence.[198]
5.131In June 2025, the Advisory Committee agreed that deepfakes are an important issue but is ‘not sure that it requires a rule amendment at this time’.[199] However, the Committee states that ‘it should take steps to develop an amendment it could consider in the event that courts are suddenly confronted with significant deepfake problems that the existing tools cannot adequately address’.[200]
5.132The Advisory Committee has developed a working draft of a proposed rule:
(1) A party challenging the authenticity of an item of evidence on the ground that it has been fabricated, in whole or in part, by generative artificial intelligence must present evidence sufficient to support a finding of such fabrication to warrant an inquiry by the court …
(2) If the opponent meets the requirement of (1), the item of evidence will be admissible only if the proponent demonstrates to the court that it is more likely than not authentic.[201]
5.133The proposed approach would first place the burden on the opponent of evidence to show that a reasonable person could find that the evidence is fabricated. The burden would then shift to the proponent to show the evidence is more likely than not authentic.
5.134The Advisory Committee intends to ‘continue to monitor the case law and commentary to determine whether a new rule is necessary to treat the deepfake problem and refine and discuss a potential rule’.[202]
Areas of the Evidence Act that may require clarification
5.135Some stakeholders told us that there were areas of the Evidence Act dealing with technology that may require clarification.
Proof of voluminous or complex documents
5.136One issue considered by stakeholders was the potential application of section 50 of the Evidence Act to summaries created by AI.[203] Section 50 allows a judge to order a summary of the contents of voluminous documents and requires the name and address of the person who prepared the summary to be produced.[204] GenAI outputs can be understood as being created by a model, not a person, which might lead to a lack of clarity if a strict or literal legislative interpretation is applied.
5.137From a practice perspective this is unlikely to create significant hurdles to the continued application of the section. Real persons will generate summaries using AI tools and their details can be used to fulfil the requirements of section 50.[205] It is also unlikely an AI tool will independently generate summaries of voluminous content in the immediate future. However, this may change if the use of agentic AI becomes widespread, and may require future legislative reform.[206]
Evidence produced by processes, machines and other devices
5.138It was suggested that sections 146 and 147 of the Evidence Act might require legislative reform.[207] These sections apply to a document or thing tendered that has been produced by a process or device. The sections create a presumption that:
•if the device or process is one that, if properly used, ordinarily produces that outcome
•it is presumed (unless evidence sufficient to raise doubt about the presumption is adduced) that, in producing the document or thing, the device or process produced that outcome.
5.139Section 147 applies the presumption to documents produced in the course of business, while section 146 applies to documents produced generally. Sections 146 and 147 were drafted with the intention to simplify the admission of faithful reproductions or mechanically produced outputs, such as photocopies, to lighten the burden of proof.[208] The party tendering evidence is not required to call evidence that the process or device is working to prove the inference.
5.140Whether these sections should be amended to impose a more rigorous requirement for the presumption of reliability and accuracy of computer-produced evidence was considered by the Australian Law Reform Commission, NSW Law Reform Commission and this Commission in 2005.[209] At the time, the conclusion was that reform was not warranted. However, what constitutes ‘processes, machines and other devices’ has changed substantially since 2005, particularly with the development of AI.
5.141England and Wales are undertaking calls for evidence to explore possible reform of the common law presumption that ‘the computer is operating correctly’.[210] While this is different from the presumption in sections 146 and 147, both might result in a false reliance on the outputs of technology. This was at issue in the Post Office Horizon scandal, where evidence based on a faulty software system led to wrongful convictions.[211]
5.142The primary concern is that AI output may be ambiguous, biased or hallucinated. Sections 146 and 147 do not require the accuracy or functionality of the device, machine or process to be proved.
5.143However, there is a question whether section 146 and 147 presumptions would apply to AI-generated material in the first place. GenAI tools are built on predictive models, which cannot create replicable results, so an AI tool may not be able to demonstrate that it ‘ordinarily produces the outcome’.[212]
5.144For some uses, it may be accepted that AI tools can produce consistent outcomes and therefore attract the presumption under sections 146 and 147. More recently it has been used to facilitate the admission into evidence of timestamps created by social media platforms.[213]
5.145The question as to whether sections 146 and 147 might be applied to an AI tool was considered in Gujic v Arterbury.[214] In this case, the appellant sought to include text message evidence produced by Google Translate. Google Translate leverages AI deep learning and neural networks to translate text.[215]
5.146The Federal Circuit and Family Court decided the section 146 presumption could not apply to the appellant’s Google Translate documentation because the appellant did not lead evidence as to what Google Translate ordinarily produces.[216] The Court therefore could not know what Google Translate did with any level of specificity. For example, did Google Translate produce literal or figurative translations? Or was it a form of approximate translation? The court therefore could not evaluate the probability of the existence of the relevant facts.[217]
5.147Commentary on this case noted that court staff, legal representatives and court users use Google Translate widely in the family law context to translate affidavits, draft applications to court and prepare cases for trial because of ease, cost and convenience of use.[218] There may come a time where adducing evidence about AI translation tools becomes unnecessary because their output and purposes have become common and reliable knowledge.[219] However, it is worth noting that in another case involving presumed AI translation, the judge stated that it was ‘undesirable’ for a character reference to be written using an AI translation tool because:
the subtleties of the use of language, which will be significant in assessing the content of the reference, will not necessarily be accurately reflected in the automated translation.[220]
5.148Sections 146 and 147 of the Evidence Act should be monitored to assess how case law develops in applying the presumption to AI evidence.
5.149The presumption of reliability for sections 146 and 147 should not be conflated with discussion about a reliability standard for expert evidence, discussed below.
Courts must be able to evaluate AI evidence
5.150As discussed in Chapter 3, AI presents several risks and challenges for courts and tribunals, including lack of explainability, bias in AI data or systems, and inaccuracy (such as hallucinations). An issue facing Victoria’s courts and VCAT is how AI evidence (meaning evidence that is generated, processed or analysed by AI) can be assessed and excluded if it is biased, incorrect or misleading.
5.151The quality of evidence is a matter of probative weight or value.[221] Flaws in evidence reduce the weight given to it by the factfinder, whether judge or jury. Evidentiary flaws are usually explored in cross-examination.[222] The Uniform Evidence Law relies heavily on sections 135 and 137 to exclude evidence that is prejudicial or misleading. We heard that these sections could be used to respond to issues relating to AI.[223]
5.152Assessing AI evidence is complicated by factors which can hinder a court or tribunal’s ability to assess the quality of evidence and parties’ ability to challenge evidence. One problem is the lack of transparency. An AI model may be very complex to explain to a court.[224] Information about how the AI system works and the data it is based on might not be available or might not be disclosed.[225]
5.153The United Nations Special Rapporteur on the independence of judges and lawyers discussed these challenges in the context of criminal justice, recommending that:
Prosecutors should avoid relying on AI evidence as the foundation for their pursuit of convictions, unless confident that the evidence is rigorously tested, not discriminatory and can be meaningfully challenged by the defence after mandatory prosecutorial disclosure …
Rules of evidence should provide no carve-out from admissibility requirements for privately owned AI tools and … the standard for admitting AI-produced evidence should be the same as for traditional forensic methods.[226]
5.154Another problem is lack of reliability. It may be difficult to determine the probative value of AI evidence when two experts or parties use different AI tools that produce conflicting results. Risks discussed in Chapter 3, such as inaccuracy and bias, will also impact reliability of evidence.
5.155The relation between transparency and reliability in AI evidence has been discussed in academic literature. It has been commented that ‘when an AI system is not transparent or explainable, then ensuring its validity and reliability increase in importance’.[227]
Does AI evidence require the development of a reliability test for expert evidence?
5.156In Australian law, admitting expert evidence is a matter of weight. Professor Ian Freckelton AO KC told us that:
If there is something flawed in the evidence, the unreliability or flaw will go to the probative value of the evidence or the weight given to it by the judicial officer. This is the general approach in the Evidence Act and it is the same at common law.
If black box evidence emerges and cannot be contested or discerned in terms of key aspects of its reasoning, forensic consequences should follow. If the probative value of expert evidence cannot be tested, courts would then question whether the evidence is relevant at all, as well as whether it is more prejudicial than probative.[228]
5.157Unlike jurisdictions such as the United States, England and Wales, Canada, India and New Zealand, Australian law does not make reliability a requirement for the admission of expert evidence.[229]
5.158Some stakeholders saw some merit in Australia adopting a reliability standard for expert evidence.[230] These issues were raised in a broader context than AI. But these considerations are also relevant as AI is introduced, and the complexity of such evidence continues to increase. A reliability standard would enable judicial officers to assess the reliability of expert evidence in determining admissibility.
5.159The United Nations Special Rapporteur on the independence of judges and lawyers recommended that in criminal justice systems: ‘Judges and prosecutors should act as gatekeepers and only submit and admit reliable AI-generated evidence in criminal trials’.[231]
5.160The United States Supreme Court adopted a test with five indicia of reliability in Daubert v Merrell-Dow Pharmaceuticals Inc, commonly referred to as the ‘Daubert test’.[232] Under the Daubert test, the judge, not the jury, determines that the expert’s evidence is based on scientific knowledge. In doing so, judges must consider the following factors:
a)whether the technique or theory in question can be, and has been tested
b)whether it has been subjected to publication and peer review
c)the technique’s known or potential error rate
d)the existence and maintenance of standards controlling its operation
e)whether it has attracted widespread acceptance within a relevant scientific community.[233]
5.161England and Wales have adopted a Practice Direction for expert evidence in criminal proceedings.[234] Considerations for judges include:
a)the extent and quality of the data on which the expert opinion is based
b)the validity of the methodology employed by the expert
c)if the expert’s opinion relies on an inference from any findings, whether the opinion properly explains how safe or unsafe the inference is (whether by reference to statistical significance or in other appropriate terms)
d)if the expert’s opinion relies on the results of the use of any method (for instance, a test, measurement or survey), whether the opinion takes proper account of matters, such as the degree of precision or margin of uncertainty, affecting the accuracy or reliability of those results
e)the extent to which any material upon which the expert’s opinion is based has been reviewed by others with relevant expertise, and the views of those others on that material
f)the extent to which the expert’s opinion is based on material falling outside the expert’s own field of expertise
g)the completeness of the information, which was available to the expert and whether the expert took account of all relevant information
h)if there is a range of expert opinion on the matter in question, where in the range the expert’s own opinion lies and whether the expert’s preference has been properly explained
i)whether the expert’s methods followed established practice in the field and, if they did not, whether the reason for the divergence has been properly explained.[235]
5.162In Victoria, the Supreme Court’s Practice Note Expert Evidence in Criminal Trials has been updated to provide standards for the validity of scientific methods used by experts and new requirements for the submission and consideration of expert evidence.[236] This provides some options to enable courts to explore issues of reliability and question the intricacies of expert evidence on AI tools.
5.163The Practice Note requires disclosure from experts who have used AI to generate or express their expert opinions, and to identify possible biases.[237]
5.164For parties and experts wishing to tender evidence that is scientific, medical or technical, it also requires experts to explain the validity of their method by disclosing:
•the method used by the expert
•whether the method has been reviewed by others
•whether the method followed established practice, and the reasons for any divergence
•whether the expert has proficiency in the use of the method
•details of any test or survey that were used to formulate the expert opinion and the margin of uncertainty and reproducibility of results
•the steps taken to limit the impact of extraneous information
•whether the opinion was peer-reviewed.[238]
5.165In terms of the scientific validity of expert evidence when a party proposes to lead expert evidence of a scientific, medical or technical nature, it provides standards for the scientific validity of expert evidence:
•results must be repeatable, reproducible and accurate
•the expert must have demonstrated proficiency in the application of the method.[239]
5.166These requirements comprise some aspects of both the Criminal Practice Note used in England and Wales and the US Daubert test.
5.167A reliability standard for evaluating expert evidence could be adopted for Australian courts, including in Victoria. Reforms to enable judicial assessment of reliability of expert evidence have been raised previously regarding forensic science.[240] However, this would require thorough consideration of how experts use AI tools and interpret their outputs.[241] The potential scope would also require detailed consideration, as any changes to section 79 of the Evidence Act would impact expert evidence broadly.
5.168Any consideration of a reliability standard should be dealt with across Uniform Evidence Law jurisdictions, rather than in Victoria’s courts and VCAT alone. The cross-jurisdictional nature of the issue, coupled with the fact that a reliability standard would cover more than issues of AI evidence, puts this issue beyond the scope of this reference.
5.169Most stakeholders were of the view that the reliability of AI evidence could be managed with existing rules. Introducing a reliability standard would have broader implications than AI evidence alone. However, the Commission recommends this issue should continue to be monitored by a court-led expert working group, as outlined in Recommendation 3.
AI-generated opinion evidence
5.170Under the Uniform Evidence Law there is a general rule that ‘evidence of an opinion is not admissible to prove the existence of a fact’.[242] However there are exceptions to the opinion rule. Section 79 of the Evidence Act creates an exception that if a person has ‘specialised knowledge’ based on their ‘training, study or experience’ they can provide an opinion that is ‘wholly or substantially based on that knowledge’.[243] Litigants may seek to rely on experts with specialised knowledge to give opinion evidence in a proceeding.
5.171As AI develops litigants may seek to rely on opinions in proceedings which have been generated by AI. This issue arose in the case of Cosette v Pennisi.[244] This case involved a dispute about the purchase of a luxury bag.
5.172In this case, the buyer of a luxury bag became concerned that it was fake. She had the bag checked by submitting photographs to a third-party authentication service, which concluded the bag was not authentic. The seller refused to refund the purchase price of the bag.
5.173In the hearing at first instance, the buyer had the onus of demonstrating the bag’s lack of authenticity. She relied on two reports from authentication services, at least one of which appeared to have been generated by AI.[245]
5.174The Tribunal, quoted by the Appeal Panel, stated that:
Opinions in legal proceedings may only be given by those having specialised knowledge based on the person’s training, study or experience and their opinion must be wholly or substantially based on that knowledge.
The Tribunal is not aware of the training, study or experience of whoever it is that is giving the two opinions. Indeed, it appears to be the case that, in the case of Entrupy, the opinion is the result of artificial intelligence based on Entrupy’s algorithms…
I accept that there is some reasoning for the opinions expressed in the two certificates. However, there is no expert or other person giving those opinions.[246]
5.175While this case did not directly consider section 79 of the Evidence Act, it demonstrates challenges in seeking to rely on AI to provide opinion evidence. Without an expert to give the opinion, there is no opportunity to evaluate the basis for the opinion through cross-examination.
5.176As AI develops it will be necessary to further consider whether courts and tribunals should allow AI generated opinions to be submitted as evidence and whether legislative reform is required.
Proposed US reforms regarding reliability of AI evidence
5.177How AI generated materials, including opinions, may be submitted as evidence is being considered internationally. In the US the Advisory Committee on Evidence Rules (Advisory Committee) has developed a formal proposal for establishing standards for machine-generated evidence tendered by litigants, in recognition of a gap that exists in the Federal Rules of Evidence.[247]
5.178Under Rule 702 of the US Federal Rules of Evidence, expert witnesses must establish that their evidence is based on sufficient facts or data, is the product of reliable principles and methods, and their methods are reliably applied to the facts of the case.[248] This rule was amended following Daubert (discussed in paragraph [5.160]).
5.179However, Rule 702 does not apply where AI-generated material is presented by litigants without an expert. Allowing courts and litigants to develop ad hoc approaches to ensuring the reliability of trial evidence is an undesirable outcome, and the Advisory Committee developed a new Rule 707 on the machine-generated evidence:
When machine-generated evidence is offered without an expert witness and would be subject to Rule 702 if testified to by a witness, the court may admit the evidence only if it satisfies the requirements of Rule 702 (a)-(d). This rule does not apply to the output of basic scientific instruments.[249]
5.180The adoption of this rule might require parties to demonstrate that training data used in the model avoided bias and its output is accurate.[250] Through this process, parties would be able to show that the material tendered reflects a reliable application of the principles and methods, and that the AI program used reliable principles and methods. The Committee on Rules of Practice and Procedure approved that the draft rule be published for public comment.[251]
5.181Some other jurisdictions are grappling with issues of reliability in AI evidence. A Bill currently before the New York State Assembly[252] seeks to amend both the civil and criminal procedure state laws to deal with these issues of the reliability of AI evidence and the ‘black box’ problem.[253]
5.182According to the memo accompanying the New York Bill:
The crux of the black box problem is that no one, not even the AI’s programmers, can precisely understand how the AI reaches its conclusions from the data. This ambiguity introduces an issue with evidence created or processed entirely or partially by AI; we cannot discern how the AI arrived at a specific conclusion. However, through rigorous testing, we can verify whether those conclusions are accurate and reliable.[254]
5.183This approach appears to reflect the important relationship between transparency and reliability in AI evidence, as discussed above in paragraph [5.155].[255]
5.184The Bill differentiates between ‘created’ and ‘processed’ AI evidence. Evidence ‘created’ by AI is defined as when AI generates ‘new information’.[256] The Bill contains strict requirements on when this evidence may be admitted:
Evidence created, in whole or in part, by artificial intelligence shall not be received into evidence in a criminal proceeding unless the evidence is substantially supported by independent and admissible evidence and the proponent of the evidence establishes the reliability and accuracy of the specific use of the artificial intelligence in creating the evidence.[257]
5.185Evidence ‘processed’ by AI is defined as where a ‘conclusion’ or ‘interpretation’ was generated by AI, based on information that already exists.[258] The admissibility requirements for this kind of AI evidence are not as strict as AI ‘created’ evidence, but require ‘the proponent of the evidence [to establish] the reliability and accuracy of the specific use of the artificial intelligence in processing the evidence’.[259]
5.186The memo explains that the level of accuracy required should be considered ‘on a case-by-case basis,’ taking the weight of the evidence into account.[260]
5.187The Bill also enables the court to limit the disclosure of trade secrets or similar sensitive information introduced by expert evidence.[261] This seems to be directed at preventing the outcome in Loomis,[262] or similar cases, where AI providers have refused to disclose information about data and algorithms to protect the commercial value of their product.[263]
5.188The laws of evidence in the US are different to the Uniform Evidence Law which operates in Victoria. However, these examples show how international jurisdictions are beginning to respond to the challenges AI presents to existing rules and laws of evidence.
Processes to assist assessment of AI evidence
5.189In assessing AI evidence, judicial officers have powers under the Civil Procedure Act 2010 (Vic) to enable court-appointed experts,[264] a single joint expert (an expert witness engaged jointly by both parties),[265] or concurrent evidence (where experts can be questioned jointly and the judge and counsel for either party can ask questions).[266]
5.190As discussed in our consultation paper, the Trivago NV v Australian Consumer and Competition Commission case used some of the above mechanisms to come to a decision on an algorithm’s output.[267] The parties called expert witnesses, and the experts gave evidence to identify the relative importance of algorithmic factors. Each of the parties’ experts submitted reports, with the experts subsequently conferring to provide a joint report structured around the ACCC’s questions.[268] The experts then delivered their evidence concurrently in a closed court session.[269]
5.191This case indicates how existing mechanisms can be used to assist the assessment of AI-generated evidence.[270] The Supreme Court noted that in addition to court appointed experts, single joint experts, and concurrent evidence, courts have a range of express powers to:
•direct expert witnesses to hold a conference of experts or prepare a joint experts report[271]
•refer a question to a special referee[272]
•call in the assistance of one or more specifically-qualified assessors.[273]
5.192We note that courts and tribunals with high volume caseloads, such as VCAT and the Magistrates’ Court, may be constrained in using these powers due to time and resources.
Education for courts and lawyers
5.193The Trivago case suggests judges and lawyers do not need to be AI experts to effectively consider matters relating to AI. However, they must be supported to ask the right questions about AI models so the adversarial system can function effectively.[274]
5.194Judges and lawyers need to understand relevant aspects of AI systems. This may include how the model has been fine-tuned or trained, whether the AI program used reliable principles and methods, whether the resulting output is valid,[275] and how the system has been prompted. Enabling judicial officers to question experts who provide evidence informed by AI will become increasingly important.
5.195A tool for judicial officers, similar to the Judicial College of Victoria’s A Practical Guide to Weighing Evidence could be one option.[276] This resource could detail the kinds of assistance judges can ask for in relation to AI and include a set of general questions judicial officers could use to ask about the algorithm and its training data.
5.196Courts and tribunals could also make use of other mechanisms to assess technical or specific questions about AI technology and evaluate probative weight. Stakeholders suggested that the following supports might be useful in determining the probative weight of AI evidence:
•an expert panel or a register of court experts with expertise regarding AI use cases[277]
•authoritative summaries on limitations and complexities of AI and algorithms, similar to primers published in relation to scientific evidence for judges in England and Wales.[278] In the US, the National Centre for State Courts developed bench cards outlining considerations for judges in evaluating AI-generated evidence. This includes sperate considerations for acknowledged[279] and unacknowledged use of AI.[280]
•a set of considerations to determine the reliability of AI-generated materials.[281] This could include some of the considerations that the UK has established in its criminal practice directions.
Continued monitoring of the suitability of the Evidence Act
5.197The law of evidence is of central importance to the functioning of Victoria’s courts and tribunals.
5.198We heard from stakeholders that the rapid pace of AI’s evolution makes legislative reform challenging. While sections 50, 146 and 147 of the Evidence Act were raised as possible areas for reform, no specific issues or problems were identified during consultations. At this stage there are no compelling reasons to undertake legislative reform of the Evidence Act in relation to AI. Addressing issues such as reliability would also have broader implications for expert evidence. It is possible that the rules of evidence are flexible enough to adapt to the increasing use of AI evidence in courts.
5.199However, AI evidence presents challenges, such as the rise of deepfakes, and issues of reliability, explainability and transparency. Reliable and robust evidence is integral to the fairness of the justice system. The use of AI evidence in courts and tribunals raises difficulties in understanding what technology is used, how it works and whether it is reliable. Constant monitoring of legislation and court rules in relation to the use of AI evidence in courts and tribunals is therefore essential. Courts are best placed to undertake this monitoring and assessment.
5.200The court-led Forensic Evidence Working Group convened judges, scientists, prosecutors and defence lawyers to develop the Practice Note Expert Evidence in Criminal Trials.[282] It is recommended that a similar court-led expert working group monitor issues arising in relation to AI and identify any areas of the Evidence Act requiring reform. The impact of AI evidence and any issues with existing evidence laws is still emerging. A working group with representatives across courts and the broader justice system will help to identify and inform issues that arise in practice and enable collaborative consideration of potential law reform. The group should report annually to the Chief Justice of Victoria.
5.201Given the cross-jurisdictional nature of the Uniform Evidence Law, any consideration of future reforms would likely require a national approach and could be considered by the Standing Council of Attorneys-General.
|
Recommendation 3.A court-led expert working group should monitor the suitability of the Evidence Act 2008 (Vic) to address emerging AI-related evidence issues, and report annually to the Chief Justice of Victoria. |
Administrative law
5.202The use of AI systems for government automated decision-making is likely to increase in the future.[283] The number of people contesting such decisions will also increase, requiring Victoria’s courts and VCAT to determine whether they are able to conduct judicial review.[284] It is important that Victoria’s courts and VCAT have clarity on whether they can conduct judicial review of these types of decisions.
Can courts review automated decisions?
5.203In the Commonwealth system, ‘decision’ is defined in legislation such as the Administrative Decisions (Judicial Review) Act 1977 (Cth) (ADJR Act),[285] the Acts Interpretation Act 1901 (Cth) (Acts Interpretation Act),[286] and the Administrative Appeals Tribunal Act 1975 (Cth) (AAT Act).[287] The ADJR Act and AAT Act enable decisions to be reviewed by federal courts and tribunals.
5.204In Victoria, a ‘decision’ for which review can be sought under the Administrative Law Act 1978 (Vic) (AL Act) is defined in that Act.[288] The Victorian Civil and Administrative Tribunal Act 1998 (Vic) (VCAT Act) defines a decision for the purposes of that Act in sections 4 and 5.[289]
5.205As discussed in our consultation paper, legal uncertainty remains as to whether automated decisions are ‘decisions’ for the purposes of judicial review. This stems from the decision in Pintarich v Federal Commissioner of Taxation (Pintarich),[290] which suggests fully automated decisions made under the Commonwealth ADJR Act may not be reviewable.
5.206In Pintarich, the majority held that an automated letter from the Australian Tax Office to a taxpayer was not a reviewable decision for the purposes of ADJR Act, because the machine producing the output did not engage in a ‘mental process’.[291]
5.207In his dissent in Pintarich, Justice Kerr considered whether a decision has been made depends on the context, and should be flexibly evaluated in line with advances in technology, stating that:
The legal conception of what constitutes a decision cannot be static; it must comprehend that technology has altered how decisions are in fact made and that aspects of, or the entirety of, decision making, can occur independently of human mental input.[292]
5.208Justice Kerr also noted that several government departments already rely on automated systems for bulk decision-making and that some legislation delegates decision-making authority to computer programs.[293]
5.209The majority’s view of what constitutes a ‘decision’ in Pintarich has been criticised by academics and organisations with an interest in AI and automated decision-making. The Human Technology Institute stated:
a rigid articulation of what constitutes a ‘decision’ is inappropriate, because it does not account for the ‘variability and complexity of [automated decision making]’, nor the reality of the rapid uptake of [automated decision making] systems to automate decision-making processes with consequential outcomes for individuals, in virtually every government context.[294]
Do decisions made by AI require reasons?
5.210There is no common law right to reasons for administrative decisions. However, the Acts Interpretation Act, ADJR Act and AAT Act contain provisions that require reasons for decisions to be provided to a person in certain circumstances.[295] In Victoria, the VCAT Act provides that a person may request a statement of reasons for a decision, which the decision maker must provide within 28 days.[296] Additionally, under the AL Act, parties are entitled to reasons for decisions of a tribunal.[297]
5.211Explainability and transparency of decision making by AI systems are of great importance in this context. As the Australian Human Rights Commission explains, a ‘failure to provide reasons for an administrative decision can make it difficult, or even impossible, to know whether the decision was fair, reasonable, accurate or even lawful’.[298]
5.212Yet, as discussed in Chapter 3, AI models are often too complex to easily explain. As Samuel White describes:
The current system of human decision-making can often be perplexing, but you are able to find the individual and unpick their logic. It is much harder to unpick the logic of hundreds of lines of computer code … who do you hold to account for a decision that was created by an algorithm?[299]
5.213Additionally, as discussed in Chapter 2, AI systems can perpetuate bias either because of the data used, or problems with the system itself.[300] This can lead to decisions made by AI that are unfair or discriminatory.[301]
5.214Despite the developments in ‘Explainable AI’,[302] such tools may have limited value in an administrative law context. Even where AI systems can provide apparent reasons, these might not genuinely reflect the process that led to its decision. Professor Lyria Bennett Moses explains that:
the problem is not that ChatGPT “lies”; the problem is that output reasons are not a genuine reflection of the nature of its own operations. While it is likely that [large language model] explanations will, over time, appear more genuine to readers, the complexity of the system makes genuine explainability unlikely.[303]
5.215Representatives of the Human Technology Institute considered that such issues may be addressed through good design of AI systems, and by enabling affected parties to be provided with technical reasons for an AI-generated automated decision.[304] We heard that:
The problem is not only the inability to provide reasons a layperson can understand, but also the inability to provide technical reasons in the event that someone forms a view that the reasons given for a decision don’t match with true rationale for the decision.[305]
5.216However, the inscrutability of AI systems is not the only issue. A separate, yet related, concern is that AI processes are ‘nonintuitive.’[306] Even a technical explanation, capable of being understood by someone with relevant expertise, might not be sufficient to provide ‘reasons’ as understood in an administrative law context.
5.217The UNSW Centre for the Future of the Legal Profession told us that Explainable AI’s purpose is:
distinct from the concept of explanation or reason-giving as used in law, which is justificatory … Building in an ‘explanation’ to [machine learning] systems may not enable justification as the explanation may depend on correlations in data rather than the patterns of causality.[307]
5.218When considering AI systems that provide ‘reasons’, it is important to consider that: ‘As with other areas where law and computer science intersect, the word might be the same, but the way the disciplines understand the word and its implications varies in important ways’.[308]
Is reform needed to address uncertainty introduced by Pintarich?
5.219The majority’s decision in Pintarich has been criticised as ‘flawed,’[309] or as taking an overly ‘narrow’ approach to the meaning of decision in the context of modern technology.[310] Some expect the case will be distinguished, or departed from, by later judgments.[311]
5.220Some stakeholders considered Pintarich would continue to be influential in the Australian legal system until it is overturned or overcome with legislative reform.[312] The definition of decision in Pintarich sits at odds with the increased use of deep learning and AI systems that cannot provide reasons.
5.221The Pintarich decision has created uncertainty and may have the ‘perverse outcome’ that governments are incentivised to adopt automated decision-making processes as a means of avoiding judicial scrutiny.[313]
5.222More broadly, it has the potential to affect public trust and confidence in the courts and government decision-making.[314] Individuals must continue to be enabled to challenge government decision-making so that there is not a zone of AI decision-making that is shielded from scrutiny by courts and tribunals.
5.223Anna Huggins considers that the decision leaves a regulatory gap, as Pintarich suggests ‘administrative decision-making in Australia is “still regarded as an inherently human process”, yet there are currently no legislative safeguards to ensure that human decision-makers do in fact oversee and review automated outputs’.[315]
5.224The Australian Human Rights Commission considered that reform is needed to ‘ensure similar rights of review are available for people affected by AI-informed administrative decisions’.[316] It recommended amending section 25D of the Acts Interpretation Act to make it clear that, ‘where a person has a legal entitlement to reasons for a decision, this entitlement exists regardless of how the decision is made’.[317] It also recommended that the content of the reasons should include a ‘technical explanation of the decision, in a form that could be assessed and validated by a person with relevant technical expertise’.[318]
5.225The Human Technology Institute similarly recommended that relevant legislation, including section 25D of the Acts Interpretation Act, should be amended. It recommended section 25D be amended to provide that for the avoidance of doubt, the use of automation is not a factor weighing against an action being considered a ‘decision’ within the meaning of administrative law.[319]
Is reform needed in Victorian law?
5.226It is not yet clear what effect Pintarich will have on automated decisions outside of the ADJR, or under Victorian administrative law. However, consideration should be given to whether legislative reform is required in Victorian law to clarify:
•that automated decisions made by AI are capable of review
•that reasons should be made available for people seeking to challenge decisions made by AI.
5.227There should not be unequal access to judicial review based on whether a decision was made by a person or a machine. Neither should there be an incentive to use AI to cause an administrative decision to be more opaque,[320] or to avoid judicial review altogether.
5.228The possible legislative approaches require further consideration. One approach would be to insert deeming provisions into section 3 of the AL Act and section 4 of the VCAT Act. This would clarify that automated decisions generated by AI tools are considered ‘decisions’ for the purpose of judicial review. One issue with section 4 of the VCAT Act is that it contemplates a ‘person’ will be making a decision.[321]
5.229As discussed above, another possible reform could ensure the right to reasons for a decision, including technical explanations. However, this approach may be limited by the inherent lack of explainability of more complex AI systems.
5.230There has been some criticism of the effectiveness of deeming provisions to address the concerns raised by Pintarich.[322] Certain Australian laws have provisions which deem decisions made by a computer to have been made by the relevant minister or executive.[323] However, such an approach entails risks, including a ‘lack of clear accountability for automated decisions’.[324] Also, as Ng and O’Sullivan raise, ‘deeper consideration should be given to the lawful and ethical remit of AI: are there certain areas of decision-making which should not be subject to AI?’[325]
5.231Some have suggested that rather than deeming provisions, an alternative approach would be to legislate that a human is responsible for independently justifying a decision made by AI.[326] This would provide a ‘check and balance’ against AI systems, as well as an opportunity for transparency.[327]
5.232While we do not recommend administrative law reform in Victoria at this stage, our view is that the complex issues raised above require detailed consideration and ongoing monitoring.
Legal Profession Uniform Law
5.233The Legal Profession Uniform Law (Uniform Law)[328] and its associated Uniform Rules outline ethical and professional standards required of lawyers in Victoria, New South Wales and Western Australia.[329] Breaches of the Uniform Law may constitute professional misconduct and lawyers could face disciplinary action. The Victorian Legal Services Board and Commissioner (VLSB+C) is responsible for ensuring that the Legal Profession Uniform Law is implemented effectively in Victoria.[330]
5.234The Uniform Law and associated rules adopt a technology-neutral framework, meaning that the Uniform Law avoids prescribing specific technologies or methods for legal practice.[331] As a result, the Uniform Law’s rules and obligations remain consistent regardless of whether traditional methods or AI are used by lawyers in their legal practice.[332] A statement by the VLSB+C and other state based regulatory bodies emphasises the ethical standards required of lawyers while using AI.[333]
5.235The following professional obligations are relevant for lawyers using AI, as outlined in the Solicitors and Barrister Rules:
•duties to the court and to the administration of justice[334]
•duties to deliver legal services competently and diligently[335]
•duties to not engage in conduct which is either prejudicial or likely to diminish public confidence in the administration of justice[336]
•duties not to engage in conduct that will bring the legal profession into disrepute or diminish public confidence in the legal profession[337]
•duties to not knowingly or recklessly mislead the court[338]
•duties to promote and protect the client’s best interests to the best of a lawyer’s skill and diligence.[339]
5.236The ethical duties of solicitors in relation to AI use was considered in Re Dayal. In this case a Victorian lawyer was found to have tendered a false list of cases without disclosing the source, and without verifying its accuracy. The lawyer claimed he did not intentionally mislead the Federal Circuit and Family Court after providing incorrect written submissions created by an AI tool.[340] Judge Humphries found it appropriate to refer the lawyer to the VLSB+C for further investigation.[341]
5.237In August 2025 the VLSB+C varied the practising certificate of the lawyer.[342] The variation meant that, among other restrictions, he would be unable to act as a principal lawyer and must undertake supervised legal practice for two years.[343] The VLSB+C stated that this response demonstrates their ‘commitment to ensuring legal practitioners who choose to use AI in their legal practice do so in a responsible way that is consistent with their obligations.’[344]
5.238Internationally, courts in Canada, England and Wales and the United States. have taken a mix of approaches when hallucinated or incorrect materials have been filed in court documents as a result of AI use, ranging from warnings to parties to the ordering of costs.[345]
Are professional obligations sufficient for AI risks?
5.239Existing professional obligations were generally considered broad enough to manage ethical issues arising from the use of AI by lawyers. Representatives of the Victorian Bar Association suggested that existing rules cover the improper use of AI, including obligations in relation to client data and informed consent.[346] The Law Institute of Victoria (LIV) stated:
The LIV agrees that ‘lawyers must continue to maintain high ethical standards and comply with their professional obligations’. Nonetheless, in the LIV’s view it is possible to envisage a broader role for the use of AI in legal practice than that envisaged by the [VLSB+C] Statement, whilst simultaneously ensuring that lawyers continue to meet high ethical standards and comply with their professional obligations.[347]
5.240In our consultation paper, we asked whether lawyers should have a specific obligation to be competent with the use of technology. The American Bar Association Model Rules were changed in 2012 to require competency with relevant technology. Comment on rule 1.1 states that ‘to maintain the requisite knowledge and skill, a lawyer should keep abreast of changes in the law and its practice, including the benefits and risks associated with relevant technology’.[348] The American Bar Association released an opinion in 2024 that lawyers and law firms using AI tools must ‘fully consider their applicable ethical obligations, including their duty to provide competent legal representation, to protect client information, to communicate with clients … and to charge reasonable fees’.[349]
5.241Representatives of Victoria Legal Aid noted that there is scope for the Uniform Law rules and commentary to address AI.[350]
5.242The VLSB+C stated legislative changes to either the Solicitors or Barristers Rules were unnecessary:
In our view, the current conduct rules for both solicitors and barristers already accommodate AI-related risks and misconduct, as a result of their broad and technology-neutral application. While each set of rules could certainly be improved, we do not believe any changes are needed to specifically address lawyers’ use of AI. However, further work can and must occur to ensure that lawyers understand how the conduct rules apply to the use of AI (and other technology) in the course of legal practice.[351]
5.243Some stakeholders emphasised that while the rules may not need amendment, they should be supplemented with comprehensive training and that links between professional obligations and AI risks need to be better understood and communicated.[352] Representatives of the Victorian Bar Association stated:
Practitioners are already under an obligation to understand the tools. Just because some people do not understand them is not a reason for more rules as opposed to better enforcement and education.[353]
5.244Representatives from Law Firms Australia shared this view:
From a broad perspective the existing professional conduct rules cover a lot of the issues. The professional conduct rules offer flexibility because they are principles based, which is useful. But work could be done to flesh out the commentary on how the conduct rules for solicitors and barristers can be applied in the context of using AI.[354]
5.245If changes are introduced, the VLSB+C and the Victorian Bar Association insisted the Uniform Law should remain principles-based, rather than refer to particular technologies.[355] A representative of the VLSB+C stated:
In terms of legal professional obligations, the rules of professional conduct could be amended so that they are more principles-based but, as they are now, they are technology-neutral and capable of dealing with AI. There could be better guidance on their applicability to AI use. I do not think amending the Rules or the Uniform Law is necessary to address AI.[356]
5.246Representatives of the VLSB+C also suggested that the Uniform Law should remain focused on the existing professional responsibilities of lawyers:
Ultimately, the professional rules remain intact, and you remain responsible for the quality of your output. It’s a reminder for people to think about whether they are competent and capable of delivering that output. Using AI without checking the output is not much different from having a human do the work for you, not checking it before sending it to a court or the other side and having something be wrong in the document. The VLSB+C would tie the responsibility for the error back to the person with the practicing certificate in both scenarios.[357]
5.247The Uniform Law ensures that lawyers are responsible for their output and therefore recognises AI tools are simply tools like any other. As stated by the Victorian Bar Association:
Every aspect of the court or tribunal process is structured around the production, filing or publication of documents by individuals who have responsibility to attest to the accuracy, relevance or authenticity of the contents of those documents. The Bar considers that these existing obligations, duties and responsibilities apply regardless of the tools used to produce the document and whether the document has been produced with the assistance of another person or with the assistance of a technological tool, platform or system (AI or otherwise).[358]
5.248The professional obligations outlined in the current Uniform Law remain appropriate for the regulation of AI use by lawyers. The use of AI does not fundamentally alter the responsibility of lawyers to act ethically, and in accordance with existing principles of professional conduct. However, the continuing effectiveness of these rules depends upon supplementary guidance and ongoing professional development. The role of legal professional guidance is explored in Chapter 7. The role of education and awareness-raising among lawyers is considered in Chapter 10.
Reforms to the regulation of legal practice
5.249The increasing use of AI to provide legal services raises questions about the regulation of unqualified legal practice. The Uniform Law prevents unqualified entities from practising law, and makes it an offence to engage in unqualified legal practice in Victoria.[359] The VLSB+C is responsible for investigating instances of unqualified legal practice in Victoria.[360]
5.250Unauthorised practice of law rules have ‘restricted the ability of courts and other organisations, whether commercial or in the third sector, from providing tailored legal advice to consumers other than through the services or under the supervision of a qualified lawyer’.[361]
5.251This issue was raised in a submission by Dr Natalia Antolak-Saper, who stated that if AI tools are to provide legal advice, this raises a foundational question of:
what constitutes the provision of legal advice and how this definition extends to AI systems. Traditional legal advice is predicated on the expertise of qualified professionals who possess the training, ethical obligations and accountability mechanisms to ensure their advice is contextually appropriate and legally sound. In contrast, AI tools operate within parameters defined by algorithms and datasets, often developed by individuals without legal qualification.[362]
5.252The use of AI to provide legal services raises complicated questions such as who will be responsible for legal outputs provided by AI systems, especially if those outputs are inaccurate and are relied on to the detriment of litigants.[363] Antolak-Saper considered that one solution may be to bring AI tools which provide legal advice within the scope of legal professional regulations.[364]
5.253Different international approaches are emerging in relation to permitting AI tools to provide legal services and coverage by legal service regulators. In Chapter 2, we discussed that the Solicitors Regulation Authority in England and Wales recently approved an AI-based firm, Garfield Law, to provide legal advice to court users.[365] However, this would not be possible in Victoria without future law reform.
5.254While beyond the scope of this report, this issue is addressed in a recent paper published by the AI Policy Consortium for Law and Courts in the United States. It argues for reforms to the ‘patchwork’ of unauthorised practice of law regulations across various states.[366] It sees such reforms as responding to the crisis of access to justice.[367] It calls for regulatory reform to:
•revise unauthorised practice of law rules to allow for use of vetted AI tools, with appropriate safeguards around disclosure, data security and transparency
•establish regulatory sandboxes for testing of AI tools performing legal services
•revise definitions of unauthorised practice of law in US states.[368]
Relevant national legislation under review
5.255Other areas of law that intersect with AI that Victoria’s courts and VCAT may encounter with increasing regularity are:
•copyright and patents
•competition and consumer law.
5.256These issues were not covered extensively in consultations with stakeholders and include large bodies of law that could conceivably form discrete references.
5.257These areas of law are largely administered under Commonwealth laws. For example, disputes about copyright are governed by the Copyright Act 1968 (Cth).
5.258We understand that the Australian Government is currently reviewing several of these frameworks to consider challenges posed by AI. It released a discussion paper seeking input on whether amendments are needed to the Copyright Act 1968 (Cth) to respond to AI transparency issues.[369] A discussion paper seeking feedback on whether the Australian Consumer Law needs reform to address AI-related harms across a broad range of economic settings was also released in October 2024.[370]
5.259Given the Commonwealth controls these areas of law, they fall outside the scope of our reference and we have not canvassed them in this report.
5.260The Commission suggests that the Attorney-General maintains awareness of these topics in the context of AI.
Future reform considerations
5.261In this chapter we discuss potential areas for legislative reform in relation to foreseeable uses of AI in courts and tribunals. However, the rapid expansion of AI and its broad use across society is continuing to evolve. This will have future implications for courts and tribunals that may require broader consideration of legislative reform.
5.262As the use of AI increases, this will give rise to new offences. In Australia, new offences have been proposed and introduced in relation to deepfakes.[371] The use of AI will also have implications for causes of action such as:
•breach of privacy
•discrimination
•copyright infringement
•malicious use, such as defamation
•cyber breaches
•employment-related issues.[372]
5.263The Law Commission of England and Wales discuss how increasing autonomy of AI systems could lead to ‘liability gaps’ where ‘no natural or legal person is responsible for the harm caused by, or the other conduct of, an AI system’.[373] The Law Commission also discusses how the disconnect between autonomous systems and the person using the system might interact with established legal concepts such as establishing causation.[374] Because it can be hard to know how an AI tool produced an output, it may be difficult to demonstrate that harm caused was reasonably foreseeable. Similarly, AI may raise challenges in establishing the mental element of an offence or legal cause of action as it may not be possible to demonstrate AI held the required state of mind.[375] In considering these potential gaps, the Law Commission raises the ‘potentially radical option for AI law reform: granting some form of legal personality to AI systems’.[376]
5.264At this stage it is difficult to predict the broad impacts of AI and its implications for courts and tribunals. Some issues could be managed within the existing system, others may require substantive reform. While the use of AI gives rise to a range of possible issues, it is challenging to assess where, or if, these legal gaps will arise.[377] The issues raised by AI are large and complex, requiring ongoing attention and further consideration of possible reforms across a range of areas.
-
For example, Supreme Court Act 1986 (Vic) s 25.
-
Ibid s 25(1)(ac).
-
Subordinate Legislation Act 1994 Guidelines (Guidelines, Parliament of Victoria, September 2023) 6; Subordinate Legislation Act 1994 (Vic) s 13.
-
Supreme Court of Victoria, Annual Report 2020-21 (Report, October 2021) 59; ‘The Honourable Justice Cavanough Retires’, Supreme Court of Victoria (Web Page, 12 December 2024) <http://www.supremecourt.vic.gov.au/news/the-honourable-justice-cavanough-retires>.
-
Magistrates’ Court of Victoria, Annual Report 2023-2024 (Report, 28 October 2024) 21 <https://www.mcv.vic.gov.au/news-and-resources/publications/annual-report-2023-2024>; Magistrates Court of Victoria, Annual Report 2020-21 (Report, 18 November 2021) 19.
-
For example, Supreme Court Act 1986 (Vic) s 28AAA; County Court Act 1958 (Vic) s 8E; Magistrates’ Court Act 1989 (Vic) s 16A.
-
Supreme Court of Victoria, Practice Note SC Gen 1 Practice Notes and Notice to the Profession (Practice Note, 30 January 2017) para 4.3 <https://www.supremecourt.vic.gov.au/sites/default/files/assets/2017/09/25/18c025b3f/gen1practicenotesandnoticetotheprofession.pdf>.
-
Submission 24 (County Court of Victoria).
-
Children’s Court Authentication and Electronic Transmission Rules 2020 (Vic).
-
Consultation 9 (Victorian Civil and Administrative Tribunal).
-
Submission 24 (County Court of Victoria).
-
Consultation 32 (Supreme Court of Victoria); Also, Juries Victoria stated it did not require legislative amendment to enable its adoption of AI: Consultation 1 (Juries Victoria).
-
Submission 24 (County Court of Victoria).
-
Consultation 9 (Victorian Civil and Administrative Tribunal).
-
Submission 24 (County Court of Victoria).
-
COVID-19 Omnibus (Emergency Measures) Act 2020 (Vic).
-
For instance, s9 of the Justice Legislation Amendment (Miscellaneous) Act 2025 (Vic) amends s118 of the Criminal Procedure Act 2009 (Vic) to allow case direction notices to be filed by electronic means rather than with a registrar at a venue of the court.
-
Consultation 15 (Magistrates’ Court of Victoria).
-
Victorian Civil and Administrative Tribunal Act 1998 (Vic) s 83(1).
-
Consultation 9 (Victorian Civil and Administrative Tribunal).
-
Although bespoke AI tools are being piloted. See Chapter 2 for further detail.
-
Neurotechnology, neuroscience and AI together have created opportunities for collecting, maintaining and utilising brain data to understand and shape the human mind. Neurotechnologies raise profound human rights problems regarding the right to privacy, and freedom of thought, conscience, religion or belief. These risks will not be discussed in detail here as they go beyond the scope of the reference – there is no discourse that suggests Victorian courts and tribunals will adopt neurotechnology in the near term. For further discussion of human rights and neurotechnology, see, for example, Australian Human Rights Commission, The Need for Human Rights-Centred Artificial Intelligence (Submission No 212 to the Department of Industry, Science and Resources Supporting Responsible AI Discussion Paper, 26 July 2023) 11–15 <https://consult.industry.gov.au/supporting-responsible-ai/submission/view/212>.
-
Submission 10 (Castan Centre for Human Rights Law, Monash University); See also Sophie Farthing et al, Human Rights and Technology (Final Report, Australian Human Rights Commission, 2021) 10 <https://humanrights.gov.au/our-work/technology-and-human-rights/projects/final-report-human-rights-and-technology>.
-
Submission 10 (Castan Centre for Human Rights Law, Monash University).
-
Charter of Human Rights and Responsibilities Act 2006 (Vic) s 6(2)(b).
-
Ibid s 8.
-
Ibid s 24.
-
Ibid s 21.
-
Ibid s 25.
-
Ibid s 13.
-
Ibid ss 4(1)(j), 38.
-
Ibid s 32.
-
See Note in ibid s 4(1)(j).
-
PJB v Melbourne Health (Patrick’s case) [2011] VSC 327; (2011) 39 VR 373, [124] cited in; Judicial College of Victoria, Charter of Human Rights Bench Book (Online Manual, 1 September 2017) ’2.4 Courts and tribunals as public authorities’ [8] n 189 <https://resources.judicialcollege.vic.edu.au/article/1049904> (3 August 2023). Summarised here are some characteristics of judicial functions which include having to create new rules of law, making binding determinations of existing legal rights, determining criminal guilt and actions for the enforcement of existing legal rights, and making binding and authoritative determinations of legal rights and duties.
-
Victoria Police Toll Enforcement v Taha [2013] VSCA 37; (2013) 49 VR 1, [246].
-
See for example ibid; Secretary to the Department of Human Services v Sanding [2011] VSC 42; (2011) 36 VR 221, [165]; Kracke v Mental Health Review Board [2009] VCAT 646; (2009) 29 VAR 1, [241]; Matsoukatidou v Yarra Ranges Council [2017] VSC 61; (2017) 51 VR 624, [32] cited in Judicial College of Victoria, Charter of Human Rights Bench Book (Online Manual, 1 September 2017) ‘2.5.6 Direct application of Charter rights to courts’ [5] n 205 <https://resources.judicialcollege.vic.edu.au/article/1049904> (10 December 2024).
-
Submissions 10 (Castan Centre for Human Rights Law, Monash University), 15 (Human Rights Law Centre).
-
Submissions 5 (Office of the Victorian Information Commissioner), 15 (Human Rights Law Centre), 27 (Federation of Community Legal Centres and Justice Connect).
-
Australian Human Rights Commission, Centring Human Rights in the Governance of Artificial Intelligence (Submission to the United Nations Office of the Secretary-General’s Envoy on Technology, 30 September 2023) 8 <https://humanrights.gov.au/sites/default/files/centring_human_rights_in_the_governance_of_artificial_intelligence_australian_human_rights_commission_0.pdf>; Sophie Farthing et al, Human Rights and Technology (Final Report, Australian Human Rights Commission, 2021) 98–99 <https://humanrights.gov.au/our-work/technology-and-human-rights/projects/final-report-human-rights-and-technology>.
-
Submission 5 (Office of the Victorian Information Commissioner).
-
Submission 15 (Human Rights Law Centre).
-
Victorian Equal Opportunity & Human Rights Commission, The Charter of Human Rights and Responsibilities: A Guide for Victorian Public Sector Workers (3rd Edn) (Report, Victorian Equal Opportunity & Human Rights Commission, January 2024) 13–16 <https://www.humanrights.vic.gov.au/resources/https-resources-charter-guide-for-vps-2024/>.
-
For instance, the Law Commission of Ontario’s Human Rights AI Impact Assessment tool: Law Commission of Ontario and Ontario Human Rights Commission, Human Rights AI Impact Assessment (Report, November 2024) <https://www.lco-cdo.org/wp-content/uploads/2024/11/LCO-Human-Rights-AI-Impact-Assessment-EN.pdf>.
-
Australian Human Rights Commission and National Australia Bank (NAB), Human Rights Impact Assessment Tool: AI-Informed Decision-Making System in Banking (Report, September 2023) <https://humanrights.gov.au/our-work/technology-and-human-rights/publications/hria-tool-ai-banking>.
-
Victorian Equal Opportunity & Human Rights Commission, The Charter of Human Rights and Responsibilities: A Guide for Victorian Public Sector Workers (3rd Edn) (Report, Victorian Equal Opportunity & Human Rights Commission, January 2024) 14 <https://www.humanrights.vic.gov.au/resources/https-resources-charter-guide-for-vps-2024/>.
-
Ibid.
-
Law Commission of Ontario and Ontario Human Rights Commission, Human Rights AI Impact Assessment (Report, November 2024) <https://www.lco-cdo.org/wp-content/uploads/2024/11/LCO-Human-Rights-AI-Impact-Assessment-EN.pdf>.
-
Ibid 12. Justice system includes court services and administrative tribunals, for adjudication, policing, law enforcement, sentencing, corrections, probation and parole.
-
Miriam Stankovich et al, Global Toolkit on AI and the Rule of Law for the Judiciary (CI/DIT/2023/AIRoL/01, UNESCO, 2023) 186 <https://unesdoc.unesco.org/ark:/48223/pf0000387331>.
-
Submission 27 (Federation of Community Legal Centres and Justice Connect). Consultation 31 (Victorian Equal Opportunity & Human Rights Commission).
-
The Privacy and Data Protection Act 2014 (Vic) defines personal information as ‘information or an opinion… that is recorded… about an individual whose identity is apparent, or can be reasonably be ascertained, from the information or opinion…’: at s3. Sensitive information is defined under the Act as information or opinions including about an individual’s origins, political opinions, group memberships, religious beliefs, philosophical beliefs, sexual preferences or criminal record: at sch 1.
-
‘Artificial Intelligence and Privacy – Issues and Challenges’, Office of the Victorian Information Commissioner (OVIC) (Web Page) <https://ovic.vic.gov.au/privacy/resources-for-organisations/artificial-intelligence-and-privacy-issues-and-challenges/>.
-
Submission 10 (Castan Centre for Human Rights Law, Monash University).
-
Law Institute Victoria, Ethical and Responsible Use of Artificial Intelligence Guideline (Ethical Guideline, 13 August 2025) 3 <https://www.liv.asn.au/download.aspx?DocumentVersionKey=69158983-87f3-4c1d-be99-8c300b5c7afd>.
-
Ibid.
-
Consultation 17 (Digital Rights Watch).
-
Australian Government, ‘Clearview AI Breached Australians’ Privacy’, Office of the Australian Information Commissioner (OAIC) (Web Page, 3 November 2021) <https://www.oaic.gov.au/news/media-centre/clearview-ai-breached-australians-privacy>.
-
‘Cyber Incident Information’, Court Services Victoria (Web Page, 18 January 2024) <https://courts.vic.gov.au/news/court-services-victoria-cyber-incident>.
-
Ibid.
-
Ibid.
-
Victoria Pengilley, ‘NSW Court Website Hit by Major Data Breach, 9,000 Documents Downloaded’, ABC News (online, 26 March 2025) <https://www.abc.net.au/news/2025-03-26/nsw-court-website-major-data-breach-documents-leaked/105100678>.
-
Ibid.
-
‘Statement: NSW Online Registry Website Data Breach’, Department of Communities and Justice (Web Page, 10 April 2025) <https://dcj.nsw.gov.au/dcj/news-and-media/media-releases/2025/nsw-online-registry-website-data-breach.html>.
-
Ibid.
-
Charter of Human Rights and Responsibilities Act 2006 (Vic) s 13. The Victorian right to privacy was modelled on the right to privacy in the International Covenant on Civil and Political Rights (ICCPR). Victorian courts have held that there is no difference between the scope of Victoria’s right to privacy and the ICCPR right to privacy. This means that international interpretations of privacy can be applied to the Victorian context. See eg, Kracke v Mental Health Review Board [2009] VCAT 646; (2009) 29 VAR 1, [591].
-
Explanatory Memorandum, Charter of Human Rights and Responsibilities Bill 2006 (Vic) cl 13.
-
Judicial College of Victoria, Charter of Human Rights Bench Book (Online Manual, 1 September 2017) ‘6.7.2 Scope of the right’ [4] n 651 <https://resources.judicialcollege.vic.edu.au/article/1049904> (27 June 2025) referencing Kracke v Mental Health Review Board (General) [2009] VCAT 646; (2009) 29 VAR 1, [619]–[620].
-
Office of the United Nations High Commissioner for Human Rights, The Right to Privacy in the Digital Age: Report of the Office of the United Nations High Commissioner for Human Rights, UN Doc A/HRC/51/17 (4 August 2022) <https://documents.un.org/doc/undoc/gen/g22/442/29/pdf/g2244229.pdf>.
-
Joseph A. Cannataci, Special Rapporteur, Artificial Intelligence and Privacy, and Children’s Privacy: Report of the Special Rapporteur on the Right to Privacy, Joseph A. Cannataci, UN Doc A/HRC/46/37 (25 January 2021) 3 [18] <https://documents.un.org/doc/undoc/gen/g21/015/65/pdf/g2101565.pdf>.
-
Ibid.
-
Charter of Human Rights and Responsibilities Act 2006 (Vic) s 7(2).
-
Ibid s 13. The UN Human Rights Committee has also accepted that where an interference with privacy is provided for by law, it is not unlawful. Eg, Human Rights Committee, Views 488/1992, 50th sess, UN Doc CCPR/C/50/D/488/1992 (31 March 1994) <https://juris.ohchr.org/casedetails/702/en-US> (‘Toonen v Australia’).
-
Judicial College of Victoria, Charter of Human Rights Bench Book (Online Manual, 1 September 2017) ‘6.7.2. Scope of the right’ [35],[38] <https://resources.judicialcollege.vic.edu.au/article/1049904> (27 June 2025).
-
Victorian Equal Opportunity & Human Rights Commission, The Charter of Human Rights and Responsibilities: A Guide for Victorian Public Sector Workers (3rd Edn) (Report, Victorian Equal Opportunity & Human Rights Commission, January 2024) 16 <https://www.humanrights.vic.gov.au/resources/https-resources-charter-guide-for-vps-2024/>.
-
Office of the Victorian Information Commissioner (OVIC), Guidelines to the Information Privacy Principles (IPP Guidelines) (Report, 14 November 2019) <https://ovic.vic.gov.au/privacy/resources-for-organisations/guidelines-to-the-information-privacy-principles/>.
-
Privacy Data and Protection Act 2014 (Vic) pt 4.
-
Ibid s 3.
-
Office of the Victorian Information Commissioner (OVIC), Use of Enterprise Generative AI Tools in the Victorian Public Sector (Report, March 2025) <https://ovic.vic.gov.au/privacy/resources-for-organisations/use-of-enterprise-generative-ai-tools-in-the-victorian-public-sector/>.
-
Ibid 2–3.
-
Office of the Victorian Information Commissioner (OVIC), Use of Personal Information with Publicly Available Generative AI Tools in the Victorian Public Sector (Report, March 2025) <https://ovic.vic.gov.au/privacy/resources-for-organisations/use-of-personal-information-with-publicly-available-generative-ai-tools-in-the-victorian-public-sector/>.
-
Ibid 2.
-
‘Public Consultation on Artificial Intelligence Privacy Guidance’, Office of the Victorian Information Commissioner (OVIC) (Web Page, August 2024) <https://ovic.vic.gov.au/privacy/resources-for-organisations/public-consultation-on-artificial-intelligence-privacy-guidance/> as at June 2025. The Office of the Victorian Information Commissioner conducted consultations to update its online privacy guidance resource; Office of the Victorian Information Commissioner (OVIC), Artificial Intelligence – Understanding Privacy Obligations (Report, April 2021) <https://ovic.vic.gov.au/privacy/resources-for-organisations/artificial-intelligence-understanding-privacy-obligations/>.
-
Privacy Data and Protection Act 2014 (Vic) pt 4.
-
Charter of Human Rights and Responsibilities Act 2006 (Vic) ss 4, 38.
-
Matsoukatidou v Yarra Ranges Council [2017] VSC 61; (2017) 51 VR 624, [32]–[46].
-
Privacy Data and Protection Act 2014 (Vic) s 10.
-
Submission 5 (Office of the Victorian Information Commissioner).
-
Ibid; Privacy Data and Protection Act 2014 (Vic) pt 4.
-
See for example, ‘PRIVACY: Privacy Statement for the Supreme Court of Victoria Website’, Supreme Court of Victoria (Web Page) <http://www.supremecourt.vic.gov.au/privacy>; ‘County Court of Victoria Privacy Statement’, County Court of Victoria (Web Page, 16 August 2019) <https://www.countycourt.vic.gov.au/privacy-statement>; Submission 26 (Supreme Court of Victoria).
-
Submission 10 (Castan Centre for Human Rights Law, Monash University).
-
Submissions 5 (Office of the Victorian Information Commissioner), 15 (Human Rights Law Centre).
-
Submissions 7 (Dr Natalia Antolak-Saper), 27 (Federation of Community Legal Centres and Justice Connect).
-
Submissions 15 (Human Rights Law Centre), 27 (Federation of Community Legal Centres and Justice Connect).
-
Consultation 17 (Digital Rights Watch).
-
Submission 5 (Office of the Victorian Information Commissioner). Consultation 14 (Office of the Victorian Information Commissioner).
-
Privacy Data and Protection Act 2014 (Vic) s 15.
-
Submission 5 (Office of the Victorian Information Commissioner).
-
Ibid; Privacy Data and Protection Act 2014 (Vic) s 10.
-
Submission 5 (Office of the Victorian Information Commissioner); Privacy Act 1988 (Cth); Privacy and Personal Information Protection Act 1998 (NSW) s 6.
-
Submission 5 (Office of the Victorian Information Commissioner).
-
Note, VCAT already provides reporting to the Office of the Victorian Information Commissioner under Part 4 of the Victorian Protective Data Security Standards regime, as a ‘special body’ under the Public Administration Act 2004 (Vic): Privacy Data and Protection Act 2014 (Vic) s 84(1)(b); Public Administration Act 2004 (Vic) s 6(1)(h).
-
Supplementary Submission 29 (Court Services Victoria) which the Supreme Court agreed with.
-
Ibid.
-
Ibid.
-
Ibid.
-
Ibid.
-
Consultation 32 (Supreme Court of Victoria); Supplementary Submission 29 (Court Services Victoria).
-
Supplementary Submission 29 (Court Services Victoria). Consultation 32 (Supreme Court of Victoria).
-
Supplementary Submission 29 (Court Services Victoria); Information Privacy Act 2009 (Qld) sch 2 pt 2; Privacy and Personal Information Protection Act 1998 (NSW) s 6(1); Information Act 2022 (NT) ss 5(5)(a) and (b).
-
The Office of the Victorian Information Commissioner interprets ‘tribunals with quasi-judicial functions’ to mean bodies that have an inquiry function concerning facts and law, apply the law and make determinations affecting the obligations and rights of involved parties Office of the Victorian Information Commissioner (OVIC), Overview: Information Privacy Principles (Guidelines, 3 December 2020) 10 [O.40] n9 <https://ovic.vic.gov.au/book/overview-2/> citing Harrison v Victoria Building Authority [2015] VCAT 1791 [23].
-
Integrity and Oversight Committee, Parliament of Victoria, Performance of the Victorian Integrity Agencies 2022/23 (Report, May 2025) 57.
-
Supplementary Submission 29 (Court Services Victoria); Health Records Act 2001 (Vic) s 14. CSV states section 14 contains a carve-out in relation to courts and tribunals in functionally identical terms to section 10 of the PDP Act.
-
Open Courts Act 2013 (Vic) s 1 (aa).
-
Emma Cunliffe, ‘Open Justice: Concepts and Judicial Approaches’ (2012) 40 Federal Law Review 385, 388–389.
-
Ibid 389.
-
‘The Courts and Your Privacy’, Federal Circuit and Family Court of Australia (Web Page) <https://www.fcfcoa.gov.au/pubs/court-privacy>.
-
Australian Institute of Judicial Administration (AIJA), Guide to Judicial Conduct, Third Edition (Revised) (Guide, December 2023) 6 <https://aija.org.au/wp-content/uploads/2024/04/Judicial-Conduct-guide_revised-Dec-2023-formatting-edits-applied.pdf>.
-
Ibid 7.
-
Parliament of Victoria, Parliament and the Courts: Separation of Powers Summary Notes (Report, 2 March 2023) 12 <https://www.parliament.vic.gov.au/49a853/globalassets/sections-shared/teach-and-learn/resource-pages/separation-of-powers-parliament-and-the-courts/summary-notes—separation-of-powers.pdf>.
-
Ibid.
-
In Victoria, the separation of powers is reflected in the Constitution Act 1975 (Vic).
-
Consultation 14 (Office of the Victorian Information Commissioner).
-
Supplementary Submission 29 (Court Services Victoria).
-
Kline v Official Secretary to the Governor General [2013] HCA 52; (2013) 304 CLR 116, [13], [41], [47].
-
Ibid [41], [71]-[77].
-
Ibid [34], [51].
-
Information Privacy Act 2009 (Qld) sch 2 pt 2; Personal Information Protection Act 2004 (Tas) ss 7(a)-(b), (g); Information Act 2002 (NT) s 5(5); Privacy and Personal Information Protection Act 1998 (NSW) s 6.
-
NZ v Attorney General’s Department [2005] NSWADT 103, [18]; On appeal Justice Bell said ‘There is no question of the PPIP Act applying to a court or the holder of an office relating to a court exercising the court’s judicial functions. Once the actions of the registry staff were found to relate to the judicial functions of the court within the meaning of the PPIP Act, that was an end to the matter’: Budd v Director, Attorney Generals Department [2006] NSWSC 1267, [20].
-
Consultation 14 (Office of the Victorian Information Commissioner).
-
See, for example, the High Court’s Australian Privacy Principles Privacy Policy in relation to administrative matters: High Court of Australia, High Court of Australia Australian Privacy Principles (‘APP’) Privacy Policy (Report, 24 July 2025) <https://www.hcourt.gov.au/sites/default/files/resources/2025-07/High%20Court%20of%20Australia%20-%20Privacy%20Policy%20July%202025.pdf>. Noting that the Privacy Act 1988 (Cth) only applies to acts done and practices engaged in by federal courts in respect of a matter of an administrative nature: at s 7(1)(b).
-
‘Data Breach Response Plan’, Federal Court of Australia (Web Page, 31 July 2020) <https://www.fedcourt.gov.au/privacy/data-breach-response-plan>.
-
‘Privacy Policy’, Federal Circuit and Family Court of Australia (Web Page, May 2024) <https://www.fcfcoa.gov.au/policies-and-procedures/privacy>.
-
‘The Role of the Privacy Officer’, Office of the Victorian Information Commissioner (OVIC) (Web Page) <https://ovic.vic.gov.au/privacy/resources-for-organisations/privacy-officer-toolkit/the-role-of-the-privacy-officer/>.
-
Judicial Data Protection Panel, Judicial Data Processing Complaints Handling Policy (Report, Courts and Tribunals Judiciary (UK), 4 June 2023) 2.
-
Ibid 3.
-
Ibid 6.
-
These measures draw on a report by the Special Rapporteur. See Joseph A. Cannataci, Special Rapporteur, Artificial Intelligence and Privacy, and Children’s Privacy: Report of the Special Rapporteur on the Right to Privacy, Joseph A. Cannataci, UN Doc A/HRC/46/37 (25 January 2021) <https://documents.un.org/doc/undoc/gen/g21/015/65/pdf/g2101565.pdf>.
-
Integrity and Oversight Committee, Parliament of Victoria, Performance of the Victorian Integrity Agencies 2022/23 (Report, May 2025) 63.
-
Ibid 61–64.
-
Office of the Victorian Information Commissioner (OVIC), 2023-24 Annual Report: Changes (Report) 33.
-
For information on privacy by design see Office of the Victorian Information Commissioner (OVIC), Privacy by Design (Guidance No D21/24515, January 2022) <https://ovic.vic.gov.au/privacy/resources-for-organisations/privacy-by-design/>.
-
Paul W Grimm, Maura R Grossman and Gordon V Cormack, ‘Artificial Intelligence as Evidence’ (2021) 19(1) Northwestern Journal of Technology and Intellectual Property 9, 60–65.
-
Riana Pfefferkorn, ‘“Deepfakes” in the Courtroom’ (2020) 29 Public Interest Law Journal 245, 250–251.
-
Paul W Grimm, Maura R Grossman and Gordon V Cormack, ‘Artificial Intelligence as Evidence’ (2021) 19(1) Northwestern Journal of Technology and Intellectual Property 9, 42–48.
-
Consultation 20 (AI for Law Enforcement and Community Safety Lab).
-
Evidence Act 2008 (Vic).
-
In 1995, the Commonwealth and NSW each enacted an Evidence Act: Evidence Act 1995 (Cth); Evidence Act 1995 (NSW). Since then, Tasmania, Victoria, the ACT, and the Northern Territory have also joined the scheme, with varying degrees of uniformity with the original Commonwealth and NSW legislation: Evidence Act 2001 (Tas); Evidence Act 2008 (Vic); Evidence Act 2011 (ACT); Evidence (National Uniform Legislation) Act 2011 (NT). The Uniform Evidence Law (UEL) is a law that aims to create a harmonised framework for the admissibility of evidence in legal proceedings across Australia, based on the Commonwealth Act. South Australia and Queensland continue to use the common law and their respective jurisdiction’s Evidence Acts.
-
Evidence Act 2025 (WA) s 3(4) note.
-
Evidence Act 2008 (Vic) s 4.; The Act applies to VCAT only to the extent that VCAT adopts the rules of evidence Victorian Civil and Administrative Tribunal Act 1998 (Vic) s 98(1)(b).
-
Supreme Court (Chapter 1 Expert Witness Code Amendment) Rules 2016 (Vic); County Court Civil Procedure Rules 2018 (Vic) Form 44A; Magistrates’ Court General Civil Procedure Rules 2020 (Vic) Form 44A.
-
Supreme Court (Chapter 1 Expert Witness Code Amendment) Rules 2016 (Vic).
-
Supreme Court of Victoria, SC CR 3 – Expert Evidence in Criminal Trials (Practice Note, 1 June 2025) <https://www.supremecourt.vic.gov.au/areas/legal-resources/practice-notes/sc-cr-3-expert-evidence-in-criminal-trials>; County Court of Victoria, Practice Note: Expert Evidence in Criminal Trials (Practice Note No PNCR 1-2025, June 2025).
-
Supreme Court of Victoria, SC CR 3 – Expert Evidence in Criminal Trials (Practice Note, 1 June 2025) para 13 <https://www.supremecourt.vic.gov.au/areas/legal-resources/practice-notes/sc-cr-3-expert-evidence-in-criminal-trials>.
-
Ibid para 7.4.
-
Ibid para 6.3.
-
Ibid para 7A.
-
Victorian Civil and Administrative Tribunal (VCAT), Practice Note – PNVCAT2 – Expert Evidence (Practice Note, 8 December 2022) <https://www.vcat.vic.gov.au/documents/practice-notes/practice-note-pnvcat2-expert-evidence>.
-
Coroners Court of Victoria, Coroners Court of Victoria Code of Conduct for Expert Witnesses (Report).
-
Consultations 19 (Professor Ian Freckelton AO KC), 32 (Supreme Court of Victoria).
-
Consultation 19 (Professor Ian Freckelton AO KC).
-
Consultation 6 (Office of Public Prosecutions).
-
Ibid 4.
-
‘Deepfake Trends and Challenges – Position Statement’, eSafety Commissioner (Web Page, 1 September 2024) <https://www.esafety.gov.au/industry/tech-trends-and-challenges/deepfakes>.
-
Australian Human Rights Commission, Adopting AI in Australia (Submission No. 71 to Senate Select Committee on Adopting Artificial Intelligence (AI), 15 May 2024) 8 <https://humanrights.gov.au/our-work/legal/submission/adopting-ai-australia>; World Economic Forum, The Global Risks Report 2024: Insight Report (World Economic Forum, 19th ed, 2024) 18.
-
Venessa Ninovic, ‘Deepfake Crime: Trends, Threats and Implications’ (2024) 1(2) International Journal of Contemporary Intelligence Issues 41, 42.
-
Chief Justice Bell, ‘Change at the Bar and the Great Challenge of Gen AI’ (Speech, Address to the Australian Bar Association, Sydney, 29 August 2025) 10 <https://inbrief.nswbar.asn.au/posts/13dbc1d59f076b32283b003eb800f0de/attachment/BellCJ-ABA-20250829.pdf>.
-
Explanatory Memorandum, Justice Legislation Amendment (Sexual Offences and Other Matters) Bill 2022 (Vic) cl 22.
-
Crimes Amendment (Deepfake Sexual Material) Bill 2025 (NSW); Crimes Amendment (Intimate Image and Audio Material) Bill 2025 (NSW); Victims Rights and Victims of Crime Commissioner Bill 2025 (NSW).
-
Criminal Code Amendment (Deepfake Sexual Material) Act 2024 (Cth).
-
Sexual Offences Act 2003 (UK) ss 66A, 66B introduced offences for the sending and sharing of intimate images.
-
Consultation 16 (Maria Dimopoulos AM and Eva Hussain). Family Violence is defined in the Family Violence Protection Act 2008 (Vic) s 5 to include a wide range of abusive, threatening and coercive behaviours by someone towards a family member, or acts that cause a child to be exposed to such behaviours.
-
Venessa Ninovic, ‘Deepfake Crime: Trends, Threats and Implications’ (2024) 1(2) International Journal of Contemporary Intelligence Issues 41, 48.
-
Consultation 16 (Maria Dimopoulos AM and Eva Hussain).
-
Venessa Ninovic, ‘Deepfake Crime: Trends, Threats and Implications’ (2024) 1(2) International Journal of Contemporary Intelligence Issues 41, 43.
-
Allie Umoff and Stephanie Lo, ‘Artificial Intelligence, Real Problems: Evidence in the Age of AI’ [2025] The Bulletin <https://bulletin.lawsocietysa.asn.au/Bulletin/Bulletin/Content/Articles/2025/June/
-
Consultation 6 (Office of Public Prosecutions).
-
Consultations 6 (Office of Public Prosecutions), 19 (Professor Ian Freckelton AO KC).
-
Ibid.
-
Submission 10 (Castan Centre for Human Rights Law, Monash University).
-
Matsoukatidou v Yarra Ranges Council [2017] VSC 61; (2017) 51 VR 624, [186].
-
Judicial College of Victoria, Charter of Human Rights Bench Book (Online Manual, 1 September 2017) ‘6.18.2 Scope of the right’ [70] <https://resources.judicialcollege.vic.edu.au/article/1049904> (4 January 2023).
-
Consultation 20 (AI for Law Enforcement and Community Safety Lab).
-
Ibid.
-
Ibid.
-
‘AI watermarking’ is defined in the Australian Government’s AI Technical Standard as ‘Information embedded into digital content, either perceptibly or imperceptibly by humans, that can serve a variety of purposes, such as establishing digital content provenance or informing stakeholders that the contents are AI-generated or significantly modified.’ Digital Transformation Agency (Cth), Australian Government’s AI Technical Standard (Version 1, July 2025) 3 <https://www.digital.gov.au/policy/ai/AI-technical-standard>; See also, Department of Industry, Science and Resources (Cth), Safe and Responsible AI in Australia Consultation: Australian Government’s Interim Response (Report, 2024) 20.
-
Robert Chesney and Danielle K Citron, ‘Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security’ (2019) 107 California Law Review 1753, 1785 <https://www.californialawreview.org/print/deep-fakes-a-looming-challenge-for-privacy-democracy-and-national-security>.
-
Riana Pfefferkorn, ‘“Deepfakes” in the Courtroom’ (2020) 29 Public Interest Law Journal 245, 267.
-
Higgins v Commissioner of Police, NSW Police Force [2024] NSWCATAD 175, [23].
-
R v Gallerno [2025] ONSC 236, [13].
-
Consultation 20 (AI for Law Enforcement and Community Safety Lab).
-
Yvonne Apolo and Katina Michael, ‘Beyond A Reasonable Doubt? Audiovisual Evidence, AI Manipulation, Deepfakes, and the Law’ (2024) 5(2) IEEE Transactions on Technology and Society 156, 163 <https://ieeexplore.ieee.org/document/10632877/?arnumber=10632877>; Riana Pfefferkorn, ‘“Deepfakes” in the Courtroom’ (2020) 29 Public Interest Law Journal 245, 271–3.
-
Submission 22 (Centre for the Future of the Legal Profession and UNSW Law and Justice).
-
Ibid.
-
See for example Riana Pfefferkorn, ‘“Deepfakes” in the Courtroom’ (2020) 29 Public Interest Law Journal 245; Paul W Grimm, Maura R Grossman and Gordon V Cormack, ‘Artificial Intelligence as Evidence’ (2021) 19(1) Northwestern Journal of Technology and Intellectual Property 9; Advisory Committee on Evidence Rules, Advisory Committee on Evidence Rules Agenda Book (Report, Judicial Conference of the United States, 27 October 2023) <https://www.uscourts.gov/sites/default/files/2023-10_evidence_rules_agenda_book_final_10-5.pdf>.
-
This is a sub-committee of the Standing Committee on Rules of Practice and Procedure of the Judicial Conference of the United States.
-
Advisory Committee on Evidence Rules, Advisory Committee on Evidence Rules Agenda Book (Report, Judicial Conference of the United States, 27 October 2023) 85 <https://www.uscourts.gov/sites/default/files/2023-10_evidence_rules_agenda_book_final_10-5.pdf>.
-
Ibid.
-
Advisory Committee on Evidence Rules, Advisory Committee on Evidence Rules Agenda Book (Report, Judicial Conference of the United States, 19 April 2024) 23 <https://www.uscourts.gov/sites/default/files/2024-04_agenda_book_for_evidence_rules_meeting_final_updated_5-8-2024.pdf>.
-
Judicial Conference of the United States, Committee on Rules of Practice and Procedure – Agenda Papers June 2025 (Report, 10 June 2025) <https://www.uscourts.gov/sites/default/files/document/2025-06-standing-agenda-book.pdf.pdf>.
-
Ibid.
-
Ibid 60.
-
Ibid 62.
-
Submission 23 (Victorian Bar Association). Consultations 5 (Victorian Bar Association), 6 (Office of Public Prosecutions).
-
Evidence Act 2008 (Vic) s 50.
-
Consultation 19 (Professor Ian Freckelton AO KC).
-
For further discussion of agentic AI, see Chapter 3 on risks and opportunities of use.
-
Consultations 19 (Professor Ian Freckelton AO KC), 27 (UNSW’s Centre for the Future of the Legal Profession and Professor Lyria Bennett Moses); Yvonne Apolo and Katina Michael, ‘Beyond A Reasonable Doubt? Audiovisual Evidence, AI Manipulation, Deepfakes, and the Law’ (2024) 5(2) IEEE Transactions on Technology and Society 156, 162 <https://ieeexplore.ieee.org/document/10632877/?arnumber=10632877>.
-
Australian Law Reform Commission, NSW Law Reform Commission and Victorian Law Reform Commission, Uniform Evidence Law (Final Report No 102, December 2005) [6.15]-[6.16].
-
Ibid [6.15]-[6.42].
-
Ministry of Justice (UK), ‘Use of Evidence Generated by Software in Criminal Proceedings: Call for Evidence’, GOV.UK (Web Page, 21 January 2025) <https://www.gov.uk/government/calls-for-evidence/use-of-evidence-generated-by-software-in-criminal-proceedings/use-of-evidence-generated-by-software-in-criminal-proceedings-call-for-evidence>.
-
Ibid.
-
Evidence Act 2008 (Vic) ss 146–7.
-
Consultation 19 (Professor Ian Freckelton AO KC); Colin Stevenson (a pseudonym) v The Queen [2020] VSCA 27; (2020) 61 VR 624.
-
Gujic v Arterbury [2024] FedCFamC1A 48.
-
Quoc V. Lee and Mike Schuster, ‘A Neural Network for Machine Translation, at Production Scale’, Google Research (Web Page, 27 September 2016) <https://research.google/blog/a-neural-network-for-machine-translation-at-production-scale/>.
-
Gujic v Arterbury [2024] FedCFamC1A 48, [52].
-
Ibid [51]-[56].
-
Prue McDonald and Hawa Q Mohammad, ‘Google Translate and the Evidence Act 1995 (Cth): Gujic v Arterbury [2024] FedCFamC1A 48’ (2024) 98(11) Australian Law Journal 919, 920.
-
Prue McDonald and Hawa Q Mohammad, ‘Google Translate and the Evidence Act 1995 (Cth): Gujic v Arterbury [2024] FedCFamC1A 48’ (2024) 98(11) Australian Law Journal 919. This could then come under the judicial notice provision of s144 of the Evidence Act 2008 (Vic).
-
Director of Public Prosecutions (ACT) v Khan [2024] ACTSC 19, [43].
-
Evidence Act 2008 (Vic) Dictionary pt 1 (definition of ’probative value’). Probative value of evidence means the extent to which the evidence could rationally affect the assessment of the probability of the existence of a fact in issue.
-
Consultations 19 (Professor Ian Freckelton AO KC), 32 (Supreme Court of Victoria).
-
Consultation 6 (Office of Public Prosecutions). In IMM v The Queen [2016] HCA 14; (2016) 257 CLR 300, [52] it was held that issues of reliability were not relevant to the assessment of probative value for the purposes of s 137.
-
For example, in Trivago NV v Australian Consumer and Competition Commission there was extensive expert evidence from computer science experts regarding the algorithm and underlying content and computation of data: Trivago NV v Australian Consumer and Competition Commission [2020] FCAFC 185; (2020) 384 ALR 496, [65].
-
State of Wisconsin v Loomis 371 Wis.2d 235 (2016). Notably, in this case the Wisconsin Supreme Court held that the trial judge’s use of COMPAS – a risk assessment algorithm used in sentencing – did not violate the defendant’s right to due process, even though the methodology was not able to be disclosed to the court, nor the defendant.
-
Margaret Satterthwaite, Special Rapporteur, AI in Judicial Systems: Promises and Pitfalls: Report of the Special Rapporteur on the Independence of Judges and Lawyers, Margaret Satterthwaite, UN Doc A/80/169 (16 July 2025) 20 <https://docs.un.org/en/A/80/169>.
-
Paul W Grimm, Maura R Grossman and Gordon V Cormack, ‘Artificial Intelligence as Evidence’ (2021) 19(1) Northwestern Journal of Technology and Intellectual Property 9, 63.
-
Consultation 19 (Professor Ian Freckelton AO KC).
-
Honeysett v The Queen [2014] HCA 29; IMM v The Queen [2016] HCA 14; (2016) 257 CLR 300; Lang v The Queen [2023] HCA 29; (2023) 278 CLR 323; R v Tang [2006] NSWCCA 357; (2006) 65 NSWLR 681; Lundy v The Queen [2014] UKPC 28; (2014) 2 NZLR 273; Lundy v The Queen [2018] NZCA 410; Thomson Reuters, Expert Evidence (online at 28 September 2025) ‘Reliability of evidence’ [12.05.30].
-
Consultation 19 (Professor Ian Freckelton AO KC); Confidential consultation.
-
Margaret Satterthwaite, Special Rapporteur, AI in Judicial Systems: Promises and Pitfalls: Report of the Special Rapporteur on the Independence of Judges and Lawyers, Margaret Satterthwaite, UN Doc A/80/169 (16 July 2025) 20 <https://docs.un.org/en/A/80/169>.
-
Daubert v Merrell Dow Pharmaceuticals Inc, 509 US 579 (1993).
-
Ibid 593–595.
-
Courts and Tribunals Judiciary (UK), Criminal Practice Directions 2023 (Report, July 2024) CrimPRC(23)90(b), PD 7 <https://www.judiciary.uk/wp-content/uploads/2025/03/Criminal-Practice-Directions-2023-as-amended-July-2024-260325.pdf>.
-
Ibid PD 7.1.2.
-
Supreme Court of Victoria, SC CR 3 – Expert Evidence in Criminal Trials (Practice Note, 1 June 2025) <https://www.supremecourt.vic.gov.au/areas/legal-resources/practice-notes/sc-cr-3-expert-evidence-in-criminal-trials>.
-
Ibid para 7.4.
-
Ibid para 6.3.
-
Ibid paras 7A.3-7A.4.
-
Gary Edmond, ‘Regulating Forensic Science and Medicine Evidence at Trial: It’s Time for a Wall, a Gate and Some Gatekeeping’ (2020) 94 Australian Law Journal 427; Chris Maxwell, ‘Preventing Miscarriages of Justice: The Reliability of Forensic Evidence and the Role of the Trial Judge as Gatekeeper’ (2019) 93(8) Australian Law Journal 642; Alastair Ross, ‘The Reliability and Validity of Expert Evidence: Law, Science and Medicine in Summit. The Rapporteur’s View.’ (2020) 52(3) Australian Journal of Forensic Sciences 246, 248 <https://doi.org/10.1080/00450618.2019.1711183>.
-
Consultation 19 (Professor Ian Freckelton AO KC).
-
Evidence Act 2008 (Vic) s 76.
-
Ibid s 79.
-
My Fashion Republic Pty Ltd t/as Cosette v Pennisi [2024] NSWCATAP 187.
-
Ibid [81].
-
Ibid.
-
Judicial Conference of the United States, Committee on Rules of Practice and Procedure – Agenda Papers June 2025 (Report, 10 June 2025) 75 <https://www.uscourts.gov/sites/default/files/document/2025-06-standing-agenda-book.pdf.pdf>.
-
Fed. R. Evid. 702(a)-(d).
-
Judicial Conference of the United States, Committee on Rules of Practice and Procedure – Agenda Papers June 2025 (Report, 10 June 2025) 75 <https://www.uscourts.gov/sites/default/files/document/2025-06-standing-agenda-book.pdf.pdf>.
-
Advisory Committee on Evidence Rules, Advisory Committee on Evidence Rule Agenda Book (Report, Judicial Conference of the United States, 2 May 2025) 17 <https://www.uscourts.gov/sites/default/files/document/2025-05_evidence_rules_committee_agenda_book_final.pdf>.
-
Judicial Conference of the United States, Committee on Rules of Practice and Procedure – Agenda Papers June 2025 (Report, 10 June 2025) 97 <https://www.uscourts.gov/sites/default/files/document/2025-06-standing-agenda-book.pdf.pdf>.
-
Unlike federal US law and other US states, New York State has not adopted the Daubert test and continues to rely on the Frye standard when determining the reliability of expert witness evidence. The Frye test relies on general acceptance within the scientific community to determine admissibility. The Frye test has been applied by some Australian courts. See Frye v United States 293 Fed. 1013 (1923); LexisNexis, Bender’s New York Evidence (online at 25 July 2025) ‘Daubert Standard’ § 140.01.
-
An Act to Amend the Criminal Procedure Law and the Civil Practice Law and Rules, in Relation to the Admissibility of Evidence Created or Processed by Artificial Intelligence, N.Y. Legis. Assemb. A01338. Reg. Sess. 2025-26 (2025).
-
Memorandum in Support of Legislation, N.Y. Legis. Assemb. A01338. Reg. Sess. 2025-26 (2025), ‘Justification’.
-
Paul W Grimm, Maura R Grossman and Gordon V Cormack, ‘Artificial Intelligence as Evidence’ (2021) 19(1) Northwestern Journal of Technology and Intellectual Property 9, 63.
-
An Act to Amend the Criminal Procedure Law and the Civil Practice Law and Rules, in Relation to the Admissibility of Evidence Created or Processed by Artificial Intelligence, N.Y. Legis. Assemb. A01338. Reg. Sess. 2025-26 (2025), s 1.3.
-
Ibid s 1.1 The Bill also contained the same proposed amendment to the civil practice law.
-
Ibid s 1.4.
-
Ibid s 1.2.
-
Memorandum in Support of Legislation, N.Y. Legis. Assemb. A01338. Reg. Sess. 2025-26 (2025), ‘Summary of specific provisions’ s 60.80(7)(a).
-
An Act to Amend the Criminal Procedure Law and the Civil Practice Law and Rules, in Relation to the Admissibility of Evidence Created or Processed by Artificial Intelligence, N.Y. Legis. Assemb. A01338. Reg. Sess. 2025-26 (2025), s 1.8.
-
State of Wisconsin v Loomis 371 Wis.2d 235 (2016).
-
Paul W Grimm, Maura R Grossman and Gordon V Cormack, ‘Artificial Intelligence as Evidence’ (2021) 19(1) Northwestern Journal of Technology and Intellectual Property 9, 63–65.
-
Civil Procedure Act 2010 (Vic) s 65M.
-
Ibid s 65L.
-
Ibid s 65K; See also Criminal Procedure Act 2009 (Vic) s 232A.
-
Trivago NV v Australian Consumer and Competition Commission [2020] FCAFC 185; (2020) 384 ALR 496.
-
Ibid [65]-[69].
-
Ibid [69].
-
Henry Fraser, Rhyle Simcock and Aaron J Snoswell, ‘AI Opacity and Explainability in Tort Litigation’ (Conference Paper, FAccT ’22: 2022 ACM Conference on Fairness, Accountability, and Transparency, 21-24 June 2022) 187-188 [3.4] <https://dl.acm.org/doi/10.1145/3531146.3533084>.
-
Submission 26 (Supreme Court of Victoria); Civil Procedure Act 2010 (Vic) s 65I; Supreme Court (General Civil Procedure) Rules 2025 (Vic) ord 44.06; Victorian Civil and Administrative Tribunal Act 1998 (Vic) sch 3.
-
Submission 26 (Supreme Court of Victoria); Supreme Court (General Civil Procedure) Rules 2025 (Vic) ord 50.01; Victorian Civil and Administrative Tribunal Act 1998 (Vic) s 95.
-
Submission 26 (Supreme Court of Victoria); Supreme Court Act 1986 (Vic) s 77.
-
Consultation 20 (AI for Law Enforcement and Community Safety Lab).
-
Advisory Committee on Evidence Rules, Advisory Committee on Evidence Rule Agenda Book (Report, Judicial Conference of the United States, 2 May 2025) 17 <https://www.uscourts.gov/sites/default/files/document/2025-05_evidence_rules_committee_agenda_book_final.pdf>.
-
Judicial College Victoria, A Practical Guide to Weighing Evidence (Guide, 2024) <https://judicialcollege.vic.edu.au/resources/evidence-essentials-practical-guide-weighing-evidence>.
-
Confidential consultation.
-
‘Science and the Law’, The Royal Society (Web Page, 2025) <https://royalsociety.org/about-us/what-we-do/science-and-law/>.
-
National Centre for State Courts (NCSC) and Thomson Reuters Institute, Bench Card: Evaluating Acknowledged AI-Generated Evidence (Guidance, 17 April 2025) <https://nationalcenterforstatecourts.app.box.com/s/s5n5x046hfiv7n16habigg553mpman5>. Acknowledged AI evidence is disclosed and often used to enhance understanding, such as creating 3D models or enhancing audio.
-
National Centre for State Courts (NCSC) and Thomson Reuters Institute, Bench Card: Evaluating Unacknowledged AI-Generated Evidence (Guidance, 17 April 2025) <https://nationalcenterforstatecourts.app.box.com/s/bz0sb4x8wjnnjp34gworvdzpw0qnlpr>. Unacknowledged AI evidence involves potential manipulation or fabrication without disclosure.
-
Consultation 19 (Professor Ian Freckelton AO KC).
-
Supreme Court of Victoria, SC CR 3 – Expert Evidence in Criminal Trials (Practice Note, 1 June 2025) para 1.2 <https://www.supremecourt.vic.gov.au/areas/legal-resources/practice-notes/sc-cr-3-expert-evidence-in-criminal-trials>.
-
Yee-Fui Ng and Maria O’Sullivan, ‘Deliberation and Automation – When Is a Decision a “Decision”?’ (2019) 26 Australian Journal of Administrative Law 21, 22; Kimberlee Weatherall et al, Automated Decision-Making in New South Wales: Mapping and Analysis of the Use of ADM Systems by State and Local Governments (Research Report, ARC Centre of Excellence on Automated Decision-Making and Society (ADM+S), 8 March 2024) 111 <https://apo.org.au/node/325901>.
-
Judicial review is when a court decides whether a decision made by a government department is lawful. The purpose of judicial review is for the court to decide if the decision was legal, the government had the power to make decision, and the decision was made without bias and took all relevant information into account. If a court determines a decision was unlawful, the government will overturn the initial decision, and the government will re-make the decision according to the law.
-
Administrative Decisions (Judicial Review) Act 1977 (Cth) s 3.
-
Acts Interpretation Act 1901 (Cth) s 25D.
-
Administrative Appeals Tribunal Act 1975 (Cth) s 3.
-
Administrative Law Act 1978 (Vic) ss 2, 3.
-
Victorian Civil and Administrative Tribunal Act 1998 (Vic) ss 4, 48.
-
Pintarich v Federal Commissioner of Taxation [2018] FCAFC 79; (2018) 262 FCR 41.
-
Ibid [140].
-
Ibid [49].
-
Ibid [47].
-
Sarah Sacher and Edward Santow, Use of Automated Decision-Making by Government (Submission No.1071994347 to the Attorney-General’s Department Consultation on Automated Decision Making Reform, Human Technology Institute, 28 January 2025) 15 <https://www.uts.edu.au/globalassets/sites/default/files/2025-01/HTI-submission-Use-of-ADM-by-Government.pdf>.
-
Administrative Decisions (Judicial Review) Act 1977 (Cth) s 13; Administrative Appeals Tribunal Act 1975 (Cth) s 28; Acts Interpretation Act 1901 (Cth) s 25D.
-
Victorian Civil and Administrative Tribunal Act 1998 (Vic) ss 45–46.
-
Administrative Law Act 1978 (Vic) s 8.
-
Sophie Farthing et al, Human Rights and Technology (Final Report, Australian Human Rights Commission, 2021) 66 <https://humanrights.gov.au/our-work/technology-and-human-rights/projects/final-report-human-rights-and-technology>.
-
Samuel White, ‘Authorisation and Accountability of Automated Government Decisions under Australian Administrative Law’ (2021) 102 Australian Institute of Administrative Law Forum 84, 90 <https://www.austlii.edu.au/au/journals/AIAdminLawF/2021/12.pdf>.
-
Sophie Farthing et al, Human Rights and Technology (Final Report, Australian Human Rights Commission, 2021) 106–107 <https://humanrights.gov.au/our-work/technology-and-human-rights/projects/final-report-human-rights-and-technology>.
-
Ibid 105–108.
-
‘What Is Explainable AI (XAI)?’, IBM Think (Web Page, 29 March 2023) <https://www.ibm.com/think/topics/explainable-ai>.
-
Lyria Bennett Moses, ‘Stochastic Judges: The Limits of Large Language Models’ (2024) 98(9) Australian Law Journal 640, 652.
-
Consultation 34 (Human Technology Institute).
-
Ibid.
-
Andrew D Selbst and Solon Barocas, ‘The Intuitive Appeal of Explainable Machines’ (2018) 87 Fordham Law Review 1085, 1096–1099 <https://ir.lawnet.fordham.edu/flr/vol87/iss3/11/>.
-
Submission 22 (Centre for the Future of the Legal Profession and UNSW Law and Justice).
-
Deven R Desai and Mark Riedl, Responsible AI Agents (Georgia Tech Scheller College of Business Research Paper No 5147666 (preprint), 20 February 2025) 7 <https://papers.ssrn.com/abstract=5147666>.
-
Margaret Allars, ‘Automated Decision Making and Review of Administration Decisions’ (2024) 58(3) Georgia Law Review 1145, 1167 <https://digitalcommons.law.uga.edu/glr/vol58/iss3/8/>.
-
Yee-Fui Ng and Maria O’Sullivan, ‘Deliberation and Automation – When Is a Decision a “Decision”?’ (2019) 26 Australian Journal of Administrative Law 21, 32.
-
Margaret Allars, ‘Automated Decision Making and Review of Administration Decisions’ (2024) 58(3) Georgia Law Review 1145, 1167 <https://digitalcommons.law.uga.edu/glr/vol58/iss3/8/>; Samuel White, ‘Authorisation and Accountability of Automated Government Decisions under Australian Administrative Law’ (2021) 102 Australian Institute of Administrative Law Forum 84, 91 <https://www.austlii.edu.au/au/journals/AIAdminLawF/2021/12.pdf>.
-
Consultations 28 (Monash University Digital Law Group), 34 (Human Technology Institute).
-
Sarah Sacher and Edward Santow, Use of Automated Decision-Making by Government (Submission No.1071994347 to the Attorney-General’s Department Consultation on Automated Decision Making Reform, Human Technology Institute, 28 January 2025) 15 <https://www.uts.edu.au/globalassets/sites/default/files/2025-01/HTI-submission-Use-of-ADM-by-Government.pdf>.
-
Consultation 34 (Human Technology Institute); Anna Huggins, ‘Automated Processes and Administrative Law: The Case of Pintarich’, AusPubLaw: Australian Public Law (Web Page, 14 November 2018) <https://www.auspublaw.org/blog/2018/11/the-case-of-pintarich>.
-
Anna Huggins, ‘Addressing Disconnection: Automated Decision-Making, Administrative Law and Regulatory Reform’ (2021) 44(3) University of New South Wales Law Journal 1048, 1074 <https://corrigan.austlii.edu.au/au/journals/UNSWLawJl/2021/37.html>.
-
Sophie Farthing et al, Human Rights and Technology (Final Report, Australian Human Rights Commission, 2021) 71 <https://humanrights.gov.au/our-work/technology-and-human-rights/projects/final-report-human-rights-and-technology>.
-
Ibid 62 Rec 6.
-
Ibid.
-
Ibid.
-
Ibid 66.
-
Victorian Civil and Administrative Tribunal Act 1998 (Vic) s 4.
-
Yee-Fui Ng and Maria O’Sullivan, ‘Deliberation and Automation – When Is a Decision a “Decision”?’ (2019) 26 Australian Journal of Administrative Law 21, 31.
-
Samuel White, ‘Authorisation and Accountability of Automated Government Decisions under Australian Administrative Law’ (2021) 102 Australian Institute of Administrative Law Forum 84, 92–93 <https://www.austlii.edu.au/au/journals/AIAdminLawF/2021/12.pdf> For example the Social Security (Administration) Act 1999 (Cth) s 6A, deems certain computer-generated decisions to be made by the Secretary.
-
Ibid 95.
-
Yee-Fui Ng and Maria O’Sullivan, ‘Deliberation and Automation – When Is a Decision a “Decision”?’ (2019) 26 Australian Journal of Administrative Law 21, 34.
-
Samuel White, ‘Authorisation and Accountability of Automated Government Decisions under Australian Administrative Law’ (2021) 102 Australian Institute of Administrative Law Forum 84, 96 <https://www.austlii.edu.au/au/journals/AIAdminLawF/2021/12.pdf>; Monika Zalnieriute, Lyria Bennett Moses and George Williams, ‘The Rule of Law and Automation of Government Decision‐Making.’ (2019) 82(3) Modern Law Review 425, 445 <https://onlinelibrary.wiley.com/doi/abs/10.1111/1468-2230.12412>.
-
Samuel White, ‘Authorisation and Accountability of Automated Government Decisions under Australian Administrative Law’ (2021) 102 Australian Institute of Administrative Law Forum 84, 96 <https://www.austlii.edu.au/au/journals/AIAdminLawF/2021/12.pdf>.
-
Applied as a law of Victoria by the Legal Profession Uniform Law Application Act 2014 s 4.
-
Legal Profession Uniform Law Application Act 2014. These rules are supported by the Legal Profession Uniform Law Australian Solicitors’ Conduct Rules 2015 and Legal Profession Uniform Conduct (Barristers) Rules 2015.
-
‘About the Board and Commissioner’, Victorian Legal Services Board and Commissioner (Web Page, 31 July 2025) <https://lsbc.vic.gov.au/about-us/board-and-commissioner/about-board-and-commissioner>.
-
Submission 6 (Victorian Legal Services Board and Commissioner). Consultation 5 (Victorian Bar Association).
-
For an analysis of the concept of technology neutrality in law see Lyria Bennett Moses, ‘Recurring Dilemmas: The Law’s Race to Keep Up With Technological Change’ [2007] University of New South Wales Faculty of Law Research Series 21 <https://www5.austlii.edu.au/au/journals/UNSWLRS/2007/21.html>.
-
The Law Society of NSW, Legal Practice Board of Western Australia, and Victorian Legal Services Board and Commissioner, Statement on the Use of Artificial Intelligence in Australian Legal Practice (Statement, 26 March 2025) <https://lsbc.vic.gov.au/news-updates/news/statement-use-artificial-intelligence-australian-legal-practice>.
-
Legal Profession Uniform Conduct (Barristers) Rules 2015 r 4(a); Legal Profession Uniform Law Australian Solicitors’ Conduct Rules 2015 r 3.
-
Legal Profession Uniform Conduct (Barristers) Rules 2015 r 4(c); Legal Profession Uniform Law Australian Solicitors’ Conduct Rules 2015 r 4.1.3.
-
Legal Profession Uniform Conduct (Barristers) Rules 2015 r 8(b) and 8(c); Legal Profession Uniform Law Australian Solicitors’ Conduct Rules 2015 r 5.1.2.
-
Legal Profession Uniform Conduct (Barristers) Rules 2015 r 8(c); Legal Profession Uniform Law Australian Solicitors’ Conduct Rules 2015 r 5.1.2.
-
Legal Profession Uniform Conduct (Barristers) Rules 2015 r 24; Legal Profession Uniform Law Australian Solicitors’ Conduct Rules 2015 r 19.1.
-
Legal Profession Uniform Conduct (Barristers) Rules 2015 r 35; Legal Profession Uniform Law Australian Solicitors’ Conduct Rules 2015 r 4.1.
-
Dayal [2024] FedCFamC2F 1166, [8].
-
Ibid [21]; See also Valu v Minister for Immigration and Multicultural Affairs (No 2) [2025] FedCFamC2G 95, [38], in which the lawyer was referred to the Office of the NSW Legal Services Commissioner, having provided the court with non-existent AI generated citations and authorities.
-
‘Statement on the “Mr Dayal” Matter’, Victorian Legal Services Board and Commissioner (Web Page, 2 September 2025) <https://www.lsbc.vic.gov.au/news-updates/news/statement-mr-dayal-matter>.
-
Ibid.
-
Ibid.
-
Some examples include Hussein v Canada (Immigration, Refugees and Citizenship) [2025] FC 1060 (CanLII), [34]-[43]; Ko v Li [2025] ONSC 2766 (CanLII), [2]-[32]; Ayinde v London Borough of Haringey [2025] EWHC 1040, [58]–[72]; Mata v Avianca, Inc, 678 F.Supp.3d 443 (2023), [43]; Loyer v Waye County Michigan, (ED Mich, 21-12589, 21 March 2025) slip op 3, n 2.
-
Consultation 5 (Victorian Bar Association).
-
Submission 16 (Law Institute Victoria) citing The Law Society of NSW, Legal Practice Board of Western Australia, and Victorian Legal Services Board and Commissioner, Statement on the Use of Artificial Intelligence in Australian Legal Practice (Statement, 26 March 2025) <https://lsbc.vic.gov.au/news-updates/news/statement-use-artificial-intelligence-australian-legal-practice>.
-
‘Rule 1.1 Competence – Comment’, American Bar Association (Web Page, 2025) <https://www.americanbar.org/groups/professional_responsibility/publications/model_rules_of_professional_conduct/rule_1_1_competence/comment_on_rule_1_1/>.
-
American Bar Association, Standing Committee on Ethics and Professional Responsibility, Formal Opinion 512: Generative Artificial Intelligence Tools (Report, 29 July 2024) 1 <https://www.americanbar.org/content/dam/aba/administrative/professional_responsibility/ethics-opinions/aba-formal-opinion-512.pdf>.
-
Consultation 35 (Victoria Legal Aid).
-
Submission 6 (Victorian Legal Services Board and Commissioner).
-
Submission 22 (Centre for the Future of the Legal Profession and UNSW Law and Justice). Consultations 12 (County Court of Victoria), 23 (Dr Fabian Horton).
-
Consultation 5 (Victorian Bar Association).
-
Consultation 33 (Law Firms Australia).
-
Submissions 6 (Victorian Legal Services Board and Commissioner), 23 (Victorian Bar Association). Consultation 4 (Victorian Legal Services Board and Commissioner).
-
Consultation 4 (Victorian Legal Services Board and Commissioner).
-
Ibid; This position was also put forward by representatives of Microsoft: Consultation 25 (Microsoft).
-
Submission 23 (Victorian Bar Association).
-
Legal Profession Uniform Law Application Act 2014 ss 9–10.
-
‘Unqualified Legal Practice’, Victorian Legal Services Board and Commissioner (Web Page, 2 August 2022) <https://lsbc.vic.gov.au/consumers/registers-lawyers/unqualified-legal-practice>.
-
Vivi Tan, Jeannie Paterson and Julian Webb, ‘Generative AI in Small Value Consumer Disputes: Reviving Not Resolving Challenges of Design and Governance in Online Dispute Resolution’ (2025) 48(4) University of New South Wales Law Journal (forthcoming) 14 <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5313052> See generally, Gino Dal Pont, ‘Unauthorised Practice of Law’ (2018) 45 Australian Bar Review 224 <https://figshare.utas.edu.au/articles/journal_contribution/Unauthorised_practice_of_law/22973501/1>. See also, Mia Bonardi and L Karl Branting, ‘Certifying Legal Assistants for Unrepresented Litigants: A Global Survey of Access to Civil Justice, Unauthorised Practice of Law’ (2025) 26(1) The Columbia Science & Technology Law Review 34 <https://doi.org/10.52214/stlr.v26i1.13336>.
-
Submission 7 (Dr Natalia Antolak-Saper).
-
Ibid.
-
Ibid.
-
Solicitors Regulation Authority (UK), SRA Approves First AI-Driven Law Firm (News Release, 6 May 2025) <https://www.sra.org.uk/sra/news/press/garfield-ai-authorised/>.
-
AI Policy Consortium for Law & Courts, Modernizing Unauthorized Practice of Law Regulations to Embrace AI-Driven Solutions and Improve Access to Justice (White Paper, National Center for State Courts and Thomson Reuters Institute, August 2025) 2 <https://www.ncsc.org/sites/default/files/media/document/AI_UPL_WhitePaper.pdf>.
-
Ibid 5–6.
-
Ibid 5.
-
Attorney-General’s Department (Cth), Copyright and AI Reference Group – Transparency (Discussion Paper, September 2024) <https://www.ag.gov.au/rights-and-protections/publications/copyright-and-ai-transparency-discussion-paper>.
-
The Treasury (Cth), Review of AI and the Australian Consumer Law: Discussion Paper (Discussion Paper, Australian Government, October 2024) <https://treasury.gov.au/sites/default/files/2024-10/c2024-584560-dp.pdf>.
-
Crimes Amendment (Deepfake Sexual Material) Bill 2025 (NSW); Criminal Code Amendment (Deepfake Sexual Material) Bill 2024 (Cth).
-
New York State Bar Association Task Force on Artificial Intelligence, Report and Recommendations of the New York State Bar Association Task Force on Artificial Intelligence (Report, April 2024) 49.
-
Law Commission, Artificial Intelligence and the Law: A Discussion Paper (Report, 31 July 2025) 9 <https://lawcom.gov.uk/publication/artificial-intelligence-and-the-law-a-discussion-paper/>.
-
Ibid 10.
-
Ibid 11–12.
-
Ibid 4.
-
Ibid 9. The Law Commission discusses: ‘While the autonomy and adaptiveness of AI raises the possibility of liability gaps, it does not guarantee they will crystallise. With the many varied potential uses for AI in future, it is difficult to assess where such liability gaps may in fact arise, though they are more likely to occur in connection with highly autonomous and adaptive systems, given that other systems are likely to be more predictable and easier to control’.
|
|
