We’re proud to announce that Bianca McDuff has been promoted to Special Counsel in our Family & De Facto Law Team. Bianca recently celebrated 10 years in the legal profession, marking a decade of dedication, professionalism, and outstanding client service. Over...
News
Other News
Artificial Intelligence – Litigants Beware

As we all know, Artificial Intelligence, more particularly generative AI Chatbots (such as ChatGPT, Microsoft Copilot or Google Gemini) are in existence and now quite commonly used. But are they being used correctly?
Many self-represented litigants (and indeed some lawyers) are tending towards using this generative AI to assist in preparing submissions and other Court documents.
A recent case has arisen in which a self-represented litigant’s use of generative AI has called for criticism and explained the consequences of use of generative AI, without checking the references made to make sure that they are accurate and do in fact say what it says they do. In the case of Chief Executive, Department of Justice v Wise & Wise Real Estate Pty Ltd & Anor[1], the Respondents, who were subject to proceedings that they be disqualified from holding any form of Licence or Registration Certificate as a real estate agent, had made an Application to the Tribunal for a Tribunal member to be recused (stand down from the case) on various grounds. In fact, this Application was the fourth after a series of three Applications along essentially the same grounds for a stay of proceedings. The particular facts of the case do not need going over as the significance of the case is highlighted in the decision given by the Tribunal concerning the Respondents’ use of generative AI to produce submissions.
The Respondents’ submissions referred to various Tribunal decisions in support of the Application. However, the cases referred to in fact did not exist at all and the citations given for the alleged cases were in fact citations for other QCAT cases which had no relevance to the Application at all. Somewhat surprisingly, the Respondent still relied on these legal fake cases despite being told previously in respect of their previous Applications about the cases not being in existence or referring to citations of irrelevant cases.
The case highlights that all the Queensland Courts have issued Guidelines for the Use of Generative Artificial Intelligence (AI) Guidelines for Responsible Use by Non-Lawyers. It is useful to summarise the Guidelines to explain the dangers of using generative AI in creation of Court documents, submissions, etc.
It is important to note that generative Chatbots are not “intelligent” in the ordinary human sense. What they do, in essence, is to make predictions of what words to use to form sentences or parts of sentences in the context of what is being asked. However, generative AI has no understanding of what any word they use means, in particular, whether in fact it is true what has been said. In short, it is simply a probability-based generation of words that look like a human response and, using resources from anywhere on the internet essentially, rather than reliable legal sources. As set out in the Guidelines, generative Chatbots cannot:
- understand the unique fact situation in a case;
- understand your cultural and emotional needs;
- understand the broader Australian social and legal context of the question;
- predict the chance of success or the outcome of a case;
- be trusted to always provide legal or other information that is relevant, accurate, complete, up-to-date and biased[2].
Accordingly, generative AI cannot be trusted essentially and must be thoroughly checked for accuracy.
More importantly, however, the case highlights the consequences of using generative AI in producing, say, submissions which are false and inaccurate. The Tribunal, at paragraph 56 of its decision, said:
“The Respondents should take heed of the warnings as raised by the Deputy President in LJY v Occupational Board of Australia (citation given) that:
- ‘Including non-existent information in Submissions or other material filed in the Tribunal weakens their arguments. It raises issues about whether their Submissions can be considered as accurate and reliable. It may cause the Tribunal to be less trusting of other Submissions which they make. …’”.
In other words, the Tribunal will be less likely to accept any arguments put forward by persons who use generative AI which creates false and inaccurate submissions, even if some other submissions are valid.
It is strongly suggested in the Guidelines you should consult a lawyer, and we strongly recommend, that if you are self-represented, you should consult RMO Law in relation to submissions intended to be filed in the Tribunal or, indeed, any of the Queensland Courts, especially if you have used generative AI.
At RMO Law, we have the resources to check whether the matters raised in the submissions are in fact legally sound, based on over 50 years of experience of practice, our Commercial Litigation Lawyers and their research skills.
Our Lawyers are able to assist litigants in the Queensland Civil and Administrative Tribunal and other Queensland Courts (including the Supreme Court, District Court, Magistrates Courts, Family Court and the other Federal Courts) in relation to out of Tribunal work, such as submissions, Applications, etc. Leave or permission from the Court is usually given if there are complex legal issues and if there are serious consequences to you from the particular proceedings succeeding. We at RMO Law can assist in making such an application to the Court or Tribunal and help you with all your legal needs.
RMO Law – your trusted partner in Law for over 50 years.
[1] [2025] QCAT 222
[2] Page 3 of the Guidelines
This article is for your information and interest only. It is not intended to be comprehensive, and it does not constitute and must not be relied on as legal advice. You must seek specific advice tailored to your circumstances.
Get in Touch
