Generative AI
Referencing generative AI
Can I use generative AI for referencing?
Generative AI tools can be asked to provide references or citations – however these are often created based simply on what would look like a valid reference. If you ask an AI tool to provide a list of references or readings, the list it provides may not be real.
How do I reference something created by generative AI
There is no specific guidelines for citing ChatGPT or other generative AI in AGLC4.
The editors of Melbourne University Law Review and Journal of International Law who produce AGLC are providing the following advice as “interim guidance” because this information is not officially part of AGLC. Their advice is to broadly follow rule 7.2 which deals with Written Correspondence.
Generative AI tools like ChatGPT cannot accurately cite their own sources. Any references they provide may be false or non-existent – you should always check the original source for any references that are generated.
References should provide clear and accurate information for each source and should identify where they have been used in your work.
More information about generative artificial intelligence:
- Artificial Intelligence (AI)A guide to understanding AI and how to work with it responsibly.
Footnotes |
1 Output from [program], [creator] to [recipient], [full date]. 1 Output from ChatGPT, OpenAI to John Smith, 23 February 2023. Discursive text may be used in the footnote to provide information about the prompts used to generate the output, in accordance with rule 1.1.5, e.g. 2 Output from ChatGPT, OpenAI to John Smith, 10 March 2023. The output was generated in response to the prompt, 'Provide an overview of the creation of the Australian Guide to Legal Citation': see below Appendix A. |
Bibliography |
Output from ChatGPT, OpenAI to John Smith, 23 February 2023 Output from ChatGPT, OpenAI to John Smith, 10 March 2023 |
Notes |
|