Blog competition joint winner – Molly Swanson on ‘Putting the ADM in ADMinistrative law’

17 Jun 2025

The 2024 AIAL Blog Competition winners are Molly Swanson for her blog entitled ‘Putting the ADM in ADMinistrative law’, which is published below, and John Molloy for his blog entitled ‘Emoji, invalidity and administrative review’, which was published earlier in the year. Congratulations to Molly and John.

Thanks to the many who submitted blogs to this competition. The entries were of a high standard and demonstrated the diversity of issues in current administrative law.

The process for selecting the winners was undertaken by AIAL national executive executive members  Janet Hope of the University of Canberra, Justice Rachel Pepper of the NSW Land and Environment Court, Margaret Tregurtha of IP Australia, and Robert Orr of AGS and ACAT, together with Mark Robinson SC of the NSW Bar.

Molly is an associate in the Supreme Court of Queensland and recently completed a Bachelor of Laws/Bachelor of Policy, Philosophy and Economics from Bond University. She completed an honours thesis there which focuses on transparency of automated decision-making by the executive and the implications on Australia’s administrative law framework. She is passionate about ensuring legal research is accessible, thus, the blog format drew her to enter the competition. Molly firmly believes that multi-media formats, such as blogs, offer the opportunity to generate legal dialogue that is accessible to the broader public – this is particularly significant in areas like administrative law that have far-reaching implications.

Molly’s blog is here:

 

Putting the ADM in ADMinistrative law

96% of Australians want conditions placed on organisations using artificial intelligence (AI) to make decisions that might affect them (OAIC Australian Community Attitudes to Privacy Survey 2023, page 77). The overwhelming public support for regulating automated decision-making (ADM) is consistent with recent commission reports (see here and here). This blog aims to generate discussion about where ADM fits into the Australian administrative law framework. This will be achieved by exploring whether a right to explanation should be included in the Freedom of Information Act 1982 (Cth) (FOI).

  1. What is the right to explanation?

The right to explanation is the right to request a meaningful explanation of a decision generated by ADM. This includes information about the algorithmic processes and the inputs used. It aims to facilitate algorithmic transparency and ensure that those affected by ADM are not prevented from accessing reasons for those decisions. This supports procedural fairness by increasing accountability and oversight of AI technology used in ADM.

The right to explanation was recently included in the final text of the European Union’s Artificial Intelligence Act.  Under Article 86(1) an individual has the right to obtain ‘clear and meaningful explanations’ of AI’s role in the decision-making process where:  

  • a high-risk AI System was used; and
  • the relevant decision produced legal effects or similarly significantly affects.

The condition that the explanation be ‘clear and meaningful’ is important because simply providing access to complex ADM processes will not reduce opacity. This was demonstrated by the responses provided to FOI requests regarding the Robodebt Scheme. As highlighted by the  Royal Commission into the Robodebt (page 329), the reams of Excel spreadsheets provided under the FOI framework were number-heavy and hard to decipher. Thus, by focusing on providing a ‘meaningful and clear’ explanation, the right to explanation promotes pragmatic transparency.

Implementing a right to explanation, or a similar right to meaningful information, in Australia has been considered by the Australia Human Rights Commission (page 37) and the  Privacy Act Review (page 192). This blog takes a slightly different approach by considering the application of a right to explanation in the FOI framework.

  1. Could the right to explanation fit into Australia’s FOI framework?

Yes. The right to explanation speaks to principles at the core of Australia’s administrative law framework – open and accountable government.  The right to explanation aims to improve transparency and accountability in ADM, this would directly advance the objects of the FOI Act by:

  • ‘increasing public participation in Government processes’; and
  • ‘increasing scrutiny, discussion, comment and review of the Government’s activities’ (FOI Act, section 3(2)).

Another key objective of Australia’s FOI framework is to increase proactive disclosure of Government information. The Government’s release of the COVIDSafe app’s draft source code demonstrated the benefits of releasing details of AI processes. This process facilitated feedback and improved public confidence in the app. Therefore, including a proactive obligation to publish explanations of agencies’ use of ADM in the FOI framework should also be considered.

  1. Further challenges and considerations

Application of the right to explanation under the FOI framework would be limited by:

  • The FOI Act’s focus on documents – The FOI Act provides access to documents, not information, held by the Government (section 11). This poses unique challenges to accessing information around ADM because algorithmic processes are not typically comprehensively documented. To support the application of a right to explanation, reform should be considered to:
    • increase the scope of the FOI Act to apply to information (as seen in the United Kingdom); or
    • impose an obligation on agencies to document clear explanations of ADM models that would be accessible under the FOI regime.
  • The cabinet exemption – Section 34 provides a broad exemption to FOI access for cabinet documents. The Robodet Royal Commission observed that this was a significant barrier to accessing ADM information and should be repealed (further discussion on this point can be found here).

The black box issue is a broader challenge associated with the right to explanation. The issue arises when algorithms create AI models directly from data. In these circumstances, humans, even the algorithm’s designer, cannot understand how it functions. This undermines the concept that a ‘clear and meaningful’ explanation of ADM can be provided. This is a significant practical barrier to achieving effective regulatory reform. However, there are options for overcoming the challenges posed by the black box issue – see, for example, interpretable models.

Conclusion

The FOI Act is the key mechanism for ensuring transparency in Government decision-making, and it is vital that it is reformed to reflect the increased complexity introduced by ADM. Including the right to explanation is a promising place to start because it would provide meaningful explanations of complex AI processes and generate a cultural shift towards transparency in the Government’s use of ADM.

Molly Swanson


Back to News