Authored by


Deakin University School of Humanities and Social Sciences; Australian Privacy Foundation

Social (in)security and social (in)justice: Automation in the Australian welfare system


This report examines automation in social security settings in Australia using the case study of the “Better Management of the Social Welfare System” initiative,[2] also known as the “Online Compliance Intervention” or colloquially as “RoboDebt”. Three years ago, the Australian Department of Human Services (DHS), via Centrelink, the agency responsible for administration of social security benefits, launched the automated debt identification programme to detect income reporting discrepancies and the “overpayment” of social security benefits. In practice, the programme aims to achieve savings on social security via the raising of government debts for suspected “overpayment” of welfare benefits, while simultaneously introducing obstacles to contest them.

This programme clearly demonstrates the potential for abuse when big data and automation are deployed against vulnerable individuals. In a wider context of government austerity and cost cutting in both social security and social security services, it reveals the social justice impacts of automated systems that are explicitly designed to target vulnerable individuals. Yet this case study also offers a glimmer of hope regarding the central role that grassroots activism and community-led campaigns can play in countering unjust automated technological systems with human voices. This specifically relates to how conversations about automated technologies can be community focused and inclusive.


In July 2016, the DHS launched an automated debt-raising programme with an algorithm identifying the suspected “overpayment” of government welfare benefits. Request for information letters were sent to welfare recipients requiring them to prove that they did not have a debt, with unconfirmed or unpaid debts subject to an automated recovery process, which included withholding social security payments or outsourcing to private debt collectors.[3] A 10% debt recovery fee was added to the alleged debt.[4]

Between November 2016 and March 2017, 230,000 letters were sent to welfare recipients directing them to an online portal requiring them to prove they did not have a debt. The debt notices were sent out at a rate of approximately 20,000 each week. It was estimated that 20-40% of the debt letters were false positives due to errors in the data used in the system, and the processes that involved averaging annual income and matching it to fortnightly income periods.[5]

Two subsequent inquiries were held: one referred by the Senate Community Affairs References Committee in response to a grassroots community campaign (#NotMyDebt, discussed below), and one initiated by the Commonwealth Ombudsman[6] in response to an increasing number of complaints. The Senate inquiry recommended suspension of the system until issues of procedural fairness were addressed.[7] The Commonwealth Ombudsman recommended assistance and support be provided to vulnerable people, and consultation with stakeholders about the difficulties that vulnerable groups face in interacting with the system.[8]

The fiscal context in which the programme was implemented is of central relevance. In 2015-2016, the Australian government forecast it would save AUD 1.7 billion over five years via the identification of welfare overpayments. In 2016-2017, the government then indicated it would achieve AUD 3.7 billion in savings over four years. Within the first six months of the 2016-2017 financial year, the DHS had attempted to recover AUD 300 million of social security “overpayments”, and had secured AUD 24 million in repayments.[9] Given the small amounts that the DHS had recovered, it has been questioned whether the government will achieve the forecasted savings via this programme.[10] Casting further doubt on whether the system will achieve any savings, in June 2018 it was reported that the DHS had spent AUD 375 million on the automated debt recovery programme.[11] Despite this, at the time of writing, the system continues to send out “RoboDebts” to meet the financial performance targets set by government.[12]

The “RoboDebt” disaster

This case study has clear implications for inclusivity and social justice involving the explicit targeting of automated technology on vulnerable populations.[13] There are lessons for the design of automated systems to ensure that they produce accurate findings, with the need for clear avenues for review of automated decisions, especially when vulnerable populations are concerned. Importantly, the case study highlights the role that grassroots activism and community-led initiatives can play in foregrounding human impacts of automated systems. There is a need to engage with the community to evaluate social justice and human rights impacts of automated technology deployed in public settings for public service provision prior to their implementation. This should involve consultation about whether these types of programmes should be implemented at all, and if so, whether appropriate checks and balances are introduced such as risk, impact, appeal and accountability processes.

Targeting automated technology at vulnerable populations

This is an example of welfare surveillance to further marginalise people who have received social security benefits. It demonstrates that the use of automated technology can serve to perpetuate structural and administrative violence against those who are socially excluded and financially disenfranchised.[14] There are significant social justice issues associated with explicit design of a programme involving automated technologies that target vulnerable individuals facing financial hardship. The automated debt-raising system began sending out debt notices seven weeks before Christmas, already a time where financial pressure is high, especially for those receiving welfare benefits.[15]

The design of the system created issues of administrative justice, procedural fairness, and the rule of law.[16] Numerous challenges were experienced by individuals who attempted to challenge debts, with the onus placed on them to prove they did not have a debt. This reverses the onus of proof onto vulnerable people (and thus overturns the presumption of innocence), which requires them to navigate complex bureaucratic and technical systems to contest alleged debts. According to law, the onus to prove the debt existed technically remained with the DHS: “absent sufficient evidence of an actual debt based on the proper fortnightly data, there can be no legally sustainable decision to raise and recover the debt as speculated from averaging.”[17] Despite this, the “scheme targets and raises debts in every case where the person cannot disprove the possible overpayment.”[18] Further, the system is computerised and presumably “objective” so it is harder to argue that it is wrong.

Given that debt notices are being sent out by a government agency, extending six years into the past, when individuals are told by that same agency that they are only required to maintain records for six months, those who are “the least literate, least powerful, and most vulnerable” may accept the debt is true and seek to pay it off, even when there is a high probability it is erroneous.[19] This places additional financial burdens on those in receipt of welfare benefits, and “[i]n light of the likely vulnerability of so many Centrelink clients, this is a heavy burden indeed.”[20]

Design of the debt-raising system

The debt-raising programme has issues in design and technical considerations, including data matching on the basis of inaccurate data, mismatching data, and the averaging of annual income data that is subsequently matched to fortnightly reporting periods to determine if overpayment has occurred during that period (due to under-reporting of fortnightly income data by the welfare recipient). At the initial design stages, a comprehensive risk assessment and consultation with experts and community groups should have identified such issues. However, no consultation occurred.

The averaging of annual income and matching it to fortnightly income periods involved the extrapolation and creation of an assumed fortnightly income average, when actual fortnightly income may fluctuate.[21] The Senate inquiry recommended that “the department resume full responsibility for calculating verifiable debts (including manual checking) relating to income support overpayments, which are based on actual fortnightly earnings and not an assumed average.”[22] This is significant, as 20-40% of the automated debt notices raised by the system were estimated to be erroneous.[23] It also raises the question of whether the DHS took reasonable steps in ensuring the accuracy of the information used for the purposes of debt recovery, whether individuals were able to correct their personal information, or even whether they knew they had a right to do so under Australian privacy law.[24]

Individuals experienced numerous obstacles in speaking to a human in order to correct their personal information, or contest the accuracy of the automatically identified debts. In the two years following the introduction of the programme, millions of calls to Centrelink went unanswered. In 2016-2017, the year immediately following the introduction of RoboDebt, there were 55 million unanswered calls to Centrelink. In 2017-2018, 48 million calls went unanswered.[25] Many individuals waited for hours on hold.[26] The inability to speak with a human undoubtedly created challenges for having false debts resolved. The DHS had a clear conflict of interest as “the harder it is for people to navigate this system and prove their correct income data, the more money the department recoups.”[27] So, individuals paid debts, even though they did not believe they owed a debt, because it was “too difficult or too stressful to challenge the purported debt raised against them. Others simply paid the purported debt because they thought the government wouldn’t make a mistake.”[28]

Fairness and transparency

The use of big data and automated processes allowed for the government to implement a system at national scale with many inherent flaws. There was an absence of fairness in the entire operation of the system including: lack of consultation with stakeholders; no testing or risk assessment processes; the process of automated averaging and matching of data; millions of unanswered calls to DHS; no information being provided to individuals when they sought to challenge a debt; the imposition of a fee for debt recovery; and the fact that the programme extended six years into the past, when individuals are advised they only need to keep records for six months.[29] Accordingly, one of the recommendations arising from the Senate inquiry was that clear and comprehensive advice on reassessment, review rights and processes should be made available to impacted individuals.[30]

There are ongoing issues of explainability and transparency. The debt-raising letters contained no information about how debts were calculated, nor were individuals informed that the debt could be false, or how to contest the calculations.[31] The DHS has refused to release documents that relate to the operation of the system, including risk assessment processes, issues papers, and ICT system reports. In June 2019, the Australian Information Commissioner ruled the DHS must release documents following freedom of information requests first lodged in 2017, yet the DHS later appealed this decision to the Administrative Appeals Tribunal. The DHS challenged the release of these documents claiming that publication may pose security risks and external threats, or as one media article stated: “Human Services claimed people wouldn’t pay debts if informed about its IT systems.”[32]

Doxing dissenters, chilling critics

In reaction to an individual complaining about the system online, the DHS publicly released their personal information including welfare history to the media.[33] This is an outrageous and reprehensible breach of individual privacy in an attempt to suppress public criticism. In response, a new privacy code is being developed for Australian public servants[34] and it was acknowledged by the Senate inquiry that the system “disempowered people, causing emotional trauma, stress and shame. This was intensified when the Government subsequently publicly released personal information about people who spoke out about the process.”[35]

#NotMyDebt: Human voices and community initiatives

Notwithstanding the department’s attempts to chill critics, a grassroots campaign known as #NotMyDebt was launched by volunteers led by Lyndsey Jackson, chair of Electronic Frontiers Australia, and united by their “deep concern about the injustice of Centrelink's robo-debt fiasco and the impact it's having on the lives of ordinary citizens.”[36] The #NotMyDebt campaign demonstrates avenues for making conversations about artificial intelligence more inclusive and human-centred. The campaign collects, houses and disseminates individual stories of false debts, and provides information and advice to individuals who have received a RoboDebt notice. To date, the #NotMyDebt campaign has collected almost 900 anonymous stories that foreground individual human voices and show the human and social impact of the automated system.


This case study casts light on the failings of big data and automated technology when deployed in public settings for cost-cutting purposes. The RoboDebt programme encapsulates “an error-riddled, unaccountable and politically-driven process.”[37] It clearly demonstrates the potential for abuse when big data and automated technology are deployed against vulnerable individuals. Rather than advocating for technical fixes to perfect and optimise the operation of these types of automated systems and technologies, there is a greater need to question whether they should be developed and deployed for certain objectives at all. [38] This involves assessment of the desired aims of the system, and consideration of specific social contexts for their use. In the present case, the aim of the system was to explicitly target vulnerable individuals with debts in an attempt to achieve savings on social security. Given this, it is perhaps unsurprising that the system was designed and operated as intended: it raised large numbers of (erroneous) debts, simultaneously introduced insurmountable obstacles to understand and contest them, and publicly targeted those who spoke out against it.

This case study also demonstrates that automated technology is not merely a technical fix deployed to increase administrative efficiency, but is socially and politically embedded.[39] Automated verdicts have human victims. Recognition that automated technology and systems are socially contingent[40] necessitates proper a priori evaluation of the possible social justice and human rights consequences. It requires direct consultation with the community, and involves questioning the type of society that we wish to create, and the role of automated technology within it. As the #NotMyDebt campaign launched in response to RoboDebt has shown, putting human voices back into focus can be an effective strategy of demonstrating not only individual but also wider societal impacts of automated technology.

Action steps

The following lessons are suggested by the debt-raising programme:

  • Consult the community about the type of society that we wish to create, and the role of automated technology within it.
  • Conduct and release risk and impact assessments, including assessment of the impacts for vulnerable individuals and groups, prior to the deployment of automated technology, and periodically during deployment.
  • Ensure there are appropriate constraints, checks and balances, and mechanisms for review. This may include approaches such as Article 22 of the EU General Data Protection Regulation[41] that grants a right not to be subject to automated decisions, or transparency provisions such as a right to understand the basis of decisions, and availability of non-automated remedies, for example, the ability to speak to humans during reviews and appeals.
  • Support community initiatives and grassroots campaigns that foreground human voices and the human impacts of automated systems.


[1] I wish to acknowledge Ian Warren, Angela Daly, and also Lyndsey Jackson of the #NotMyDebt campaign, for providing feedback on previous versions of this report.


[3] Knaus, C. (2017, 11 April). Almost half of all Centrelink robo-debt cases sent to private debt collectors. The Guardian.

[4] Commonwealth Ombudsman. (2017). Centrelink’s automated debt raising and recovery system: A report about the Department of Human Services’ online compliance intervention system for debt raising and recovery.

[5] Community Affairs References Committee. (2017). Senate inquiry into the design, scope, cost-benefit analysis, contracts awarded and implementation associated with the Better Management of the Social Welfare System initiative. Canberra: Commonwealth of Australia.

[6] Commonwealth Ombudsman. (2017). Op. cit.; see also: Commonwealth Ombudsman. (2019). Centrelink’s Automated Debt Raising and Recovery System: Implementation Report.

[7] Community Affairs References Committee. (2017). Op. cit.

[8] Commonwealth Ombudsman. (2017). Op. cit.

[9] Ibid.

[10] Ibid.

[11] Barbaschow, A. (2019, 13 February). Human Services has spent AU$375m on ‘robo-debt’. ZDNet.

[12] Henriques-Gomes, L. (2019, 29 May). Centrelink still issuing incorrect robodebts to meet targets, staff claim. The Guardian.

[13] Mann, M., & Daly, A. (2019). (Big) Data and the North-in-South: Australia’s Informational Imperialism and Digital Colonialism. Television and New Media, 20(4), 379-395.

[14] Ibid.

[15] Community Affairs References Committee. (2017). Op. cit.

[16] See for example: Carney, T. (2018). The New Digital Future for Welfare: Debts Without Legal Proofs or Moral Authority? UNSW Law Journal Forum, March, 1-16; Galloway, K. (2017). Big data: A case study of disruption and government power. Alternative Law Journal, 42(2), 89-95; Hogan-Doran, D. (2017). Computer says “no”: Automation, algorithms and artificial intelligence in Government decision-making. The Judicial Review, 13, 1-39; Zalnieriute, M., Bennet-Moses, L., & Williams, G. (2019). The rule of law and automation of government decision-making. Modern Law Review, 82(3), 425-455.

[17] Carney, T. (2018). The New Digital Future for Welfare: Debts Without Legal Proofs or Moral Authority? UNSW Law Journal Forum, March. (Emphasis in original.)

[18] Ibid. (Emphasis in original.)

[19] Ibid.

[20] Galloway, K. (2017). Big data: A case study of disruption and government power. Alternative Law Journal, 42(2), 89-95.

[21] Carney, T. (2018). Op. cit.

[22] Community Affairs References Committee. (2017). Op. cit.

[23] Ibid.

[24] Hutchens, G. (2017, 18 May). New privacy code for public servants after Centrelink ‘robo-debt’ debacle. The Guardian.

[25] Dingwall, D. (2018, 30 October). ‘No party poppers’ for Centrelink’s 48 million unanswered calls. The Sydney Morning Herald.

[26] Community Affairs References Committee. (2017). Op. cit.

[27] Ibid.

[28] Ibid.

[29] Ibid.

[30] Ibid.

[31] Henman, P. (2017, 4 September). The computer says ‘DEBT’: Towards a critical sociology of algorithms and algorithmic governance. Zenodo.

[32] Stilgherrian. (2019, 7 June). Human Services claimed people wouldn’t pay debts if informed about its IT systems. ZDNet.

[33] Sadler, D. (2018, 30 May). A ‘chilling effect’ on free speech.

[34] Hutchens, G. (2017, 18 May). Op. cit.

[35] Community Affairs References Committee. (2017). Op. cit.


[37] Henman, P. (2017, 4 September). Op. cit.

[38] See for example: Powles, J., & Nissenbaum, H. (2018, 7 December). The Seductive Diversion of ‘Solving’ Bias in Artificial Intelligence. Medium.

[39] Henman, P. (2017, 4 September). Op. cit.

[40] Ibid.


This report was originally published as part of a larger compilation: “Global Information Society Watch 2019: Artificial intelligence: Human rights, social justice and development"
Creative Commons Attribution 4.0 International (CC BY 4.0) - Some rights reserved.
ISBN 978-92-95113-12-1
APC Serial: APC-201910-CIPP-R-EN-P-301