In the news: Responses to article "Don’t trust algorithms to predict child-abuse risk"

  • Date:

The letter beneath was featured in the Guardian on 19 September 2018. It is available online among other responses on the Guardian website.

"Your editorial (18 September) raises some fundamental questions about the nature of public services and professionals that work in these services. The potential for computerised aided decision-making cannot be ignored. Evidence, judgment and interventions and services need to be driven by the best of what we know, focussed on the needs and circumstances of the people who find themselves in difficulty and trusted and respected what is offered.

"When it comes to services for children and families, the role of the state is fundamentally defined by article 8 of the European convention on human rights – “the right to respect for one’s private and family life, home and correspondence”. Only when this conflicts with other basic rights does the state have a duty to intervene against the wishes of the individual or family.

"Child protection is one of these areas and there is a tension between the need to make services available to families– a home, income, food, health and education services – and the option to remove the child against the parent’s wishes if that is the only option to protect them. Social workers barely need algorithms to determine which families need homes, food and clothing. They may need decision-making support in identifying the best long-term plan for the child where this is not sufficient.

"We have a duty to support every child in every family. There can be no doubt that public policy should be closely aligned to that fundamental objective. My algorithm tells me that at the moment it does not."

John Simmonds
Director of policy, research and development, CoramBAAF