Our website uses cookies so we can analyse our site usage and give you the best experience. Click "Accept" if you’re happy with this, or click "More" for information about cookies on our site, how to opt out, and how to disable cookies altogether.
We respect your Do Not Track preference.
Predictive risk modelling (PRM) is a hot privacy topic. The neglect and abuse of children is a social issue that has understandably galvanised public interest, the news media and government agencies. One of the ways the government is considering tackling this high priority issue is by using computer programs that make predictions about the levels of risk to a child.
The Ministry of Social Development (MSD) describes PRM as the “use of automated tools to help identify people at risk early enough to allow for effective intervention” and it has been exploring PRM as a way to identify children that might be at risk from abuse or neglect.
PRM was recently the subject of this Radio New Zealand Insight documentary. It uses large amounts of electronic administrative data collected by welfare and health agencies about people that interact with them. This data can then be linked within and across systems.
A feasibility study on the MSD website lists the main predictors of risk as:
Other variables include indicators of mental health, location, sentencing history, family violence, single-parent status and caregiver age.
The Ministry says the use of PRM data can help social service agencies take steps to prevent the maltreatment of vulnerable children. Such actions can include assessing what interventions for high-risk groups can best reduce occurrences of maltreatment. PRM could also inform proactive targeting of social services to higher-risk populations and individuals.
Inaccuracies in the administrative data may create inaccurate risk predictions. This in turn might lead to people being unnecessarily targeted by welfare and enforcement agencies or, alternatively, they’re overlooked or not assessed as being at-risk when they really are. Due to the volume of information and number of people involved in social service provision, the likelihood of inaccuracies seems high.
PRM also has the potential to unjustifiably affect the privacy children and their families. If PRM data is disaggregated to an individually identifiable level, then people may be flagged as ‘high-risk’ in public service systems forever more.
The potential for an ‘at-risk’ label becoming a self-fulfilling prophecy is also worth considering. When people expect children’s behaviour to match their ‘at-risk’ label, their expectations can, in fact, increase the likelihood of poorer outcomes. It removes individuals’ ability to control their identity from birth because they and their family have already been judged likely to be deficient.
In our submission to Parliament’s Social Services Committee on the Vulnerable Children Bill in 2013, we agreed with the surrounding consensus that we must do better with vulnerable children. Too many children are being harmed in situations where we have the opportunity to protect them. We argued that proposed actions based on PRM could challenge legal obligations to ensure ‘information is accurate, complete, relevant and up-to-date’ before disclosing to third parties.
With limited social service resources, the Ministry does need a way to prioritise resource allocation and further work is being done by MSD to decrease the rates of both false negatives and false positives as a result of PRM. The more accurate PRM is, the more use it has as a way to increase the timeliness of preventative responses that will help keep more vulnerable children safe.
In its current state, PRM needs further research and development to make sure that its predictions do not create more harm than good.
It is for these reasons we urge caution in embracing predictive risk modelling as a universal remedy for helping vulnerable children. Modelling has potential at a macro level to inform policy makers about the bigger picture of where we find distress and privation. But there are obvious dangers in applying a risk prediction model to target individuals. Testing such theories a also carries risks as noted in by the Minister of Social Development in this media report.
We have a responsibility to make sure children are safe but we also have a responsibility to preserve individual freedom and give people the ability to confound our expectations of them.
Image credit - Tatiana Bulyonkova, Creative Commons licence
Back