The charter might not seem a huge deal. Yet overseas experience suggests it could save lives.
Shaw’s press release says the charter will “give New Zealanders confidence that data is being used safely and effectively across government.”
Make that: “parts of government”. The charter is not compulsory. A total of 21 government departments have signed. The biggest data users are there: Inland Revenue and The Ministry of Social Development are important. The New Zealand Defence Force has signed, the Police has not.
New Zealanders would be more confident they would not be on the wrong end of a rogue algorithm if the charter was compulsory across government.
Ethical data use
The charter draws on work by the head of Statistics NZ, Liz MacPherson. She also has the title of chief data steward. MacPherson has been working on ethical data use in government.
Last year the government looked at how it used algorithms. It decided they needed more transparency. In July it set up a Data Ethics Advisory Group.
The thinking behind the charter is sound enough. Government departments use vast amounts of data. At times the software used to sift the data is complex, although it can be straightforward at times.
This can work fine, but humans write algorithms. They can be biased or based on false premises. Algorithms can be broken. People using them can make bad decisions.
There are plenty of stories of algorithms serving up inaccuracies and discriminatory decisions. The process is opaque, government employees have been known to hide behind bad decisions. The logic used to feed algorithms is often kept secret from the public.
When this happens, the consequences can be dire. At times the most vulnerable members of society can be at risk.
One of the worst examples of how bad this gets is Australia’s so called Robodebt saga. Australians who had received welfare payments were automatically sent debt notices, often without explanation if data matching between different departments showed inconsistencies.
Many Robodebt demands were wrong. Fighting or even questioning the demands saw people descend into a Kafkaesque digital distopia. There were suicides as a result.
Agencies signing the charter commit to explaining their algorithms to the people on the receiving end. The rules used are supposed to be transparent and published in plain English. Good luck with that one.
Fit for purpose
Elsewhere the New Zealand charter wants algorithm users to “make sure data is fit for purpose” by “understanding its limitations” and “identifying and managing bias”. It sounds good, but there is a danger public servants might push the meaning of those words to the limit.
Any agency signing the charter has to give the public a point of contact for enquiries about algorithms. The charter expects agencies to offer a way of appealing against algorithm decisions.
There’s a specific New Zealand twist. The charter asks agencies to take Māori views on data collection into account. This is important. Algorithms tend to be written by people from other cultures and Māori are disproportionately on the wrong end of bad decisions.
One area not covered in the documents published at the launch is how agencies might deal with data that is manipulated by external agencies. Given that government outsources data work, this could be a problem. There may even be cases where external organisations use proprietary algorithms.