'Like swimming in crocodile waters' - Immigration officials' data analytics use

1:44 pm on 17 March 2020

Immigration officials are being accused of using data analytics and algorithms in visa processing - and leaving applicants in the dark about why they are being rejected.

Seal denied stamped on a document and fountain pen. Macro shot.

Photo: 123RF

One immigration adviser described how applicants unaware of risk profiling were like unwitting swimmers in crocodile infested waters.

The automatic 'triage' system places tourists, overseas students or immigrants into high, medium or low risk categories.

The factors which raise a red flag on high-risk applications are not made publicly available; Official Information Act requests are redacted on the grounds of international relations.

But an immigration manager has told RNZ that staff identify patterns, such as overstaying and asylum claim rates of certain nationalities or visa types, and feed that data into the triage system.

On a recent visit to a visa processing centre in Auckland, Immigration New Zealand assistant general manager Jeannie Melville acknowledged that it now ran an automated system that triages applications, but said it was humans who make the decisions.

"There is an automatic triage that's done - but to be honest, the most important thing is the work that our immigration officers do in actually determining how the application should be processed," she said.

"And we do have immigration officers that have the skills and the experience to be able to determine whether there are further risk factors or no risk factors in a particular application.

"The triage system is something that we work on all the time because as you would expect, things change all the time. And we try and make sure that it's a dynamic system that takes into account a whole range of factors, whether that be things that have happened in the past or things that are going on at the present time."

When asked what 'things that have happened in the past' might mean in the context of deciding what risk category an applicant would be assigned to, another manager filled the silence.

"Immigration outcomes, application outcomes, things that we measure - overstaying rates or asylum claim rates from certain sources," she said. "Nationality or visa type patterns that may have trended, so we do some data analytics that feed into some of those business rules."

Humans defer to machines - professor

Professor Colin Gavaghan, of Otago University, said studies on human interactions with technology suggested people found it hard to ignore computerised judgments.

"What they've found is if you're not very, very careful, you get a kind of situation where the human tends just to defer to whatever the machine recommends," said Prof Gavaghan, director of the New Zealand Law Foundation Centre for Law and Policy in Emerging Technologies.

"It's very hard to stay in a position where you're actually critiquing and making your own independent decision - humans who are going to get to see these cases, they'll be told that the machine, the system has already flagged them up as being high risk.

"It's hard not to think that that will influence their decision. The idea they're going to make a completely fresh call on those cases, I think, if we're not careful, could be a bit unrealistic."

Oversight and transparency were needed to check the accuracy of calls made by the algorithmic system and to ensure people could challenge decisions, he added.

Best practice guidelines tended to be high level and vague, he added.

"There's also questions and concerns about bias," he said. "It can be biased because the training data that's been used to prepare it is itself the product of user bias decisions - if you have a body of data that's been used to train the system that's informed by let's say, for the sake of argument, racist assumptions about particular groups, then that's going to come through in the system's recommendations as well.

"We haven't had what we would like to see, which is one body with responsibility to look across all of government and all of these uses."

The concerns follow questions around another Immigration New Zealand programme in 2018 which was used to prioritise deportations.

A compliance manager told RNZ it was using data, including nationality, of former immigrants to determine which future overstayers to target.

It subsequently denied that nationality was one of the factors but axed the programme.

Don't make assumptions on raw data - immigration adviser

Immigration adviser Katy Armstrong said Immigration New Zealand had to fight its own 'jaundice' that was based on profiling and presumptions.

"Just because you're a 23-year-old, let's say, Brazilian coming in, wanting to have a holiday experience in New Zealand, doesn't make you an enemy of the state.

"And you're being lumped in maybe with a whole bunch of statistics that might say that young male Brazilians have a particular pattern of behaviour.

"So you then have to prove a negative against you, but you're not being told transparently what that negative is."

It would be unacceptable if the police were arresting people based on the previous offending rates of a certain nationality and immigration rules were also based on fairness and natural justice, she said.

"That means not discriminating, not being presumptuous about the way people may behave just purely based on assumptions from raw data," she said.

"And that's the area of real concern. If you have profiling and an unsophisticated workforce, with an organisation that is constantly in churn, with people coming on board to make decisions about people's lives with very little training, then what do you end up with?

"Well, I can tell you - you end up with decisions that are basically unfair, and often biased.

"I think people go in very trusting of the system and not realising that there is this almighty wall between them and a visa over issues that they would have no inkling about.

"And then they get turned down, they don't even give you a chance very often to respond to any doubts that immigration might have around you.

"People come and say: 'I got declined' and you look at it and you think 'oh my God, it was like they literally went swimming in the crocodile waters without any protection'."

Get the RNZ app

for ad-free news and current affairs