Deportation modelling 'bringing back the dawn raids'

5:25 pm on 5 April 2018

An Immigration New Zealand pilot programme that profiles overstayers could unfairly target Pasifika people and is being called racist.

Immigration to New Zealand.

Photo: RNZ

For the past 18 months, Immigration New Zealand has been running a pilot to help it prioritise which people it should take action against.

To do that, it has been modelling data - including the age, gender and ethnicity - of overstayers to identify which groups are more likely to run up hospital costs or commit crime.

The Immigration Minister, Iain Lees-Galloway, said when resources are limited, officials have to make calls about which of the country's 11,000 overstayers to deport.

"All of these people are liable for deportation. Immigration New Zealand has limited resources, they have to prioritise who they will take enforcement action against," he said.

"They are using a range of data to prioritise the people they think pose the greatest threat to New Zealand."

Immigration New Zealand has however been criticised for using ethnicity as part of its risk modelling.

Melino Maka from the Tongan Advisory Council said it was racist and he feared Pasifika people could be unfairly targeted.

"This is bringing back the dawn raids. You can see the language they use, 'using up our health system' - that's the same language they use in America and in Europe to justify using some of those racist policies that we don't need here in New Zealand," he said.

Most overstayers are from Tonga, Samoa and China.

Kamil Lakshman, a lawyer specialising in immigration and refugee issues, said it was concerning that Immigration New Zealand was using ethnicity as part of its profiling.

That had the potential to further marginalise certain groups, she said.

"It is seriously concerning, simply because it will be based on race, so if somebody's from a particular country, of a particular race, they will be deemed to be in this class and then a certain approach will apply to them, a predetermination in how they are looked at."

Mr Lees-Galloway denied the programme was racial profiling.

"If I got any suggestion that they were solely profiling people based on race, that would be unacceptable to me and we would deal with that."

Data analyst Harkanwal Singh said that while predictive risk modelling is useful, it needs to be done in an open and transparent way.

He said that as more and more government departments do it, there needed to be greater checks and balances.

"There needs to be some independent checking of these models because it's different if Facebook is using a model because it's a private company.

"But it's very different if a ministry or government is using models because there needs to be more accountability over that."

Mr Singh said ethical issues needed to be a part of that as well.

"If you can use race to predict certain things, the question to ask is should you be using it in the first place, considering human rights concerns and everything."

The Human Rights Commission and the Privacy Commissioner were both seeking more details from Immigration New Zealand about the pilot programme.

Mr Lees-Galloway will get a full briefing from his officials about it tomorrow.