The NZ government has explored options of a similar ban to the one that just came into effect across the Tasman. Photo: RNZ
A social media ban for children is worth exploring further, according to an interim report into social media harm, but the party that requested the inquiry has taken the unusual step of submitting a differing view.
It comes as Australia's ban on social media for children under 16 takes effect.
The New Zealand government has explored options of a similar ban, with National keen to progress with one before the end of this term.
National MP Catherine Wedd has submitted a member's bill to legislate a ban, while the education minister is doing separate work around regulation.
ACT opposes a ban, instead requesting an inquiry by Parliament's Education and Workforce Committee into social media harm.
That inquiry has released its interim report, with a majority saying a ban was worth exploring further.
It has led ACT to submit a differing view, over concerns the inquiry was steered towards a pre-determined outcome.
The committee intends to release a final report into other regulatory measures next year.
What the report found
The inquiry heard from a number of social media platforms, including Meta and TikTok, as well as concerned parents, privacy experts and free speech groups
The committee received 430 submissions from 400 individuals and groups. Eighty-seven submitters presented in-person to the committee.
Committee acting chair National MP Carl Bates said the report reflected the "seriousness" of what the committee heard from families, experts, and young people.
"Some of the stories we heard were deeply concerning. No parent wants to think about their child being exposed to harm online," he said.
"This interim report highlights the key areas the committee believes deserve further work, and it provides useful insight for the government's broader efforts to keep young people safe."
The report found while New Zealand had multiple pieces of legislation related to online content regulation - such as the Harmful Digital Communications Act - there was no specific legislation regulating online platforms for user safety.
"Submitters expressed concern that no clear, single authority is tasked with co-ordinating the response to online harm. Several commented that it could be beneficial to introduce a national regulator responsible for online safety. They said this could help make it clear where people should go when they encounter harm," the report said.
Submitters were also concerned that legislation did not effectively protect young people from sexual violence online, and there were no mechanisms to regulate the design of online platforms or algorithms.
The committee also found a "tension" between what online platforms told it, and the overall sentiment from submitters.
"Online platforms assured us that they had extensive mechanisms in place to keep young people safe online. In contrast, submitters considered that platforms could do more to prevent and respond to harm," the report said.
"These differences may arise due to low awareness of platforms' safety features, different expectations of the level of action required, or limited effectiveness of online platforms' efforts."
While acknowledging the submissions from platforms on steps they took to address online harm, other submitters said platforms should prioritise user safety instead of introducing safety mechanisms in retrospect.
"The extent to which online platforms choose to self-regulate, and the effectiveness of these measures, is limited by platforms' business choices and commercial incentives," the report said.
Overall, the committee found limitations in existing laws and regulations, and did not believe online platforms were doing enough to address the "gravity" of the harm experienced.
"We consider that there is a level of conflict between the business models of social media platforms and the need to most effectively shield young people from harm. We consider that social media platforms have a commercial incentive to design their platforms to be addicting and stimulating for young people. However, these features may contribute to psychological and behavioural harm."
It called for further research into online harm in a New Zealand-specific context, and the committee would further review the current legislative settings and whether the government could play a stronger role in regulating or mandating online safety.
It said 12 things warranted further consideration:
- 1. Restricting access to social media platforms for under 16-year-olds
- 2. Regulating deepfake tools or "nudify" apps in New Zealand
- 3. Whether New Zealand legislation, including the Films, Videos, and Publications Classification Act and Harmful Digital Communications Act, is fit for purpose
- 4. Introducing a national regulator for online safety in New Zealand
- 5. Ways in which regulatory approaches introduced in New Zealand could be made sufficiently agile to respond to new developments in technology as they occur
- 6. What role the New Zealand government should play in regulating the design of online platforms
- 7. Whether there is a need to restrict online advertising of harmful products, such as alcohol, tobacco, and gambling for under 18-year-olds
- 8. The level of liability online platforms and internet service providers should be held to for harmful and illegal content hosted through their services
- 9. The level of responsibility that parents should have in protecting their children from online harm, and the tools they would need to provide this support effectively
- 10. Advantages and limitations of approaches to increase algorithm transparency
- 11. Ways to learn from international experiences, including the implementation of the social media ban for under-16-year-olds in Australia
- 12. Ways to encourage further research on online experiences in New Zealand.
Why ACT disagreed
Despite initiating the inquiry in the first place, the ACT party published a differing view, concerned that the report had "drifted noticeably from its intended purpose".
ACT said the interim report was supposed to summarise the information and advice to lay the groundwork of understanding the issue ahead of the final report, and that it had instead leaned heavily into recommendations and policy options.
"This is premature and risks compromising the quality and integrity of the final report," ACT said.
MP Parmjeet Parmar said it was concerning some parties appeared to already be moving towards a "predetermined solution" of a ban on social media for under-16 year olds before the inquiry had run its course.
The committee expects to release its final report in early 2026.
"ACT agrees that online harm demands attention. Parents remain the first and most effective safeguard, and any government response must be proportionate, workable, and based on evidence," Parmar said.
"New Zealand now has the chance to learn from Australia in real time. Parliament should take that opportunity before deciding what to do next."
Where do parties stand on a ban?
National is "highly supportive" of a restricting access to social media for under 16 year olds, with the prime minister saying he was "personally invested" in delivering a ban.
"I think we need to make sure we've got guidelines and guard-rails in place for our kids in the virtual world as we do in the physical world."
ACT leader David Seymour said parents' concerns should be met with a "thoughtful" response, and cautioned against a "knee-jerk" move.
"The whole world's watching Australia, because no-one really knows what's going to happen. There's good arguments it could actually make things worse if it drives children into worse parts of the internet and stops them wanting to talk to their parents about it," he said.
Seymour said he hoped Australia's move was a "raging success," as then New Zealand would know what to do, but then also if Australia found problems, New Zealand would know what to avoid.
Green Party co-leader Marama Davidson acknowledged that parents were worried, but the core issue was tech giants "running unregulated, exploiting our people for their misinformation and disinformation," and social media harm affected people of all ages.
"They need to be held to account instead of just taking something off young people," she said.
Labour leader Chris Hipkins said Labour would support Wedd's member's bill to first reading, but the party would keep an open mind.
"We're also happy to work with the government to come up with a law that will actually work. We want to get this right, we think future generations of New Zealanders deserve for us all to take this seriously and to get it right."
Hipkins said there were some good reasons for young people to stay connected, and Labour would be following what Australia did closely.
Labour MP Reuben Davidson has his own member's bill in the ballot, which would ensure providers baked safety-by-design into their products (such as ensuring the best interests of a child are a primary consideration when designing a service or introducing a new function), publish transparency reports, and create offences for failing to comply with the duty to protect children from harmful material.
What work is the government doing?
The education minister is pursuing her own government work, separate to the National-led ban.
Erica Stanford described her work as "the real teeth" designed around getting social media companies to fully report on how they were protecting children, as well as defining what social media actually is.
"A ban just means that kids will be able to get around it, that's why that won't work on its own."
Stanford said New Zealand was a "fast follower," and would be watching what other countries were doing.
"We have nothing. We don't have a child online protection act, we don't have a regulator, we have nothing. So it's a good place to be and a bad place to be."
Stanford said it looked like what the committee would be proposing was similar to what she was working on.
"When you look overseas, they have child protection acts, they have strong regulators, because they're about changing the behaviour of the social media companies, not just an outright ban."
She said she would be taking a paper to Cabinet shortly.
"Personally? I think that social media companies don't give a damn about our kids."
Asked whether Wedd should have held off on her member's bill before that work was completed, Stanford said the bill was a "useful vehicle" as it created a public discussion.
Sign up for Ngā Pitopito Kōrero, a daily newsletter curated by our editors and delivered straight to your inbox every weekday.