Netsafe says people shouldn't stop pressuring Facebook to improve, despite its promise to crack down on hate speech and the live-streaming of violent videos.
The social media giant has been heavily criticised for allowing the live-streaming and sharing of the Christchurch terror attack on its platform. It took the social media colossus 29 minutes to detect the livestreamed video of the massacre - eight minutes longer than it took police to arrest the gunman.
About 1.3 million copies of the video were blocked from Facebook, but 300,000 copies were published and shared.
In a statement released on Facebook today, chief operating officer Sheryl Sandberg says it must do more to stamp out hate on its platforms and support the New Zealand community.
Facebook says it's investigating imposing restrictions on who can use its livestream technology. It's also researching how it can more quickly identify violent videos and stop them being re-posted.
It says it's identifying a range of hate groups in New Zealand and Australia, including the National Front, which will be banned from the website.
Netsafe's chief executive, Martin Cocker, is giving its latest promise a cautious welcome.
"I think in general people will be pleased to know this is the direction they're travelling. You don't disagree with anything they've said they want to do but I think people will be rightly sceptical and wanting to wait to see the mechanisms they're planning to put in place actually in place and working."
Mr Cocker says other changes, such as clarifying the definition of hate groups, will help get unacceptable material off the platform.
Facebook also says it will provide support to four New Zealand well-being and mental health organisations and will work with New Zealand's Royal Commission and its review of the role online services can play in terror attacks.
Australian government plans crackdown on social media
The Australian government announced earlier this week that it plans to introduce new laws under which social media executives could get up to three years in prison and their firms could face massive fines if they fail to quickly remove violent material from their platforms.
Prime Minister Scott Morrison says social media companies have a responsibility to ensure their platforms are not exploited by murderous terrorists.
The bill was drafted in response to community outrage at the live streaming of the Christchurch mosque attacks.