Teachers are struggling to get the national standards in reading, writing and maths right, and in some cases they're getting them wrong.
National Standards: School Sample Monitoring and Evaluation Project is based on a five-year study of how schools are using the standards.
The report said there was strong evidence the results teachers gave children using the standards were not dependable.
The government-funded study said some children's national standards results might be incorrect.
The study tracked use of the standards at 95 schools covering nearly 16,000 children since 2010 and its final report, based on data gathered at the end of 2014, was published in mid-July.
The report said there were several indications that teachers' judgements of how children performed against the standards were not dependable.
They included variation in children's results from one year to the next; big differences between how intermediates and primary schools rated the performance of Year 7 and Year 8 students; and previous comparisons of teachers' judgements to judgements reached with the help of an official Education Ministry guide.
"Considered together, this body of evidence strongly suggests that OTJs (overall teacher judgements) lack dependability, which is problematic as OTJs are a central element of the National Standards system. It should be noted that there is no suggestion that all OTJs are inaccurate, but evidence indicates that a reasonable proportion may be."
The report said consistency problems were not surprising as the standards were introduced only recently and tools to help teachers make their judgements were still being developed.
The problems with teachers' judgements meant apparently improving national standards results could not be used as evidence that student achievement was improving, the report said.
It was possible that teachers in low-decile schools were rating students more highly than their peers in high-decile schools.
The report said primary school children in Year 7 and 8 had better results in all of the national standards compared to intermediate school children in the same year group. For example, 79 percent of Year 8 students at full primary schools were rated 'at' or 'above' the mathematics standards in 2014 compared with 67 percent of Year 8 students at intermediate schools.
A likely explanation for the difference was that primary school teachers judged the standards differently from intermediate school teachers.
In each of the standards, about 36 percent of children in the study had a different national standard result in 2014 than in 2013. That was down from 38 to 40 percent in 2010.
The study said the change was likely to be because of inconsistencies in teachers' judgements.
Only half of schools were moderating their national standards judgements effectively by concentrating on work at the boundaries, such as between "at" and "below" the standards.
And only 50 percent of school reports to parents were clear, a figure it said was "concerningly low".
More than 80 percent of principals said the standards were useful for setting targets and reporting to their boards of trustees.
Nearly 90 percent of principals were confident that their teachers were making consistent judgements, but fewer than 20 percent were confident judgements were consistent between schools.
Principals made the following comments:
"We have eight major contributing schools with huge variation of judgement upon entry to our school for students we perceive at similar levels."
"When we get new students from other schools we always feel that their OTJs are much higher than we would give and wonder how they have been derived."
Schools had developed ways of working with the standards that would not change unless there was outside intervention, the report stated.
Standards information robust and improving - ministry.
The Ministry of Education's Karl Le Quesne said the accuracy of national standards results had improved since data for the report was collected in 2014
"We believe that both the assessments that teachers are making, the variability, and the information they're providing to parents has improved," said Mr Le Quesne, acting head of early learning and student achievement.
Mr Le Quesne said more than 400 schools were using the Progress and Consistency Tool - an online guide introduced last year that helps teachers make accurate national standards' decisions - and a further 400 wanted to start using it.
He said some of the variation in teachers' judgement was due to the nature of the standards.
"They were set up right from the start to focus on learning that's relevant and important and ironically that's why there's some variability in the way teachers are judging this. If it was just a tick box assessment we would have less variability but we wouldn't be measuring or assessing the learning that really matters."
Educational Institute president Louise Green said the standards were unreliable.
"The standards are shonky, they're unreliable, they've never been tested. Every New Zealand primary and intermediate child is part of a trial with these things."
Ms Green said any two teachers could rate a child's work differently against the standards.
"We know that, in my own school from personal experience, when we've taken writing samples for example to experts, and experts will look at examples and actually come up with very different judgements."
Ms Green said people were mistakenly trying to use national standards results as if they were a test result, whereas they were based on a rich judgement of children's work.
Green Party education spokesperson Catherine Delahunty said the report showed there was no evidence that national standards were working well.
"It just shows you national standards are pretty irrelevant to equipping our kids to be 21st century learners."
Ms Delahunty said children's learning was not an exact science so there was no point in measuring national standards. The Green Party's policy in the next election would likely be to scrap them.