At the start of each month, the research provider, Gravitas Limited, sends out 432 survey invitations 6. The annual information included in the report draws on the experiences of 1,935 New Zealanders who completed Kiwis Count between January 2017 and December 2017.


In 2007, for the first time, the State Services Commission asked a sample of New Zealanders about their experiences and views of public services. Known as the Kiwis Count survey, this provided rich information on how New Zealand’s public services were performing in the eyes of the people who use them. The survey ran for a second time in 2009. These first two surveys were point-in-time surveys.

In 2012 the survey moved to a continuous collection methodology. This is the sixth report of annual results from the Kiwis Count survey since the survey became continuous in 2012.

Based on the methodology of a Canadian government survey called Citizens First, Kiwis Count measures satisfaction in public services. Public services means all services provided by government and includes central and local government services, tertiary institutions, schools and hospitals.

Kiwis Count is an integral part of NZE, an SSC research initiative begun in 2007 and designed to find out how New Zealanders experience public services and to develop tools through which services can improve.  The other two parts of NZE, designed to work together and complement and enhance each other are work on the drivers of satisfaction and the Common Measurements Tool.

New Zealanders’ Experience Research Programme (NZE)


The Drivers of Satisfaction

The Drivers Survey 7, published in July 2007, identified the key factors (or drivers) that have the greatest influence on New Zealanders’ satisfaction with, and trust in, public services.  The most effective way to improve satisfaction with public services is for agencies to focus on these key drivers.

Kiwis Count has measured the drivers of satisfaction since it began in 2007 and, since 2012 Kiwis Count has measured the drivers by channel (general: face to face or by correspondence, telephone, looking for information online and transactions online).

The Common Measurements Tool

Kiwis Count measures services satisfaction and trust in government at the macro level.

Agencies are also encouraged to measure satisfaction with their services at a detailed level to help them understand how they are doing in improving areas which really matter to New Zealanders, and where to focus resources so they can have the greatest impact.

The Common Measurements Tool (CMT) is a databank of customer satisfaction questions which agencies can use as they develop their surveys. This saves them from developing their own question banks and allows them to benchmark their results against similar agencies. Approximately half of our agencies use CMT.

Survey Approach

The Kiwis Count team have published a survey methodology report on the SSC website (

New Survey Provider

Gravitas began collecting the survey for SSC in July 2016. 8 To provide comparability with the earlier results Gravitas collected the survey using the same survey materials and overall method as previously.  

There are some differences in the demographics of those who completed the survey since July 2016. Looking at the unweighted sample, younger respondents made up a larger proportion of completions, while older respondents (particularly those 65+) made up a lower proportion. However, as expected, once weighted to the population, the proportions of each demographic are very similar to previously.

Analysis by Gravitas shows no evidence of this difference in the unweighted sample having an impact on the headline results in this report. In general, the difference has improved the robustness of the survey results as the unweighted sample is now more representative of the population for younger and older respondents.

Questionnaire Content

The Kiwis Count survey is modular. At the heart of the survey are questions about the public services that New Zealanders use most frequently. These core questions have been fixed since 2012, with new questions added only as required to reflect actual changes in services.

The modular part of the questionnaire is designed to change as required to focus on service delivery priorities:

  • In the 2012 calendar year the survey included a module of questions on channel use and preferences. This repeated a module of questions which was included in the 2009 survey.
  • Starting in 2013 a module of questions about the ease of transacting with government in the digital environment replaced the previous module. The new module, developed with the team responsible for Result 10 9 of the Better Public Services programme, will be one of a suite of measures used to report on the progress of Result 10.
  • For the first half of 2014, a new module of questions was included. It was about parent/primary caregiver’ satisfaction with education services.
  • In the second half of 2014, at the request of the Result 10 team, new questions were added:
    • To the Government and the digital environment module, and
    • To the main body of the survey, at A10, about experiencing public services.
  • In 2015, new questions were added, at A13, about the privacy of personal information for the Office of the Chief Privacy Officer.

In 2016 a comprehensive review of the questionnaire was undertaken to ensure its continued usefulness. This new questionnaire is being used from the beginning of 2017.

The new questionnaire can be downloaded or viewed at

Encouraging Online Participation

Sisxt five percent of respondents chose to complete the survey online in the year to December 2017. Online completion rates have been growing since 2015. This compares to online rates of 17% in 2009 and 8% in 2007.

Response Rate

The response rate between January 2017 and December 2017 was 43%. This is at the low end of response rates achieved since the continuous approach was adopted (response rates have ranged between 43% and 53%). This reflects greater emphasis on sampling younger people and Māori – both groups tend to have lower response rates.

Key drivers of response rates include sampling, questionnaire length and incentives provided to complete surveys.

Analysis undertaken by the Kiwis Count team shows a correlation between response rates and questionnaire length.

Sample Size

Collection Year

Numbers of New Zealanders who answered the survey

















A total of 24,190 New Zealanders have completed the survey since it began.

Service Quality Scores

The Kiwis Count survey asks New Zealanders to rate services or express opinions using a scale from 1 to 5. To enable comparisons between Kiwis Count and Citizens First to be made, we have adopted the Canadian approach of converting five point rating scales to service quality scores ranging from 0 to 100.

The overall Service Quality Score is calculated by rescaling the result from each respondent’s five point scale (1,2,3,4,5) to a 101 point scale (0,25,50,75,100) then calculating an average of these scores from all the services used.

The overall average uses all service experiences, so a respondent who has used ten services contributes ten observations to the overall score and a respondent who has used one service contributes one observation to the overall score.

Example: the service quality question

Example: the Service quality question


Back to Kiwis Count

6 More survey invitations were sent out in the first half of 2014 to ensure a sufficient sample of parent /primary caregivers answered the survey for a one-off module for the Ministry of Education.

7 From January 2012 to June 2016, the survey collection provider was Nielsen.

8 The full report on the Drivers Survey can be found at and the summary report can be found at

9 People have easy access to Public Services, which are designed around them, when they need them.

10 This was the number of respondents to the main 2007 survey where 6000 people were invited to complete the survey. Prior to the main survey, a pilot survey with 500 invitations was also undertaken.

Last modified: