Download PDF version (2.6 MB)

Executive Summary

Kiwis Count results show New Zealanders have high trust in, and satisfaction with, their public services and that their trust and satisfaction continues to grow. It is evidence of a public service delivering well and working to improve public services.

A consistent theme to increased Kiwis Count satisfaction scores is agencies designing their services around the needs of the customer. In an environment where customer expectations are increasing, agencies must continue to deliver and evolve, with the customer in mind, to ensure satisfaction scores are maintained and increased. This report includes two case studies (from NZ Police and Inland Revenue) which illustrate this.

The results show there is much to be proud of, but also identifies areas for improvement.

Trust

New Zealanders' trust in public services has increased over time as the number of New Zealanders who distrust public services has decreased.

Levels of trust based on experience have increased two percentage points over the year to 79%. This is twelve percentage points higher than 2007. However, their perception of trust has declined two percentage points over the year to 43%. This is a decline on a four percentage point increase in 2014. Perception of trust results have fluctuated since 2012 but are trending upwards with the 2015 result being 14 percentage points higher than 2007.

Since 2013, the perception of trust in the public sector has overtaken the private sector.

Demographic analysis shows:

  • Females report higher levels of trust than males by experience but not by perception.
  • Trust by experience and perception is higher among those of NZ European, Asian and Other ethnicities than Māori and Pacific ethnicities. The perception of trust among those of Asian ethnicities is significantly higher than for all other ethnicities.
  • Trust within the 65+ age group is highest of any age group, based on both experience and perception.
  • Those with a disability have lower levels of trust, based on both experience and perception.

Satisfaction

The overall Service Quality Score (SQS) for June 2015 is 74, one point higher than June 2014 and two points higher than June 2012 and June 2013. Satisfaction has increased from 68 in 2007.

Over the 2015 year:

  • 23 services improved their annual scores on their June 2014 score. Eight of these improvements were statistically significant.
  • Nine services recorded the same service quality score as in June 2014.
  • Ten services recorded decreases in service quality over the year. None of these decreases were statistically significant.
  • Demographic analysis at a total SQS level shows:
  • Overall, females are more satisfied with public services than males.
  • Those of Asian ethnicity have the highest satisfaction of all ethnic groups with Māori the lowest.
  • Satisfaction among those aged 55+ is higher than for younger people.
  • Satisfaction is lower among those people who reported having a disability.

Since 2007 satisfaction with public sector services has been higher than for the private sector.

Introduction

Kiwis Count asks New Zealanders about their experience of using public services. It is part of the New Zealanders' Experience (NZE) research programme, established in 2007 by the State Services Commission (SSC). Kiwis Count assists SSC to monitor satisfaction with, and trust in, public services over time. The research programme addresses questions that underpin two of the State Services Commissioner's key statutory functions:

  • promote and reinforce standards of integrity and conduct in the State Services
  • work with leaders across the State Services to improve the way agencies think, operate and perform. In regard to this role, SSC is shifting its approach, from setting policy and the framework to change, to assisting agencies to work at pace across agency boundaries to deliver improved services and results to New Zealanders.

Initiatives, such as Kiwis Count and the Continuous Improvement (CI) programme, are examples of how SSC provides practical assistance to agencies to deliver improved results to New Zealanders. Case studies from New Zealand Police and Inland Revenue (pages 16-21) show how agencies have focused on understanding their customers' needs in order to improve service design and delivery. Kiwis Count lets these agencies measure their service improvements over time and benchmark them against other public services. In addition, between December 2014 and August 2015, New Zealand Police worked with SSC's “better every day” CI team to understand and improve customer experience on first contact with Police. An initial project in South Auckland involved capturing data on high frequency requests for service and identifying and trialling ways to make it easier for customers to connect with Police services and receive a satisfactory response. The work has now been assigned to internal Police project groups for implementation.

Kiwis Count is based on the methodology of the Canadian government survey, Citizens First. Appendix 1 describes the Kiwis Count survey methodology.

The 2015 Kiwis Count annual report draws on the experience of 2,365 New Zealanders who completed the survey in the year to 30 June 2015. It presents the annual trust and overall satisfaction figures, the annual Service Quality Score (SQS) results for individual services and includes demographic analysis of the trust and service quality results.

More information about the NZE research programme is on page 14. This section describes the key factors (or drivers) that have the greatest influence on New Zealanders' satisfaction with, and trust in, public services and shows performance information since 2012 on the key drivers of customer satisfaction.

2015 Annual Results: Trust

Trust in Public Services

Kiwis Count measures trust in public services in two ways: by perception and by experience.

Consistently, New Zealanders' trust in public services by experience (“Thinking about your most recent service contact, can you trust them [public servants] to do what is right?”) has measured much higher than the perception of trust (“Thinking about your overall impressions and from what you know or have heard from family, friends or the media, to what extent do you trust the public service?”).

By both measures, trust has increased markedly since 2007.

Trust in public services by experience increased to 79% in 2015. This is a two point increase over the last year and a 12 point increase since 2007.

Perception of trust increased 13 points from 2007 to 2012 and has levelled out since then, oscillating between 41% and 45%. The 2015 result is 43%, two points lower than the 2014 result and 14 points higher than the 2007 result.

Figure 1: Experience and Perception of Trust in Public Services

''

The percentage of people who do not trust public services has declined over 2015 after being static between 2012 to 2014. Distrust levels are lower than they were in 2007 and 2009, with perception of distrust now ten percentage points lower and distrust based on experience four percentage points lower in 2015 than 2007.

Figure 2: Experience and Perception of Distrust in Public Services

''

The OECD states “trust in government is essential for social cohesion and well-being.”

In addition to conducting the Kiwis Count survey, the State Services Commission regularly surveys State servants’ perceptions on the integrity and conduct of their colleagues and managers. The Integrity and Conduct survey has consistently found that State servants rate the integrity and conduct of their colleagues and immediate managers highly. The 2013 Integrity and Conduct survey found that 89% of State servants agreed that the people they work with on a day to day basis demonstrate standards of integrity and conduct and 81% of respondents reported that they “go the extra mile” in working for their agency.

At the beginning of 2015, extra questions were added into the Kiwis Count survey to measure customers' views about whether they considered the public service staff they had dealt with in their last interaction with the public service had “gone the extra mile” for them. This was done to see whether the views of public servants about how they treat customers is matched with how customers themselves feel they are treated.

The results showed:

  • Of all respondents whose last contact with the public service was by visiting an office or location, receiving a visit, sending or receiving a letter, fax or email, 71% considered staff went the extra mile to help them get what they needed. Sixteen percent were neutral about this question and 12% of respondents answered in the negative (a 1 or 2 on a 5 point scale).
  • Of all respondents whose last contact with the public service was by telephone, 67% considered staff went the extra mile to help them get what they needed. 16% were neutral about this question and 17% of respondents answered in the negative.

The Kiwis Count trust results support the view that “the New Zealand State services are rated highly for their standards of integrity and conduct at the international level, and are considered to be one of the most transparent public services in the world”. International comparator measures of trust continue to rate New Zealand either at, or near, the top countries of the populations measured.

Trust in Non-government Services

Since 2012, Kiwis Count has also measured New Zealanders' perception of trust in the private sector (“to what extent do you trust the private sector?”), to help benchmark the results for public services.

Figure 3 shows that, since 2014, the perception of trust in the private sector is lower than the perception of trust in public services. The four point gap which opened up in 2014 narrowed to a three point gap in 2015 as the perception of trust results for both public services and the private sector have oscillated since 2012. Private sector trust was very similar to that for public services in 2012 and 2013.

Figure 3: Perception of Trust in Public Service and the Private Sector

''

Trust by Demographic Breakdown

In this section we highlight demographic differences that were either statistically significant in 2015 or that reflected demographic patterns.

Based on most recent experience of public sector services:

  • Trust among females is higher than males.
  • Those of NZ European, Asian and Other ethnicities have higher levels of trust than those of Māori and Pacific ethnicities.
  • Levels of trust within the 65+ age group is highest of any age group.
  • Those with a disability report lower levels of trust.

Based on perception of trust in public sector services:

  • Trust among females is lower than males.
  • Those of Asian ethnicity have higher levels of trust than for any other ethnic group, with those of Māori and Pacific ethnicities reporting the lowest levels.
  • Trust within the 65+ age group is higher than other age groups.
  • Those who reported having a disability had lower levels of trust.
  • Those with a personal income over $70k trust more than those who earn less.

Demographic results for perception of trust in private sector services were similar to results for public sector services.

2015 Annual Results: Service Quality

Overall Service Quality Trend

Kiwis Count measures New Zealanders' satisfaction with 42 commonly used public services. Individual services have Service Quality Scores (SQS) calculated and an overall SQS across all services is then derived from them.

The results present an overall positive picture. Satisfaction with public services continues to steadily improve.

In the June 2015 year, overall satisfaction has increased one point to 74. This is a six point increase from 68 in 2007. If we compare this to Canada over a similar period, New Zealand's rate of improvement is higher than that of Canada's where overall satisfaction has increased two points from 72 in 2008 to 74 in 2014.

Figure 4: Kiwis Count Annual Service Quality Score

''

Private Sector Service Quality

Kiwis Count also asks New Zealanders about their use of, and satisfaction with, seven types of private sector companies (banks or finance companies, insurance companies, internet service providers, postal or courier companies, telephone companies, credit card companies and electricity or gas companies). These scores are also aggregated to give an overall private sector SQS.

Figure 5 shows that, since 2007, satisfaction with private sector services has been lower than satisfaction with public services. Satisfaction with private sector services has increased over the whole period, but the rate of increase has been slower than that of satisfaction with public services.

Figure 5: Service Quality Score (Public Services and the Private Sector)

''

Service Quality by Demographic Breakdown

In this section we highlight demographic differences that were either statistically significant in 2015 or that reflected demographic patterns:

  • Satisfaction with both public and private sector services is higher for females than for males in 2015; by two points for public sector service and three points for private sector services.
  • By ethnic group, satisfaction with public sector services for those of Asian ethnicity is highest (76), followed by NZ Europeans (74), Other (73), Pasifika (71), with Māori satisfaction lowest at 70.
  • By age group, satisfaction of 55+ year olds with public sector services is higher than for younger people.
  • In contrast to all other age groups, satisfaction with private sector services is higher than satisfaction with public sector services for 18-24 year olds.
  • There are no differences in satisfaction with public services by personal income, although those with a personal income over $70k had the lowest satisfaction with private sector services.
  • Satisfaction with public sector services is two points higher for those without a disability than for those with a disability.

Individual Services Quality Trend

As described in Appendix 1, Kiwis Count measures New Zealanders' satisfaction with 42 commonly used services. In this section, we discuss changes in SQS for individual services.

We present the individual services' levels of improvement over the previous year and also highlight those services which have had statistically significant increases or decreases since they were first measured by Kiwis Count.

Individual service Kiwis Count data can be viewed here

The level of improvement over time is important as it highlights improvements as well as services which may be stagnating or falling behind.

Statistically significant increases are a factor of how much a service is used as well as quantum of change. For example, public libraries with only a one point SQS increase since the service was first measured, has experienced a statistically significant increase as it is the service with the second highest usage (55% of respondents had visited a public library within the past year).

''

Since 2007 (or first measured)

In June 2015, 34 services have improved their annual scores on their first measured score. Twenty-one of these increases have been statistically significant. They are:

  • A childcare subsidy.
  • Stayed in a public hospital.
  • Contact with Statistics NZ for information or about taking part in a survey.
  • A hunting or fishing licence.
  • A passport.
  • Enquired about tax, receiving tax credits (such as Working for Families), Student loan repayments or KiwiSaver.
  • National environmental issues or the Resources Management Act.
  • Received outpatient services from a public hospital (includes A & E).
  • Registering a birth, death, marriage or civil union.
  • Registering a new company or filing an annual return for a registered company.
  • The arrival process after landing at a New Zealand international airport from anywhere from Australia.
  • The arrival process after landing at a New Zealand international airport from anywhere except Australia.
  • The Police (for a non-emergency situation).
  • Used an 0800 number for health information.
  • Your local council about a building permit.
  • A kindergarten, day-care, crèche, preschool, home-based service, playcentre. Kōhanga Reo, Aoga Amata, Puna Reo or playgroup etc that your child attends or may attend in the future.
  • Accident compensation for injuries.
  • Emergency services i.e.111.
  • Paying fines or getting information about fines.
  • Registered a business entity for tax purposes or filed a tax return.
  • Visited a public library.

Three services recorded the same SQS as first measured.

Five services had decreases in SQS since they were first measured. None of these decreases are statistically significant.

The median level of improvement since first measured is +4 points, with the level of change ranging from +11 to -3 points.

''

Over the Year:

In June 2015, 23 services improved their annual scores on their June 2014 score. Eight of these improvements were statistically significant:

  • Applying for or receiving a student loan or student allowance.
  • Visited sorted.org.nz for information to help manage your personal finances or retirement income.
  • A hunting or fishing licence.
  • A passport.
  • Stayed in a public hospital.
  • Enquired about tax, receiving tax credits (such as Working for Families), Student loan repayments or KiwiSaver.
  • Received outpatient services from a public hospital (includes A&E).
  • The arrival process after landing at a New Zealand international airport from Australia.

Nine services recorded the same SQS as in June 2014.

Ten services recorded decreases in SQS over the year. None of these decreases are statistically significant.

The median level of change over the year is +1 point, with the level of change ranging from +8 to -7 points.

''

Over the Quarter:

In June 2015, 31 services improved their scores on their March 2015 score. Four of these increases were statistically significant:

  • Living in a Housing New Zealand home.
  • ERO school or early childhood reports.
  • Visited sorted.org.nz for information to help manage your personal finances or retirement income.
  • A state or state integrated (public) school that your child attends or may attend in the future.

Four services recorded the same SQS as March 2015.

Seven services recorded decreases in service quality over the year. None of these decreases are statistically significant.

The median level of change over the quarter was +1 point, with the level of change ranging from +15 to -4 points.

Table 1 Annual Service Quality Scores for Individual Services (ordered by level of improvement since 2014)

Service Annual Level of improvement over year Significant change from 2014 Significant change since first measured
2007 2009 2012 2013 2014 2015
Applying for or receiving a student loan or student allowance 59 59 52 59 54 62 8 YES   
A childcare subsidy 56 65 60 67 60 67 7   YES 
Stayed in a public hospital 68 71 74 74 72 77 5 YES  YES 
Contact with Statistics New Zealand for information or about taking part in a survey 65 67 67 68 67 71 4   YES 
Visited sorted.org.nz for information to help manage your personal finances or retirement income - - 81 75 76 80 4 YES   
A hunting or fishing license 77 72 79 80 79 82 3 YES  YES 
A passport 76 77 79 79 80 83 3 YES  YES 
Enquired about tax, receiving tax credits (such as Working for Families), Student loan repayments or KiwiSaver - 59 61 62 60 63 3 YES  YES 
Registering a birth, death, marriage or civil union 72 75 77 80 76 78 2   YES 
Received outpatient services from a public hospital (includes A & E) 69 68 72 74 73 75 2 YES  YES 
Used an 0800 number for health information 67 70 70 75 74 76 2   YES 
The arrival process after landing at a New Zealand international airport from Australia - 73 79 80 79 81 2 YES  YES 
Your local council about a building permit 44 51 55 49 50 52 2   YES 
Importing goods into New Zealand or customs duties 62 57 66 63 64 66 2    
The police (for a non-emergency situation) 62 64 66 66 68 70 2   YES 
Registering a new company or filing an annual return for a registered company 70 71 74 74 72 74 2   YES 
Your local council about road maintenance 42 45 48 48 46 48 2    
A housing subsidy or accommodation supplement 56 62 64 61 59 61 2    
A state or state integrated (public) school that your child attends or may attend in the future 77 72 74 75 76 78 2    
Your local council about property rates 59 57 55 59 60 61 1    
Employment or retraining opportunities 64 61 59 61 63 64 1    
Licensed or registered a vehicle - - 80 80 80 81 1    
National environmental issues or the Resources Management Act 41 48 42 45 48 49 1   YES 
Emergency services i.e.111 73 77 84 81 79 79 0   YES 
The arrival process after landing at a New Zealand international airport from anywhere except Australia - 72 77 79 79 79 0   YES 
The Community Services card 73 74 75 75 75 75 0    
Visited a public library 83 82 85 84 84 84 0   YES 
Visited a national park 79 76 78 78 79 79 0    
A kindergarten, day-care, crèche, preschool, home-based service, playcentre, Kōhanga Reo, Aoga Amata, Puna Reo or playgroup etc that you child attends or may attend in the future 73 76 77 79 81 81 0   YES 
Accident compensation for injuries 65 64 70 67 70 70 0   YES 
Registered a business entity for tax purposes or filed a tax return - 64 71 71 70 70 0   YES 
Paying fines or getting information about fines 54 57 63 63 63 63 0   YES 
New Zealand Superannuation 79 75 84 83 81 80 -1    
Obtain, renewed, change or replace a driver licence - - 75 73 74 73 -1    
Your local council about rubbish or recycling (excluding the actual collection of rubbish and recycling from your household each week) 63 65 68 66 67 66 -1    
A rental property bond lodgement, refund or transfer - - 71 71 71 70 -1    
A university, polytechnic or wananga about a course you are attending or may attend in the future 70 70 75 74 73 72 -1    
A court, about a case you were involved with - 52 49 50 56 54 -2    
ERO (Education Review Office) school or early childhood reports - - 68 66 71 68 -3    
Obtaining family services or counselling 68 65 65 69 70 66 -4    
Receiving a benefit such as Jobseeker Support, Sole Parent Support or a Supported Living Payment 59 59 60 61 60 56 -4    
Living in a Housing New Zealand home - - 58 64 66 59 -7    

The Customer Voice

Using data to improve service design and delivery leads to improved customer satisfaction. Data can indicate that initiatives are working and where modifications need to be made.

This section of the report will:

  • discuss the State Services Commission's (SSC) New Zealanders' Experience Research Programme (NZE), of which Kiwis Count is a part
  • present a case study from NZ Police about how it has used key aspects of NZE to drive performance improvement
  • present a case study from Inland Revenue about how its focus on improving customer experience has led to results in its peak season turning an important corner this year.

New Zealanders' Experience Research Programme

This programme is a research initiative designed to find out how New Zealanders experience public services and to develop tools through which services can improve.

There are three integrated aspects of NZE that were designed to compliment and strengthen each other: Kiwis Count, the Common Measurements Tool and the Drivers of Satisfaction (see figure 6).

Kiwis Count measures service satisfaction and trust in government at the macro level.

Agencies are also encouraged to measure satisfaction with their services at a detailed level to help them understand how they are doing in improving areas which matter most to New Zealanders, and where to focus resources for the greatest impact.

The Common Measurements Tool (CMT) is a databank of customer satisfaction questions which agencies can use as they develop their surveys. This saves them from developing their own question banks and allows them to benchmark their results against similar agencies. Approximately a third of New Zealand's service delivery agencies use CMT.

Figure 6: New Zealanders’ Experience Research Programme (NZE)

''

The Drivers of Satisfaction

The Drivers Survey[12], published in July 2007, identified the key factors (or drivers) that have the greatest influence on New Zealanders' satisfaction with, and trust in, public services. One way of improving satisfaction with public services is for agencies to focus on these key drivers.

Kiwis Count has measured the drivers of satisfaction since it began in 2007.

Performance on the drivers increased in 2012 on the 2007 and 2009 results.

Not all drivers are equal: ‘the service experience met your expectations' is the most important driver as it accounts for nearly one third of satisfaction with public services. Results for this driver are shown in figure 7.

Figure 7: Performance by channel on the driver 'The service experience met your expectations'

''

Overall results for the driver "the service experience met your expectations" are higher at June 2015 than June 2012 with a 6 percentage point narrower spread of results at 2015 than 2012:

  • At June 2015, the channel with the highest result was the ‘general: face to face channel' (81%).
  • Results for ‘general:face to face' and ‘phone' channels steadily increased through the period. The ‘phone' channel has come from the lowest base,and is still the driver with the lowest satisfaction, but it has increased the most of any channel (9 percentage points) over the period.
  • Despite a large (5 percentage point) increase for ‘transacting online' in 2013, which was maintained in 2014, the 2015 result returned to the 2012 level of 78%.
  • ‘Looking for information online' increased over the period by 4 percentage points. However, the large (8 percentage point) increase it experienced from 2012 to 2013 was not maintained and the channel experienced a decrease of 4 percentage points in 2014 which was then maintained in 2015.

That online channel service satisfaction with this driver has fluctuated may indicate that expectations for online channels is increasing, rather than poorer services are being provided per se. This period covers a time when mobile uptake has been dramatic. As digital has become so integral in people's lives, behaviour has changed but changes to online service design may not be occurring at the same rate. It is a signal for agencies to keep up with delivering services in the way their customers want to operate.

Appendix Two reports performance on drivers for all channels. Similar results as reported above can be seen in these graphs. Since 2012, performance on:

  • the General: Face to Face channel has been generally flat
  • online channels increased at first and then fell back somewhat
  • the telephone channel has increased.

The Key Drivers of Customer Satisfaction

The service experience met your expectations.

  • Staff were competent.
  • Staff kept their promises, they did what they said they would do.
  • You were treated fairly.
  • You feel your individual circumstances were taken into account.
  • It's an example of good value for tax dollars spent.

Notes

Case Studies

New Zealand Police

NZ Police has focused on bringing the customer into the heart of its interactions with New Zealanders. As a result the SQS for the service “The Police (for a non-emergency situation” has increased 8 points since first measured. This is 2 points more than the overall SQS has increased over that time. The service's rate of improvement over the past two years is double the overall improvement.

Figure 8: Service Quality Score (The Police and the Public Service)

''

In the first case study, on pages 17-20, NZ Police describes how it has used customer experiential data to improve services to the public.

Inland Revenue

Since the Kiwis Count survey became continuous, a clear pattern of satisfaction decreasing in Inland Revenues peak period (Jan - Mar) has been evident.

This year, for the first time, this decline did not happen.

Figure 9: Quarterly Service Quality Score (Enquired about tax, receiving tax credits (such as Working for Families), Student loan repayments or KiwiSaver)

''

Inland Revenue has been working to improve the service experience of its customers, with a particular focus on meeting peak demand more quickly.

The reversing of the March decline in the ‘Enquired about tax, receiving tax credits (such as Working for Families), Student loan repayments or KiwiSaver' service has contributed to the service's statistically significant annual increase to June 2015.

In the second case study, on pages 21–22, Inland Revenue describes how its work to improve customer satisfaction has been a multi-faceted approach.

"Making Every Contact Count" - Case study from New Zealand Police

Putting the people we serve at the centre of everything we do is critical to the way Police works. Over recent years we have focussed on people's service experiences with Police, particularly victims of crime. Our vision is to have the trust and confidence of all. Central to this is that we make every contact count.

Service quality is a critical driver for policing services. Police has used the New Zealanders' Experience (NZE) research programme in a very practical way to tie the expectations on New Zealanders to the way we develop, manage and monitor the public's service experiences with Police.

In the Beginning

In 2007, we were looking at new ways to be more citizen centred. At the same time the SSC embarked on the New Zealanders' Experience (NZE) research programme. The resulting Drivers of Satisfaction that the NZE programme identified resonated with our own work in understanding what was important to the people we served.

The Police Executive approved Service Excellence - a programme of work focussing on the public's service experiences with Police that aimed to make every contact with a member of the public count. One of the first actions was to set up the Citizens' Satisfaction Survey, using the Common Measurements Tool (CMT), so that we could understand people's experiences and track progress.

Using the Drivers of Satisfaction

We wired the Drivers of Satisfaction into every aspect of our services: from our commitment of service to the public, to service delivery standards for every point of contact and to measuring and monitoring people's service experiences.

We have made a commitment to the public to meet their expectations. Our commitment is based on the drivers of satisfaction:

  • We will treat you fairly.
  • Our staff will be competent.
  • We will do what we say we'll do.
  • We aim to meet your service expectations.
  • We will take your individual circumstances into account.
  • Our service will be good value for your tax dollars.

Building the Voice of Citizens into Police Services

We focused on people's service experiences: at the roadside, on the telephone, at the station and for all other operational interactions. We asked people to tell us what behaviours were important for each of the drivers at each point of contact. We asked the same thing of our front line staff.

The information gave us a clear understanding of what behaviours our staff and leaders need to demonstrate to meet the expectations of the people we serve. We used it to create service delivery standards. We have standards for all points of contact including ones for our Information and Communications Technology Service Centre who provide a service to internal customers.

''

Making it Memorable

We developed the PEOPLE framework to help people remember the drivers of satisfaction and our Commitment of Service. We linked each driver to the corresponding letter of PEOPLE - Positive, Expert, Ownership, Professionalism, Listening, and Excellence. The PEOPLE framework is used in training and visual materials.

Over the last six years most staff in front line service delivery roles have received some form of Service Excellence training - leadership workshops, face to face and online training. They are now part of business as usual. All new Police recruits learn about the drivers through the PEOPLE framework.

Embedding the Drivers of Satisfaction – An Example

The practical application of Service Excellence can be seen in the way the Communications Centre (Police call centre) has embedded the drivers of satisfaction into the way they do their business. They are a core part of training and the quality assurance process.

The National Communications Centre management team prepares quarterly reports on their Citizens' Satisfaction Survey results including verbatim comments and opportunities for improvement for staff. They have seen sustained improvement over time in the ratings people who have received services from the Communications Centre give them.

''

Measuring and Monitoring

We have used the Common Measurements Tool questions in our Citizens' Satisfaction Survey since 2008 to measure our progress and to better understand people's service experiences. Gravitas Research and Strategy Limited interview a random sample of around 10,000 people per year on our behalf. They also call a targeted sample of people who have called the Communications Centre.

Around 41% of people in the random sample will have had contact with Police in the last six months. We are interested in their service experience. Results are used in a variety of ways. Key measures - overall satisfaction and being treated fairly – are included in monthly performance reports for the Executive. Districts receive tailored reports.

Results are used to track progress and identify opportunities for improvement particularly for communities that are important to us such as victims of crime, Māori and young people. We now have a significant database of over 80,000 interviews to assist our understanding of people's service experiences.

How have we Done?

We can show a range of improvements over time for people's service experiences with Police at national, district, demographic and point of contact levels over time (2008/09 - 2014/15):

4% - increase in people who had a face-to-face interaction (not at the roadside or station) agree/strongly agreeing that they were treated fairly (88% - 92%).

5% - increase in overall satisfaction with service experiences at the roadside (79% - 84%).

6% - increase in the people who called their local station agreeing/strongly agreeing that the service they received was good value for tax dollars (67% - 73%).

8% - increase in callers to Communications Centre who agreed or strongly agreed their individual circumstances were taken into account (78 - 86%).

10% - increase in people very satisfied with their overall service experience (37% - 47%).

11% - increase in people whose service experience at the public counter was better or much better than they expected (28% - 39%).

12% - increase in people who telephoned their local station strongly agreeing that staff were competent (34% - 46%).

14% - increase in callers to Communications Centre who strongly agreed that we did what we said we would do (34 - 48%).

We also measure level of trust and confidence in Police in the Citizens' Satisfaction Survey. The most common reason people give us for their trust and confidence in Police having improved over the last year is a positive service experience. Trust and confidence in Police has increased from 72% to 77%.

Next Steps

We have made a commitment to the public about the quality of service that they can expect from us. We will continue to focus on improving the service experiences of the people we serve whenever and however they have contact with us. The feedback gathered from the public using the Common Measurement Tool assists in deepening our understanding of people's service experiences and ensuring that we are keeping our services aligned with the expectations of the communities we serve. Providing excellent service is part of the NZ Police overall strategy.

''

"Making Tax Easier for Kiwis" - Case study from Inland Revenue

Inland Revenue is making tax simpler, more open and more certain for customers and businesses. We've made great strides towards making it easier to file a tax return and claim a refund, receive the right entitlements, pay GST and find information online or talk with us in person.

Customers Moving to Digital

We are helping our customers to do more online and save time by improving our online services, making it easier to register, and promoting our online services in new ways.

Customers are embracing new digital opportunities with the number of people moving to online services increasing dramatically - last year 67% of returns and almost 83% of payments were done online.

This year we improved our website and 'myIR' online services, and made it easier to register and find information. People can now activate their new 'myIR' account instantly via text. We introduced more online documents to provide access to statements, notices and letters electronically.

Completing tax and social policy obligations online also helps customers to get it right the first time. This year 80% of tax refunds from Personal Tax Summaries were received by customers within one day.

The number of self-help service contacts has increased by nearly two million since last year; this means that when customers do need to call us they don't have to wait long and we are available to help them with their more complex queries.

Customers expect to be able to interact with us using all their digital devices. In January 2015, we launched our new free mobile application ‘myIR' on Apple products. This enables small to medium enterprises to manage their GST through their smartphone.

We are continually improving our website and YouTube information to make sure customers are able to find the information they need when they need it. One of the improvements this year was to add transcripts of our introductory videos written in Korean and traditional and simplified Chinese to reflect the changing needs of our customers. These are in addition to the existing information in English, Māori and New Zealand Sign Language.

Our work to make tax easier to understand on our website was recognised in December 2014 when ird.govt.nz won the ESET NetGuide best Government website award for the eighth time in nine years.

Better Call Handling

Going digital gives customers increased certainty. But if they need to call us we're answering more calls and answering them faster: 75% of our 3.24 million calls received last year were answered in less than two minutes. Reducing the time it takes for us to answer customers' telephone calls improves their experience and satisfaction. This is why we aim to answer calls as quickly as possible.

We introduced Voice ID in November 2011 as a way to recognise people and remove the need for us to ask questions to confirm that we are talking to the right people. This saves the customer time and enables us to answer calls faster. People register with Voice ID once then use the system each time they call. At the end of June 2015, 1.5 million people were registered for Voice ID. Inland Revenue has one of the world's highest rates of enrolment for voice recognition services.

Building on our experience from previous years, we have planned better to manage our peak times, taking a customer-centric approach rather than a service-level centric approach. For example, this year we better understood the key reasons customers contact us and focused on capacity to anticipate and minimise repeat contacts for these services.

We managed our calls more effectively by directing calls to staff with the best skills to help and by offering our customers call back options, including the opportunity to book a time for us to call them back when they are free. We have also reduced the number of phone calls that are related to follow-up contact by improving how quickly we get back to people.

Our technology and service delivery teams worked closely together to ensure there was increased support for our systems so they were available during our busiest times.

Targeted Marketing and Communications

Proactive, targeted marketing and communication to customers helped to direct customers to the right channel to file and pay on time and manage peak demand times more effectively.

One example was this year's peak season "The tax refund workout" marketing campaign which aimed to get people who think they may be due a tax refund to check and confirm through 'myIR'. The campaign used a combination of online advertising, direct marketing activity and communications to reach customers before they needed to call us.

All of these activities contribute to Better Public Services, making it simpler and more seamless for New Zealanders to deal with Government. There is more to come with planned improvements to many aspects of New Zealand's tax system in the coming years that will make it simpler and reduce compliance cost for Kiwis.

Appendix 1: The Kiwis Count Survey

This report is the third report of annual results, and the thirteenth in a series of quarterly updates from the Kiwis Count survey.

The annual information included in the report draws on the experiences of 2,365 New Zealanders who completed Kiwis Count between July 2014 and June 2015.

Background

In 2007, for the first time, the State Services Commission asked a sample of New Zealanders about their experiences and views of public services. Known as the Kiwis Count survey, this provided rich information on how New Zealand's public services were performing in the eyes of the people who use them. The survey ran for a second time in 2009.

In late 2011, the State Services Commission contracted Nielsen to manage the collection and reporting of Kiwis Count. Nielsen and Commission staff worked together to turn Kiwis Count from a point-in-time survey into a continuous survey with ongoing data collection and regular reporting. This enables trends over time to be examined and the earlier identification of issues.

Based on the methodology of a Canadian government survey called Citizens First, Kiwis Count measures satisfaction in public services. Public services means all services provided by government and includes central and local government services, tertiary institutions, schools and hospitals.

Kiwis Count is part of a wider research initiative called the New Zealanders' Experience Research Programme (NZE) designed to find out how New Zealanders experience public services, and to develop tools through which services can improve (see figure 6: New Zealanders' Experience Research Programme (NZE) on page 14.

Kiwis Count Updates

Each quarter, the State Services Commission publishes an update from Kiwis Count to highlight areas of strength and areas for improvement in the quality of service delivered to New Zealanders. The focus of releases has been on the core part of Kiwis Count - the service quality scores for 42 commonly used services. With each quarterly update, a clearer picture of the trends in the quality of service delivery is emerging.

Each quarterly release reports on the past two quarters of data. For example, a September quarterly release, will report on data collected from April to September, and compare these results to those collected between January and June. This six-monthly rolling average approach boosts the sample size to over 1,000 per quarter, reducing the potential for volatility from quarter to quarter.

In addition to quarterly reporting, each June report also reports on an annual data series. For example, the June 2015 release reports on data collected between July 2014 and June 2015, and compares this with data collected between July 2013 and June 2014, and July 2012 and June 2013.

Potential Reporting Changes

It was appropriate to use the first data point in June 2012 as the beginning of the annual series when there was limited data. As we approach December 2015 and the point where there is four years of continuous data, we are considering moving to a December year end for annual reporting.

We are also considering semi-annual or annual public reporting only. If you have a view about this, you can email newzealanders.experience@ssc.govt.nz.

Survey Approach

A survey methodology report is on the SSC website (http://www.ssc.govt.nz/kiwis-count-technical-report-july-2015).

Questionnaire Content

The Kiwis Count survey is modular. The core of the survey is questions about 42 public services that New Zealanders use frequently. These core questions have been and will remain fixed for the next few years, with new questions added only as required to reflect actual changes in services.

The modular part of the questionnaire is designed to change as required to focus on service delivery priorities:

  • In the 2012 calendar year the survey included a module of questions on channel use and preferences. This repeated a module of questions which was included in the 2009 survey.
  • Starting in 2013 a module of questions about the ease of transacting with government in the digital environment replaced the previous module. The new module, developed with the team responsible for Result 10 of the Better Public Services programme, will be one of a suite of measures used to report on the progress of Result 10.
  • In further consultation with the Result 10 team, the 2013 questions were amended slightly at the beginning of 2014 to capture information about relative satisfaction depending on whether services were accessed via single or multi channel.
  • For the first half of 2014, a new module of questions was included. It was about parent/primary caregiver' satisfaction with education services.
  • In the second half of 2014, at the request of the Result 10 team, new questions were added:
    • To the Government and the digital environment module, and
    • To the main body of the survey, at A10, about experiencing public services.
  • In 2015, new questions were added, at A13, about the privacy of personal information for the Office of the Chief Privacy Officer.

Continuous Surveying

Unlike the 2007 and 2009 Kiwis Count surveys that were point-in-time collections, Kiwis Count is now a continuous survey. At the start of each month, Nielsen sends out 432 survey invitations[14]. The change in approach increases the frequency of reporting from biennially to quarterly and provides a regular stream of performance information for Ministers, agencies and the public.

Encouraging Online Participation

In late 2011, SSC worked with Nielsen to redesign the survey processes to encourage online participation and reduce survey costs. There has been a significant shift towards online participation. Fifty six percent of respondents chose to complete the survey online in the six months to June 2015. Online completion rates have been consistent with this since 2012. This compares to online rates of 17% in 2009 and 8% in 2007.

Sample Size and Response Rate

The response rate between January 2015 and June 2015 was 47%. This is at the low end of response rates achieved since the continuous approach was adopted (response rates have ranged between 46% and 53%).

As Nielsen outlined in the June 2014 report, key drivers of response rates include sampling, questionnaire length and incentives provided to complete surveys.

The Kiwis Count team can correlate response rates closely to questionnaire length.

In the year to June 2015, 2,365 New Zealanders completed the survey, compared to 6,099 in the year to June 2014, 2,371 in the year to June 2013, and 1,121 in the six months to June 2012. This sums to a total of 11,956 New Zealanders completing the survey since the continuous approach began.

Service Quality Scores

The Kiwis Count survey asks New Zealanders to rate services or express opinions using a scale from 1 to 5. To enable comparisons between Kiwis Count and Citizens First to be made, we have adopted the Canadian approach of converting five point rating scales to service quality scores ranging from 0 to 100.

The overall Service Quality Score is calculated by rescaling the result from each respondent's five point scale (1,2,3,4,5) to a 101 point scale (0,25,50,75,100) then calculating an average of these scores from all the services used.

Example: the service quality question

''

The overall average uses all service experiences, so a respondent who has used ten services contributes ten observations to the overall score and a respondent who has used one service contributes one observation to the overall score.

Appendix 2: Performance on the Drivers by Channel

Figure 10: General: Face to Face channel by driver (percentage of people who answered 4 or 5)

''

Figure 11: Telephone channel by driver (percentage of people who answered 4 or 5)

''

Figure 12: Looking for Information Online channel by driver (percentage of people who answered 4 or 5)

''

Figure 13: Transacting Online channel by driver (percentage of people who answered 4 or 5)

''

Notes

Comment from Research Partner

The Kiwis Count results show how hard government agencies are working to improve citizens' experiences, perceptions and trust in government services. Citizens around the world expect more transparent, accessible and responsive services from the public sector and those expectations are rising[4].

As this report shows, one size does not fit all, for example what works for our youth is not always the best solution for our older New Zealanders.

Recently, Nielsen has been looking at key trends among New Zealanders and many of these will impact the way public services are accessed. Some of these key trends include:

  • More connected - In 2005 Nielsen's Consumer and Media Insights (CMI) Research found that 51% of all people aged over 10 and 65% of those aged over 50 had not used the internet in the last week for anything other than email. The picture today is quite different, with seven out of ten New Zealanders over the age of 10 actively using the internet and half using smart phones and tablets.[5]
  • Aging population - It is well known that we have an aging population. In the next 25 years, Statistics NZ projects the number of people aged 65+ will more than double. Also interesting is that the percentage of women increases as age increases; for example at the time of the 2013 Census up to 64% of those aged 85+ were women.[6]
  • Less landlines - Based on data from Nielsen's CMI research, one in three New Zealanders do not have a landline at home (with this increasing to two in three of those aged 20 and under).[7]
  • Ethnic diversity - In the 2013 Census, Statistics NZ found that, between 2001 and 2013, the size of Asian ethnic groups almost doubled, with over a quarter of people living in New Zealand identifying with an ethnicity other than European. In addition, a quarter of the usually resident in NZ population was born overseas, with Asia being the most common region or origin. Currently English is our most commonly spoken language followed by Māori, Samoan and Hindi.[8]

The implications of these trends affect all public services. With less use of landlines, increased use of internet (particularly tablets) it is no longer possible to allocate channels to customer types (e.g. landlines for older people, online for youth). With the impact of the aging population considered alongside these trends, it is clear that the way New Zealanders access public services is constantly changing.

Design of internet access of public services will need to be tailored to meet the needs of its audience. Some services may need to tailor their apps and website design to meet their older customers' needs. This will mean reviewing aspects such as navigation and font size to ensure it is intuitive and easy for older users. But one size won't fit all; other services with a younger customer base, such as StudyLink, will have quite different expectations from their users.

The Kiwis Count results show that satisfaction, trust and perceptions can differ based on the ethnicity of the respondent. Public services will need to take into account the changing ethnic mix of New Zealand when designing, redesigning or improving their service delivery, including the mix of languages that websites, forms and other sources of information are available in.

The challenge is similar for many governments around the world. As stated in McKinsey & Company's report “Part of the problem is that, despite their best intentions, many governments continue to design and deliver services based on their own requirements and processes instead of the needs of the people they service”.[9]

Notes

Last modified: