Stats NZ has a new website.

For new releases go to

As we transition to our new site, you'll still find some Stats NZ information here on this archive site.

  • Share this page to Facebook
  • Share this page to Twitter
  • Share this page to Google+
Impact of differing methodologies

This chapter describes how the differing methodologies across the three data sources may affect findings between the census and the post-censal surveys, and between the 2001 HMLS and Te Kupenga.

Census and the post-censal surveys

Question wording

The wording of a question can have a great impact on the responses to that question (Groves et al, 2009). The census question is very different from that used in the post-censal surveys, being more general in nature and more open to different interpretations. The response options are also very different, with the post-censal surveys’ options offering much more information for respondents to base their answers on.

The census language question likely does not capture those who said they could speak te reo ‘not very well’. This assumption is supported by the fact that 85 percent of those who said they could speak te reo Māori ‘not very well’ in Te Kupenga had said in the 2013 Census that they could not hold a conversation in te reo Māori. See Appendix 1 for more information on the consistency in answers between the census and the post-censal surveys. These differences in the questions mean that comparisons between rates for te reo Māori speakers from census and the post-censal surveys are not exact.

All these data sources are dependent on respondents’ interpretations of their own level of language proficiency. The survey delivery method, context, and the interviewers themselves may influence answers.

Survey delivery method

Te Kupenga and the 2001 HMLS involved face-to-face interviews, in either te reo Māori or English. The census involves people completing a questionnaire on their own, and contains a single question about language knowledge. The delivery method is likely to have some impact on te reo Māori statistics, although it would not solely explain differences. The impact is also not easy to quantify. Examples of the impact are:

  • social desirability bias in face-to-face interviews, where respondents may want to say what they think will be an acceptable response (Australian Bureau of Statistics, 2002 [ABS]; Groves et al, 2009).
  • less control over the response process in self-administered surveys; for example, no control over who is present while the survey form is being completed, or no opportunity to provide clarification to a question (ABS, 2002).

Survey objectives

The 2001 HMLS and Te Kupenga had much more specific objectives than the census. This meant they were able to delve much deeper into te reo Māori than the census did. The census has just one question on language. The post-censal surveys have a range of questions about te reo Māori specifically, including: proficiency, the use of the language inside and outside the home, language acquisition, and other language skills.

In addition, Te Kupenga was a general Māori well-being survey, which means that information on te reo Māori sits with wider information on Māori well-being. This makes Te Kupenga a valuable data source in exploring how te reo Māori ability and use is associated with wider Māori culture.

2001 Survey on the Health of the Māori Language and Te Kupenga

Context – objectives and question order

The context in which a question is asked can affect a respondent’s answers (ABS, 2002). Two differences between the post-censal surveys that may change the context are the survey objectives and question order.

We introduced the 2001 HMLS as a Māori-language survey, and reminded respondents of that purpose throughout the interview to give them context in answering questions. However, we introduced Te Kupenga as a Māori well-being survey, and questions about Māori language are only introduced near the end.

Context effects may also occur when the preceding questions influence responses to subsequent questions. The questions that preceded the language questions in the post-censal surveys were quite different from each other. This difference may have affected how respondents answered the questions.

As with the effects of the delivery method, context effects are difficult to quantify.

Proficiency of interviewers

The level of te reo Māori proficiency among interviewers and how we assigned those interviewers were very different between the 2001 HMLS and Te Kupenga. In face-to-face interviews the interviewer and the respondent interact with each other, potentially allowing the interviewers to influence the respondents. A proficient speaker of te reo Māori may influence a respondent’s answers to questions on te reo Māori, more so than if a non-proficient (and non-Māori) speaker was conducting the interview.

The differences in interviewers’ proficiency would likely have had an impact on te reo Māori statistics, although this impact cannot be quantified.


The survey population for Te Kupenga was people who identify with the Māori ethnic group or have Māori descent. For the 2001 HMLS, the survey population was just those who identify with the Māori ethnic group. The reason for changing the survey population for Te Kupenga was the desire to capture the widest possible group of Māori. All comparisons between Te Kupenga and the 2001 HMLS published so far have used the Māori ethnic group only.

This methodology difference is unlikely to have any major impact on differences in figures.

Possible impact on 2001 results

These differing methodologies appear to contribute to misalignment of the 2001 HMLS and 2001 Census figures (see Appendix 1 for more information). And as a result, data users should exercise caution when comparing the 2001 HMLS with Te Kupenga.

  • Share this page to Facebook
  • Share this page to Twitter
  • Share this page to Google+
  • Share this page to Facebook
  • Share this page to Twitter
  • Share this page to Google+