Skip to main content
English Cymraeg
Research project

Food and You 2: Wave 6 Technical report

This technical report is for Wave 6 and gives details about each wave of fieldwork.

Last updated: 24 July 2023
Last updated: 24 July 2023

Survey background

The Food and You 2 Survey was commissioned by the Food Standards Agency (FSA) in September 2019. Fieldwork has been conducted on a biannual basis since July 2020.
Dates of previous waves of fieldwork are as follows:

  • Wave 1: July and October 2020
  • Wave 2: November 2020 and January 2021
  • Wave 3: April and June 2021
  • Wave 4: October 2021 and January 2022
  • Wave 5: April and July 2022

Further details about each wave of fieldwork can be found in the corresponding technical report.  

This technical report is for Wave 6 which was conducted between 12 October 2022 and 10 January 2023 among a cross-section of 5,991 adults (aged 16 years or over) living in households in England, Wales and Northern Ireland. Adults invited to take part in the survey were selected from a sample of the Royal Mail’s Postcode Address File (PAF) using a random probability sampling methodology. The survey was conducted using a push-to-web methodology. This is a quantitative data collection method in which participants are contacted using an offline means of contact and asked to complete an online survey. In this survey, participants were contacted by letter, with those who chose not to complete the online survey, after the initial reminder, subsequently sent a postal version. The survey explored participants’ food-related knowledge, behaviours and attitudes.

About the Food Standards Agency

The Food Standards Agency (FSA) is an independent Government department working to protect public health and consumers’ wider interests in relation to food in England, Wales, and Northern Ireland. The FSA’s overarching mission is “food you can trust”, which means the Agency strives towards a food system in which food is safe, food is what it says it is and food is healthier and more sustainable. As such, understanding consumers’ attitudes, knowledge and behaviour in relation to food is of vital importance to the FSA. 

Food and You 2 is the FSA’s principal source of methodologically robust and representative evidence regarding consumers’ attitudes, knowledge and behaviour in relation to food. This survey has an important role in measuring the FSA’s progress towards its strategic objectives, providing evidence to support its communication campaigns and other activities, and identifying topics for further research or action.

History of Food and You

Since its inception in 2000, the FSA has commissioned surveys to collect quantitative data on the public’s reported behaviour, attitudes and knowledge relating to food. Between 2000 and 2007 the FSA conducted an annual Consumer Attitudes Survey (CAS). In 2010, this was replaced by the more rigorous ‘Food and You’, a biennial survey conducted face-to-face. Food and You became the FSA’s flagship social survey. In addition, the FSA conducted regular tracking surveys including the biannual Public Attitudes Tracker and annual Food Hygiene Rating Scheme (FHRS) Consumer Attitudes Tracker. The FHRS is a scheme that helps consumers choose where to eat out or shop for food by giving clear information about the businesses’ hygiene standards. The scheme is run in partnership with local authorities in England, Wales and Northern Ireland. 

In 2018, the FSA’s Advisory Committee for Social Science (ACSS) recommended that Food and You and the Public Attitudes Tracker be replaced with a new ‘push-to-web’ survey. Food and You 2 was commissioned in 2019 with data collection for Wave 1 commencing in July 2020. 

Due to differences in the survey methodologies, comparisons cannot be made between Food and You or the Public Attitudes Tracker and Food and You 2, therefore Wave 1 of Food and You 2 in 2020 represented the start of a new data time series. Data are collected through Food and You 2 on a biannual basis.  

Summary of the survey

Design

The research was conducted using a push-to-web methodology with households selected to take part in the survey receiving a letter that invited them to complete the Food and You 2 survey online. Up to two adults in each household could take part. Fieldwork was conducted from 12th October 2022 to 10th January 2023. Restrictions put in place due to the Covid-19 pandemic were all removed in England, Wales and Northern Ireland prior to fieldwork commencing. However, in the 12 months prior to survey fieldwork (which was the time period referenced in some questions) restrictions would have been in place in all countries to some extent but more so in Northern Ireland and Wales compared to England as described in the Technical report for Wave 4. Restrictions may have impacted some participants’ behaviours relating to food, and in turn may have impacted how participants answered certain questions. 

In this study, the fieldwork was structured around four mailings: 

  • Mailing 1: Initial invitation letter inviting up to two individuals per household to complete the Food and You 2 survey online
  • Mailing 2: Reminder letter
  • Mailing 3: Second reminder, which included up to two versions of a postal questionnaire
  • Mailing 4: Final reminder letter 

Mailings 2, 3 and 4 were sent only to those who had not completed the survey since the previous mailing, and households where there was a known second participant who was eligible to take part but had not yet completed the questionnaire. There was a question which asked for the number of adults in a household. If one person responded in a household and they stated that there was only one adult in their household, they would not be sent a reminder letter. If they stated that more than one adult was present in their household then that household would be sent a reminder, unless both adults had completed the survey.

Questionnaire

The survey included an online version of the questionnaire and two postal versions. On both versions there were slight differences between the questionnaires in England, Wales and Northern Ireland, reflecting the different regional government bodies, their roles and responsibilities. For participants in Wales, both the online and postal surveys were offered in Welsh and English. Participants could take part in Food and You 2 via the online survey or using a postal survey.

The online questionnaire was structured as a series of modules covering key areas of interest to the FSA. Most questions were behavioural, asking participants to state their usual activities or to recall recent actions. A smaller number of questions were attitudinal, asking participants to state their opinions on various subjects, or knowledge-based, for example asking participants what they think the temperature inside their fridge should be. The questionnaire included demographic questions to allow the FSA to conduct subgroup analysis on the data. When analysing data from Food and You 2, it is important to note that behaviours are self-reported and therefore may not reflect actual observable behaviour. Measures were taken to minimise the impact of social desirability (for instance, stating that results are reported anonymously) and to increase accuracy (including time frames), but there is likely to be some difference in self-reported and actual observable measures.

Due to the length and complexity of the online questionnaire it was not possible to include all questions in the postal version of the questionnaire. The postal version of the questionnaire needed to be shorter and less complex to encourage a high response rate, so two versions were produced. Key modules (for example, About You, Hypersensitivities) were asked in both versions of the postal surveys, with the remaining content divided across the two versions depending on available space. Details of which modules were included in each postal version are outlined below (unless stated otherwise a module was included in both versions):

  • Introductory Questions
  • Food Hypersensitivities
  • Eating at Home (EH version only)
  • Eating Out (EO version only)
  • Food Concerns 
  • Food You Can Trust 
  • Household Food Security
  • About You and Your Household

Whilst steps were taken to make the online and postal questionnaires as comparable as possible, there were minor differences in the order questions were asked, question wording and the way routing was applied. The online and postal versions of the survey can be found in appendices linked to this report.
Further information on the questions asked in each module and questionnaire development can be found in the ‘Questionnaire development and cognitive testing’ section.

Sampling

A random sample of addresses was drawn from the Royal Mail’s Postcode Address File (PAF), a database of all known addresses in the UK. The sample was drawn from the address list for England, Wales and Northern Ireland. The size of the sample from each region aimed to provide an estimated minimum of 1,000 responses in each of Wales and Northern Ireland, and 2,000 from England. Wales and Northern Ireland were therefore over-represented in the sample. The samples were drawn in this way to enable effective subgroup analysis on the data.

The sample was further stratified by local authority to ensure even geographical spread across the three countries. The local authority classifications used were those that existed when the sample was first selected in July 2020 for Wave 1 and have not been updated to show current boundaries. Within each local authority the sample was stratified by degree of deprivation to ensure a broadly representative sample in terms of income level. As in prior waves, a reserve sample was also drawn at the same time as the main sample. More details on this can be found in the ‘Sampling’ section. 

In each selected household, up to two adults (aged 16 years or over) were invited to participate in the survey. In the interests of maximising the response rate, no selection criteria (other than being aged 16 years or over) were imposed regarding the selection of individuals within each household. 

The sampling strategy for this survey is described in greater detail in the ‘Sampling’ section. 

Weighting

Weighting is a process by which survey estimates are adjusted both to compensate for unequal selection probabilities and to reduce demographic discrepancies between the sample who completed a survey and the desired survey population - in this instance, the populations of England, Wales and Northern Ireland. 
Following data collection, two kinds of weight were applied to the data. First, selection weights were calculated to equalise selection probabilities for individuals across all sampled households.  Second, these weights were adjusted to ensure achieved sample estimates aligned with ONS country population totals for selected variables. Following this, additional weights were created for use in combined-country analyses by scaling the country sample sizes to be proportional to their corresponding country population values.

Finally, a ‘Wales & Welsh-England’ weight was calculated to permit comparisons to be made between England (excluding London) and Wales after controlling for differences in age, gender, ethnic group, household size, and urban-rural mix.

The weighting process is described in greater detail in the ‘Weighting’ section.

Questionnaire design

Food and You 2 uses a sequential mixed-mode approach involving an initial online stage, with non-respondents then followed up using a postal questionnaire. Therefore, the questionnaire was designed in such a way that it could be presented online and on paper. Like many other push-to-web surveys, the online version of the questionnaire is too long and complex to translate into an equivalent self-completion questionnaire suitable for postal administration. This meant there were some differences between the online and postal questionnaires. To help address this limitation, two versions of the postal questionnaire were developed, thereby enabling more questions to be asked across the sample as a whole. However, even with two versions of the postal questionnaire, there was insufficient space to include some of the online questions. 

Given the wide range of topic areas that the FSA and external stakeholders were interested in investigating, the issue of questionnaire length was considered throughout the questionnaire development period. Ipsos recommended that, in the interest of reducing drop-out rates, the online questionnaire should not take longer than 30 minutes for the average participant to complete and the postal questionnaires should not be more than 20 pages in length. This time limit for the online survey and page limit for the postal survey were recommended to minimise the risk of participants not completing the survey, and to minimise the risk of straight-lining (for example, selecting the same answer consistently) when going through the survey. 

A modular approach was required for Food and You 2 to keep the length of the survey to a maximum of 30 minutes, and to minimise the likelihood of participants starting but not completing the survey. It also maximised coverage of topics and allows for new modules or questions to be added on emerging topic areas. When developing the Food and You 2 Wave 1 questionnaire, the topic areas the FSA were interested in were grouped into broad ‘modules’ (such as food shopping, food concerns or food we can trust). These modules were then assessed for frequency of fielding (6 months, 12 months or 24 months). For instance, attitudinal questions that are used to measure the FSA’s performance (e.g. trust in the FSA) or where fluctuations over time are more likely (for example, concerns with food) were considered to be ‘core’ and therefore collected every 6 months. Whereas behavioural questions (for example, on food practices in the home) that were relatively stable over time in previous studies were deemed to be appropriate for fielding less frequently. The modules selected for inclusion in the Wave 6 questionnaire reflected this approach. 

Questionnaire development draws upon the work done for previous waves. The development for Wave 1 involved questionnaire development workshops, cognitive testing, usability testing and a pilot (covered in more detail in the Wave 1 Technical Report). The questionnaire development for Wave 2 onwards was shorter as core questions and materials had been developed in Wave 1. A phase of cognitive testing was held to test newly developed questions. Cognitive testing for Wave 6 covered the hypersensitivity questions, additional codes to existing questions and new questions around reactions to foods and use of food clubs.

Design of questions

The content and nature of the questions was informed by previous research conducted by the FSA, the FSA and stakeholders’ research priorities, and by Ipsos’ prior experience in survey research.

Much of the content for the questionnaires had already been completed during Waves 1-5 questionnaire development periods. To determine content for the Wave 6 questionnaire, meetings were held between Ipsos, the FSA and key stakeholders to discuss research priorities and to decide which questions from the online questionnaire should be included in the postal questionnaires. No new modules were developed for Wave 6 though additional questions were added to existing modules.   

To enable comparability of the data between waves, questions carried over from earlier waves were largely kept consistent in wording and format. The main exception was the question collecting respondent age which was changed in Wave 5. This was still asked as an open question but for those preferring not to give an answer they were asked to choose their age from pre-defined age band categories instead. More details are given after Table 6 ‘Demographic profile of survey responders – Age’. 

Cognitive testing

In social and market research, cognitive testing refers to a form of qualitative data collection in which participants are asked by an interviewer to examine a set of materials and explain their understanding of them. In questionnaire development, cognitive testing interviews are used to evaluate how participants approach a questionnaire so that any issues regarding participant comprehension may be highlighted. 
Following the completion of the first questionnaire draft, a series of cognitive testing interviews were arranged to test a sub-set of questions from the questionnaire, specifically those new or modified for Wave 6. Nineteen interviews were conducted with members of the public, including 5 conducted in the Welsh language. During recruitment participants were screened on age, gender, ethnicity, geographical region, employment status, income and whether or not they had a food allergy, intolerance or Coeliac disease. This ensured people with relevant food behaviours and habits were spoken to, which was important for assessing the questions.

Key aims of the cognitive testing included: 

  • to gauge the simplicity of questions and participant comprehension of key terms;
  • to note any ambiguity in the interpretation of the questions; and
  • to identify any questions that may not produce meaningful data.

The Welsh language interviews also aimed to evaluate the accuracy and clarity of the translations.

Each cognitive interview was undertaken with a single participant, lasted approximately one hour, and was conducted by a moderator using online video conferencing software. During each interview, the moderator recorded the participant’s answers and noted further observations regarding how the participant interpreted the questionnaire, with attention paid to any problems encountered. The English language interviews were conducted by moderators from Ipsos, while the Welsh language interviews were conducted by a trusted external qualitative researcher. Some of the interviews were conducted in the (virtual) presence of an observer from the FSA.

Following completion of the interviews, Ipsos submitted a written report to the FSA detailing the findings. An extended meeting was subsequently held to discuss the findings and agree on further edits to the questionnaire.

Survey mailings

The survey was conducted in England, Wales and Northern Ireland using a push-to-web methodology. As noted, push-to-web is a quantitative data collection method in which offline contact modes are used to encourage sample members to go online and complete an online questionnaire. 

The push-to-web methodology used in this survey mirrored a tried-and-tested methodology used by Ipsos in previous studies; a sequential mixed-mode approach in which participants are at first asked to complete an online survey, with non-respondents then followed up using a postal questionnaire at the third mailing. The rationale behind this methodology is that it brings the benefits of encouraging online survey completion while avoiding the exclusion of those who do not have access to the internet and/or have low levels of digital literacy.

In this study, the methodology consisted of a series of four mailings sent to selected households. The second, third and fourth mailings were only sent to households who had not responded to the survey since the previous mailing. Further details regarding the sampling approach are provided in the ‘Sampling’ section. The schedule of mailings is outlined below:

  • mailing 1: Initial invitation letter
  • mailing 2: First reminder letter
  • mailing 3: Postal questionnaire and second reminder letter
  • mailing 4: Final reminder

The first mailing invited recipients to complete the survey online. The letter invited two adults from each household to participate. Each participant was provided with a unique login code allowing them to complete the questionnaire on the survey website. Those who did not complete the survey following receipt of the initial invitation letter were sent a reminder letter a few weeks following the mailout of the invitation.

The second mailing took the form of a reminder letter, again inviting participants to complete the online survey. In the third mailing, copies of the postal version of the questionnaire were sent alongside a letter instructing recipients how to complete and send back the postal questionnaire. Lastly, a final reminder letter was sent. Each mailing was separated by an interval of a few weeks. 

Postal questionnaire design and modular approach

The postal questionnaires consisted of a selection of questions from the online survey. The full questionnaire was not included in the postal versions due to concerns regarding questionnaire length. 
Questions were selected for inclusion in the postal questionnaires based on a number of factors. For instance, questions that were a key strategic measure for the FSA (for example, trust in the FSA) were included to provide the FSA with robust data. Questions were also included to maximise the base sizes for specific groups of interest (for example, participants with food allergies). Finally, questions where the mode of delivery and sample profile may have impacted on the data collected, for example questions on food security. It was important to include the majority of the demographic questions in the postal survey to enable subgroup analysis.

As with the online questionnaire, there were minor differences between England, Wales and Northern Ireland in the wording of a small number of questions. Participants in Wales were sent copies of the questionnaires in English and in Welsh. 

As noted, the survey was conducted using a modular approach. Certain ‘core’ modules were included in each biannual survey wave, while others were rotated every 12 or 24 months. The content of the survey for this wave is detailed in the section below. 

Overview of survey content

Introductory questions (core module)

In the online survey, this module began with a question asking for confirmation of age (as those under 16 years were not eligible to participate). This was followed by a small number of questions asking participants for some basic information about themselves and their household, such as their gender identity, and the number and age of any other household members. The module also asked participants whether they had a food allergy, food intolerance or Coeliac disease so that the questionnaire could be tailored to individuals. This module was included in Wave 1 and is kept unchanged between waves to enable comparability of subgroup trend data. 

Food hypersensitivities (rotating module repeated from Wave 4)

This module began with a question asking participants whether there were any foods which caused them unpleasant physical reactions or which they avoided because of unpleasant physical reactions which the foods might cause. Those participants who answered ‘yes’ to this question were then asked a series of questions regarding the nature of the hypersensitivity and how they found out that they had the hypersensitivity. They were then asked details regarding any recent experiences of consuming the foods in question. Certain questions in subsequent modules were routed to those who stated in the Food Hypersensitivities module that they had a hypersensitivity.

Eating at Home (core questions, repeated from Wave 4)

This module included a sub-set of questions asked in the ‘full’ Eating at Home module included in Wave 1 and Wave 5. It was intended to gauge participant knowledge of and adherence to the key FSA food safety and hygiene guidelines. Participants were asked about the ways in which they store, prepare, and consume food in the home. 

Eating out 

In this module, participants were asked how often they eat out or buy food to take away and the factors they consider when choosing where to eat. Participants were also asked about their awareness and use of the Food Hygiene Rating Scheme (FHRS).

Food Concerns (Core module)

In this core module, participants were asked whether they had any concerns with the food they ate, followed by a spontaneous question asking them to give details on these. This was followed by questions which listed specific food concerns, prompting participants on the food concerns they may have.

Food You Can Trust (Core module)

This core module gauged participant confidence in the food supply chain (including in farmers, food manufacturers, and shops) and asked participants questions relating to the FSA, and trust in its ability to fulfil its key responsibilities. 

Household Food Security (Core module)

This module incorporated the USDA 10-item US Adult Food Security module, a standardised measure that uses indicator questions to assess different levels of food security experienced by participants and their households. It asked a series of questions regarding participants’ ability to afford food over the previous 12 months. It also asked about changes participants had made to their eating habits in the last 12 months, and the reason for these changes (for example, financial reasons, health reasons). The USDA has published the most up to date guidance, including how to calculate food security scores. For more detailed information please visit the guidebook. A new question was added at Wave 6 about the use of food clubs and social supermarkets.

Due to the sensitive nature of the topic area, all questions in this section were optional and included a ‘Prefer not to say’ option, in addition to ‘Don’t know’ or ‘Not stated’ options. Any questions that had any of these three responses, or that were left blank, were treated as ‘missing’, with no data imputed.  In total 230 respondents had missing responses to the first three questions and so their overall food security status was set to missing (at Waves 1, 2, 3, 4 and 5 there were 313, 187, 191, 212 and 272 such respondents respectively).

About You and Your Household (Core module) 

This final module asked participants various questions about their personal circumstances and those of their household, including age, marital status and working status. The inclusion of these questions was primarily intended to enable demographic subgroup analysis of the data.  
 

Sample design

The sample for Food and You 2 was selected from the postcode address file (PAF) in England, Wales and Northern Ireland. The sample of addresses was unclustered within each country. Households were sampled to achieve interviews in 1,000 households in Wales and Northern Ireland, and 2,000 households in England (Table 1) (4,000 overall). In other words, a greater proportion of households were sampled in Wales and Northern Ireland compared to England. This was done to improve the precision of estimates for Wales and Northern Ireland. 

The size of the issued sample in each country was calculated by dividing the target sample by estimated address yield (proportion of addresses with at least one productive response). Yield estimates were based on actual yields obtained in previous waves. An additional reserve sample was drawn to be issued (in whole or in part) if response rates were lower than anticipated. 

Table 1: Sample sizes and assumptions for each country

Country Main sample Assumed address completion rate Total sampled
England 6,849 29% 10,273
Wales 3,542 28% 5,313
Northern Ireland 4,109 24% 6,164
Total 14,500 28% 21,750

The samples of main and reserve addresses were stratified proportionately by region (with Wales and Northern Ireland being treated as separate regions), and within region (or country) by local authority (district in Northern Ireland) to ensure that the issued sample was spread proportionately across the local authorities. National deprivation scores were used as the final level of stratification within the local authorities (in England the Index of Multiple Deprivation (IMD), in Wales the Welsh Index of Multiple Deprivation (WIMD) and in Northern Ireland, the Northern Ireland Multiple Deprivation Measure (NIMDM)). In practice stratification was achieved by ordering the population of PAF addresses by (i) region (country) (ii) local authority (district) within region and (iii) national deprivation score of Lower Layer Super Output Areas (LSOA) (OA on Northern Ireland) within local authority (district), and then selecting addresses by the method of random start and fixed interval. The steps for sampling that were taken were:

  1. From the PAF file, exclude all business addresses and private addresses that were selected in previous waves of the Food & You 2 survey
  2. Order the address list by region (for England only)
  3. Within each English region, Wales and Northern Ireland, order addresses by local authority (district in Northern Ireland)
  4. Within local authority / district, order addresses by IMD of LSOA in England, WIMD of LSOA in Wales, and NIMDM of Super Output Areas (SOA) in Northern Ireland
  5. Select numbers of addresses shown in Table 1 by method of random start and fixed interval from these ordered lists
  6. Divide stratum-ordered selections into successive groups of 3 selections
  7. Within each group of three, randomly allocate two cases to the main sample, and one case to the reserve sample.

Household sample design

As stated above, addresses were selected from the Postcode Address File (PAF) systematically using the random start and fixed interval method. At each address, up to two adults were invited to take part in the survey. Two unique login codes for the online survey were provided in the initial invitation letter and up to two were provided in each reminder mailing. Up to two postal questionnaires were provided in the postal questionnaire mailing (Mailing 3). In the reminders, two logins / questionnaires were sent to completely non-responding addresses. At any address where one adult had already completed the questionnaire only one login code and one postal questionnaire were sent. Each adult who completed the questionnaire received a £10 online or paper voucher.

Process for selecting adults within a household

There are many approaches that could have been used for selecting adults within households. For instance, the two adults with the most recent birthdays or the adults with the two next birthdays could be selected. These are commonly referred to as quasi-random approaches, as they are roughly equivalent to a fully random approach. While this would have randomised the selection process to a degree in households where there were more than two adults, in self-administered surveys it adds another barrier to completing the survey and has been shown to be incorrect in about 20% to 25% of cases. Further details are available from TNS BMRB’s 2013 report of web experiments prepared for the Cabinet Office on the Community Life Survey or a journal article from 2014 by Kristen Olson and Jolene D. Smyth focusing on the accuracy of within-household selection in general population web and mail surveys published in Field Methods (volume 26, issue 1, pages 56–69).

With this in mind, it was decided to allow any two eligible adults (aged 16 years or over) to participate in the survey. Given the household size distribution in the UK, it was estimated that 93% of the sample selected in this way would also have been selected had we managed to successfully implement a random selection method.

This approach was consistent with that taken for the previous waves of the Food and You 2 survey.

Letters and reminders

Letters and reminder strategy

The mailing approach followed Ipsos’ standard push-to-web methodology:

  1. An initial invitation letter was issued to sampled addresses inviting up to two adults to go online and complete the online questionnaire. This letter was mailed on the 10th October 2022.
  2. The first reminder letter was issued on 21st October 2022 and began to arrive at sampled addresses on 1st – 4th November 2022. Reminder letters were only sent to non-responding addresses and addresses where one adult had completed the online questionnaire but not a second adult (the presence of an eligible second adult was determined in the first questionnaire). 
  3. The second reminder letter was issued on the 11th November and began to arrive at sampled addresses on 16th November with the first postal returns logged on 27th November. The second reminder was only sent to non-responding addresses and addresses where one adult had completed the online questionnaire but not a second adult. All of these letters were accompanied by one or two postal questionnaires, to allow those who could not access the internet, and those who may have been less comfortable completing online questionnaires, to take part. Those in Wales received one questionnaire in English and one in Welsh. Further detail is provided in the section on the postal questionnaire. 
  4. A final reminder letter was issued on 7th December 2022, arriving at sampled addresses on 10th December 2022. The survey remained open until 10th January 2023.

There were numerous Royal Mail postal strikes during Wave 6 fieldwork, which had an impact on how quickly letters arrived.

Letter design

The principles for designing the invitation and reminder letters, which were kept substantially the same as those used for previous waves, were primarily based on the Tailored Design Method, which was described in Dillman, DA. Smyth, JD. Christian, LM. (2014) Internet, Phone, Mail and Mixed-Mode Surveys: The Tailored Design Method . A host of other literature and best practice based on previous studies (mainly the Active Lives survey and Labour Force Survey) were also used to inform the design. The main aim of the letters was to provide all the relevant information a participant requires to complete the survey, and to answer immediate questions which they may have. 

Our guiding principles for designing the letters were:

Use simple and easy to understand language, with no unnecessary complicated text
To cover key messages that needed to be conveyed in the letters including:

(a)    Importance
(b)    Motivators for taking part
(c)    How to take part
(d)    Your personal data are safe
a)    Importance was conveyed in all four letters in the following ways:

  • FSA and Defra logos were prominent
  • visual clutter which could distract from the logos and the importance of the survey was avoided
  • professional letter format with address of recipient and full date
  • signed by someone with authority (signified by their job title and organisation details) 
  • highlighted key messages in the text; using these to break up the text made it easier to read

b)    The main motivational statements varied across the four letters, with the aim of increasing the likelihood of converting non-respondents:

  • 1st letter: It’s easy to take part and why take part
  • 2nd letter: Taking part will benefit you and your community
  • 3rd letter: We want to hear from as many people as possible
  • 4th letter: This is the last chance to have your say
  • iIn addition, all letters placed a degree of emphasis on the financial motivator for taking part receiving a £10 gift voucher 

In addition to this the letters also provided key information about Ipsos and the Food Standards Agency and contact details for Ipsos should the participant have any queries about the survey. 

Online questionnaire

The Food and You 2 Wave 6 survey was hosted using Ipsos’ global Dimensions platform within Rackspace, a managed hosting facility and Europe’s most successful managed hosting company. The security features offered by Rackspace, and Ipsos are listed below:

At Rackspace:

  • Rackspace has SAS 70 type II and Safe Harbor certifications and operates management systems which are compliant to International standards (IS09001, ISO27001);
  • the servers and network infrastructure are physically located in England;
  • the servers and network components are fully redundant;
  • Rackspace guarantees recovery of hardware failures within one hour.

At Ipsos:

All applications and data for Dimensions were managed by Ipsos. Access to Dimensions’ questionnaires and data was password protected. Only a small number of online survey experts had access. Survey data and any participants personal information were stored in separate databases at Ipsos. 

Survey URL

As done in all previous waves, we used a dedicated URL that specifically included ‘food and you’ for the Food and You 2 Wave 6 survey. When deciding on the URL we wanted to choose an address that was short enough for participants to remember and one which would not easily be mis-typed. It also needed to give some indication of survey content.

Online questionnaire accessibility

The online questionnaire was made to be as accessible as possible to participants. Key to this was offering those in Wales the opportunity to complete the survey in Welsh (in line with the Welsh Language Act 1993). Participants could request to complete the survey in another language by calling the Food and You 2 survey helpline, or by asking someone to complete it on their behalf.

The Food and You 2 survey was designed to be accessed using a range of devices, including desktop computers, laptops, tablets and smart phones. The survey was designed with a ‘mobile first’ approach to minimise drop offs and improve response rates. A ‘mobile first’ approach means that the online questionnaire was designed with smart phone users in mind initially, as this is increasingly how participants choose to access online questionnaires. Additionally, the online questionnaire was designed in a way that made it easy for people to adjust colour contrasts and increase font size. 

Topline data checks

Once the online survey is in field and a sufficient number of responses have been received, topline data are reviewed to ensure no errors are detected with the filtering. Whilst extensive testing is carried out prior to fieldwork launch, on occasion unusual combinations of answers, particularly those involving non-valid answers, mean that minor errors are identified. By reviewing the data early on in fieldwork such errors can be identified and corrected. 

For Wave 6, topline checks were carried out after eight days of fieldwork when 864 responses had been received to the online survey. 

Results of the topline checks and refinements to the data checking syntax revealed an inconsistency for HSVOUCH (receipt of Healthy Start vouchers) in Waves 1-4 and Wave 6 (HSVOUCH was not asked in Wave 5). HSVOUCH should be asked of participants who were pregnant or had a child aged 0-4 in the household. Checks showed the online script was treating those who answered ‘Prefer Not To Say’ at CHILDAGE as having at least one child age=0 and routing them to HSVOUCH. This meant more participants were being asked HSVOUCH than required.

Numbers affected:

  • W1 – 55
  • W2 – 33
  • W3 – 51
  • W4 – 42
  • W5 – n/a
  • W6 – 17

However, waves 1 and 2 had been corrected in the data and tables before publication, and therefore, corrections were only needed for waves 3 and 4.

Changing the routing during fieldwork would introduce inconsistencies within the Wave 6 data waves. Therefore, this was addressed by:

(a)    Applying a back edit to HSVOUCH at the data processing stage of Wave 6 data
(b)    Changing the routing of this question if it is used in future waves

Break-offs and questionnaire length

A total of 7.8% of all participants (324) who started the online survey did not complete it, and therefore are not in the final dataset. A low number of breakoffs were observed across all questions asked in the online survey for Wave 6, for example, no question in particular caused people to leave the survey. 

The median completion time of those who completed the online survey was 25 minutes and 46 seconds.

Postal questionnaire

At the second reminder (Mailing 3) non-responding households were sent postal questionnaires. Households in England and Northern Ireland where one adult had completed the questionnaire and in which a second adult had been identified were sent one postal questionnaire, otherwise non-responding households were sent two postal questionnaires in these countries. All non-responding households in Wales were sent two postal questionnaires – one in English and one in Welsh. 

Each household that received two postal questionnaires received one Eating at Home and one Eating Out postal questionnaire. Households that were sent one postal questionnaire received only one of these versions.

In total 2,171 participants completed the postal questionnaire with 1,073 completing the Eating at Home postal questionnaire and 1,098 completing the Eating Out questionnaire. There were 20 participants in Wales who completed the Welsh language postal questionnaire in total, 14 completed the ‘Eating at Home’ version and six completed the ‘Eating Out’ version. The number of returns of the postal questionnaire for each country is detailed in Table 2. The highest number of postal returns were received from participants in England (1,177 returns), followed by 607 in Northern Ireland and 387 in Wales. 

Table 2: The number of postal questionnaires returned, by questionnaire version, for each country
 

Postal questionnaire Number returned
Eating at Home (England) 612
Eating Out (England) 565
Eating at Home (Norther Ireland) 290
Eating Out (Northern Ireland) 317
Eating at Home Wales (English) 157
Eating at Home Wales (Welsh) 14
Eating Out Wales (English) 210
Eating Out Wales (Welsh) 6
Total 2.171

Storage of scanned images and survey results

All scanned images and survey data were stored on a secure server, which is isolated from the Ipsos network and has restricted access controls. Our secure file servers are housed in server rooms/data centres with appropriate physical access controls and monitoring procedures. The network is protected by appropriate use of firewalls, DMZ and intrusion detection systems. Public facing servers are also appropriately protected and are based on a secure (minimum) two tier or, our general standard, three-tier architecture. We used AES256 as a minimum standard for encryption.

Vouchers for participants

Participants were offered a £10 gift voucher as a thank you for taking part in the survey. 
Participants who completed the survey online who wished to receive a voucher entered their email address at the end of the survey. They were then emailed a Love2shop e-voucher of the nominal amount which they could redeem online at the Love2Shop website. 

Those who completed the postal questionnaire were given the choice of receiving a Love2shop e-voucher or paper Love2shop voucher via post, either of which could be redeemed at a wide range of high street stores. Participants were asked to give their name in order to address the voucher to the correct person, but even without a name a voucher would be sent to that address.

Handling queries

The survey website provided information about the survey and included a list of frequently asked questions (FAQs) which had been developed based on similar studies.

Additionally, a dedicated freephone telephone helpline and email address were set up allowing participants to contact Ipsos if they had any queries about the survey. Telephone queries were first recorded by an answer machine and a member of the research team returned the call when they had identified an appropriate solution. Emails sent to the Food and You 2 survey inbox were first answered with automatic responses, which included the most commonly asked questions and answers. Each query was then followed up individually within five working days. 

There were 288 queries, the majority of which were regarding when participants would receive their voucher or to opt out of the survey. Other queries included participants requesting a postal questionnaire or experiencing difficulties accessing the online survey. 

Response rates

The overall response rate for Food and You 2 Wave 6 was 28.8% with 1.43 adults participating per household on average. Of the surveys completed, 63.8% were online and 36.2% were postal questionnaires. Response rates varied by region. Table 3 shows the variation in response rate by region and country. 

Table 3: Overall individual level response rates by region and country

Region/Country Issued addresses Number of returns overall Proportion of returns that were online (%)
East Midland 597 272 58.8%
East of England 768 362 58.6%
London 971 336 64.9%
North East 351 157 59.2%
North West 933 376 64.9%
South East 1,109 519 63%
South West 721 373 55.5%
West Midlands 710 305 62.3%
Yorkshire and the Humber 689 332 61.4%
England 6,849 3,032 61.2%
Wales 3,542 1,315 70.6%
Northern Ireland 4,109 1,644 63.1%
Total 14,500 5,991 63.8%

Table 4: Household level response rates by region and country

Region/Country Issued addresses Number of returns overall Proportion of returns that were online (%)
East Midland 182 30.5% 1.49
East of England 252 32.8% 1.44
London 239 24.6% 1.41
North East 103 29.3% 1.52
North West 255 27.3% 1.47
South East 358 32.3% 1.45
South West 251 34.8% 1.49
West Midlands 205 28.9% 1.49
Yorkshire and the Humber 227 32.9% 1.46
England 2,072 30.3% 1.46
Wales 1,015 28.7% 1.30
Northern Ireland 1,088 26.5% 1.51
Total 4,217 28.8% 1.43

Profile of achieved sample

The tables below show the profile of those who completed the survey online and those who completed the postal questionnaires.

Table 5:  Demographic profile of survey responders - Gender

Demographic Percentage of online participants Percentage of postal participants Percentage of total participants
Male 42.3% 41.0% 41.8%
Female 56.3% 58.0% 56.9%
In another way 0.5% *% 0.3%
Prefer not to say 1.0% 0.9% 1.0%

*indicates a value higher than 0 but lower than 0.05.

Note, the not stated cases are excluded from the above table. 

Table 6: Demographic profile of survey respondents

This table shows that those who are aged 54 or younger are more likely to complete the online questionnaire than the postal questionnaire, with the opposite true for those aged 65 and over.

Demographic Percentage of online participants Percentage of postal participants Percentage of total participants
16 to 24 5.1% 2.2% 4.1%
25 to 34 14.9% 5.7% 11.6%
35 to 44 17.9% 10.3% 15.2%
45 to 54 18.9% 11.6% 16.2%
55 to 64 19.5% 20.7% 19.9%
65 to 79 20.7% 36.2% 26.3%
80+ 2.8% 11.7% 6.0%
Prefer not to say 0.1% 1.7% 0.7%

The 65-74 and 75+ age groups were changed to 65-79 and 80+ in the questionnaire for Wave 5 when the age question was amended to collect age or age band only, rather than a specific date of birth – this was continued for Wave 6. These age bands were chosen as they aligned with the weighting approach, however in future questionnaires, although the same approach will be taken collecting age and age band, the 65-74 and 75+ bands will be added so that the age bands align with how age has been reported in the data tables in Waves 1-4. 

Table 7: Demographic profile of survey responders - Ethnicity

Demographic Percentage of online participants Percentage of postal participants Percentage of total participants
White 90.3% 94.6% 91.9%
Mixed 1.1% 0.7% 1.0%
Asian or Asian British 3.8% 2.2% 3.2%
Black or black British 1.0% 1.1% 1.0%
Other ethnic group 0.5%

0.3%

0.4%
Prefer not to say 3.2% 1.1% 2.5%

Table 8: Demographic profile of survey responders - Household size

Demographic Percentage of online participants Percentage of postal participants Percentage of total participants
1 12.8% 22.6% 16.3%
2 45.7% 49.5% 47.1%
3 16.4% 12.5% 15.0%
4 16.4% 9.5% 13.9%
5+ 8.7% 5.8% 7.7%

 

Overview of weighting

The same weighting approach was taken in Wave 6 as in previous waves. Weights were initially calculated separately for each country in two stages:

1.    Calculation of selection weights (described in the section on selection weights)
2.    Calibration of selection weights to country population totals (described in section on population weights)

Next, weights were created for use in analyses of combined-country data by scaling the weighted country sample sizes to be proportional to the corresponding country population values (for adults aged 16 and over). 

Because it was not possible to include all questions in the postal questionnaires (see the section called ‘Questionnaire development and cognitive testing’), four separate question-type weights were calculated in each country, and in the combined all-country sample. These four question-type weights were designed to be used as follows:

  1. All-questionnaire weights to be used for questions asked of all sample members in all online and postal questionnaires
  2. Online questionnaire weights to be used for questions asked only of online participants (for example, questions not asked in the postal questionnaires)
  3. Online questionnaire plus Eating at Home (EH) postal questionnaire weights to be used for questions asked only of online participants and postal questionnaire respondents receiving only the EH version (for example, questions not asked in the version EO postal questionnaires)
  4. Online questionnaire plus Eating Out (EO) postal questionnaire weights to be used for questions asked only of online participants and postal questionnaire respondents receiving only the EO version (for example, questions not asked in the version EH postal questionnaires)

Four additional weights (one for each of these question types) were calculated for the combined all-country sample. Once the main weights were calculated as described above, supplementary ‘Wales & Welsh-England’ weights were calculated. These were designed to allow comparisons to be made between Wales and England (excluding London) after controlling for country profile differences in age within gender, ethnic group, number of adults per household, and urban-rural mix.

Calculation of selection weights

Selection weights were created to compensate for (i) variations in within-household individual selection probabilities and response propensities and (ii) the fact that, by design, some questions were not included in all questionnaires. As a maximum of two eligible adults were surveyed per household, adults in larger households are less likely to be included in the survey. So without this weight, individuals living in households in which some eligible adults were not interviewed would be underrepresented relative to individuals living in households in which all eligible adults were interviewed. They were calculated in the following stages: 

  1. The all-questionnaire selection weight was calculated as: (number of eligible people aged 16 years or over in the household)/(number of participants in the household). 
  2. The online questionnaire selection weight was calculated as: (number of eligible people aged 16 years or over in the household)/(number of online participants in the household). 
  3. Next the Online questionnaire plus version EH postal questionnaire weight and the online questionnaire plus version EO postal questionnaire weight were calculated by doubling the value of the all-questionnaire selection weight for postal respondents relative to the corresponding value for online respondents (because the relevant questions were only asked in half the postal questionnaires).    

Values were capped to the range 1-3 for the all-questionnaire and online selection weights, and to the range 1-6 for the online questionnaire plus version EH postal questionnaire and online questionnaire plus version EO postal questionnaire weights to restrict variance inflation.

Calibration to population values

Next, selection weights were applied to the three individual country samples and each was calibrated to the corresponding country population values for the number of adults aged 16 or over by: 

(i)    age band within gender 
(ii)    geographic area (defined separately for each country) 
(iii)    deprivation quintile (calculated using each country’s multiple deprivation index). 

These weighting variables are often used as standard in social surveys because they correlate reliably with both response propensity and a wide range of survey variables. We note that in some previous rounds of the face-to-face Food and You survey, working status was used as a weighting variable instead of deprivation quintile. In previous waves of Food and You 2 it was decided not to use this variable for weighting the sample because survey fieldwork took place during the Covid-19 pandemic, during which rates of employment were likely to be unstable.  Deprivation quintile was used as a substitute indicator of general economic prosperity. This approach was taken again for Wave 6, and is expected to continue for the immediate future for comparability.

Weighting targets are shown in the next section, taken from ONS Mid 2020 Population Estimates and NISRA Mid 2020 Population Estimates.

Table 9: Population totals for age within gender in England

Age band Males Females
16 to 24 3,066,029 2,884,608
25 to 29 1,924,416 1,847,077
30 to 34 1,916,412 1,908,240
35 to 39 1,852,969 1,885,240
40 to 44 1,730,268 1,746,035
45 to 49 1,803,208 1,835,431
50 to 54 1,911,318 1,964,033
55 to 59 1,852,593 1,909,189
60 to 64 1,568,489 1,628,324
65 to 69 1,347,714 1,436,586
70+ 2,278,001 2,546,119
All 1,1651,748 1,689,851

Table 10: Population totals for age within gender in Wales

Age band Males Females
16 to 24 180,657 164,947
25 to 29 106,877 101,237
30 to 34 98,579 98,093
35 to 39 91,729 94,144
40 to 44 85,290 87,640
45 to 49 93,969 98,496
50 to 54 105,438 111,532
55 to 59 108,081 114,141
60 to 64 95,915 101,501
65 to 69 87,314 92,641
70+ 149,473 163,436
All 71,981 103,745

Table 11: Population totals for age within gender in Northern Ireland

Age band Males Females
16 to 24 104,333 96,676
25 to 29 60,377 59,442
30 to 34 62,883 63,699
35 to 39 60,758 63,594
40 to 44 56,927 61,017
45 to 49 59,844 63,095
50 to 54 63,786 66,797
55 to 59 62,595 64,908
60 to 64 53,421 55,599
65 to 69 44,862 45,831
70+ 68,762 77,834
All 32,527 50,133

Table 12: Population totals for regions in England

England Region code England Region name Population total
E12000001 North East 2,203,353
E12000002 North West 5,957,266
E12000003 Yorkshire and The Humber 4,474,428
E12000004 East Midlands 3,963,265
E12000005 West Midlands 4,791,343
E12000006 East of England 5,051,203
E12000007 London 7,149,281
E12000008 South East 7,442,850
E12000009 South West 4,664,909
Total - 45,697,898

Table 13: Population totals for regions in Wales

Wales Region Population total
North 579,711
Mid 174,082
South West 586,562
South East 1,266,501
Total 2,606,856

Table 14: Population totals for regions in Northern Ireland

Northern Ireland (Local Government District code) Northern Ireland (Local Government District name) Population total
N09000001 Antrim and Newtonabbey 113,924
N090000011 Ards and North Down 132,057
N09000002 Armagh City, Banbridge and Craigavon 168,360
N09000003 Belfast 274,369
N09000004 Causeway Coast and Glens 116,337
N09000005 Derry City and Strabane 118,371
N09000006 Fermanagh and Omagh 91,929
N09000007 Lisburn and Castlereagh 116,887
N09000008 Mid and East Antrim 112,616
N09000009 Mid Ulster 114,153
N090000010 Newry, Mourne and Down 140,697
Total - 1,499,700

Table 15: Population totals for deprivation quintiles in England

England Quintiles 16+_Pop_20
1 9,138,329
2 9,140,152
3 9,139,700
4 9,139,337
5 9,140,380
Total

45,697,898

Table 16: Population totals for deprivation quintiles in Wales

Wales Quintiles 16+_Pop_20
1 521,330
2 520,803
3 521,290
4 522,008
5 521,425
Total

2,606,856

Table 17: Population totals for deprivation quintiles in Northern Ireland

NI Quintiles 16+_Pop_20
1 299,268
2 300,459
3 299,450
4 300,395
5 300,128
Total

1,499,700


Initial calibration was carried out separately in each country for each of the four questionnaire type weights described above. For each questionnaire type weight, calibration adjustment factors were calculated by dividing the individual country weights by the selection weights. These adjustment factors were then capped at the 99th percentile value to limit variance inflation and applied to selection weight to produce final individual country weights.  

After calibration and adjustment factor capping, the individual country level weights were scaled to equalise unweighted and weighted sample sizes in each country.

The aim of these within-country calibration procedures was to match the profile of the weighted sample to that of the population aged 16 or over on gender, age band, geographic region, and deprivation quintile.  In practice, there will be slight discrepancies between weighted sample totals and population figures as a result of the adjustment factor caps.

Creating of all country weight

An all-country version of each questionnaire type weight was then constructed by combining the individual country samples and rescaling final individual country weights so that weighted sample country proportions matched the respective country population (aged 16 years or over) proportions.

'Wales and Welsh England' standardisation weight

This weight was designed to calibrate English sample estimates to Welsh population characteristics for comparative purposes. It was calculated from the England sample as follows:

  1. London cases were dropped (London being in many ways unique in the UK)
  2. The non-London England sample proportions were calibrated to the weighted Wales sample proportions for four variables: number of adults in the household, ethnic group, urban-rural and age by gender. These four variables were selected when the ‘Wales & Welsh-England’ weights were first constructed in Wave 1. Weighted estimates for Wales and non-London England were compared across a range of candidate variables and statistically significant differences were found for urban-rural, ethnic group, household size and age within gender.

The final weighting variables were defined as follows:

Table 18: Age within gender (male and female)

Males Females
16 to 24 16 to 24
25 to 29 25 to 29
30 to 34 30 to 34
35 to 39 35 to 39
40 to 44 40 to 44
45 to 49 45 to 49
50 to 54 50 to 54
55 to 59 55 to 59
60 to 64 60 to 64
65 to 69 65 to 69
70+ 70+

Number of adults in household:

  • 1 adult
  • 2 adults
  • 3+ adults
  • Question not answered

Ethnic group:

  • White 
  • Asian
  • Black
  • Mixed
  • Other/not answered

Urban rural:

Urban: OA falls into a built up area with a population of 10,000 or more

Rural: All other OAs

‘Wales & Welsh-England’ weights were calculated only for respondents in England outside London and in Wales (where they were the same as the individual country weight for Wales). 

Table 18: Summary list of wights and when to use each one

Weight When to be used
wt1 Estimates for all-countries: questions asked of all sample members completing the online and postal questionnaires
wt2 Estimates for all-countries: questions asked only of online participants (not asked in postal questionnaire)
wt3 Estimates for all-countries: questions asked of all sample members completing the online questionnaires and those completing EH version of the postal questionnaire
wt4 Estimates for all-countries: questions asked of all sample members completing the online questionnaires and those completing EO version of the postal questionnaire
wt5 Individual country estimates for England, Wales and Northern Ireland: questions asked of all sample members completing the online and postal questionnaire
wt6 Individual country estimates for England, Wales and Northern Ireland: questions asked only of online participants (not asked in postal questionnaire)
wt7 Individual country estimates for England, Wales and Northern Ireland: questions asked of all sample members completing the online questionnaires and those completing EH version of the postal questionnaire
wt8 Individual country estimates for England, Wales and Northern Ireland: questions asked of all sample members completing the online questionnaires and those completing version EO of the postal questionnaire
wt9 ‘Wales and Welsh-England’ estimates: questions asked of all sample members in the online and postal questionnaire
wt10 ‘Wales and Welsh-England’ estimates: questions asked only of online participants (not asked in postal questionnaire)
wt11 ‘Wales and Welsh-England’ estimates: questions asked of all sample members completing the online questionnaires and those completing EH version of the postal questionnaire
wt12 ‘Wales and Welsh-England’ estimates: questions asked of all sample members completing the online questionnaires and those completing version EO of the postal questionnaire

 

 

Overview

Questionnaire versions

As described in earlier sections, the data have been collected from two sources: an online questionnaire and two postal questionnaires. The online questionnaire includes some built-in routing and checks within it, whereas the postal questionnaires relied on correct navigation by participants and there is no constraint on the answers they can give. 

In addition, the online data were available immediately in their raw form, however the postal questionnaire data must be scanned and keyed as part of a separate process. Tick box answers were captured by scanning, and numbers and other verbatim answers were captured by keying, with the data then coded in an ascii text string.

In line with standard procedures on a mixed-mode survey such as this, the online questionnaire was taken as the basis for data processing. Once that was processed then a data map/dictionary was used to match the data from the postal questionnaires with the online data.

A wide range of edits were carried out on the data followed by numerous checks. These have been detailed throughout this section.

Data editing 

Postal data - forced edits

The postal data were subject to errors introduced by participants and subsequently edits were required for this data. There are five key principles to editing postal data which were drawn upon for this:

  1. Forward editing was applied to all filtered questions. If a participant was eligible to answer a question but had not, they were assigned a code of -99 “Not stated”. 
  2. A small number of back edits were applied to a handful of variables. If a participant had answered a question but had not answered “yes” at the previous filter question a back edit was applied. This was only done on variables specified by the FSA as the forward editing approach handles the majority of the cleaning required. 
  3. A specification was created by the FSA that set out a number of variables which needed to be edited to directly match the online routing. This was applied as a post field edit to the postal data only. 
  4. If a question was incorrectly answered as a multi-code question then the responses were set to -99 “Not stated”.  
  5. On a handful of questions that offered a multi-code answer but we asked participants to limit their answers to a maximum of three – answers were randomly assigned by running a random selection in SPSS. This was run for participants who answered more than 3 answers and the process ensured no duplicate answer could be selected. 

In addition to this, where there was a multi-code variable that also had an exclusive code (such as “don’t know”), answers were edited so that valid multi-code options took priority, and conflicting exclusive codes were deleted. Where there were several exclusive codes, a hierarchy was applied.

Edits to numeric answers

In Wave 6, edits were only made to one question where the answer was deemed to be improbable or unlikely. For ‘Number of adults’, if a participant from a multiple response household answered that only one adult lived in that household a post-field edit was applied to set the answer to two. This edit will have a subsequent impact on any variables that use nadult as part of the filter and therefore some questions will highlight a group that look eligible to answer but did not.

In Waves 1-4, it has also been necessary to edit ‘age’ data. However, due to the inclusion of both the open question recording age in years and the question capturing the same information via pre-defined categories on the postal questionnaires in Waves 5 and 6, it was possible to use the answers to the age bands question to verify the answers to the open question in the postal data. 

Duplicate responses

Some cases were removed from the data if the participant completed both the online and the postal survey. In these instances, the online questionnaires were prioritised as that represents a more complete set of data. A total of 95 duplicates were removed from the data.

Coding

Coding was done by Ipsos on one open ended question (FOODISSA2). Coding is the process of analysing the content of each response based on a system where unique summary ‘codes’ are applied to specific words or phrases contained in the text of the response. The application of these summary codes and sub-codes to the content of the responses allows systematic analysis of the data.

Translation of verbatims in Welsh

Participants were able to complete the survey in English and in Welsh. There were a small number of participants who chose to complete the survey in Welsh and provided verbatim text. These verbatims were translated by the FSA’s Welsh Language Unit before being coded, alongside the English responses, by Ipsos.

Ipsos coding

Having established the codeframe for FOODISSA2 “What are your concerns about the food you eat?” in Wave 1 (using Q.1a. “What food issues, if any, are you concerned about?” from Wave 17 of the FSA’s Public Attitudes Tracker as a basis for the codeframe) this coding framework was then updated throughout the analysis process of Waves 2-6 to ensure that any newly emerging themes were captured. Developing the coding framework in this way ensured that it would provide an accurate representation of what participants said. This process was continued in Wave 6, with the codeframe developed further to match newly-emerged themes. After adding in any new codes to the codeframe, it was then reviewed by the FSA and Ipsos research teams with queries subsequently addressed by the coding team. After this it was then appended to the datasets. 

Codes were grouped together into broad themes (for example, ‘Environmental and Ethical Concerns’), shown in bold text in the data tables. Some of the broad themes also had sub-themes (for example, ‘Fair Trade / Ethical’). For consistency between waves, all codes developed for the Waves 1-5 codeframes were included in the Wave 6 codeframe, including codes for which no responses were assigned at Wave 6. These codes are also present in the Wave 6 tables (and are marked as having received no responses).
Ipsos used a web-based system called Ascribe to manage the coding of all the text in the responses. Ascribe is a system which has been used on numerous large-scale consultation projects. Responses were uploaded into the Ascribe system, where members of the Ipsos coding team then worked systematically through the comments and applied a code to each relevant piece of text. 

The Ascribe system allowed for detailed monitoring of coding progress, and the organic development of the coding framework (for example, the addition of new codes to new comments). A team of coders worked to review all the responses after they were uploaded on Ascribe, with checks carried out on 5% of responses.
 

Data checks

Checks on data

Ipsos checked the data in two ways. Firstly, the data is checked using the questionnaire and applying a check for each filter to ascertain whether a participant correctly followed the routing. This checks 100% of the questionnaire and is run separately on the raw postal data and the raw online data. Once the data was checked a list is produced that identifies which variables require an edit and this largely related to the postal data. Any edits applied are set out in the section on Data editing.  

Once the data edits are applied a combined dataset is created, duplicate participants are removed (as outlined in the section on duplicate responses) and then the derived variables are created. 

Checks on derived variables

Derived variables were created in syntax and are based on the table specification. All derived variables were checked against previous waves to ensure the values were roughly in line with what we would expect to see. Cross checks were carried out on the syntax used to create the derivations to ensure the logic was valid. 

Once the derivations were set up the dataset was checked by other members of the team. Some derived variables were based on one question (for instance age) and these were checked by running tabulations on SPSS from the question they were derived, to check that the codes fed into the groups on the cross-breaks. If the derived variables were more complex and based on more than one question, for example, NS-SEC, more thorough checks were carried out. For example, the NS-SEC variable was created independently by another data manager – the questions are in line with other surveys, so an independent check was carried out to ensure that the syntax was correctly created. The checker also ran the syntax themselves to check that they could replicate the results in the data.

Checks on tables

Once the data was signed off the tables were produced using Quantum and subsequent checks were run against the table specification. These checks ensured all questions were included, that down-breaks included all categories from the question, that base sizes were correct (for example, for filtered questions), base text was right, cross-breaks added up and were using the right categories, nets were summed using the correct codes, and that summary and recoded tables were included. Weighting of the tables was also checked by applying the correct weight on the SPSS file then running descriptives and cross-break tabulations to check that this matched up with the values on the tables.

If any errors were spotted in the tables, these were then specified to the data processing team in a change request form. The data processing team then amended the tables based on this and the tables were rechecked after the changes were made. The data and table checks were carried out by a team of five people at Ipsos. 
 

List of appendices

The online questionnaire has been included as an Appendix to the technical report but is available as a separate PDF and a table of methodological differences between waves has been presented overleaf. 

The rest of the documentation (listed below) will be uploaded onto the UK Data Archive:

Food and You 2 Wave 6 online questionnaire
Food and You 2 Wave 6 postal questionnaires

  • Eating at Home (England)
  • Eating at Home (Northern Ireland)
  • Eating at Home Wales (English)
  • Eating at Home Wales (Welsh)
  • Eating Out (England)
  • Eating Out (Northern Ireland)
  • Eating Out Wales (English)
  • Eating Out Wales (Welsh)

Food and You 2 Wave6 invitation and reminder letters

  • Invitation letters (sent to main sample addresses)
  • First reminder letters (sent to main sample addresses)
  • Second reminder letter (sent to main sample addresses only)
  • Final reminder letter (sent to main sample addresses only)

Food and You 2 Wave 6 full SPSS data
Food and You 2 Wave 6 SPSS user guide
Food and You 2 Wave 6 full data tables (and user guide) for England, Wales and Northern Ireland combined 
Food and You 2 Wave 6 individual country data tables (and user guide) for England, Wales and Northern Ireland

Methodological differences between Waves 1 to 6

Table 21 compares the differences between Waves 1-6 in sample size and household participation rates. The largest sample size was in Wave 1 (21,053), dropping to 13,922 in Wave 2 and remaining at 14,115 for Waves 3 and 4. For Wave 5, the inclusion of the reserve addresses meant that the sample size was larger than in Waves 2-4, with 16,115 addresses. Reserve addresses were not used for Wave 6 but the main sample size for Wave 6 was increased slightly to 14,500 compared to 14,115 issued in Waves 3 and 4. As up to two adults per household can participate the number of individual returns overall is always higher than the number of households participating. The number of individual returns overall was 9,319 (Wave 1), 5,900 (Wave 2), 6,271 (Wave 3), 5,796 (Wave 4), 6,770 (Wave 5), and 5,991 (Wave 6).  The number of household returns overall is show in Table 19. The highest overall response rate was achieved in Wave 3: 30.7%.

Table 19: Comparing differences between Wave 1 to 6 in sample size and participation rates

Wave Sample size (issued addresses) Number of household returns overall Overall response rate
1 21,053 6,408 30.4%
2 13,922 3,955 28.4%
3 14,115 4,338 30.7%
4 14,115 4,026 28.5%
5 16,115 4,727 29.3%
6 14,500 4,217 29.1%

The average length of the online questionnaire was different in each wave:

  • Wave 1 was 29 minutes and 58 seconds
  • Wave 2 was 36 minutes and 27 seconds
  • Wave 3 was 30 minutes and 13 seconds
  • Wave 4 was 37 minutes and 14 seconds
  • Wave 5 was 27 minutes and 08 seconds
  • Wave 6 was 25 minutes and 46 seconds 

To maximise response and reduce participant burden the recommended survey length for the online questionnaire is no more than 30 minutes. Waves 1, 5, and 6 were at or within the target 30-minute average length.

Table 20 shows the number of versions of the postal questionnaire used, the fieldwork period and the proportion of the outstanding sample who were sent the final reminder. In all waves other than Wave 3, two versions of the postal questionnaire was used. In Wave 3, the response rate was high enough after the second reminder for the final reminder to be sent to just two-thirds of the non-responding sample. In all other waves the final reminder was sent to all outstanding non-responding households. 

Table 20: Comparing differences between Wave 1 to 6, for postal questionnaire versions, fieldwork period and mailing sample

Wave Number of different versions of postal questionnaires for each country Fieldwork period Proportion of available sample sent final mailing
1 Two (Version A and Version B) 29 to 6 October 2020 (about 10 weeks)

100% of non-responding households in Wales and Northern Ireland. 

50% of non-responding households in England

2 Two ('Eating Out' and 'Eating at Home') 20 November 2020 to 21 January 2021 (about nine weeks) 100% of non-responding households across the sample
3 One (with different questions in Northern Ireland than for England and Wales) 28 April to 25 June 2021 (about 8 weeks) 66.6% of non-responding households across the sample
4 Two ('Eating Out' and 'Eating at Home') 18 October 2021 and 10 January 2022 (about 12 weeks) 100% of non-responding households across the sample
5 Two (Version A and Version B) 26 April 2022 and 24 July 2022 (about 13 weeks) 100% of non-responding households across the main sample, 0% of households from the reserve sample
6 Two ('Eating Out' and 'Eating at Home') 12 October 2022 to 10 January 2023 (about 13 weeks) 100% of non-responding households across the sample

Questionnaire development

In Wave 1 a prolonged period of questionnaire development took place which involved an extensive review of questions from previous FSA surveys (Food and You and Public Attitudes Tracker). After all relevant questions were compiled a workshop with the Food and You 2 advisory group was held to discuss key priorities for the questionnaire. This was followed by a second workshop with key internal stakeholders to discuss their priorities for the questionnaire and provide Ipsos with direction regarding questionnaire content. 

Following this, draft questionnaire modules were compiled based on questions from previous FSA surveys. Numerous alterations to the wording, ordering, format and content of the questions were made in the process based on survey design best practice, with additional questions designed based on stakeholder needs. The questionnaire development stages for Waves 2-6 were much shorter as core questions and materials had been developed in Wave 1.  

Cognitive testing 

Ahead of Waves 1-4 and 6, cognitive testing was conducted to examine participant comprehension of new or potentially challenging questions. Participants for cognitive testing were recruited from Ipsos MORI’s iOmnibus recontact database and for Wave 2 onwards, via an external Ipsos approved supplier. In Wave 1, 26 cognitive interviews were completed. In Wave 2, 14 interviews were completed, and 20 interviews were conducted when developing the questionnaire for Waves 3-4.  Cognitive testing was not used for Wave 5 as only one straightforward question was added. In Wave 6, 19 interviews were completed.

Usability testing

Prior to Wave 1 fieldwork, usability testing was also undertaken to identify areas where improvements could be made in the form and format of the questions on the online survey across different commonly used devices (for example, mobile phone, tablet, computer). Interviews were conducted over online video conferencing software, with interviewers observing participants journey through the online questionnaire (using screen share technology) and asking questions where relevant. Eleven interviews were undertaken at this stage. This helped identify formatting and layout issues with the online questionnaire which were amended ahead of the pilot survey. Usability testing was not conducted again ahead of Waves 2-6 as the online questionnaire took the same format as the Wave 1 questionnaire.

Pilot

Prior to the main stage fieldwork for Wave 1 a pilot was conducted on the full questionnaire to understand the time it took for participants to complete the questionnaire and each individual module within it. The questionnaire was tested over four days with 390 members of Ipsos MORI’s online access panel. The questionnaire took participants on average 26 mins and 48 seconds to complete and it was believed that no alterations were needed to the length of the questionnaire, in order for it to fall within the desired 30 minutes. Pilots were not conducted in Waves 2-6 as the expected completion time was estimated from earlier waves. 

Differences in the questionnaire

Due to the modular design of Food and You 2, some questions (core modules) are asked in every wave, whereas other questions are only present in certain waves. For some questions, the base will vary between waves. This is due to changes in the questions available for filtering, and/or their inclusion in the postal questionnaire. Please see the Wave 6 Tables User Guide for details.

The table below notes which modules were present in each wave of the survey, though note the content of each module varied somewhat between waves, as outlined above. 

Table 21: Questionnaire module content of each survey wave

Full list of modules from Wave 1 to 6 Waves included
About you and your household 1, 2, 3, 4, 5, 6
Food Concerns (core) 1, 2, 3, 4, 5, 6
Food you can trust (core) 1, 2, 3, 4, 5, 6
Household Food Security (Core) 1, 2, 3, 4, 5, 6
Eating at Home (core questions) 2, 4, 6
Eating at Home (full module) 1,5
Food Shopping 1, 3, 5
Defra Questions 1, 3, 5
Eating Out 2, 4, 6
Online Food Platforms 3, 5
Food Hypersensitivities (core questions) 1, 3, 4, 5
Food Hypersensitivities (full module) 2, 6
Healthy Eating (Northern Ireland only) 3
Emerging Issues 4

Differences in fieldwork

Fieldwork dates

The Food and You 2 survey should take place every six months. However, the length of the initial questionnaire development led to a later start in its first year. The fieldwork dates of each wave are as follows:

  • Wave 1: 29th July 2020 to 6th October 2020 (about 10 weeks)
  • Wave 2: 20th November 2020 to 21st January 2021 (about nine weeks)
  • Wave 3: 28th April to 25th June 2021 (about eight weeks)
  • Wave 4: 18th October 2021 and 10th January 2022 (about 12 weeks)
  • Wave 5: 26th April 2022 and 24th July 2022 (about 13 weeks)
  • Wave 6: 12th October 2022 - 10th January 2023 (about 13 weeks)

Sample sizes 

There were just over 21,000 addresses issued in Wave 1, leading to 9,319 individual returns. Since this was much higher than the target of 5,600 individual returns, only 13,922 addresses were issued in Wave 2, and 14,115 addresses issued in Waves 3 and 4. For Wave 5, 16,115 addresses were issued (14,115 from the main sample and 2,000 from the reserve sample). 14,500 addresses were issued for Wave 6. 

Vouchers 

As an experiment, each adult who completed the questionnaire in Wave 1 received either a £15 online voucher, £10 online or paper voucher and £5 online or paper voucher. Based on the results, respondents in Waves 2-6 received only the £10 voucher. The experiment process and results were summarised in an article published on the Social Research Association (SRA) website, Volume 11, Summer 2021

Postal questionnaires

When postal questionnaires were sent out in Waves 1, 2 and 4-6, the version was assigned to person one and person two in the household on a quasi-random basis. This meant half contained questions from one module and the rest contained questions from another module. However, in Wave 3, one of the modules was only relevant to residents of Northern Ireland. Therefore, the content of the postal questions varied on a country basis rather than randomly.

Reminders

In Wave 1, the final reminder was sent to all outstanding non-responding households in Wales and Northern Ireland, and to 50% of those in England. In Wave 3, the response rate was high enough after Mailing 3 for the final reminder to be sent to just two-thirds of the non-responding sample. In Waves 2, 4, 5, and 6, all non-responding sample received a final reminder. In Wave 5, main sample cases received up to three reminders whereas reserve sample cases received up to two – the final reminder was not issued to reserve sample addresses due to an increase in response.
 

Differences in weighting

Overall, the same weighting approach was taken in Waves 1-6. However, in each wave, some additional weights are needed for those questions that are not asked to all postal respondents. These additional weights will vary between waves depending on which questions are included.

Waves 4-6 have “Welsh and Welsh-England” weights to easily compare Welsh respondents against an English population calibrated to have similar demographic characteristics. In Waves 1 to 3, the weights for the calibrated English population were called “Welsh England” weights. The corresponding weights for Welsh respondents were formally part of the individual country level weights.

Differences in data validation and management

In Waves 1 and 2, the tables were created from the underlying data independently of the SPSS dataset. From Wave 3 onwards, syntax produced the derived variables in SPSS, and this was used to produce the tables in Quantum. As part of this change, the data validation procedures were reviewed and the following improvements made: 

  • in all waves, back editing and forwarding editing was applied to inconsistencies in the postal data, with a smaller amount of back editing applied to the data for Waves 3-6 than in other waves. Back editing meant that if a filtered question was answered but the filter origin question contradicted that answer (blank or different), then the origin question was changed to be the answer for the filter question. Whereas forward edited meant that if a participant answered a question but did not follow the routing to answer the next filtered question they were assigned a code of -99 “Not stated”.
  • in Waves 1 and 2, if a question was incorrectly answered as a multi-code question when only one answer should have been selected, then a digit from the participant ID was used to randomly select an answer. In Waves 3-6, the responses were set to -99 “Not stated”.
  • from Wave 3 onwards, an edit was introduced to correct the number of adults when participants from a multiple response household answered that only one adult lived in that household.

Ipsos standards and accreditations

Ipsos’s standards and accreditations provide our clients with the peace of mind that they can always depend on us to deliver reliable, sustainable findings. Moreover, our focus on quality and continuous improvement means we have embedded a ‘right first time’ approach throughout our organisation.

Market Research ISO 20252 Certificate

This is the international market research specific standard that supersedes BS 7911/MRQSA and incorporates IQCS (Interviewer Quality Control Scheme). It covers the five stages of a Market Research project. Ipsos UK was the first company in the world to gain this accreditation. 

Information Security ISO 27001 Certificate

This is the international standard for information security designed to ensure the selection of adequate and proportionate security controls. Ipsos UK was the first research company in the UK to be awarded this in August 2008.

Company Quality ISO 9001 Certificate

This is the international general company standard with a focus on continual improvement through quality management systems. In 1994, we became one of the early adopters of the ISO 9001 business standard.

Market Research Society (MRS) Company Partners Certificate

By being an MRS Company Partner, Ipsos endorse and support the core MRS brand values of professionalism, research excellence and business effectiveness, and commits to comply with the MRS Code of Conduct throughout the organisation. Ipsos were the first company to sign our organisation up to the requirements & self regulation of the MRS Code; more than 350 companies have followed our lead.
The UK General Data Protection Regulation (UK GDPR) & the UK Data protection Act 2018 (DPA) 
Ipsos UK is required to comply with the UK General Data Protection Regulation and the UK Data Protection Act; it covers the processing of personal data and the protection of privacy.

Cyber Essentials Certificate

A government backed and key deliverable of the UK’s National Cyber Security Programme. Ipsos UK was first assessment validated for certification in 2016 and this is renewed annually. Cyber Essentials defines a set of controls which, when properly implemented, provide organisations with basic protection from the most prevalent forms of threat coming from the internet. 

Fair Data 

Ipsos UK is signed up as a ‘Fair Data’ Company by agreeing to adhere to ten core principles. The principles support and complement other standards such as ISOs, and the requirements of Data Protection legislation.