Skip to main content
English Cymraeg
Research project

Food and You 2: Technical Report

This technical report is for the Food and You 2 survey. This report outlines the questionnaire development, sampling, fieldwork and weighting approach.

Last updated: 26 September 2024
Last updated: 26 September 2024

Survey Background

The Food and You 2 Survey was commissioned by the Food Standards Agency (FSA) in September 2019. Fieldwork has been conducted on a biannual basis since July 2020.

Food and You 2 is conducted among a cross-section of adults (aged 16 years or over) living in households in England, Wales and Northern Ireland. Adults invited to take part in the survey are selected from a sample of the Royal Mail’s Postcode Address File (PAF) using a random probability sampling methodology. The survey is conducted using a push-to-web methodology. This is a quantitative data collection method in which participants are contacted using an offline means of contact and asked to complete an online survey. In this survey, participants are contacted by letter, with those who choose not to complete the online survey, after the initial reminder, subsequently sent a postal version. The survey explores participants’ food-related knowledge, behaviours and attitudes.

Details about each wave of fieldwork including fieldwork dates and response rates can be found in the accompanying technical report spreadsheet published alongside each wave.  

In Wave 8 Scotland was included for the first time, funded by Food Standards Scotland (FSS). Unless mentioned separately, all content in this report is applicable to England, Wales, Northern Ireland and Scotland.

About the Food Standards Agency

TheFood Standards Agency (FSA) is an independent Government department working to protect public health and consumers’ wider interests in relation to food in England, Wales, and Northern Ireland. The FSA’s overarching mission is “food you can trust”, which means the Agency strives towards a food system in which food is safe, food is what it says it is and food is healthier and more sustainable. As such, understanding consumers’ attitudes, knowledge and behaviour in relation to food is of vital importance to the FSA.

Food and You 2 is the FSA’s principal source of methodologically robust and representative evidence regarding consumers’ attitudes, knowledge and behaviour in relation to food. This survey has an important role in measuring the FSA’s progress towards its strategic objectives, providing evidence to support its communication campaigns and other activities, and identifying topics for further research or action.

The FSA also monitor consumer behaviour and attitudes in their online, monthly Consumer Insights tracker.

About Food Standards Scotland

Food Standards Scotland (FSS) is the independent government agency responsible for food and feed safety and standards, and healthy eating in Scotland.  FSS’s vision is for a safe, healthy and sustainable food environment that benefits and protects the health and well-being of everyone in Scotland.  FSS aims to use the data and evidence from Scotland to provide assurance and advice that inspires consumer confidence and improve public health.

FSS also monitor consumer behaviour and attitudes in their online Food in Scotland Consumer tracker survey.

History of Food and You

Since its inception in 2000, the FSA has commissioned surveys to collect quantitative data on the public’s reported behaviour, attitudes and knowledge relating to food. Between 2000 and 2007 the FSA conducted an annual Consumer Attitudes Survey (CAS). In 2010, this was replaced by the more rigorous ‘Food and You’, a biennial survey conducted face-to-face. Food and You became the FSA’s flagship social survey. In addition, the FSA conducted regular tracking surveys including the biannual Public Attitudes Tracker and annual Food Hygiene Rating Scheme (FHRS) Consumer Attitudes Tracker. The FHRS is a scheme that helps consumers choose where to eat out or shop for food by giving clear information about the businesses’ hygiene standards. The scheme is run in partnership with local authorities in England, Wales and Northern Ireland.

In 2018, the FSA’s Advisory Committee for Social Science (ACSS) (Opens in a new window) recommended that Food and You and the Public Attitudes Tracker be replaced with a new ‘push-to-web’ survey. Food and You 2 was commissioned in 2019 with data collection for Wave 1 commencing in July 2020. 

Due to differences in the survey methodologies, comparisons cannot be made between Food and You or the Public Attitudes Tracker and Food and You 2, therefore Wave 1 of Food and You 2 in 2020 represented the start of a new data time series. Data are collected through Food and You 2 on a biannual basis.  

Summary of the survey

Design

The research is conducted using a push-to-web methodology, where households who are selected to take part, receiving a letter that invites them to complete the Food and You 2 survey online. Up to two adults in each household can take part. Fieldwork for Waves 1 to 4 was conducted while there were restrictions put in place due to the COVID-19 pandemic. In Waves 5 and 6, all restrictions were removed however some questions asked participants about their behaviour in the 12 months prior to the survey fieldwork when restrictions would have been in place in all countries to some extent. Restrictions may have impacted some participants’ behaviours relating to food, and in turn may have impacted how participants answered certain questions.

In this study, the fieldwork is structured around four mailings: 

  • Mailing 1: Initial invitation letter inviting up to two individuals per household to complete the Food and You 2 survey online
  • Mailing 2: Reminder letter
  • Mailing 3: Second reminder, which includes up to two versions of a postal questionnaire
  • Mailing 4: Final reminder letter 

Mailings 2, 3 and 4 are sent only to those who have not completed the survey since the previous mailing, and households where there is a known second participant eligible to take part but who has not yet completed the questionnaire. There is a question which asks for the number of adults in the household. If one person responds in a household and they stated that there was only one adult in their household, they would not be sent a reminder letter. If they stated that more than one adult was present in their household then that household would be sent a reminder, unless both adults had completed the survey.

Questionnaire 

The survey includes an online version of the questionnaire and two postal versions. On both versions there were slight differences between the questionnaires by country, reflecting the different regional government bodies, their roles and responsibilities. For participants in Wales, both the online and postal surveys are offered in Welsh and English. Participants can take part in Food and You 2 via the online survey or using a postal survey.

The online questionnaire is structured as a series of modules covering key areas of interest to the FSA/FSS. Most questions are behavioural, asking participants to state their usual activities or to recall recent actions. A smaller number of questions are attitudinal, asking participants to state their opinions on various subjects, or knowledge-based, for example asking participants what they think the temperature inside their fridge should be. The questionnaire includes demographic questions to allow the FSA to conduct subgroup analysis on the data. When analysing data from Food and You 2, it is important to note that behaviours are self-reported and therefore may not reflect actual behaviour. Measures are taken to minimise the impact of social desirability (for instance, stating that results are reported anonymously) and to increase accuracy (including time frames), but there is likely to be some difference in self-reported and actual observable measures.

Due to the length and complexity of the online questionnaire it is not possible to include all questions in the postal version of the questionnaire. The postal version of the questionnaire needs to be shorter and less complex to encourage a high response rate, so in most waves, two versions are produced for each country. Key modules (e.g. About You, food concerns, food security) are asked in all versions of the postal surveys, with the remaining content divided across the country-specific versions depending on available space. Details of the modules included within Food and You 2 can be found below:

  • About You and Your Household (Core)
  • Food Concerns (Core)
  • Food You Can Trust (Core)
  • Household Food Security (Core)
  • Eating at Home (Core questions)
  • Eating at Home (Full module)
  • Food Shopping
  • Defra Questions
  • Eating Out
  • Online Food Platforms
  • Food Hypersensitivities (Core questions)
  • Food Hypersensitivities (Full module)
  • Healthy Eating (Northern Ireland only)
  • Emerging issues

Whilst steps are taken to make the online and postal questionnaires as comparable as possible, there are minor differences in the order questions are asked, question wording and the way routing is applied. The online version of each survey can be found in appendices linked to the key findings report.

Further information on the questions asked in each module and questionnaire development can be found in the ‘Questionnaire development and cognitive testing’ section.

Sampling

At the start of each wave a random sample of addresses is drawn from the Royal Mail’s Postcode Address File (PAF), a database of all known addresses in the UK. The sample is drawn from the address list for England, Wales and Northern Ireland (and Scotland when applicable). The size of the sample from each region is aimed to provide an estimated minimum of 1,000 household responses in each of Wales and Northern Ireland, and 2,000 from England. Wales and Northern Ireland are therefore over-represented in the sample. The samples are drawn in this way to enable effective subgroup analysis on the data. When Scotland is included, the number of addresses sampled is aimed to provide 1000 household responses.

The sample is further stratified by local authority to ensure even geographical spread across the three countries. The local authority classifications used are those that existed when the sample was first selected in July 2020 for Wave 1 and have not been updated to show current boundaries. Within each local authority the sample is stratified by degree of deprivation to ensure a broadly representative sample in terms of income level. In each wave, a reserve sample is also drawn at the same time as the main sample. More details on this can be found in the ‘Sampling’ section.

In each selected household, up to two adults (aged 16 years or over) are invited to participate in the survey. In the interests of maximising the response rate, no selection criteria (other than being aged 16 years or over) are imposed regarding the selection of individuals within each household.

The sampling strategy for this survey is described in greater detail in the ‘Sampling’ section.  

Weighting

Weighting is a process by which survey estimates are adjusted both to compensate for unequal selection probabilities and to reduce demographic discrepancies between the sample who completed a survey and the desired survey population - in this instance, the populations by country.

Following data collection for each wave, two kinds of weight are applied to the data. First, selection weights are calculated to equalise selection probabilities for individuals across all sampled households. Second, these weights are adjusted to ensure achieved sample estimates align with ONS country population totals for selected variables. Following this, additional weights are created for use in combined-country analyses by scaling the country sample sizes to be proportional to their corresponding country population values.

Finally, a ‘Wales & Welsh-England’ weight is calculated to permit comparisons to be made between England (excluding London) and Wales after controlling for differences in age, gender, ethnic group, household size, and urban-rural mix.

In some waves, additional weights are produced when the mode for a question is different across the countries.

The weighting process is described in greater detail in the ‘Weighting’ section.

Questionnaire design

Food and You 2 uses a sequential mixed-mode approach involving an initial online stage, with non-respondents then followed up using a postal questionnaire. Therefore, the questionnaire is designed in such a way that it can be presented online and on paper. Like many other push-to-web surveys, the online version of the questionnaire is too long and complex to translate into an equivalent self-completion questionnaire suitable for postal administration. This means there are some differences between the online and postal questionnaires. To help address this limitation, in most waves up to two versions of the postal questionnaire are developed for each country, thereby enabling more questions to be asked across the sample as a whole. However, even with two versions of the postal questionnaire, there is insufficient space to include some of the online questions.

Given the wide range of topic areas that the FSA and external stakeholders are interested in investigating, the issue of questionnaire length is considered throughout the questionnaire development period. Ipsos recommends that, in the interest of reducing drop-out rates, the online questionnaire should not take longer than 30 minutes for the average participant to complete and the postal questionnaires should not be more than 20 pages in length. This time limit for the online survey and page limit for the postal survey were recommended to minimise the risk of participants not completing the survey, and to minimise the risk of straight-lining (i.e. selecting the same answer consistently) when going through the survey.

A modular approach is required for Food and You 2 to keep the length of the survey to a maximum of 30 minutes, and to minimise the likelihood of participants starting but not completing the survey. It also maximises coverage of topics and allows for new modules or questions to be added on emerging topic areas. When developing the Food and You 2 Wave 1 questionnaire, the topic areas the FSA were interested in were grouped into broad ‘modules’ (such as food shopping, food concerns or food we can trust). These modules were then assessed for frequency of fielding (6 months, 12 months or 24 months). For instance, attitudinal questions that are used to measure the FSA’s performance (e.g. trust in the FSA) or where fluctuations over time are more likely (e.g. concerns with food) were considered to be ‘core’ and therefore collected every 6 months. Whereas behavioural questions (e.g. on food practices in the home) that were relatively stable over time in previous studies were deemed to be appropriate for fielding less frequently.

Questionnaire development draws upon the work done for previous waves. The development for Wave 1 involved questionnaire development workshops, cognitive testing, usability testing and a pilot (covered in more detail in the Wave 1 Technical Report). The questionnaire development for Wave 2 onwards was shorter as core questions and materials had been developed in Wave 1. When newly developed questions are added to the survey, a phase of cognitive testing is held to test consumer understanding. For waves with few new questions, cognitive testing is not conducted.

Design of questions

The content and nature of the questions is informed by previous research conducted by the FSA, the FSA and stakeholders’ research priorities, and by Ipsos’ prior experience in survey research.

In Wave 1 a prolonged period of questionnaire development took place which involved an extensive review of questions from previous FSA surveys (Food and You and Public Attitudes Tracker). After all relevant questions were compiled, a workshop with the Food and You 2 advisory group was held to discuss key priorities for the questionnaire. This was followed by a second workshop with key internal stakeholders to discuss their priorities for the questionnaire and provide Ipsos with direction regarding questionnaire content.

Following this, draft questionnaire modules were compiled based on questions from previous FSA surveys. Numerous alterations to the wording, ordering, format and content of the questions were made in the process based on survey design best practice, with additional questions designed based on stakeholder needs.

To determine content for the questionnaire for each wave, meetings were held between Ipsos, the FSA and key stakeholders to discuss research priorities and to decide which questions from the online questionnaire should be included in the postal questionnaires.  

To enable comparability of the data between waves, questions carried over from earlier waves are largely kept consistent in wording and format. For the exceptions, see the ‘Differences between Waves’ sheets in the Tables User Guide for each wave.

Cognitive testing

In social and market research, cognitive testing refers to a form of qualitative data collection in which participants are asked by an interviewer to examine a set of materials and explain their understanding of them. In questionnaire development, cognitive testing interviews are used to evaluate how participants approach a questionnaire so that any issues regarding participant comprehension may be highlighted.

Following the completion of the first questionnaire draft, a series of cognitive testing interviews are arranged to test a sub-set of questions from the questionnaire, specifically those new or modified for that wave. Cognitive interviews are conducted with members of the public, including some conducted in the Welsh language. During recruitment participants are screened on age, gender, ethnicity, geographical region, employment status, income and, depending on the focus of the topics, whether or not they have a food allergy, intolerance or Coeliac disease. This ensures people with relevant food behaviours and habits are spoken to, which is important for assessing the questions.

Key aims of the cognitive testing are:

  • to gauge the simplicity of questions and participant comprehension of key terms;
  • to note any ambiguity in the interpretation of the questions; and
  • to identify any questions that may not produce meaningful data.

The Welsh language interviews also help to evaluate the accuracy and clarity of the translations. 

Each cognitive interview is undertaken with a single participant, lasts approximately one hour, and is conducted by a moderator using online video conferencing software. During each interview, the moderator records the participant’s answers and notes further observations regarding how the participant interprets the questionnaire, with attention paid to any problems encountered. The English language interviews are conducted by moderators from Ipsos, while the Welsh language interviews are conducted by a trusted external qualitative researcher. Some of the interviews are conducted in the (virtual) presence of an observer from the FSA.

Following completion of the interviews, Ipsos submits a written report to the FSA detailing the findings with recommendations. An extended meeting is subsequently held to discuss the findings and agree on further edits to the questionnaire.

The specific number of cognitive interviews conducted each wave are shown in the accompanying technical report spreadsheet.

Postal questionnaire design and modular approach

The postal questionnaires consist of a selection of questions from the online survey. The full questionnaire is not included in the postal versions due to concerns regarding questionnaire length.

Questions are selected for inclusion in the postal questionnaires based on a number of factors. For instance, questions that are a key strategic measure for the FSA (e.g. trust in the FSA) are included to provide the FSA with robust data. Questions are also included to maximise the base sizes for specific groups of interest (e.g. participants with food allergies). Finally, questions where the mode of delivery and sample profile may impact the data collected, for example questions on food security. It is important to include the majority of the demographic questions in the postal survey to enable subgroup analysis.

As with the online questionnaire, there are country-based differences in the wording of a small number of questions. Participants in Wales are sent copies of the questionnaires in English and in Welsh.

As noted, the survey is conducted using a modular approach. Certain ‘core’ modules are included in each biannual survey wave, while others are rotated every 12 or 24 months. The content of the survey for each wave is noted in the Appendices.

Sample design

The sample for each wave of Food and You 2 is selected from the postcode address file (PAF) in England, Wales and Northern Ireland (and Scotland, when applicable). The sample of addresses is unclustered within each country. Households are sampled to achieve interviews in 1,000 households in Wales and Northern Ireland, and 2,000 households in England (4,000 overall). In other words, a greater proportion of households are sampled in Wales and Northern Ireland compared to England. This is done to improve the precision of estimates for Wales and Northern Ireland. When Scotland is included, the sample of addresses is aimed to achieve 1,000 household responses.

The size of the issued sample in each country is calculated by dividing the target sample by estimated address yield (proportion of addresses with at least one productive response). Yield estimates are based on actual yields obtained in previous waves. An additional reserve sample is drawn to be issued (in whole or in part) if response rates are lower than anticipated.

The sample sizes and assumptions for each country can be found in the accompanying technical report spreadsheet.

The samples of main and reserve addresses are stratified proportionately by region in England (with the other countries treated as separate regions), and within region (or country) by local authority (district in Northern Ireland) to ensure that the issued sample is spread proportionately across the local authorities. National deprivation scores are used as the final level of stratification within the local authorities (in England the Index of Multiple Deprivation (IMD), in Wales the Welsh Index of Multiple Deprivation (WIMD), in Northern Ireland, the Northern Ireland Multiple Deprivation Measure (NIMDM) and in Scotland the Scottish Index of Multiple Deprivation (SIMD)). In practice stratification is achieved by ordering the population of PAF addresses by (i) region (country), (ii) local authority (district) within region, and (iii) national deprivation score of Lower Layer Super Output Areas (LSOA) (OA on Northern Ireland) within local authority (district), and then selecting addresses by the method of random start and fixed interval. The steps for sampling that are taken are:

  1. From the PAF file, exclude all business addresses and private addresses that were selected in previous waves of the Food & You 2 survey
  2. Order the address list by region (for England only)
  3. Within each English region, and country, order addresses by local authority (district in Northern Ireland)
  4. Within local authority / district, order addresses by IMD of LSOA in England, WIMD of LSOA in Wales, NIMDM of Super Output Areas (SOA) in Northern Ireland, and Scottish IMD (SIMD) data zones in Scotland
  5. Select numbers of addresses agreed with the FSA by method of random start and fixed interval from these ordered lists
  6. Divide stratum-ordered selections into successive groups of 3 selections
  7. Within each group of three, randomly allocate two cases to the main sample, and one case to the reserve sample.

Household sample design

As stated above, addresses are selected from the Postcode Address File (PAF) systematically using the random start and fixed interval method. At each address, up to two adults are invited to take part in the survey. Two unique login codes for the online survey are provided in the initial invitation. Up to two postal questionnaires are provided in the postal questionnaire mailing (Mailing 3). In the reminders, two logins / questionnaires are sent to completely non-responding addresses. At any address where one adult has already completed the questionnaire only one login code and one postal questionnaire are sent. Each adult who completes the questionnaire receives a £10 online or paper voucher.

Process for selecting adults within a household

There are many approaches that could have been used for selecting adults within households. For instance, the two adults with the most recent birthdays or the adults with the two next birthdays could be selected. These are commonly referred to as quasi-random approaches, as they are roughly equivalent to a fully random approach. While this would have randomised the selection process to a degree in households where there were more than two adults, in self-administered surveys it adds another barrier to completing the survey and has been shown to be incorrect in about 20% to 25% of cases. Further details are available from TNS BMRB’s 2013 report of web experiments prepared for the Cabinet Office on the Community Life Survey or a journal article from 2014 by Kristen Olson and Jolene D. Smyth focusing on the accuracy of within-household selection in general population web and mail surveys published in Field Methods (volume 26, issue 1, pages 56–69).

With this in mind, it was decided to allow any two eligible adults (aged 16 years or over) to participate in the survey. Given the household size distribution in the UK, it was estimated that 93% of the sample selected in this way would also have been selected had we managed to successfully implement a random selection method.

This approach is taken for all waves of the Food and You 2 survey.

Using the reserve sample

After four weeks of fieldwork, the number of completed online surveys is reviewed to determine whether it is likely that the target of 2,000 household returns for England and 1,000 household returns for each of the other countries will be met. If the target is unlikely to be met, the reserve sample may be used to meet the household returns target.

When a reserve sample is used, the same push-to-web survey design that is used for the main sample is replicated for the reserve sample, meaning that participants are able to complete the survey online or on paper.

For example, in Wave 5, to help ensure the target was met, invitation letters were sent to 2,000 additional addresses selected randomly from the reserve sample list. The reserve sample was split by country in the same proportions as the main sample (944 in England, 489 in Wales and 567 in Northern Ireland). The interval between mailings and the total number of mailings was reduced slightly compared to the main sample as described in the Fieldwork and response rates section.

Letters and reminders

Letters and reminder strategy

The mailing approach follows Ipsos’ standard push-to-web methodology:

  1. An initial invitation letter is issued to sampled addresses inviting up to two adults to go online and complete the online questionnaire.  
  2. The first reminder letter is issued around 2 weeks after the initial invite. issued. Reminder letters are only sent to non-responding addresses and addresses where one adult has completed the online questionnaire but not a second adult (the presence of an eligible second adult is determined in the first questionnaire). 
  3. The second reminder letter is only sent to non-responding addresses and addresses where one adult has completed the online questionnaire but not a second adult. All of these letters are accompanied by one or two postal questionnaires, to allow those who cannot access the internet, and those who may be less comfortable completing online questionnaires, to take part. Those in Wales receive one questionnaire in English and one in Welsh. Further detail is provided in the section on the postal questionnaire. 
  4. A final reminder letter is issued to non-responding addresses. 

When the reserve sample is used, invitation letters are sent around a month after the initial mailing with the first reminder sent to non-responding households about a fortnight later. The second reminder, which includes the postal questionnaire is sent about another two weeks later. A decision about whether issuing the final reminder letter to reserve sample addresses is necessary depends on the response rate. However, as a result of the reserve sample being issued, the survey must remain open for both main and reserve sample addresses for an extended period. 

Letter design 

The principles for designing the invitation and reminder letters, which have been kept substantially the same across waves, are primarily based on the Tailored Design Method, which was initially developed by Don A Dillman and described in ‘Internet, Phone, Mail and Mixed-Mode Surveys: The Tailored Design Method’ (2014) (footnote 1).  A host of other literature and best practice based on previous studies (mainly the Active Lives survey and Labour Force Survey) were also used to inform the design. The main aim of the letters is to provide all the relevant information a participant requires to complete the survey, and to answer immediate questions which they may have. 

Our guiding principles for designing the letters are:

  • Use simple and easy to understand language, with no unnecessary complicated text
  • Cover key messages that need to be conveyed in the letters including:

(a)    Importance
(b)    Motivators for taking part
(c)    How to take part
(d)    Your personal data are safe

Importance is conveyed in all four letters in the following ways:

  • FSA, Defra / FSS logos are prominent
  • Visual clutter which could distract from the logos and the importance of the survey is avoided
  • Professional letter format with address of recipient and full date
  • Signed by someone with authority (signified by their job title and organisation details) 
  • Highlighted key messages in the text; using these to break up the text makes it easier to read

The main motivational statements varies across the four letters, with the aim of increasing the likelihood of converting non-respondents:

  • 1st letter: It’s easy to take part and why take part
  • 2nd letter: Taking part will benefit you and your community
  • 3rd letter: We want to hear from as many people as possible
  • 4th letter: This is the last chance to have your say
  • In addition, all letters place a degree of emphasis on the financial motivator for taking part – receiving a £10 gift voucher 

In addition to this the letters also provide key information about Ipsos and the Food Standards Agency / Food Standards Scotland and contact details for Ipsos should the participant have any queries about the survey.

Online questionnaire

The Food and You 2 survey is hosted using Ipsos’ global Dimensions platform within Rackspace, a managed hosting facility and Europe’s most successful managed hosting company. The security features offered by Rackspace, and Ipsos are listed below:

At Rackspace:

  • Rackspace has SAS 70 type II and Safe Harbor certifications and operates management systems which are compliant to International standards (IS09001, ISO27001);
  • The servers and network infrastructure are physically located in England;
  • The servers and network components are fully redundant;
  • Rackspace guarantees recovery of hardware failures within one hour.

At Ipsos:

All applications and data for Dimensions are managed by Ipsos. Access to Dimensions’ questionnaires and data are password protected. Only a small number of online survey experts have access. Survey data and any participants personal information are stored in separate databases at Ipsos.

Survey URL

A dedicated URL that specifically includes ‘food and you’ for the Food and You 2 survey is used. When deciding on the URL we wanted to choose an address that was short enough for participants to remember and one which would not easily be mis-typed. It also needed to give some indication of survey content.

Online questionnaire accessibility

The online questionnaire is made to be as accessible as possible to participants. Key to this is offering those in Wales the opportunity to complete the survey in Welsh (in line with the Welsh Language Act 1993). Participants can request to complete the survey in another language by calling the Food and You 2 survey helpline, or by asking someone to complete it on their behalf.

The Food and You 2 survey is designed to be accessed using a range of devices, including desktop computers, laptops, tablets and smart phones. The survey is designed with a ‘mobile first’ approach to minimise drop offs and improve response rates. A ‘mobile first’ approach means that the online questionnaire is designed with smart phone users in mind initially, as this is increasingly how participants choose to access online questionnaires. Additionally, the online questionnaire is designed in a way that make it easy for people to adjust colour contrasts and increase font size. 

Topline data checks

Once the online survey is in field and a sufficient number of responses have been received, topline data are reviewed to ensure no errors are detected with the filtering. Whilst extensive testing is carried out prior to fieldwork launch, on occasion unusual combinations of answers, particularly those involving non-valid answers, mean that minor errors are identified. By reviewing the data early on in fieldwork such errors can be identified and corrected.  

Postal questionnaire

At the second reminder (Mailing 3) non-responding households, and non-responding individuals in multi-adult households, are sent postal questionnaires, according to the country they live in.

In most waves there are two postal versions for each country. For these waves person 1 and person 2 at each address is assigned either version A or B on a quasi-random basis. Non-responding households are sent version A and B to complete (in Wales, an English and Welsh version are sent – where version A or B are randomly assigned to either language). Where only one completed questionnaire from a multi-adult household is received, the version which has not yet been completed, is sent to the person who has not responded (in Wales, the version that has not yet been completed is sent in both English and Welsh).  

In waves where there is only one version of the postal questionnaire for each country, non-responding households are sent two of the same postal questionnaire to complete (in Wales, this is sent in both English and Welsh). Where we have received only one completed questionnaire from a multi-adult household, one postal questionnaire is sent to the person who has not responded (in Wales, this is sent in both English and Welsh).

Storage of scanned images and survey results

All scanned images and survey data are stored securely. Only relevant staff have access to these for data processing. Images are securely deleted after fieldwork and electronic coding of postal responses has been completed.

Vouchers for participants 

Participants are offered a £10 gift voucher as a thank you for taking part in the survey.

Participants who complete the survey online who wish to receive a voucher enter their email address at the end of the survey. Participants are then emailed a Love2shop e-voucher of the nominal amount which they can redeem online at the Love2Shop website.

Those who complete the postal questionnaire are given the choice of receiving a Love2shop e-voucher or paper Love2shop voucher via post, either of which can be redeemed at a wide range of high street stores. Participants are asked to give their name in order to address the voucher to the correct person, but even without a name a voucher is sent to that address.

Handling queries

The survey website provides information about the survey and includes a list of frequently asked questions (FAQs) which have been developed based on similar studies.

Additionally, a dedicated freephone telephone helpline and email address are set up allowing participants to contact Ipsos if they have any queries about the survey. Telephone queries are first recorded by an answer machine and a member of the research team returns the call when they have identified an appropriate solution. Emails sent to the Food and You 2 survey inbox are first answered with automatic responses, which include the commonly asked questions and answers. Each query is then followed up individually within five working days.

Response rates

Response rates can be found in the accompanying technical report tables.

 

Overview of weighting

The same weighting approach has been taken in all waves of Food and You 2. Weights are initially calculated separately for each country in two stages:

  1. Calculation of selection weights (described in the section on selection weights)
  2. Calibration of selection weights to country population totals (described in section on population weights)

Next, weights are created for use in analyses of combined-country data by scaling the weighted country sample sizes to be proportional to the corresponding country population values (for adults aged 16 and over).

Because it is not possible to include all questions in the postal questionnaires (see the section called ‘Questionnaire development and cognitive testing’), four separate question-type weights are calculated in each country, and in any combined country sample. For most waves, four question-type weights are designed to be used as follows:

  1. All-questionnaire weights to be used for questions asked of all sample members in all online and postal questionnaires
  2. Online questionnaire weights to be used for questions asked only of online participants (i.e., questions not asked in the postal questionnaires)
  3. Online questionnaire plus version 1 postal questionnaire weights to be used for questions asked only of online participants and postal questionnaire respondents receiving only one of the postal questionnaires
  4. Online questionnaire plus version 2 postal questionnaire weights to be used for questions asked only of online participants and postal questionnaire respondents receiving only the other postal questionnaire

Four additional weights (one for each of these question types) are calculated for any combined country sample.  

However, in waves where the healthy eating module is included, one of the two postal questionnaires contains these Northern Ireland-only questions, with the other postal questionnaire only used in the other countries. This means:

  1. All-questionnaire weights are used for questions asked to all sample members in all online and postal questionnaires 
  2. Online questionnaire weights are used for questions asked only to online participants
  3. All-countries Defra questions weight is calculated for a selection of questions asked of all participants in England and Wales, but only online participants in Northern Ireland.
  4. Combined England & Wales online questionnaire weight is used for questions which are asked only in England and Wales and only in the online questionnaires offered in those countries.

Online questionnaire plus Version 1 postal questionnaire weights and online questionnaire plus Version 2 postal questionnaire weights are not relevant in waves when there is a Northern Ireland specific postal questionnaire.

Once the main weights are calculated as described above, supplementary ‘Wales & Welsh-England’ weights are calculated. These are designed to allow comparisons to be made between Wales and England (excluding London) after controlling for country profile differences in age within gender, ethnic group, number of adults per household, and urban-rural mix.

Calculation of selection weights

Selection weights are created to compensate for (i) variations in within-household individual selection probabilities and response propensities and (ii) the fact that, by design, some questions are not included in all questionnaires. As a maximum of two eligible adults are surveyed per household, adults in households with more than two adults are less likely to be included in the survey. So without this weight, individuals living in households in which some eligible adults are not interviewed would be underrepresented relative to individuals living in households in which all eligible adults are interviewed. They are calculated in the following stages:

  1. The all-questionnaire selection weight is calculated as: (number of eligible people aged 16 years or over in the household)/(number of participants in the household). 
  2. The online questionnaire selection weight is calculated as: (number of eligible people aged 16 years or over in the household)/(number of online participants in the household). 
  3. Next the Online questionnaire plus version 1 postal questionnaire weight and the online questionnaire plus version 2 postal questionnaire weight are calculated by doubling the value of the all-questionnaire selection weight for postal respondents relative to the corresponding value for online respondents (because the relevant questions are only asked in half the postal questionnaires).   

Values are capped to the range 1-3 for the all-questionnaire and online selection weights, and to the range 1-6 for the online questionnaire plus version 1 postal questionnaire and online questionnaire plus version 2 postal questionnaire weights to restrict variance inflation.

Calibration to population values

Next, selection weights are applied to the three individual country samples and each is calibrated to the corresponding country population values for the number of adults aged 16 or over by:
(i)    age band within gender 
(ii)    geographic area (defined separately for each country) 
(iii)    deprivation quintile (calculated using each country’s multiple deprivation index)

These weighting variables are often used as standard in social surveys because they correlate reliably with both response propensity and a wide range of survey variables. We note that in some previous rounds of the face-to-face Food and You survey, working status was used as a weighting variable instead of deprivation quintile. In Food and You 2 it was decided not to use this variable for weighting the sample because early waves of fieldwork took place during the Covid-19 pandemic, during which rates of employment were likely to be unstable.  Deprivation quintile was used as a substitute indicator of general economic prosperity. This approach is expected to continue for the immediate future for comparability.

Weighting targets are taken from recent ONS Mid Population Estimates and NISRA Mid Population Estimates. The estimates used are provided in the accompanying technical report tables.

Initial calibration is carried out separately in each country for each of the four questionnaire type weights described above. For each questionnaire type weight, calibration adjustment factors are calculated by dividing the individual country weights by the selection weights. These adjustment factors are then capped at the 99th percentile value to limit variance inflation and applied to the selection weight to produce final individual country weights. 

After calibration and adjustment factor capping, the individual country level weights are scaled to equalise unweighted and weighted sample sizes in each country.

The aim of these within-country calibration procedures is to match the profile of the weighted sample to that of the population aged 16 or over on gender, age band, geographic region, and deprivation quintile.  In practice, there are slight discrepancies between weighted sample totals and population figures as a result of the adjustment factor caps.

Creation of all-country weight

An all-country version of each questionnaire type weight is then constructed by combining the individual country samples and rescaling final individual country weights so that weighted sample country proportions match the respective country population (age 16 years or over) proportions.

The all-countries Defra weight is constructed using the same rescaling process described above but using the all-questionnaire individual country weights for England and Wales (and in Scotland where applicable) and the online questionnaire weight for Northern Ireland.

Where a combined online questionnaire weight is needed that excludes a country because a question is not asked to those respondents (so far, this has just been Northern Ireland), it is constructed from the online only individual country weights for those countries, as described above. Where the all-countries weights are rescaled to match the respective (age 16+) population proportions for all countries, this weight is rescaled to match just the respective (age 16+) population proportions.

‘Wales & Welsh-England’ standardisation weight

This weight is designed to calibrate English sample estimates to Welsh population characteristics for comparative purposes. It is calculated from the England sample as follows:

  1. London cases are dropped (to make the sample characteristics more comparable with Wales, as London is in many ways unique in the UK)
  2. The non-London England sample proportions are calibrated to the weighted Wales sample proportions for four variables: number of adults in the household, ethnic group, urban-rural and age by gender. These four variables were selected when the ‘Wales & Welsh-England’ weights were first constructed in Wave 1. Weighted estimates for Wales and non-London England were compared across a range of candidate variables and statistically significant differences were found for urban-rural, ethnic group, household size and age within gender

‘Wales & Welsh-England’ weights are calculated only for respondents in England outside London. For later waves, the individual country weights for Wales are also copied into this weight, to assist with any country-level comparison analysis of England and Wales.

The final weighting variables for all weights are defined as follows.

Table 1: Age within gender (male and female)

Males Females
16 to 24 16 to 24
25 to 29 25 to 29
30 to 34 30 to 34
35 to 39 35 to 39
40 to 44 40 to 44
45 to 49 45 to 49
50 to 54 50 to 54
55 to 59 55 to 59
60 to 64 60 to 64
65 to 69 65 to 69
70+ 70+

Number of adults in household:

  • 1 adult
  • 2 adults
  • 3+ adults
  • Question not answered

Ethnic group:

  • White 
  • Asian
  • Black
  • Mixed
  • Other/not answered

Urban rural:

Urban: Output Area (OA) falls into a built-up area with a population of 10,000 or more

Rural: All other OAs

Overview

Questionnaire versions

As described in earlier sections, the data are collected from two sources: an online questionnaire and either one or two postal questionnaires. The online questionnaire includes some built-in routing and checks within it, whereas the postal questionnaires rely on correct navigation by participants and there is no constraint on the answers they can give.

In addition, the online data are available immediately in their raw form, however the postal questionnaire data must be scanned and keyed (typed in) as part of a separate process. Tick box answers are captured by scanning, and numbers and other verbatim answers are captured by keying, with the data then coded in an ascii text string.

In line with standard procedures on a mixed-mode survey such as this, the online questionnaire is taken as the basis for data processing. Once that is processed then a data map/dictionary is used to match the data from the postal questionnaires with the online data.

A wide range of edits are carried out on the data followed by numerous checks. These have been detailed throughout this section.

Data editing

Postal data – forced edits

The postal data are subject to errors introduced by participants. Edits are required for this data and in some cases to also match the postal routing to the online questionnaire routing. There are five key principles to editing postal data which are drawn upon for this:

  1. Forward editing is applied to all filtered questions. If a participant is eligible to answer a question but has not, they were assigned a code of -99 “Not stated”.
  2. A small number of back edits are applied to a handful of variables. If a participant has answered a question but has not answered “yes” at the previous filter question a back edit is applied (i.e. the original question is edited so that the data matches the routing). This is only done on variables specified by the FSA as the forward editing approach handles the majority of the cleaning required.
  3. A specification is agreed with the FSA that sets out a number of variables which need to be edited to directly match the online routing. This is applied as a post field edit to the postal data only.
  4. If a question is incorrectly answered as a multi-code question then the responses are set to -99 “Not stated”. 
  5. On a handful of questions that offer a multi-code answer we ask participants to limit their answers to a maximum of three, so edits are made where additional answers are given. A random selection is made of the given answers in SPSS and the process ensures no duplicate answer can be selected.

In addition to this, where there is a multi-code variable that also has an exclusive code (such as “don’t know”), answers are edited so that valid multi-code options take priority, and conflicting exclusive codes are deleted. Where there are several exclusive codes, a hierarchy is applied.

Edits to numeric answers

In Wave 6, edits were only made to one question where the answer was deemed to be improbable or unlikely. For ‘Number of adults’, if a participant from a multiple response household answered that only one adult lived in that household a post-field edit was applied to set the answer to two. This edit will have a subsequent impact on any variables that use nadult as part of the filter and therefore some questions will highlight a group that look eligible to answer but did not.

In Waves 1-4, it has also been necessary to edit ‘age’ data. However, due to the inclusion of both the open question recording age in years and the question capturing the same information via pre-defined categories on the postal questionnaires in Waves 5-7, it was possible to use the answers to the age bands question to verify the answers to the open question in the postal data.

Duplicate responses

Duplicate responses are received each wave where participants complete the postal version of the questionnaire as well as the online. The number of duplicate responses for each wave can be found in the accompanying technical report tables. The online version takes precedent, and the postal version is deleted.

Break off rates

The number of break offs (i.e. where a participant has started the online version but abandoned it) and the variable they occurred for in each wave can be found in the accompanying technical report tables.

Coding

Coding is done by Ipsos on one open ended question (FOODISSA2). Coding is the process of analysing the content of each response based on a system where unique summary ‘codes’ are applied to specific words or phrases contained in the text of the response. The application of these summary codes and sub-codes to the content of the responses allows systematic analysis of the data.

Translation of verbatims in Welsh 

Participants are able to complete the survey in English and in Welsh. There are a small number of participants who choose to complete the survey in Welsh and provide verbatim text. These verbatims are translated by the FSA’s Welsh Language Unit before being coded, alongside the English responses, by Ipsos.

Ipsos coding

Having established the codeframe for FOODISSA2 “What are your concerns about the food you eat?” in Wave 1 (using Q.1a. “What food issues, if any, are you concerned about?” from Wave 17 of the FSA’s Public Attitudes Tracker as a basis for the codeframe) this coding framework is then updated throughout the analysis process of every wave to ensure that any newly emerging themes are captured. Developing the coding framework in this way ensures that it provides an accurate representation of what participants say. After adding in any new codes to the codeframe, it is then reviewed by the FSA and Ipsos research teams with queries subsequently addressed by the coding team. After this it is then appended to the datasets.

Codes are grouped together into broad themes (e.g. ‘Environmental and Ethical Concerns’), shown in bold text in the data tables. Some of the broad themes also have sub-themes (e.g. ‘Fair Trade / Ethical’). For consistency between waves, all codes developed for previous codeframes are included in the codeframe for the latest wave, including codes for which no responses were assigned. These codes are also present in the data tables (and are marked as having received no responses).

Ipsos uses a web-based system called Ascribe to manage the coding of all the text in the responses. Ascribe is a system which has been used on numerous large-scale projects. Responses are uploaded into the Ascribe system, where members of the Ipsos coding team then work systematically through the comments and apply a code to each relevant piece of text.

The Ascribe system allows for detailed monitoring of coding progress, and the organic development of the coding framework (i.e. the addition of new codes to new comments). A team of coders work to review all the responses after they are uploaded on Ascribe, with checks carried out on 5% of responses.

Data checks

Checks on data

Ipsos check the data in two ways. Firstly, the data is checked using the questionnaire and applying a check for each filter to ascertain whether a participant correctly followed the routing. This checks 100% of the questionnaire and is run separately on the raw postal data and the raw online data. Once the data is checked a list is produced that identifies which variables require an edit and this largely relates to the postal data. Any edits applied are set out in the section ‘Data editing’.

Once the data edits are applied a combined dataset is created, duplicate participants are removed (as outlined in the section on duplicate responses) and then the derived variables are created.

Checks on derived variables

Derived variables are created in syntax and are based on the table specification. All derived variables are checked against previous waves to ensure the values are roughly in line with what we would expect to see. Cross checks are carried out on the syntax used to create the derivations to ensure the logic is valid.

Once the derivations are set up the dataset is checked by other members of the team. Some derived variables are based on one question (for instance age), and these are checked by running tabulations on SPSS from the question they are derived from, to check that the codes feed into the groups on the cross-breaks. If the derived variables are more complex and based on more than one question, e.g. NS-SEC, more thorough checks are carried out. For example, the NS-SEC variable is created independently by another data manager – the questions are in line with other surveys, so an independent check is carried out to ensure that the syntax was correctly created. The checker also runs the syntax themselves to check that they can replicate the results in the data.

Checks on tables - Waves 1 to 6

The data tables for Food and You 2 were produced by Ipsos for Waves 1 to 6 of the survey.

In Waves 1 and 2, Ipsos produced the tables using Quantum and subsequent checks were run against the table specification, ensuring all questions were included, that down-breaks included all categories from the question, that base sizes were correct (e.g. for filtered questions), base text was right, cross-breaks added up and were using the right categories, nets were summed using the correct codes, and that summary and recoded tables were included. Once the tables were signed off, the SPSS dataset was exported from Quantum. Weighting of the tables was also checked by applying the correct weight on the SPSS file then running descriptives and cross-break tabulations to check that this matched up with the values on the tables. If any errors were spotted in the tables, these were then specified to the data processing team in a change request form. The data processing team then amended the tables based on this and the tables were rechecked after the changes were made. The data and table checks were carried out by a team at Ipsos.

In Waves 3 to 6, a new process was introduced which switched around the order of data production, with the SPSS data created and signed off first, and then production of the tables (using Quantum) done subsequently. All checks of the tables remained the same.

Checks on tables - Wave 7

From Wave 7, the data tables are produced by the FSA Statistics team.

Before the tables were produced, the table specification was checked for inconsistencies including duplicate rows or tables, bases or SPSS variables appropriate to the table definition.

Once the tables are produced, automated testing checked that:

  • all required tables were present, and contained the correct response columns, net columns and demographic breakdowns
  • where the same base was used for different tables, results in the Total Base column were consistent between them
  • for questions where respondents pick a single option, the percentages in a row totalled 100%
  • for questions where respondents pick a single option and where the table had a net column, the relevant percentages summed to the values in the net column
  • the Country breakdown in the all-country tables produced the same percentages as the Total rows for the individual country tables
  • for each available breakdown, the unweighted totals in the three individual country tables summed to the unweighted total in the all-country table

The results for selected tables were reproduced separately and checked against those in the published tables. These tables were chosen to represent a variety of table types.

List of appendices

The online questionnaire for each wave is provided as a separate PDF alongside each key findings report and a table of methodological differences between waves has been presented below.

For each wave the documentation (listed below) will be uploaded to the UK data service:

  • Food and You 2 online questionnaire
  • Food and You 2 postal questionnaires
  • Food and You 2 invitation and reminder letters
    • Invitation letters (sent to main sample addresses)
    • First reminder letters (sent to main sample addresses
    • Second reminder letter (sent to main sample addresses only)
    • Final reminder letter (sent to main sample addresses only)
  • Food and You 2 abridged and safeguarded SPSS data for all countries combined
  • Food and You 2 SPSS user guide

In addition to the CSV data and user guide, the FSA publish the following data tables on the FSA data catalogue:

  • Food and You 2 data tables for England, Wales and Northern Ireland combined
  • Food and You 2 individual country data tables for England, Wales and Northern Ireland
  • Food and You 2 trends data tables for England, Wales and Northern Ireland combined (and these countries individually) including a user guide

Methodological differences

The largest issued sample size was in Wave 1 (21,053), dropping to 13,922 in Wave 2 and remaining at 14,115 for Waves 3 and 4. For Wave 5, the inclusion of the reserve addresses meant that the sample size was larger than in Waves 2-4, with 16,115 addresses. The issued sample size for Wave 6 and Wave 7 was increased to 14,500 to minimise the need to issue the reserve sample. The total issued sample size for Wave 8 was 18, 850. This comprised of an issued sample size of 15,000 for England, Wales and Northern Ireland, and 3,850 for Scotland.

As up to two adults per household can participate, the number of individual returns overall is always higher than the number of households participating. The number of individual returns, household returns and issued addresses for each wave can be found in the accompanying technical report tables.

Table 2: Comparing differences between Waves 1 to 7, for postal questionnaire versions, fieldwork period and mailing sample
Wave Number of different versions of postal questionnaires for each country Fieldwork period Proportion of available sample sent final mailing
1 Two (Version A and Version B) 29 July to 6 October 2020 (about 10 weeks)

100% of non-responding households in Wales and Northern Ireland. 

50% of non-responding households in England

2 Two ('Eating Out' and 'Eating at Home') 20 November 2020 to 21 January 2021 (about nine weeks) 100% of non-responding households across the sample
3 Two (one in Northern Ireland and one in England and Wales) 28 April to 25 June 2021 (about 8 weeks) 66.6% of non-responding households across the sample
4 Two ('Eating Out' and 'Eating at Home') 18 October 2021 and 10 January 2022 (about 12 weeks) 100% of non-responding households across the sample
5 Two (Version A and Version B) 26 April 2022 and 24 July 2022 (about 13 weeks) 100% of non-responding households across the main sample, 0% of households from the reserve sample
6 Two ('Eating Out' and 'Eating at Home') 12 October 2022 to 10 January 2023 (about 13 weeks) 100% of non-responding households across the sample
7 Two (one in Northern Ireland and one in England and Wales) 28 April to 10 July 2023 (about ten weeks) 100% of non-responding households across the sample
8 Two (‘Eating Out’ and ‘Eating at Home’) 12 October 2023 to 8 January 2024 (about 13 weeks) 100% of non-responding households across the sample

Questionnaire development

In Wave 1 a prolonged period of questionnaire development took place which involved an extensive review of questions from previous FSA surveys (Food and You and Public Attitudes Tracker). After all relevant questions were compiled a workshop with the Food and You 2 advisory group was held to discuss key priorities for the questionnaire. This was followed by a second workshop with key internal stakeholders to discuss their priorities for the questionnaire and provide Ipsos with direction regarding questionnaire content.

Following this, draft questionnaire modules were compiled based on questions from previous FSA surveys. Numerous alterations to the wording, ordering, format and content of the questions were made in the process based on survey design best practice, with additional questions designed based on stakeholder needs. The questionnaire development stages for subsequent waves were much shorter as core questions and materials had been developed in Wave 1.

Cognitive testing

Ahead of Waves 1-4 , 6 and 8, cognitive testing was conducted to examine participant comprehension of new or potentially challenging questions. Participants for cognitive testing were recruited from Ipsos MORI’s iOmnibus recontact database and for Wave 2 onwards, via an external Ipsos approved supplier. No cognitive interviews are conducted in waves with minimal new content.

The number of cognitive interviews conducted each wave can be found in the accompanying technical report tables.

Usability testing

Prior to Wave 1 fieldwork, usability testing was also undertaken to identify areas where improvements could be made in the form and format of the questions on the online survey across different commonly used devices (e.g. mobile phone, tablet, computer). Interviews were conducted over online video conferencing software, with interviewers observing participants journey through the online questionnaire (using screen share technology) and asking questions where relevant. Eleven interviews were undertaken at this stage. This helped identify formatting and layout issues with the online questionnaire which were amended ahead of the pilot survey. Usability testing was not conducted again for subsequent waves as the online questionnaire took the same format as the Wave 1 questionnaire.

Pilot

Prior to the main stage fieldwork for Wave 1 a pilot was conducted on the full questionnaire to understand the time it took for participants to complete the questionnaire and each individual module within it. The questionnaire was tested over four days with 390 members of Ipsos MORI’s online access panel. The questionnaire took participants on average 26 mins and 48 seconds to complete and it was believed that no alterations were needed to the length of the questionnaire, in order for it to fall within the desired 30 minutes. Pilots were not conducted in subsequent waves as the expected completion time was estimated from earlier waves.

Differences in the questionnaire

Due to the modular design of Food and You 2, some questions (core modules) are asked in every wave, whereas other questions are only present in certain waves. For some questions, the base will vary between waves. This is due to changes in the questions available for filtering, and/or their inclusion in the postal questionnaire. Please see the Tables User Guide for details.

The table below notes which modules were present in each wave of the survey, though note the content of each module varied somewhat between waves, as outlined above.

Table 3: Questionnaire module content of each survey wave
Full list of modules from Waves 1 - 8 Topics covered Waves included

About You and Your Household (Core)

  • Dietary preferences and food hypersensitivities (prevalence and diagnosis)
  • Shopping and cooking responsibilities
  • Demographic and household information

1-8

Food Concerns (Core)

  • Concerns about food

1-8

Food You Can Trust (Core)

  • Awareness and trust in FSA
  • Confidence in food safety and authenticity 
  • Confidence in the food supply chain 

1-8

Household Food Security

  • Food insecurity prevalence (USDA food security module)
  • Changes people have made for financial reasons

1-8

Eating at Home (Core questions)

  • Food safety knowledge and behaviour in the home (core questions)

2, 4, 6, 8

Eating at Home (Full module)

  • Food safety knowledge and behaviour in the home

1, 5

Food Shopping

  • Food purchasing behaviour
  • Confidence in allergen information
  • Awareness and actions taken in response to food and allergy alerts 
  • Attitudes towards animal welfare, provenance and the environmental impact of food

1, 3, 5, 7  

Defra Questions

  • Food-buying activities.
  • Environmental concerns, provenance and what influences purchasing choices.

1, 3, 5, 7

Eating Out and the Food Hygiene Rating Scheme (FHRS)

  • Attitudes and behaviour relating to eating out and ordering takeaways
  • Awareness and use of FHRS 

2, 4, 6, 8

Online Food Platforms

  • Frequency of ordering food online
  • Type of products ordered
  • Platforms used
  • Problems encountered when ordering food online.

3, 5, 7

Food Hypersensitivities (Core questions)

  • Prevalence and sources of a food allergy, food intolerance or Coeliac disease

1, 3-7

Food Hypersensitivities (Full module)

  • Detailed questions on food hypersensitivities

2, 6, 8

Healthy Eating (Northern Ireland only)

  • Healthy eating including knowledge (for example government guidance) attitudes and behaviour

3, 7

Emerging issues

  • Sustainable diets and purchasing behaviour
  • Consumption and perceptions of meat alternatives

4, 8

Differences in fieldwork

Fieldwork dates

The Food and You 2 survey should take place every six months. However, the length of the initial questionnaire development led to a later start in its first year. Fieldwork length across waves has varied between 9 and 13 weeks. Fieldwork dates can be found in the accompanying technical report tables.

Sample sizes

There were just over 21,000 addresses issued in Wave 1, leading to 6,408 household returns. Since this was much higher than the target of 4,000 household returns, only 13,922 addresses were issued in Wave 2, and 14,115 addresses issued in Waves 3 and 4. For Wave 5, 16,115 addresses were issued (14,115 from the main sample and 2,000 from the reserve sample). 14,500 addresses were issued for Wave 6. The issued sample size for Wave 6 and Wave 7 was increased to 14,500 to minimise the need to issue the reserve sample.

The total issued sample size for Wave 8 was 18,850. This comprised of an issued sample size of 15,000 for England, Wales and Northern Ireland, and 3,850 for Scotland.

Vouchers

As an experiment, each adult who completed the questionnaire in Wave 1 received either a £15 online voucher, £10 online or paper voucher and £5 online or paper voucher. Based on the results, respondents in subsequent waves received only the £10 voucher. The experiment process and results were summarised in an article published on the Social Research Association (SRA) website, Volume 11, Summer 2021.

Postal questionnaires

For the majority of waves, when postal questionnaires were sent out the version was assigned to person one and person two in the household on a quasi-random basis. This meant half contained questions from one module and the rest contained questions from another module. However, in Waves 3 and 7, one of the modules was only relevant to residents of Northern Ireland. Therefore, the content of the postal questions varied on a country basis rather than randomly.

Reminders

In Wave 1, the final reminder was sent to all outstanding non-responding households in Wales and Northern Ireland, and to 50% of those in England. In Wave 3, the response rate was high enough after Mailing 3 for the final reminder to be sent to just two-thirds of the non-responding sample. In Wave 5, main sample cases received up to three reminders whereas reserve sample cases received up to two – the final reminder was not issued to reserve sample addresses due to an increase in response. In all other waves, all non-responding sample received a final reminder.

Topline checks

For all waves, topline checks were carried out within the first two weeks of fieldwork when sufficient responses in each country had been received to the online survey.'.

Results of the topline checks and refinements to the data checking syntax revealed an inconsistency for HSVOUCH (receipt of Healthy Start vouchers) in Waves 1-4 and Waves 6 and 7 (HSVOUCH was not asked in Waves 5 or 8). HSVOUCH should be asked of participants who were pregnant or had a child aged 0-4 in the household. Checks showed the online script was treating those who answered ‘Prefer Not To Say’ at CHILDAGE as having at least one child age=0 and routing them to HSVOUCH. This meant more participants were being asked HSVOUCH than required.

Numbers affected:

  • W1 – 55
  • W2 – 33
  • W3 – 51
  • W4 – 42
  • W5 – n/a
  • W6 – 17

However, Waves 1 and 2 had been corrected in the data and tables before publication, and therefore, corrections were only needed for tables in Waves 3 and 4.

Changing the routing during fieldwork would introduce inconsistencies within the Wave 6 data waves. Therefore, this was addressed by:

  1. Applying a back edit to HSVOUCH at the data processing stage of Wave 6 data
  2. Changing the routing of this question if it is used in future waves.

No errors were found in the topline checks for Waves 7 and 8.

Differences in weighting

Overall, the same weighting approach was taken in all waves of the survey. However, in each wave, some additional weights are needed for those questions that are not asked of all postal respondents. These additional weights will vary between waves depending on which questions are included.

Wave 4 onwards have “Welsh and Welsh-England” weights to easily compare Welsh respondents against an English population calibrated to have similar demographic characteristics. In Waves 1 to 3, the weights for the calibrated English population were called “Welsh England” weights. The corresponding weights for Welsh respondents were formally part of the individual country level weights.

The decision was also taken to rename the weights to better reflect what mode they should be used for.

Differences in data validation and management

In Waves 1 and 2, the tables were created from the underlying data independently of the SPSS dataset. From Wave 3 onwards, syntax produced the derived variables in SPSS, and this was used to produce the tables in Quantum. As part of this change, the data validation procedures were reviewed, and the following improvements made:

  • In all waves, back editing and forwarding editing was applied to inconsistencies in the postal data, with a smaller amount of back editing applied to the data for Wave 3 onwards. Back editing meant that if a filtered question was answered but the filter origin question contradicted that answer (blank or different), then the origin question was changed to be the answer for the filter question. Whereas forward editing meant that if a participant answered a question but did not follow the routing to answer the next filtered question they were assigned a code of -99 “Not stated”.
  • In Waves 1 and 2, if a question was incorrectly answered as a multi-code question when only one answer should have been selected, then a digit from the participant ID was used to randomly select one of the answers given. From Waves 3 onwards, the responses were set to -99 “Not stated”.
  • From Wave 3 onwards, an edit was introduced to correct the number of adults when participants from a multiple response household answered that only one adult lived in that household.

Ipsos standards and accreditations

Ipsos’s standards and accreditations provide our clients with the peace of mind that they can always depend on us to deliver reliable, sustainable findings. Moreover, our focus on quality and continuous improvement means we have embedded a ‘right first time’ approach throughout our organisation.

Market Research ISO 20252 Certificate

This is the international market research specific standard that supersedes BS 7911/MRQSA and incorporates IQCS (Interviewer Quality Control Scheme). It covers the five stages of a Market Research project. Ipsos UK was the first company in the world to gain this accreditation.

Information Security ISO 27001 Certificate

This is the international standard for information security designed to ensure the selection of adequate and proportionate security controls. Ipsos UK was the first research company in the UK to be awarded this in August 2008.

Company Quality ISO 9001 Certificate

This is the international general company standard with a focus on continual improvement through quality management systems. In 1994, we became one of the early adopters of the ISO 9001 business standard.

Market Research Society (MRS) Company Partners Certificate

By being an MRS Company Partner, Ipsos endorse and support the core MRS brand values of professionalism, research excellence and business effectiveness, and commits to comply with the MRS Code of Conduct throughout the organisation. Ipsos were the first company to sign our organisation up to the requirements & self regulation of the MRS Code; more than 350 companies have followed our lead.

The UK General Data Protection Regulation (UK GDPR) & the UK Data protection Act 2018 (DPA)

Ipsos UK is required to comply with the UK General Data Protection Regulation and the UK Data Protection Act; it covers the processing of personal data and the protection of privacy.

Cyber Essentials Certificate

A government backed and key deliverable of the UK’s National Cyber Security Programme. Ipsos UK was first assessment validated for certification in 2016 and this is renewed annually. Cyber Essentials defines a set of controls which, when properly implemented, provide organisations with basic protection from the most prevalent forms of threat coming from the internet.

Fair Data

Ipsos UK is signed up as a ‘Fair Data’ Company by agreeing to adhere to ten core principles. The principles support and complement other standards such as ISOs, and the requirements of Data Protection legislation.