Skip to main content
English Cymraeg
Research project

Review of FSA Social Science

Last updated: 4 August 2023
Last updated: 4 August 2023

Summary Conclusions 

This review has attempted to assess the contribution that the FSA social science team makes to the FSA and its mission, and to identify what it does well, where there may be need for improvement, and what might be the direction of future learning and professional development.

What does the social science team do well?

  1. The FSA’s social science team is a confident group that is well-regarded by most internal and external stakeholders and provides a robust social science evidence base for the FSA.

  2. The social science team was described by stakeholders as having “a huge amount of in-house expertise” and providing a “variety of really interesting high-quality research” for the FSA.
  3. The team’s qualitative analysis and their evaluation of the equality issues surrounding food policy and food insecurity were highly valued, as were its exploration of consumers’ attitudes and perceptions on food products. The ‘lived experience’ research of the team was also highly valued.  
  4. The team’s ability to identify and clarify the problems to be researched or evaluated, and to articulate the business needs for research or evaluation, is also highly valued.
  5. The research outputs reviewed for this report were found to be generally of a high standard in terms of the methodologies used, their design, execution and reporting. 
  6. Project management by the FSA social science team is generally good and is appreciated by policy colleagues and contractors alike. 
  7. The social science team meets most of the GSR Code for Products and People well. 

Where is there room for improvement?

The data and analysis presented in this review has identified areas where there is room for improvement.

  1. Engaging with policy colleagues earlier would help to clarify research objectives, questions and methodological approaches, thereby enhancing the quality and appropriateness of social research.
  2. Research procurement processes should be reviewed with the aim of making them fit for purpose for commissioning social research and other types of analysis.
  3.  Identifying the impacts of social research outputs is something the social science team would like to improve. This requires not only monitoring the uptake of the FSA social research team’s outputs, but also identifying and assessing their effects on dietary and hygiene-promoting behaviour.
  4. Providing more information consistently about the technical details of research projects would provide more transparency and enhance confidence in the quality of the FSA’s social research.
  5. External peer reviewing and quality assurance of research outputs should be common practice in order to ensure the scientific validity and quality of social research.
  6. There is some need for professional development of some members of the social science team. This includes professional development in methodology, leadership and coaching. 
  7. The balance of in-house versus contracted-out social research should be reviewed to ensure that all members of the team have the opportunity to maintain and improve their social research skills.
  8. The GSR Self-Assessment tool, based on the GSR Code for Products and People, was found to operate at a high level of generality and fails to assess methodology or the quality of analysis at a sufficiently granular level.
  9. The FSA’s Quality assessment Toolkit should be used as the main means of assessing the quality of social science research at FSA and in other government departments.

FSA Response

FSA GSR Review: FSA Response

    Executive Summary

    Objectives of the Review

    1. To assess the contribution that the FSA social science team makes to the FSA and its mission, and to identify what it does well, areas for improvement, and make recommendations for CPD. 
    2. To assess the seven principles of the GSR Code for People and Products and the use of the GSR Self-Assessment tool to appraise social science outputs.

    Approach 

    A thorough review was undertaken using the following procedures:

    1. Stakeholders’ perspectives of the FSA’s social science team.
    2. Appraisal of the FSA’s social science research outputs.
    3. Appraisal of the GSR code self-assessment and an external peer review.
    4. An online survey of the technical skills of FSA’s social researchers.
    5. A group interview with FSA’s social researchers.

    FSA’s Stakeholders’ Perspectives (Chapter 1)

    Stakeholders’ views on the work of the FSA’s social science team were explored by undertaking ten online in-depth interviews, five with internal stakeholders and five with external stakeholders. The topics covered in these interviews are presented in Annex 1 and mirror those used in the group interviews with the social science team.

    The social science team is highly valued by most internal and external stakeholders. The team’s qualitative analysis and their evaluation of the equality issues surrounding food policy and food insecurity were highly valued, as were its exploration of consumers’ attitudes and perceptions of food products. The ‘lived experience’ research of the team was also highly valued. The team was applauded for having “really stepped up in response to the cost of living crisis”, giving policy teams evidence that helped them understand the lived experience of food insecurity.

    Some concerns were expressed about the technical skills of some members of the team, especially in quantitative methods and evidence synthesis approaches, though this was a minority view. Other issues that could be improved upon are the consistency in the effectiveness at commissioning and managing social research, the management of external contractors and identifying the “big messages” of research findings. 

    An Appraisal of FSA Social Research Outputs (Chapter 2)

    Six recent research outputs were reviewed and appraised by the author of this review. These were:

    • Food and You 2: Wave 4 Survey.
    • FSA Small and Micro FBO Tracking Survey Wave 3. 
    • The Value of the FHRS: Business Strand, Consumer Strand, Local Authority Strand.
    • The FSA Online Supermarket Trial.
    • Kitchen Life 2 Literature Review.
    • Psychologies of Food Choice-Public views and experiences around meat and dairy consumption.

    Each research output was appraised using the GSR Code for Products and the FSA Quality Assurance Toolkit (QAT).

    Each report was found to be of a high standard in terms of methodology used, the design, execution and reporting of the study, and the role of the FSA social science team in terms of project management.

    Some issues for consideration about the documentation of research outputs were identified. Not all published research outputs have a separate technical report providing in-depth details of how research projects are designed, samples and research instruments are selected, or how analysis will be undertaken. Such details were readily available from the social science team on request. 

    External peer reviewing and quality assurance of research outputs is “undertaken on a case-by-case basis, subject to the pace, complexity and purpose of the work” (footnote 1). Whilst a degree of proportionality is necessary when working to tight timelines, external peer review and quality assurance is one of the hallmarks of good science and should be common practice.

    Self-Assessment using the GSR Code (Chapter 3)

    The FSA social science team undertook a self-assessment of how well it meets the seven key principles of the GSR Code for Self-Assessment. These were peer reviewed by the author of this review.

    A traffic light rating scale is provided for assessing performance against the GSR Code’s criteria, with ‘green’ representing ‘strong’ performance, ‘amber/green’ (‘development area/strong’), ‘amber’ indicating ‘development area’ and ‘red’ identifying ‘serious concerns’.  

    There was agreement between the peer reviewer and the self-assessment that the ‘outward facing’ professional standard should be rated as ‘green’. 

    The peer reviewer concurred with team’s self-assessed ‘amber/green’ ratings against four of the standards of the GSR Code (rigour and impartiality, relevance, legal and ethical, performing role with integrity). 

    Two of the professional standards (accessibility, appropriately skilled) were rated as ‘green’ by the team and ‘amber/green’ by the reviewer. 

    Assessing the GSR Assessment Code 

    The GSR Assessment Code aims to ensure that its professional standards are met in all GSR’s products and people. It does this at a rather high level of generality and the indicators do not really capture the quality of research outputs at a sufficiently granular level.

    The social science team found the GSR Assessment Code unwieldy and the scoring categories insufficiently nuanced. The tool was also seen as conflating standards, such as ‘rigour’ and ‘impartiality’, when assessing the methodological quality of research. 

    The peer reviewer also found the Self-Assessment Code difficult to use. The assessment criteria and their associated indicators are not always well-aligned, nor sufficiently detailed to assess the work and outputs of government social research, especially in terms of assessing the rigour of social science methodology. 

    The Self-Assessment Tool also requires all research products to be assessed in terms of their implications and solutions for policy and delivery. This is often beyond the scope of most external contractors and therefore scores ‘low’ or ‘medium’. 

    FSA Quality Assurance Toolkit (QAT)

    This recently published toolkit, developed by the FSA’s Advisory Committee on Social Science (ACSS) and University College London (UCL), provides a comprehensive means of ensuring that social science outputs are procured, managed, assessed and validated in a structured and rigorous way. The toolkit covers quantitative and qualitative social research methods and provides checklists for each approach. It also provides links to the EQUATOR Network (footnote 2) for additional resources.

    The FSA Quality Assurance Toolkit (QAT) was used in addition to the GSR Self-Assessment Code to appraise the research outputs of the FSA social science team (Chapter 2). On the basis of this it is strongly recommended that going forward the Quality Assurance Toolkit be used as the main means of assessing the quality of social science research at the FSA and across government.

    The Technical Skills of the Social Science Team (Chapter 4)

    These were assessed by self-reports of ability to meet the requirements of the GSR Technical Skills Framework (2022). The framework outlines the technical expertise expected of members of the GSR profession, and how they apply for each grade of researcher.

    Overall, the team has considerable self-confidence in its ability to meet most of the requirements of the Framework. Most of the technical skills required by the Framework were self-rated as ‘good ability’ or ‘very good ability’ by most of the members of the team. It was not possible for this review to independently verify these self-assessments of technical knowledge and skills.

    There were a few methodological skills (ROs and SROs) and some leadership and coaching skills (PROs) for which professional development would be appropriate. 

    The Social Researchers’ Perspectives (Chapter 5)

    The group interview with the entire social science team identified the following:

    Commissioning Social Research

    The team thinks it does well in terms of the following aspects of commissioning social research:

    • Identifying the policy issues requiring social research and clarifying the objectives and anticipated outcomes of a policy or programme. 
    • Maintaining a productive relationship with the FSA’s procurement team.
    • Keeping abreast of innovative research methodologies from academic and commercial research communities via seminars, workshops and training courses, and discussions with suppliers about new and upcoming research methods.
    • Liaising with the FSA’s Areas of Research Interest (ARI) steering groups to link with project technical leads and policy colleagues. 

    The team thinks there are the following challenges in terms of commissioning social research:

    • Getting early involvement from policy colleagues in developing research objectives and questions.
    • Managing expectations of policy colleagues regarding how long it takes to procure, deliver and quality assure research to high standards.
    • Resolving different stakeholders’ opposing demands or conflicting needs for social research.

    Managing Social Research

    The team thinks it does well in terms of the following aspects of managing social research:

    • establishing how research is going to be used and adapting the research specification and procurement approach accordingly.
    • keeping contracts on track, developing good relationships with contractors and working collaboratively with a range of stakeholders.
    • managing the wide range of stakeholders with which the FSA engages, including policy makers across government, contractors, academics, food suppliers, consumers, third sector organisations and local authorities across England, Wales and Northern Ireland.
    • quality control of research outputs.

    The team thinks it could improve upon the following aspects of managing social research:

    • having more time to understand policy issues in greater detail, design the research, and deliver higher quality outputs. Being involved early would allow the team to respond more effectively. 
    • improving research dissemination and establishing the impact of research outputs. Impact can only be partially assessed by how often research outputs are read, referred to and used for decision making. Impact also requires monitoring and evaluating the uptake of the FSA social research team’s outputs and identifying their effects on dietary and hygiene-promoting behaviour.

    GSR Code Standards: People 

    The social science team thinks it does well in terms of performing with integrity and being outward facing. The team gave examples of how it meets these two professional standards of the GSR Self-Assessment Code.

    The team also provided evidence that its recruitment is aligned with the GSR recruitment protocol (GSR, 2023) and that most of its members are ‘badged’. It has also recruited staff from outside the civil service which brings considerable experience and different perspectives to the team. 

    The FSA is seen by the social science team as being very supportive of staff undertaking professional development and training. CPD is actively encouraged and undertaken across the team. 

    Background

    The Food Standards Agency (FSA) is an independent Government department working across England, Wales and Northern Ireland to protect public health and consumers’ wider interests in food. Its mission is to ensure that food is safe, is what it says it is, and is healthy and sustainable.

    The social science team is part of the Science, Evidence and Research Directorate of the FSA. It provides insight, analysis and evidence in relation to food policy for stakeholders inside and outside of the FSA. The team supports all stages of policy development including signposting existing evidence, leading or advising on research projects, and supporting the evaluation of policies for all parts of the FSA across England, Wales and Northern Ireland. The team works in collaboration with other analytical professions that deliver research, evidence and advice.

    The FSA social science team consists of six Principal Research Officers (three of whom are part-time), six Senior Research Officers, and three Research Officers (one of whom is a GSR placement student). Members of the team are established civil servants with many years of experience contributing social research to policy development and implementation. Some members of the team have worked in the academic sector as well as in commercial and third-sector research organisations.

    The team undertakes a wide range of research including citizen science, tracking surveys, deliberative research, ethnography, interviews studies, experimental evaluation and rapid evidence reviews. It works with a wide range of internal and external stakeholders to identify research needs and deliver appropriate research evidence to inform food policy.  

    The default position of the team is to commission research from external agencies rather than undertake research in-house. This involves the procurement, contracting, management, quality assurance and publication of social research within tight timelines and the constraints of budgetary cycles. The social science team works closely with other Government analytical professions.

    As part of its ongoing quality assurance procedures, the FSA social science team commissioned this independent assessment of its people (researchers) and outputs (products) against the GSR Code of professional standards, in order to identify its strengths and areas for improvement.

    Objectives of the Review

    1. To assess the contribution that the FSA social science team makes to the FSA and its mission, and to identify what it does well, areas for improvement, and make recommendations for CPD.
    2. To assess the seven principles of the GSR Code for People and Products and the use of the GSR Self-Assessment tool to appraise social science outputs.

    Approach 

    This review was undertaken using the following procedures:

    1. Stakeholders’ perspectives of the FSA’s social science team.
    2. Appraisal of the FSA’s social science research outputs.
    3. Appraisal of the GSR code self-assessment and an external peer review.
    4. An online survey of the technical skills of FSA’s social researchers.
    5. A group interview with FSA’s social researchers.

    Three publications that represent the professional standards and skills required of government social researchers were used for this assessment:

    1. The Government Social Research Code - People and Products (GSR, 2018)
    2. Government Social Research Technical Framework (GSR 2022) 
    3. FSA Quality Assurance Toolkit published by the FSA Advisory Committee on Social Science (ACSS, 2023).

    Stakeholders’ Perspectives of the FSA social science team

    Stakeholders’ views on the work of the FSA’s social science team were explored by undertaking ten online in-depth interviews. Five of these interviews were with internal stakeholders and five with external stakeholders. The topics covered in these interviews are presented in Annex 1.

    What does the FSA social research team do well, and less well?

    Commissioning and Managing Research

    Both internal and external stakeholders were generally positive about the contribution of the social research team to commissioning and managing research. An often-mentioned strength was the team’s ability to identify and clarify the problem(s) to be researched or evaluated, and to articulate the business needs for research or evaluation. 

    Internal and external stakeholders commended the team for being “really good at constructive challenge and facilitation” and helping policy colleagues to think through what it really is that they are trying to evaluate. The ability of the team to draw upon a “great back catalogue of work” was also noted, and that consequently policy, strategy and delivery colleagues “don't need to guess or hypothesise.”

    An internal policy stakeholder commended the team for helping to successfully navigate government procurement contracts which were described as “a bit of a headache”. This policy colleague cited a recent instance in which a procurement issue was presenting a considerable risk to the programme in question. The social science team offered “some really useful insights and flexible solutions” which impressed this policy maker.  

    Many government research projects are procured by call-off contracts with external commercial contractors and academic researchers. This is the case with FSA social research. One such external commercial contractor suggested that this call-off arrangement often enhances the commissioning process by bringing in an independent ‘critical friend’ to ask challenging questions such as “have you thought about doing this instead?” and “do you think this is feasible?”. 

    Problems with the commissioning process were seen by external stakeholders to emanate from the Government procurement process, rather than the team’s abilities. This process was described as “bulky” and “inappropriate for procuring research and evaluation”. The weightings given to different aspects of research procurement were seen by one external researcher as a particular weakness. The procurement process overall was characterised as “not fit for purpose” by internal and external stakeholders.

    One internal stakeholder suggested that the team “are not commissioning well” and that they were “naïve in their commissioning”. The concern here was more to do with the haste of commissioning and the speed of initiating research or evaluation once a contract was issued. The proposed solution from this stakeholder was to get greater movement between academics and government so that the latter can “commission intelligently”.  

    The team was also commended for accepting these external challenges and either changing their approach or providing good reasons for continuing to proceed in their proposed ways. This was seen by external contractors as an illustration of the good two-way relationship of FSA’s social research commissioning. 

    Managing projects

    The team was also considered by stakeholders to be generally good at managing research projects and, for policy makers, often “lending an air of authority” to the management of research. 

    Other positive features of the team’s research management included being good at managing internal relationships, acting as intermediaries with policy colleagues and other stakeholders, and filtering technical advice “through the lens of what the organisation needs”. One internal stakeholder applauded the work of the team for “working the contractors pretty hard and not giving them an easy ride”. 

    The social science team’s recent narrative review of the existing literature for the FSA’s Kitchen Life 2 project was also highly commended by internal stakeholders. This identified some of the key behaviours, actors, triggers and barriers that occur in both domestic and business kitchens. This project won the 2023 Analysis in Government (AiG) Innovative Methods Award, which was described by Professor Sir Ian Diamond, National Statistician and Head of the Government Analysis Function, as a “fantastic piece of work” (footnote 1).

    One interviewee, however, suggested that the social research team would benefit from developing its skills in using existing systematic reviews and meta-analysis, and commissioning such products where possible. These approaches to evidence synthesis are more systematic, rigorous and detailed than most literature reviews and rapid evidence assessments. 

    A more cautious observation about the team’s research management was noted by some internal and external stakeholders, particularly concerning the consistency of contractors’ delivery of research. Delays in providing the content and quality of research in a timely manner had led one policy colleague to “having the difficult conversation with contractors that social researchers should have had and pushing for more timely delivery”. This, however, was a minority perspective.

    Another suggestion for improvement was around the need for policy colleagues to know “the two or three big things” to take from a research report” as well as having some indication of how a particular set of findings fits in with the accumulated evidence from other studies. 

    Research Skills

    Stakeholders were asked about the team’s technical research skills. There was a greater response to this question from research interviewees than from policy colleagues. On the positive side some research stakeholders thought the social research team to be “extremely competent” and “not having anything that is outside their sphere of ability”. This included having good quantitative and qualitative research skills and the ability to commission and analyse some high-quality internal evidence reviews and controlled evaluations. One research interviewee suggested that “on the technical skills I haven't seen any weaknesses”.

    One external interviewee noted that the greatest value of the team was in co-designing research, bringing the policy people in and checking that the output would meet policy needs. 

    A minority of research stakeholders were more critical, saying: “I don't think they would stand out as being particularly strong methodologically” and “I don’t rate them very highly on social science ability at all, they have no quantitative skills”. These two comments were not representative of the research stakeholders who were interviewed, nor do they align with the professional and academic backgrounds of the team. It was acknowledged by internal and external research stakeholders that research skills varied across the team by grade and experience, and that there are researchers whose methodological skills are excellent. 

    Internal stakeholders valued the team’s qualitative analysis and their evaluation of the equality issues surrounding food policy and food insecurity. Exploration of consumers’ attitudes and perceptions on food products was also highly valued, as was the ‘lived experience’ work of the social research team. This was described by one policy colleague as “really good, really powerful stuff” that provided “fantastic” evidence for FSA’s work on the cost of living crisis and household food insecurity. 

    This was reiterated by another senior internal stakeholder who thought the team had “really stepped up in response to the cost of living crisis”, giving policy teams evidence that helped them understand the lived experience of food insecurity, enabling them to “nuance our day-to-day communications through a very consumer-centric, very behavioural, very social science perspective”. 

    More broadly, the team was described as having “a huge amount of in-house expertise” and providing a “variety of really interesting high quality research that contributes towards the evidence base for our policies”. This included the “ability to provide quick and detailed comments on things where input was required”, and “some really strong areas of consumer insight that allows us to make very powerful statements as an organisation”.

    The team was also considered “really pioneering” and “creative” in terms of changing the policy narrative around affordable and sustainable food, climate change and other policy issues. Their work was valued by policy stakeholders for being “creative and pushing boundaries”, “understanding consumers” and “having really important things to say about a range of topics”. One external stakeholder thought that “everything that I've seen has been good and they've been able to advocate very strongly for all their work and what they've done with it”.

    How might the contribution of the FSA social research team be improved?

    One suggestion from an internal policy colleague was to foster greater subject matter expertise around the different areas of the FSA’s work. This might include “shadowing other parts of the Agency to understand their needs or to understand subject matter areas more deeply”. 

    Another suggestion was for the social science team to undertake more horizon scanning analysis to establish upcoming challenges for the FSA, balancing the team’s “extensive amount of operational research and analysis with these longer-term strategic approaches to evidence gathering”. This might be done in collaboration with the strategic insights team in the FSA’s Analytics Unit. It should be noted that the horizon scanning work by the social science team to date was applauded by other internal stakeholders. 

    Summary

    The social science team is highly valued by most internal and external stakeholders. Some concerns were expressed about the technical skills of some members of the team, especially in quantitative methods and evidence synthesis approaches, though this was not universally felt by all stakeholders. 

    Other issues that could be improved upon are the consistency in the effectiveness of commissioning and managing social research, the management of external contractors and identifying the “big messages” of research findings. 

    Introduction

    The social science team produces an extensive library of research outputs (footnote 1) that includes one-off studies for specific programmes or projects, as well as reports of regular surveys and evaluations of the FSA’s broad programme of activities. 
    The research outputs reviewed below were published between January and September 2022. They include a range of social science approaches and methods: quantitative surveys, qualitative studies, a controlled evaluation, a literature review, and an ethnographic and iterative qualitative study:

    • Food and You 2: Wave 4 Survey.
    • FSA Small and Micro FBO Tracking Survey Wave 3. 
    • The Value of the FHRS: Business Strand, Consumer Strand, Local Authority Strand.
    • The FSA Online Supermarket Trial
    • Kitchen Life 2 Literature Review.
    • Psychologies of Food Choice - Public views and experiences around meat and dairy consumption

    Approach

    Each of the above research outputs was reviewed and assessed using the GSR Self-Assessment template of the Government Social Research Code - People and Products (2018) and the recently published FSA Quality Assurance Toolkit (ACSS, 2023). 

    Research Output 1: Food and You 2 Wave 4 (FSA, 2022a)

    Food and You 2 is a biannual ‘Official Statistic’ survey commissioned by the FSA. The survey measures self-reported consumers’ knowledge, attitudes and behaviours related to food safety and other food issues amongst adults in England, Wales, and Northern Ireland. 

    Fieldwork for Food and You 2: Wave 4 was conducted between 18th October 2021 and 10th January 2022. A total of 5,796 adults from 4,026 households across England, Wales, and Northern Ireland completed the ‘push-to-web’ survey.

    Table 2.1 indicates that the combined assessment scored ‘High’ on all of the dimensions of the GSR Self-Assessment Tool except: ‘contributes to all stages of the policy and delivery process’ (‘Medium’ score) and ‘delivers solutions that are viable, actionable and represent value for money’ (‘Low’ score) (footnote 2). Annex 2A presents the peer reviewer’s comments for all of these ratings.

    The Food and You 2, Wave 4 report fully met most of the quality criteria of the FSA’s Quality Assessment Toolkit (Table 2.2). There were two criteria that were partly met. Annex 2A presents the peer reviewer’s comments for all of these ratings.

    Table 2.1 Food and You 2 Wave 4 Technical report and Key findings (combined assessment) Assessment using the GSR Self-Assessment tool:

    Rigorous and impartial

    Based on sound methodology and established scientific principles Quality assured Based on best design given constraints Conclusions are clearly and adequately supported by data
    High High High High

    Relevant

    Anticipates future policy issues as well as addressing current ones Answers clear and researchable questions Contributes to all stages of the policy and delivery process Delivers solutions that are viable actionable and represent value for money
    High High Meduim Low

    Table 2.2 Food and You 2 Wave 4 Technical report and Key Findings Assessment using the QAT checklist:

    Question Answer
    Q1 Title, lead author and year Food and You 2 Wave 4 Key Findings 2022 -
    Q2 Has a clear research need been outlined? Yes fully
    Q3 Has a precise research question/aim been specified? Yes fully
    Q4 Is the research design experimental, cross-sectional, longitudinal case study? Longitudinal
    Q5 Is the research method qualitative, quantitative both? Quantitative
    Q6 Is there a good match between the research question/aim research design and research method? Yes fully
    Q7 Is the study population and setting specified? Yes fully
    Q8a Is the sampling method purposive, convenience, theoretical, other, not specified?  Not applicable
    Q8b Is the sampling method, simple random, stratified sampling, quote sampling, convenience sampling, other, not specified? Stratified
    Q9 Is the sampling method appropriate for addressing the research question? Yes fully
    Q9a Is the target sample size justified? Yes fully
    Q9b Has a sample size calculation been conducted? Yes fully
    Q10 Are the research instruments valid and reliable? Yes fully
    Q11a Is the analytical approach, thematic, grounded theory, framework analysis, other not specified? Not applicable
    Q11b Is the analytical approach chi-square test, correlation test, t-test or analysis of variance, linear regression, logistic regression, survival analysis, time series analysis, meta-analysis, other, not specified? Time Series Analysis
    Q12 Is there a good match between the analytical approach, the research method and the research question? Yes fully
    Q13 Has a relevant checklist from the EQUATOR Network been used in the reporting of the results? Yes fully
    Q14 Have descriptive data on the characteristics of participants been presented? Not applicable
    Q15 Have two or more researchers been involved in the analysis process (for example, through double coding) Not applicable
    Q16 Is there consistency between the data presented and the themes? Not applicable
    Q17 Have similarities and differences between participants been explored (for example, negative cases_ Not applicable
    Q18 Did participants provide feedback on the findings (for example, member checking) Not applicable
    Q19 Have descriptive data on exposures/interventions and potential cofounders been presented? Partly met
    Q20 Have unadjusted and adjusted point estimates and confidence intervals been presented alongside statistical significance? Partly met
    Q21 Has generalisability been considered in the interpretation of the results? Yes fully
    Q22 Has causality been considered in the interpretation of the results? Not applicable
    Q23 Has uncertainty been considered in the interpretation of the results? Yes fully
    Q24 Has a clear study conclusion been presented? Yes fully

    Food and You 2: Wave 4 Strength of the research output

    • the conclusions of the ‘Key Findings’ Report have been clearly presented in the Executive Summary and they are well supported by the data presented in the main body of the report. The report has also been presented in a clear and well-structured way. 

    • the presentation of the ‘Technical terms and definitions’ section (page 94) is helpful and explains the technical terms and issues very well.
    • the Key Findings report indicates many areas where the public’s awareness of food content, the food supply chain and policy measures, such as the Food Hygiene Rating Scheme, is considerable. This may be seen as a success for the work of the FSA over the past two decades. 
    • the technical report is an excellent record of how Wave 4 of the Food and You survey was developed conceptually and methodologically. Its attention to technical detail is excellent. 
    • the methodology (sampling strategy, sample size, power, weighting and sub-group analysis, attention to deprivation analysis and local authority contextual sensitivity etc.) are all good.
    • the contractor and the FSA clearly had a good and collaborative relationship in developing this survey. The FSA’s social researchers and the FSA Advisory Group were active participants in this collaborative relationship and in influencing the survey. This is commendable.
    • the push-to-web methodology has been used well, as has the sequential mixed-method approach of this survey.

    Issues for considerations

    The range of questions and issues to be addressed by the survey was ambitious, albeit relevant for understanding food choice and food behaviour. The FSA and its external stakeholders might have been a little more discriminating about the number of themes and questions to be asked in one survey. 

    Concluding assessment

    Overall, this is a high quality technical report that enhances the strength and value of the Food and You 2 survey (Wave 4) and the evidence base for decision making by the FSA and other health promotion agencies.

    Research Output 2: FSA Small and Micro FBO Tracking Survey Wave 3 (FSA, 2021)

    The FSA Small and Micro Food Business Organisations (FBO) Tracking Survey, Wave 3, is survey of small and micro Food Business Operator (FBO) sites in England, Wales and Northern Ireland. This research output represents the third wave of an annual tracking survey with the following aims:

    • to gain insight, and understand the implications of UK’s exit from the European Union (EU) on small and micro enterprises
    • to ‘unpack’ attitudes towards regulation and deepen insights and knowledge of small and micro enterprises
    • to measure trust in the FSA and extent to which the FSA is considered a modern, accountable regulator

    Fieldwork was conducted in November and December 2021 and comprised 700 interviews with small (10-49 employees) and micro (fewer than 10 employees) FBOs.

    Table 2.3 indicates that the FSA Small and Micro FBO Tracking Survey Wave 3 scored ‘High’ on ten of the dimensions of the GSR Self-Assessment Tool. The remaining four dimensions scored ‘Medium’ (N=3) or ‘Low’ (N=1). Annex 2B presents the peer reviewer’s comments for all of these ratings.

    Table 2.3 FSA Small and Micro FBO Tracking Survey, Wave 3, 2021 - Assessment using the GSR Self Assessment Tool (footnote 3)

    Rigorous and impartial

    Based on sound methodology and established scientific principles High
    Quality assured Medium
    Based on best design given constraints High
    Conclusions are clearly and adequately supported by data High

    Relevant

    Anticipates future policy issues as well as addressing current ones Medium
    Answers clear and researchable questions High
    Contributes to all stages of the policy and delivery process Medium
    Delivers solution that are viable, actionable and represent value for money Low

    Accessible

    Published High
    Data made available where possible High
    Clear and concise High
    Related to existing work in field High

    Legal and Ethical

     

    Complies with relevant legislation High
    Complies with GS ethical guidelines High

    The FSA Small and Micro FBO Tracking Survey Wave 3, 2021 fully met most of the quality criteria of the FSA’s Quality Assessment Toolkit (Table 2.4). There were three criteria that were partly met and one technical detail that was not met (see below). Annex 2B presents the peer reviewer’s comments for all of these ratings.

    Table 2.4 FSA Small and Micro FBO Tracking Survey, Wave 3, 2021 Assessment using the FSA's Quality Assessment Toolkit (QAT)

    Question Answer
    Q1 Title and year, FSA Small and Micro-FBO Tracking Survey Wave 3 2021 -
    Q2 Has a clear research need been outlined? Yes fully
    Q3 Has a precise research question/aim been specified? Yes fully
    Q4 Is the research design experimental, cross-sectional, longitudinal, case study? Longitudinal
    Q5 Is the research method, qualitative, quantitative, both? Quantitative
    Q6 Is there a good match between the research question/aim, research design and research method? Yes fully
    Q7 Is the study population and setting specified? Yes fully
    Q8a Is the sampling method; purposive, convenience, theoretical, other, not specified? -
    Q8b Is the sampling method; simple random sampling, stratified sampling, quote sampling, convenience sampling, other, not specified? Stratified
    Q9 Is the sampling method appropriate for addressing the research question? Yes fully
    Q9a Is the target sample size justified? Yes fully
    Q9b Has a sample size calculation been conducted? Yes fully
    Q10 Are the research instruments valid and reliable? Yes fully
    Q11a Is the analytical approach thematic grounded theory, framework analysis, other, not specified? Not applicable
    Q11b Ia the analytical approach chi-square test, correlation test, t-test or analysis of variance, linear regression, survival analysis, time series analysis, meta-analysis, other, not specified? Time series analysis
    Q12 Is there a good match between the analytical approach, the research method and the research question? Yes fully
    Q13 Has a relevant checklist from the EQUATOR Network been used in the reporting of the results? Partly met
    Q14 Have descriptive data on the characteristics of participants been presented? Yes fully
    Q15 Have two or more researchers been involved in the analysis process (for example through double coding)? Not applicable
    Q16 Is there consistency between the data presented and the themes? Not applicable
    Q17 Have similarities and differences between participants been explored (for example negative cases)? Not applicable
    Q18 Did participants provide feedback on the findings (for example, member checking)? Not applicable
    Q19 Have descriptive data on exposures/interventions and potential cofounders been presented? Partly met
    Q20 Have unadjusted and adjusted point estimates and confidence intervals been presented alongside statistical significance? No
    Q21 Has generalisability been considered in the interpretation of the results? Yes fully
    Q22 Has causality been considered in the interpretation of the results? Not applicable
    Q23 Has uncertainty been considered in the interpretation of the results? Partly met
    Q24 Has a clear study conclusion been presented? Yes fully

    FSA Small and Micro FBO Tracking Survey, Wave 3, 2021, Strengths of the Research Output

    • this survey was undertaken after relevant and careful cognitive testing and sample design/planning. Considerable effort went into sample preparation. 
    • weighting of the sample was also undertaken carefully and appropriately. Details of how the sample and sub-samples were drawn, weighted and coded are available in the accompanying Technical Report. 
    • the conclusions are laid out clearly at the end of the report, and in the Executive Summary. In both cases the conclusions are adequately supported by the data. 
    • although this report overall is not concise (approximately 100 pages) it is clearly presented. Individual chapters are reported concisely and well.

    Issues for consideration

    This report was quality assured by FSA’s internal researchers and external consultants using comments in track changes only. A separate peer reviewers’ report, preferably by external reviewers, would be of greater scientific value and transparency.

    It seems inappropriate for external contractors’ reports to be assessed against the criteria of ‘anticipates future policy issues as well as addressing current ones’ and ‘contributes to all stages of the policy and delivery process’, given that they have little or no opportunity to do this. The same applies to the criterion ‘delivers solutions that are viable, actionable and represent value for money’. These dimensions of the GSR Code should normally be provided by FSA’s social researchers and/or policy colleagues.

    Concluding assessment

    Overall, the FSA Small and Micro FBO Tracking Survey Wave 3 report can be considered high quality in terms of design, conduct and reporting, and it provides robust data on the key questions being addressed by the survey.

    Research output 3: Value of the Food Hygiene Rating Scheme (FHRS): Businesses, Consumers and Local Authorities (FSA 2023a, 2023b, 2023c)

    The FSA is responsible for food safety across England, Wales, and Northern Ireland. As part of its work on the Achieving Business Compliance (ABC) programme, the FSA wanted to understand in more detail how Local Authorities (LAs), businesses and consumers feel about the current Food Hygiene Rating Scheme (FHRS). In addition, the FSA wanted to capture consumer views on potential changes to the regulatory approach. 

    This research report provides findings from four reconvened online discussion groups conducted with consumers in England, Wales, and Northern Ireland.

    Table 2.5 indicates that the Value of the Food Hygiene Rating Scheme outputs scored ‘High’ in all but four of the dimensions of the GSR Self-Assessment Tool. Three of these four dimensions received a ‘medium’ score and one was rated a ‘low’. Annex 2C presents the peer reviewer’s comments for all of these ratings.

    Table 2.5 Value of the Food Hygiene Rating Scheme (FHRS): Assessment using the GSR Self Assessment Tool (footnote 4)

    Based on sound methodology and established scientific principles High
    Quality assured Medium
    Based on best design given constraints High
    Conclusions are clearly and adequately supported by data High

    Relevant

    Anticipates future policy issues as well as addressing current ones High
    Answers clear and researchable questions High
    Contributes to all stages of the policy and delivery process High
    Delivers solution that are viable, actionable and represent value for money Medium

    Accessible

    Published High
    Data made available where possible Medium
    Clear and concise High
    Related to existing work in field Low

    Legal and Ethical

    Complies with relevant legislation High
    Complies with GS ethical guidelines High

    The Value of the Food Hygiene Rating Scheme (FHRS) fully met most of the quality criteria of the FSA’s Quality Assessment Toolkit (QAT) for qualitative studies (Table 2.6). Three of the QAT quality criteria were partly met (see below) and one was unmet (‘providing feedback on the findings for example, member checking’). Annex 2C presents the peer reviewer’s comments for all of these ratings.

    Table 2.6 Value of the Food Hygiene Rating Scheme (FHRS): Assessment using the FSA's Quality Assessment Toolkit (QAT)

    Question  Answer
    Q1 Title and year. Value of the FHRS business, consumer, and local authorities LAs 2022 -
    Q2 Has a clear research need been outlined? Yes fully
    Q3 Has a precise research question/aim been specified? Yes fully
    Q4 Is the research design experimental cross-sectional, longitudinal case study? Cross-sectional and comparative
    Q5 Is the research method, qualitative, quantitative, both? Qualitative
    Q6 Is there a good match between the research question/aim, research design and research method? Yes fully
    Q7 Is the study population and setting specified? Yes fully
    Q8a Is the sampling method; purposive, convenience, theoretical, other, not specified? Purposive and thematic.
    Q8b Is the sampling method: simple random sampling, stratified sampling, quota sampling, convenience sampling, other, not specified? Not applicable
    Q9 Is the sampling method appropriate for addressing the research question? Yes fully
    Q9a Is the target sample size justified? Yes fully
    Q9b Has a sample size calculation been conducted? Not applicable
    Q10 Are the research instruments valid and reliable? Yes fully
    Q11a Is the analytical approach, thematic grounded theory, framework analysis, other , not specified? Thematic
    Q11b Is the analytical approach: chi-square test, correlation test, t-test or analysis of variance, linear regression, survival analysis, time series analysis, meta analysis, other, not specified? Not applicable
    Q12 Is there a good match between the analytical approach, the research method and the research question? Yes fully
    Q13 Has a relevant checklist from the EQUATOR Network been used in the reporting of the results? Partly met
    Q14 Have descriptive data on the characteristics of participants been presented? Partly met
    Q15 Have two or more researchers been involved in the analysis process (for example through double coding)? Information unavailable
    Q16 Is there consistency between the data presented and the themes? Yes fully
    Q17 Have similarities and differences between participants been explored (for example negative cases)? Yes fully
    Q18 Did participants provide feedback on the findings (for example, member checking)? No
    Q19 Have descriptive data on exposures/interventions and potential cofounders been presented? Not applicable
    Q20 Have unadjusted and adjusted point estimates and confidence intervals been presented alongside statistical significance? Not applicable
    Q21 Has generalisability been considered in the interpretation of the results? Not applicable
    Q22 Has causality been considered in the interpretation of the results?  Not applicable
    Q23 Has uncertainty been considered in the interpretation of the results?  Partly met
    Q24 Has a clear study conclusion been presented? Yes fully

    Value of the Food Hygiene Rating Scheme (FHRS) - Strengths of the Research output

    This research was undertaken using a combination of qualitative research methods including workshops, deliberative engagement, in-depth interviews and scenario setting. These are appropriate methods for addressing the central questions of the study and they were undertaken well. 

    Each interview study addresses clear and researchable questions about the value of the FHRS to businesses, consumers, the local authorities, and possible areas of change for the regulatory approach of the FSA.

    Sample sizes were appropriate for qualitative research with over-sampling where necessary. A summary of the sampling methods used for the consumers’ study is provided (in Appendix 1 of the report), but not for the businesses’ or local authorities’ surveys. No reason was given for this exclusion.

    The FSA’s social research team was actively engaged in seeking more information and detail on methodology from the contractor during the commissioning process. The forthcoming additional information improved the design of these studies considerably. This is good practice.

    The conclusions are presented clearly and succinctly and are adequately supported by the data. Subgroup analyses by type and size of businesses, demographic variation and different countries (England, Wales and Northern Ireland) are provided and are generally well presented.

    The three interview studies provide a robust evidence base for current and future discussion of policy issues surrounding the FHRS. The reports identify areas where there is consensus about the effective and beneficial working of the FHRS, and where there is disagreement.

    Issues for consideration

    The amount of detail on the methodology used for these three studies was very limited in the publicly available reports. They were readily provided for this review on request. Given the high profile of the FHRS it would have been beneficial had these details been made available in the form of a separate Technical Report, as is the case with most other research outputs of the FSA social research team. 

    Concluding assessment

    Overall, these three studies represent high quality research that provide a valuable evidence base on how the FHRS is working, and how it might be developed in future.

    Research Output 4: The FSA Online Supermarket Trial (FSA, 2023d)

    This research output reports on a clustered randomised trial, with a matched pairs design, to test whether Food Business Organisation (FBO) staff would make customers feel more confident that they could identify ingredients that they want to avoid consuming, feel more comfortable to ask about ingredients on a future visit, and increase consumers’ perceptions of food safety regarding food and drink sold at the given chain. The clusters were branches of a national FBO. Participants were customers who entered the FBOs between 28th March 2022 and 30th June 2022, who placed a food order at the till, and who chose to complete a voluntary survey about their experience.

    Table 2.7 indicates that the FSA Online Supermarket Trial research scored ‘High’ on nine of the fourteen quality criteria of the GSR Self-Assessment Tool. Two of the dimensions of the GSR Self-Assessment Tool were rated as ‘medium’ (see above) and three were rated as low. As noted in footnote 6, external contractors are not always in a position to meet these three quality criteria given that they may have little or no role in these policy making activities. Annex 2D presents the peer reviewer’s comments for all of these ratings.

    Table 2.7 The FSA Online Supermarket Trial: Assessment using the GSR Self Assessment Tool (footnote 5)

    Rigorous and Impartial

    Based on sound methodology and established scientific principles High
    Quality assured Medium
    Based on best design given constraints High
    Conclusions are clearly and adequately supported by data High

    Relevant

    Anticipates future policy issues as well as addressing current ones Low
    Answers clear and researchable questions High
    Contributes to all stages of the policy and delivery process Low
    Delivers solution that are viable, actionable and represent value for money Low

    The FSA Online Supermarket Trial fully met all but one of the quality criteria of the FSA’s Quality Assessment Toolkit for experimental trials (Table 2.8). The only criterion that was not ‘fully met’ was the use of the EQUATOR Network in the reporting of the results. Although a relevant checklist from the EQUATOR Network was not used, the reporting of the results was well structured in accordance with these guidelines. Annex 2D presents the peer reviewer’s comments for all of these ratings.

    Table 2.8 The FSA Online Supermarket Trial: Assessment Using the FSA’s Quality Assessment Toolkit (QAT)
     

    Question  Answer
    Q1 Title and year. Testing the impact of overt and covert ordering interventions on sustainable consumption choices: a randomised controlled trial on an online supermarket 2022 -
    Q2 Has a clear research need been outlined? Yes fully
    Q3 Has a precise research question/aim been specified? Yes fully
    Q4 Is the research design experimental, cross-sectional, longitudinal case study? Experimental
    Q5 Is the research method, qualitative, quantitative, both? Quanitative
    Q6 Is there a good match between the research question/aim, research design and research method? Yes fully
    Q7 Is the study population and setting specified? Yes fully
    Q8a Is the sampling method; purposive, convenience, theoretical, other, not specified? Not applicable
    Q8b Is the sampling method: simple random sampling, stratified sampling, quota sampling, convenience sampling, other, not specified? Quota sampling
    Q9 Is the sampling method appropriate for addressing the research question? Yes fully
    Q9a Is the target sample size justified? Yes fully
    Q9b Has a sample size calculation been conducted? Yes fully
    Q10 Are the research instruments valid and reliable? Yes fully
    Q11a Is the analytical approach, thematic grounded theory, framework analysis, other , not specified? Not applicable
    Q11b Is the analytical approach: chi-square test, correlation test, t-test or analysis of variance, linear regression, survival analysis, time series analysis, meta analysis, other, not specified? Linear regression
    Q12 Is there a good match between the analytical approach, the research method and the research question? Yes fully
    Q13 Has a relevant checklist from the EQUATOR Network been used in the reporting of the results? Partly met
    Q14 Have descriptive data on the characteristics of participants been presented? Yes fully
    Q15 Have two or more researchers been involved in the analysis process (for example through double coding)? Yes fully
    Q16 Is there consistency between the data presented and the themes? Not applicable
    Q17 Have similarities and differences between participants been explored (for example negative cases)? Not applicable
    Q18 Did participants provide feedback on the findings (for example, member checking)? Not applicable
    Q19 Have descriptive data on exposures/interventions and potential cofounders been presented? Yes fully
    Q20 Have unadjusted and adjusted point estimates and confidence intervals been presented alongside statistical significance? Yes fully
    Q21 Has generalisability been considered in the interpretation of the results? Yes fully
    Q22 Has causality been considered in the interpretation of the results? Yes fully
    Q23 Has uncertainty been considered in the interpretation of the results?  Yes fully
    Q24 Has a clear study conclusion been presented? Yes fully

    The FSA Online Supermarket Trial - Strengths of the Research Output

    • this is a well-planned, well-implemented and well-reported randomised controlled trial of the impact of overt and covert ordering interventions on sustainable consumption choices.
    • the study concludes that “analysis of the control condition showed that the positioning of products had no effect on (consumer) choices.”
    • the three-arm, between-subjects design with randomisation is an appropriate research design for the central research question of the study. 
    • careful attention has been given to sample size and power, appropriate quota sampling and linear regression analysis that is both appropriate and well presented. There is also good attention to participants’ socio-demographic characteristics.

    Issues for consideration

    This trial addresses consumer choice under simulated online supermarket conditions and, as the authors note, this simplifies many of the real-life considerations. The authors also note that “more field research in real supermarket environments is required to establish the external validity of the effects of pop-ups on behaviour” (page 40).

    Concluding Assessment

    Notwithstanding the simulated nature of this trial this research output presents a solid evidence base on how consumers behave in response to different presentations of sustainable consumption choices. This is a high-quality report of a well-planned, well-implemented and well-reported randomised controlled trial. 

    Research Output 5 - Kitchen Life 2 Literature Review (FSA, 2022b)

    The aim of the Kitchen Life 2 project is to identify the key behaviours relating to food safety that occur in domestic and business kitchens, as well as the factors that may reduce the likelihood to enact recommended food safety & hygiene behaviours. This literature review seeks to identify existing key behaviours, actors, triggers and barriers in domestic and business kitchens and develop successful behavioural interventions and risk assessment models.

    Table 2.9 indicates that the Kitchen Life 2 Literature review scored ‘High’ on seven dimensions of the GSR Self-Assessment tool and ‘Medium’ on five dimensions. Two dimensions (‘based on best design, given constraints’ and ‘data made available where possible’) were rated as low. Annex 2E presents the peer reviewer’s comments for all of these ratings. 

    Table 2.9 Kitchen Life 2 Literature Review. Assessment using the GSR Self-Assessment Tool (footnote 6)

    Rigorous and Impartial

    Based on sound methodology and established scientific principles Medium
    Quality assured Medium
    Based on best design given constraints Low
    Conclusions are clearly and adequately supported by data Medium

    Relevant

    Anticipates future policy issues as well as addressing current ones High
    Answers clear and researchable questions High
    Contributes to all stages of the policy and delivery process Medium
    Delivers solution that are viable, actionable and represent value for money Medium

    Accessible

    Published High
    Data made available where possible Low
    Clear and concise High
    Related to existing work in field High

    Legal and Ethical

    Complies with relevant legislation High
    Complies with GS ethical guidelines High

    Assessment using the FSA's Quality Assessment Tool (QAT)

    The FSA’s Quality Assessment Toolkit does not include a checklist for assessing the quality of literature reviews. It does provide links to the ‘Preferred Reporting Items for Systematic Reviews and Meta-Analyses’ (PRISMA) guidelines and checklist via the Equator Network. The ‘issues for consideration’ below are based on the PRISMA guidelines.

    Kitchen Life 2 Literature Review - Strengths of the Research Output

    • the review has a clear and researchable question and is an informative literature review. It is presented clearly and reasonably concisely and is also easy to read. 
    • the ‘Recommendations’ section summarises the key findings and ‘take home’ messages clearly and well. These are supported by the data that were collected, but given the somewhat narrow search procedures (see below) there may be some risk of bias.
    • this review focuses on key behaviours relating to food safety in domestic and business kitchens within the context of the FSA’s policy response to the COVID pandemic. Its ‘key recommendations’ about what people need to know, and should do, have considerable implications for current and future policy issues.
    • this review identifies a number of activities in domestic and business kitchens that may require action to improve food hygiene. 
    • this review was quality assured internally by the FSA social research team and by members of the Advisory Board on Social Science. 

    Issues for consideration

    • Literature reviews are no longer considered a “sound methodology [based on] established scientific principles” (GSR 2018). Given that this review had over eight months to be completed the best design would have been a narrative systematic review following PRISMA procedures and guidelines. Alternatively, it could have used Rapid Evidence Assessment methodology which would have enhanced its search and analytical approach.
    • the review uses a limited search string and only two electronic databases (Scopus and Web of Science). It also does not appear to have used double screening or double data extraction. 
    • this review does not include a ‘flow diagram’ of how the initial search yields were reduced to the finally included studies. There is no list of excluded studies with the reasons for their exclusion. There is no attempt to rate the included studies in terms of their strength of evidence.
    • these limitations do introduce a risk of bias to the review.

    Concluding Assessment

    Despite having used a form of evidence review no longer considered “sound methodology and [based on] established scientific principles” this review provides some valuable insights into “the key behaviours relating to food safety that occur in domestic and business kitchens, as well as the factors that may reduce the likelihood to enact recommended food safety & hygiene behaviours”. Its contribution to the evidence base on Kitchen Life would have been enhanced had it followed established and up-to-date scientific principles for reviewing evidence.

    Research Output 6: Psychologies of Food Choice - Public views and experiences around meat and dairy consumption (FSA, 2022c)

    This research output explores UK public views and experiences around meat and dairy consumption, including key drivers of participants’ chosen dietary approach. It presents findings drawn from qualitative remote ethnography research with 24 UK people conducted during July and August 2021, plus 9 peer-to-peer interviews conducted by our main sample participants with their friends and family.

    Table 2.10 indicates that this study scored ‘High’ on nine of the criteria of the GSR Self-Assessment Tool (see above). Five of the criteria were rated as ‘Medium’. Annex 2F presents the peer reviewer’s comments for all of these ratings. 

    Table 2.10 Psychologies of Food Choice, Public views and experiences around meat and dairy consumption Assessment Using the GSR Self-Assessment Tool 

    Rigorous and Impartial

    Based on sound methodology and established scientific principles High
    Quality assured Medium
    Based on best design given constraints High
    Conclusions are clearly and adequately supported by data High

    Relevant

    Anticipates future policy issues as well as addressing current ones Medium
    Answers clear and researchable questions High
    Contributes to all stages of the policy and delivery process Medium
    Delivers solution that are viable, actionable and represent value for money Medium

    Accessible

    Published High
    Data made available where possible Medium
    Clear and concise High
    Related to existing work in field High

    Legal and Ethical

    Complies with relevant legislation High
    Complies with GS ethical guidelines High

    The Psychologies of Food Choice report fully met most of the quality criteria of the FSA’s Quality Assessment Toolkit for qualitative research (Table 2.11). The only criterion that was not ‘fully met’ was the use of the EQUATOR Network in the reporting qualitative research. The reporting that is provided in the report, however, does meet many of these standards. Annex 2F presents the peer reviewer’s comments for all of these ratings. 

    Table 2.11 Psychologies of Food Choice - Assessment using the FSA's Quality Assessment Toolkit (QAT)

    Question Answer
    Q1 Title and year, Psychologies of Food Choice Public views and experiences around meat and dairy consumption 2021 -
    Q2 Has a clear research need been outlined? Yes Fully
    Q3 Has a precise research/question aim been specified? Yes fully
    Q4 Is the research design experimental, cross-sectional, longitudinal, case study? Ethnographic and behavioural
    Q5 Is the research method, qualitative, quantatitve both? Qualitative 
    Q6 Is there a good match between the research question/aim, research design and research method? Yes Fully
    Q7 Is the study population and setting specified? Yes Fully
    Q8a Is the sampling method purposive, convenience, theoretical, other, not specified Purposive sampling
    Q8b Is the sampling method simple random sampling, stratified sampling, quote sampling convenience sampling other not specified? Quota sampling
    Q9 Is the sampling method appropriate for addressing the research question? Yes fully
    Q9a Is the target sample size justified? Yes fully
    Q9b Has a sample size calculation been conducted? Not applicable
    Q10 Are the research instruments valid and reliable? Yes fully
    Q11a Is the analytical approach, thematic grounded theory, framework analysis, other, not specified? Thematic
    Q11b Is the analytical approach chi-square test, correlation test, t-test or analysis of variance, linear regression, logistic regression, survival analysis, time series analysis, meta-analysis, other not specified? Not applicable
    Q12 Is there a good match between the analytical approach, the research method and the research question? Yes fully
    Q13 Has a relevant checklist from the EQUATOR network been used in the report of the results? Partly met
    Q14 Have descriptive data on the characteristics of participants been presented? Yes fully
    Q15 Have two or more researchers been involved in the analysis process (for example, through double coding)? Yes fully
    Q16 Is there consistency between the data presented and the themes? Yes fully
    Q17 Have similarities and differences between participants been explored (for example negative cases)? Yes fully
    Q18 Did participants provide feedback on the findings (for example, member checking)? Yes fully
    Q19 Have descriptive data on exposures/interventions and potential cofounders been presented? Not applicable
    Q20 Have unadjusted point estimates and confidence intervals been presented alongside statistical significance? Not applicable
    Q21 Has generalisability been considered in the interpretation of the results? Yes fully
    Q22 Has causality been considered in the interpretation of the results? Not applicable
    Q23 Has uncertainty been considered in the interpretation of the results? Fully met
    Q24 Has a clear study conclusion been presented? Yes fully

    Psychologies of Food Choice - Strengths of the research output

    • this study shows the value of ethnographic and mixed methods qualitative research in identifying the drivers of behaviour and the triggers to eating differently.
    • the use of the ‘capability-opportunity-motivation behaviour (COM-B)’ model is welcome because it addresses the need for behavioural insights in changing people’s food consumption.
    • the study concludes that a one-size-fits-all policy approach is inappropriate for understanding how to influence food consumption behaviour. 
    • the study was undertaken to a reasonably high degree of quality. It has identified some of the sources of the uncertainty in the relationship between drivers and behavioural outcomes.

    Issue for Consideration

    Given the length of the published report the authors were unable to present detailed transcript data or field notes, which are important features of ethnographic and mixed qualitative methods. These could have been provided in a separate Technical Report. The report does, however, provide good summary data on fieldwork observations and participants’ responses in interviews and groups.

    This study did not intend to contribute to all stages of the policy and delivery process. It does, however, indicate that the policy and delivery processes need to be more nuanced and contextualised.

    Concluding assessment

    This report provides important information on some of the things that need to be in place to change people’s food consumption behaviour. It has been undertaken to a reasonably high standard. There may be more data and insights available from this study that could not be included in this report given length limitations.

    Chapter summary

    Six research outputs of the FSA social science team have been reviewed using the GSR Self-Assessment tool and the FSA’s Quality Assessment Toolkit (QAT). Each report is of a high standard, though some issues for consideration have been identified. 

    The criteria used by the GSR Self-Assessment tool assess research outputs at a high level of generality and do not identify details about methodology or the quality of analysis at a sufficiently granular level. The GSR Self-Assessment tool also requires an assessment of whether research outputs “anticipate future policy issues as well as addressing current ones”, “contribute to all stages of the policy and delivery process” and “deliver solutions that are viable, actionable and represent value for money”. Whilst these are important features of research and analysis for government not all research has the opportunity to do this, especially where research is contracted externally. 

    The FSA’s Quality Assurance Toolkit provides a more appropriate and sensitive tool to assess the quality of social research. This reinforces the recommendation (Chapter 3) that, going forward, the FSA’s Quality Assurance Toolkit should be used as the main means of assessing the quality of social science research at the FSA, and in other government departments and agencies.
     

     

     

    The GSR Code “sets out seven key principles that all GSR members must adhere to in order to ensure research and analysis that is scientifically rigorous, relevant and valued” (GSR, 2018). It provides four standards for assessing the products of GSR research and three standards for assessing people:

    GSR Code Products

    • rigour and impartiality
    • relevance
    • accessibility
    • legal and ethical

    CSR Code People

    • performing the role with integrity
    • appropriately skilled and continuously developed
    • outward facing

    Each standard has GSR assessment criteria, and indicators are provided for assessing an organisation’s performance against these criteria (Table 3.1). A traffic light rating scale is provided for assessing performance against the GSR Code’s criteria, with ‘green’ representing ‘strong’ performance, ‘amber/green’ (‘development area/strong’), ‘amber’ indicating ‘development area’ and ‘red’ identifying ‘serious concerns’.  Additionally, assessors are required to provide details about any issues or risks, and examples of good practice in their organisation. 

    The full GSR Code for Assessment template, including a self-assessment by the FSA social science team, is available at Annex 3A.

    Table 3.1 GSR Code for Self-Assessment

    GSR Code for Products: Standards  GSR Assessment Criteria Indicators Used
    Rigor and impartiality  a. Based on sound methodology and established scientific principles
    b. Quality assured
    c. Based on best design, given constraints
    d. Conclusions are clearly and adequately supported by data 
    • Project Design
    • Quality assurance of outputs 
    Legal and ethical  a. Complies with relevant legislation
    b. Complies with GSR ethical guidelines 
    • Short/long term balance
    • Departmental business planning
    • Strategic level sign-off
    • Impact assessed
    Relevance a. Anticipates future policy issues as well as addressing current ones
    b. Answers clear and researchable questions
    c. Contributes to all stages of the policy and delivery process
    d. Delivers solutions that are viable, actionable and represent value for money
    • Published
    • Format 
    Accessibility   a. Published 
    b. Data made available where possible
    c. Clear and concise 
    d. Related to existing work in field
    • Ensuring good practice in the commissioning, management and conduct of government social research 
    • Procurement
    • Data Security
    • Data Sharing
    Performing role with integrity a. Make best use of available resources
    b. Give appropriate methodological and impartial evidence-based advice, challenging where appropriate.
    • Make best use of available resources/achieve value for money
    • Knowledge management
    • Open, fair and honest
    Outward facing  a. Establish effective links with the external research community
    b. Actively collaborate with policy/delivery colleagues
    c. Actively collaborate with other analytical professions within and across departments
    • Recruitment and induction 
    • Continuing Professional Development
    • Career and talent management
    • Balance and use of skills 
    Appropriately skilled and continuously developed a. Recruited and promoted in line with GSR recruitment protocol 
    b. Committed to continuous professional development in line with the CPD requirements
    • External research community 
    • Other government analysts 
    • Policy/delivery community 

    Peer review of the self-assessment of FSA’s social science

    This peer review was undertaken by the author of this report. A summary of the peer reviewer’s ratings on the GSR Code for Products and People, and those of the social science team, is presented at Annex 3B. There was agreement between the peer reviewer and the self-assessment that the ‘outward facing’ professional standard should be rated as ‘green’. 

    The peer reviewer concurred with team’s self-assessed ‘amber/green’ ratings against four of the standards of the GSR Code (rigour and impartiality, relevance, legal and ethical, performing role with integrity). Two of the professional standards (accessibility, appropriately skilled) were rated as ‘green’ by the team and ‘amber/green’ by the reviewer. 

    No standards were rated by the peer reviewer or the social science team as amber (‘development area’) or red (‘serious concerns’).

    Rigour and impartiality

    The peer reviewer agrees with the amber/green self-rating of the team on this standard. Project specifications and proposals are well reviewed internally. The evidence provided by the team to support this is valid.

    The team notes that “external reviewing and QA is undertaken on a case-by-case basis, subject to the pace, complexity and purpose of the work”. Whilst the latter caveat is appreciated, and a degree of proportionality is necessary when working to tight timelines, external peer review and quality assurance are two of the hallmarks of good science. This is undertaken to some extent by the ACSS. It would enhance the quality and optics of the peer review and assurance process if all published research outputs were also peer reviewed by a wholly independent source. 

    The self-assessment notes that “engaging colleagues in exploratory/experimental work” and “accessing good research suppliers at pace and within government procurement processes” is a challenge. This may affect the rigour with which these methods are undertaken. That said, the contractors that the team uses to undertake such work have excellent reputations for planning and conducting experimental and quasi-experimental methods, and for delivering in a timely manner. 
    The social science team self-rated this standard amber/green, which is appropriate for this standard.

    Relevance

    There is a good balance between the team’s short term (operational) and longer term (strategic) research and analysis. For example, the waves of the FSA’s tracker surveys are undertaken in a timely manner and help to inform policy and delivery issues with good quality real-time data. The horizon scanning work by the social science team in 2020 and 2021 (FSA, 2020; Pierce and Mumford, 2020) is rightly claimed as evidence of its contribution to strategic analysis.

    The GSR Self-Assessment tool’s indicators for ‘relevance’ include “anticipates future policy issues as well as addressing current ones”, “contributions to all stages of the policy and delivery process”, and the delivery of “solutions that are viable, actionable and represent value for money”. Whilst these are important features of research and analysis for government, not all social research has the opportunity to do this.

    It is hard to see how external contractors can meaningfully make these contributions unless they are deeply embedded in the decision-making process of government. This makes it all the more important for the social science team to provide the link between research findings and their potential for policy making and delivery. 

    The self-assessment notes that “turning dissemination into measurable impact” is challenging. This might require linking the findings of FSA’s research outputs on consumers’ reported behaviour with data on food consumption, nutrient intake and the nutritional status of the general population.

    The team also claims, quite rightly, to have good liaison with policy and delivery stakeholders. This should provide opportunities for knowledge translation and knowledge transfer from research outputs to potential policy and practice.
    The social science team self-rated this standard amber/green, which is appropriate for this standard.

    Accessibility

    The publication of social science research outputs is timely, readable and informative. The different types of presentational format are well documented in the self-assessment and are generally good. There is a focus on reporting and analysing the findings of research rather than detailed accounts of the methodology. This is good for the general audience for whom it is intended. Summary data and sub-group presentations are presented well in published outputs. 

    The above approach, however, does require some further details to be available about the methodological and technical aspects of published research. This is important for reviewing and quality assuring the approach taken, and for transparency. 

    Technical Reports are frequently, but not always, published by the team. Making available technical details of how research has been conducted should be common practice if the scientific quality of research outputs is to be assured. 

    More detailed information on methodology, data and analysis was readily made available from the social science team for this review on request, which is good practice and commendable.

    The social science team self-rated this standard as green (‘strong’ performance). A rating of amber/green would be more appropriate for this standard.

    The team follows guidance on commissioning, managing and conducting research well, including attention to GSR ethical guidelines. Data security is handled well and in accordance with civil service requirements. Data sharing is generally good and readily available, notwithstanding the above observation about the consistent availability of technical reports.
    Procurement procedures have been identified as a challenge for some contractors and the team. Through good working relationships with procurement colleagues the social science team manages this well. There is room for improvement in the research procurement arrangements, but these are cross-government requirements beyond the control of the social science team. This suggests that the procurement arrangements for government research should be reviewed with the aim of having separate procedures and requirements from those of general government procurement. 

    The social science team self-rated this standard amber/green, which is appropriate for this standard.

    Performing role with integrity

    The team certainly meets the GSR criteria for a green rating for being open, fair and honest and considering the added value of a project before undertaking new research. The claim in the self-assessment that the social science team “works with policy stakeholders to understand the existing evidence base” was substantiated by the stakeholders’ interviews.

    Identifying existing and emerging research is currently undertaken by the team using literature reviews and rapid evidence assessments. There is some room for improvement in terms of using up-to-date methods of evidence synthesis such as systematic reviews and evidence gaps maps. This involves identifying and using existing systematic reviews and other evidence synthesis products (footnote) as well as commissioning or undertaking some (rapid) reviews in-house. This may require some professional development training. 

    Sharing of information gained by gathering existing evidence is generally good. Knowledge management also requires good storage, file management and retrieval of evidence. The team has improved this in recent months.

    The social science team self-rated this standard amber/green, which is appropriate for this standard.

    Appropriately skilled and continuously developed

    All indications are that the team is recruited and promoted in line with GSR protocols. Recruitment of researchers from the external research and evaluation community has enhanced the team’s experience and expertise.

    There is some imbalance of grades in the current structure of the SR team (six PROs, six SROs, 3 ROs) which gives the appearance of the team being top-heavy. That said, there is evidence of most routine research tasks being undertaken by PROs and SROs as well as ROs. Future recruitment might focus on adding more ROs to the team. 

    Some technical skills might require development (see Chapter 4) including quantitative skills and methods of evidence synthesis. Staff receive CPD opportunities in line with GSR guidelines. Procurement requirements sometimes limit the choice of CPD opportunities, but otherwise talent management is good.

    The social science team self-rated this standard as green (‘strong’ performance). A rating of amber/green would be more appropriate for this standard.

    Outward Facing

    The self-assessment of the team’s links with the external research community, other government analysts and policy delivery colleagues is fair and accurate. This was confirmed in the interviews with internal and external stakeholders and is well documented in Chapter 2.

    The social science team self-rated this standard as green, which is appropriate for this standard.

    Summary of the peer review

    This peer review generally concurs with the self-assessment by the social science team using the GSR Self-Assessment Code. Some differences in rating the team’s performance against the Self-Assessment Code’s standards have been made, and some areas of improvement have been suggested.  

    Assessing the GSR Assessment Code

    The GSR Assessment Code aims to ensure that the professional standards are met in all GSR’s products and people. It does this at a rather high level of generality and the indicators do not really capture the quality of research outputs at a sufficiently granular level.

    Prior to this review, most of the social science team was not aware of the GSR Self-Assessment Code. It does not appear to have been routinely used in other government departments either, according to interviews with the social science team and both internal and external stakeholders. 

    Most of the team used the tool for the first time to undertake the FSA self-assessment for this review. They found it unwieldy and ‘clunky’, and the Red, Amber, Green scoring categories insufficiently nuanced. The tool was also seen as conflating standards, such as ‘rigour’ and ‘impartiality’, when assessing the methodological quality of research. 

    The peer reviewer for this report also found the Self-Assessment Code difficult to use. The GSR assessment criteria and their associated indicators are not always well-aligned, nor sufficiently detailed to assess the work and outputs of government social research, especially in terms of assessing the rigour of social science methodology. 

    Despite its aim to assess whether research products are “based on sound methodology and established scientific principles” the Assessment Code does not indicate how this is to be determined in terms of technical detail. This has been remedied by the new FSA Quality Assurance Toolkit [QAT] (FSA, 2023).

    FSA Quality Assurance Toolkit (QAT)

    This toolkit provides a comprehensive means of ensuring that social science outputs are procured, managed, assessed and validated in a structured and rigorous way. It has been co-created by the FSA’s Advisory Committee on Social Science (ACSS) and University College London. This involved focus groups with FSA staff and “multiple rounds of feedback” as well as the piloting of several study protocols, research reports, and tender specifications (ACSS, 2023). 

    The ‘Guidance’ section of the QAT has 3 parts:

    • Part 1 contains guidance for producing, assessing, and procuring research. 
    • Part 2 contains guidance for research management and dissemination (primarily relevant for producing and procuring research).
    • Part 3 contains additional guidance for procuring research.

    The toolkit covers quantitative and qualitative social research methods and provides checklists for each approach (see Chapter 2). It also provides links to the EQUATOR Network (footnote) for additional resources for procuring, assessing and reporting research, and for assessing other study types.

    Guidance and checklists for undertaking systematic reviews of evidence are not included in the QAT framework, but links are provided via the Equator Network to the ‘Preferred Reporting Items for Systematic Reviews and Meta-Analyses’ (PRISMA) guidelines and checklist. 

    The FSA Quality Assurance Toolkit (QAT) was used along with the GSR Self-Assessment Code to appraise the research outputs of the FSA social science team (Chapter 2). On the basis of this it is strongly recommended that going forward the Quality Assurance Toolkit be used as the main means of assessing the quality of social science research at the FSA and across government. 

    The GSR Self-Assessment code could potentially be used to assess the broader dimensions of professional standards, such as the accessibility, legal and ethical issues and the recruitment and professional development dimensions. 

    Summary

    In addition to peer reviewing the self-assessment of FSA’s social science team (using the GSR Code and Self-Assessment Tool) this chapter has also appraised the structure and applicability of the GSR Self-Assessment Tool itself. It is concluded that the tool operates at a rather high level of generality and the indicators do not really capture the quality of research outputs at a sufficiently granular level.

    The recently published FSA Quality Assurance Toolkit (QAT) has also been reviewed and appraised. It is strongly recommended that going forward the Quality Assurance Toolkit be used as the main means of assessing the quality of social science research at the FSA and across government. 

    The GSR Technical Skills Framework

    The GSR Technical Framework (GSR, 2022) outlines the technical expertise expected of members of the GSR profession, and how they apply for each grade of researcher. The framework “comprises technical skills and the use and promotion of social research” around the following dimensions:

    • knowledge of research methods, techniques, and their application.
    • identifying research needs, designing and specifying research.
    • analysis and interpretation.
    • managing and commissioning social research. 
    • leading, communicating and championing social research.
    • understanding government context.
    • learning and development.

    A full list of the skills and abilities expected for the grades currently employed by the FSA social research team – Research Officer, Senior Research Officer, Principal Research Officer – is presented at Annex 4A. The social science team (N=15) completed an online survey of the technical skills specified for each grade. 

    It should be noted that the technical skills contained in the GSR Technical Framework are fairly general and do not question researchers’ knowledge of, or ability to undertake, specific methodological techniques.  They are also self-reported responses for which no independent verification of knowledge or ability was possible.

    Findings

    Research Officers (ROs)

    The three ROs in the social science team reported ‘good ability’ or ‘very good ability’ in most of the technical skills required for their grade. The full list of the technical skills in which the ROs expressed having ‘some ability’ or ‘no ability’ is presented at Annex 4B. These were mainly methodological skills, though some were organisational (e.g., “uses the GSR network to increase awareness of cross cutting research possibilities”).  

    Two of the three ROs reported having ‘no ability’ in:

    • working knowledge of relevant data analysis packages, particularly SPSS and Excel, and qualitative packages. Packages to be determined by the particular role and job content.
    • introductory level knowledge of data science techniques.
    • having knowledge of quality assurance methodologies required for analytical work, and understanding the context and relevance of quality assurance products, such as the Aqua Book.
    • persuading others to support the research process, for example, industry bodies to release necessary information or policy customers of the value of social research.

    Given that RO is the entry level into the Government Social Research Service it is perhaps not surprising that ability in all the required technical skills is not yet fully developed. This suggests some areas in which professional development would be appropriate.

    Senior Research Officers

    There were no technical skills in which the six SROs thought they had ‘no ability’ (Annex 4B).

    Two of the six SROs reported only ‘some ability’ in:

    • Using a range of analytical techniques to carry out in-house analysis and briefing work.
    • Having up-to-date knowledge of methodological developments including the role of innovative methodologies; applying these methods when and where appropriate; making use of appropriate new developments from other analytical professions and outside the Civil Service where relevant.
    • Demonstrating sufficient technical ‘authority’ by taking the lead in recommending solutions to fill strategic gap.

    These three skills represent the more methodological aspects of social research, though the question of having technical ‘authority’ may also be a matter of experience.

    Just one of the six SROs reported having only 'some ability' in the skills associated with commissioning, influencing, representing GSR and managing research independently. 

    Three of the SROs offered comments on the range of technical skills that are expected for their grade by the framework. Two of these comments referred to the limited amount of research undertaken by the team in-house:

    “Most of our work is contracted out so some of the technical skills we have less experience of as we don't do as much in-house research”.

    “My understanding of the competency framework is that you don't have to hit every element to perform well at your grade. For example, most work at the FSA is commissioned, so those who have spent their social research career solely at the FSA will have less experience in in-house research. However, this doesn't mean that they aren't able to do it, they've just not had the opportunity to do so”.

    “As a department, I feel that opportunities to do in-house research should be made available to avoid researchers who do have the experience becoming de-skilled in this area”.

    These comments suggest that the contracting out of much of the FSA’s social research may limit researchers’ opportunities to maintain and develop their technical research skills. The balance of in-house versus contracted-out social research might need reviewing to avoid skills fade.

    The third comment from an SRO was about training in the technical skills of social research:

    “I'd like more training on technical analytical skills (e.g., regression analysis) and when it's appropriate to use these techniques. I'd like more training on evaluation skills”.

    Principal Research Officers

    The six PROs reported ‘good ability’ or ‘very good ability’ to undertake most of the technical skills required for their grade by the GSR Technical Skills Framework (Annex 4A).  

    Annex 4B present the skills in which some of the PROs reported having only some or no ability. Two responses stand out. One is the apparent limitations of four PROs to “use the GSR network effectively to actively pursue options for cross cutting research”. Three of the six PROs thought they had only “some ability” with this skill, and a fourth PRO claimed to have no such ability.  

    The second notable response was that four of the six PROs reported having only ‘some ability’ to “take the lead on a number of ‘technical’ matters within the wider GSR/analytical community, for example, this could be methodological or evidence based”. 

    Given that PROs provide the leadership and management of SROs and ROs, some of whom also expressed limited ability to undertake these skills, these would seem to be areas in which the professional development of some PROs would be appropriate.
    Similarly, the responses of two PROs indicating that they have only ‘some ability’ to “encourage, coach and support others to adopt the latest social research methods and data science techniques into their work” may require further consideration about their professional development. 

    Summary

    The team reported ‘good ability’ or ‘very good ability’ to undertake most of the technical skills required across all three grades. 

    All three grades, however, reported having only ‘some ability’ or ‘no ability’ with a few of the methodological and organisational skills required by the GSR Technical Skills Framework. This indicates a need for some professional development of some of the technical skills mentioned above. 

    This chapter presents the views of the FSA social science team on its strengths and areas for improvement in terms of ensuring the scientific rigour, relevance and value of the FSA’s social research. The standards for assessing these professional requirements of social science are presented in the GSR Code for Products and People (Chapter 3).

    Face-to-face group interviews were undertaken in October 2022 with all members of the FSA social research team, which at the time was made up of six Principal Research Officers (three of whom are part-time), six Senior Research Officers, and three Research Officers (one of whom was a GSR placement student).

    Participation in the group interviews was voluntary and was conducted under the requirements of the GSR’s Ethical Assurance Guidance for Social and Behavioural Research in Government (GSR 2021). All fifteen members of the social research team consented to participate.

    Commissioning Research

    The FSA social science team leads on identifying and commissioning research, which requires effective collaboration with policy and procurement colleagues as well as contractors. 

    What do FSA social researchers think they do well in terms of commissioning?

    Identifying the policy issues requiring social research and clarifying the objectives and anticipated outcomes of a policy or programme are activities that the social science team thinks it does well. As one Principal Research Officer put it: “once we have a clear idea of what is required, we can signpost quickly what has to be done and who are likely to be the best people to undertake that project”.

    Commissioning social research takes place within the context of a demand for fast-paced delivery of work and slow government procurement processes. The social science team has numerous routes to commissioning including call-off contracts with vetted contractors to deliver research and analysis quickly without sacrificing quality. The team’s relationship with FSA’s procurement team is seen as productive. 

    The team keeps up to date with innovative research methodologies from academic and commercial research communities. Members of the team attend a wide range of seminars and training courses, arrange seminars and workshops, and discuss new and upcoming research methods with contractors. The team also has well developed relationships with a broad range of analysts inside and outside government. Several team members have had research experience outside of the civil service, which brings additional skills and expertise. 

    To improve the commissioning and procurement process the team has developed a process map which is said to have delivered better operations. FSA’s Areas of Research Interest (ARI) steering groups also help the social science team link with project technical leads and policy colleagues. 

    What do FSA social researchers think they do less well in terms of commissioning?

    Notwithstanding many positive and productive relationships, working with policy colleagues and other commissioners of research can present challenges. Policy colleagues are said to not always understand the skills and roles of the social science team. Consequently, they do not always approach the team in a timely manner so that precise objectives and questions can be agreed. This is more difficult when the team needs to engage with many different stakeholders who have opposing demands. Policy colleagues may not always fully recognise how long it takes to procure, deliver and quality assure research of high quality.

    Managing Research

    The team manages the fine details and logistics of social research and works with policy colleagues and contractors to ensure that research is delivered on time, on budget and avoids mission drift. 

    What do FSA social researchers think they do well in terms of managing social research?

    The team thinks it is good at establishing how research is going to be used and adapting the research specification and procurement accordingly. If the research is for communications purposes the breadth and depth will usually differ from when it is being used to shape and evaluate policy. 

    Good day-to-day management of research requires keeping contracts on track, developing good relationships with contractors, and collaborating with a range of stakeholders. This might require frank feedback to contractors such as saying: “we were expecting this to be a bit more thorough” or “we need more detail from you about x”. This is another thing that the team believes it does well, as was verified by some stakeholders (see Chapter 1).

    The broad range of stakeholders with which the team engages includes not only policy makers across government and contractors but also academics, food suppliers, consumers, third sector organisations and local authorities across England, Wales and Northern Ireland. Every stakeholder requires careful management and collaborative relationships. Managing this range of stakeholders is something the social science team believes it does well.

    Quality control of research outputs is another task about which the team is positive. This requires reviewing and critically appraising research outputs, editing them, identifying policy implications and signing them off only when they reach an acceptable standard of quality.   

    What do social researchers think they could improve on in terms of managing research?

    The overwhelming response to this question from the group interviews was “time”. A slower pace of planning, commissioning and delivering research would allow the team to understand policy issues in greater detail, design the appropriate research, and deliver higher quality research throughout. If the team is approached earlier then it will be in a better position to respond effectively.

    There was clear recognition, however, that this is seldom possible when undertaking research in and for government. Working to tight timelines and (sometimes) ill-defined questions comes with the territory. A reconsideration of the system for procuring research to support policy making seems timely and appropriate. Previous calls to match policy needs and analytical provision (e.g. Campbell, et al, 2007:33) do not seem to have been heeded.  

    There is also a recognition that the pressure of time also requires good research “housekeeping”. The team has recently completed a revision of its systems for saving and storing project documents in an organised and shared way. It is anticipated that this will enhance the service that the social science team provides.

    Research dissemination and impact is something else that the team would like to improve upon. The social science team has a very good record of publishing its research outputs, but impact can only be partially assessed by how often its outputs are read, referred to and used for decision making. Impact also requires monitoring and evaluating the uptake of the FSA social research team’s outputs and identifying their effects on dietary and hygiene-promoting behaviour.  This, however, would add even further demands on time and resources. 

    GSR Code for People

    The group interviews also explored the professional standards required by the GSR Code for People (GSR 2018).

    Performing with integrity

    This professional standard includes making “the best use of resources and giving appropriate methodological and impartial evidence-based advice, challenging where appropriate” 

    There were different understandings within the team about the meaning and implications of “making the best use of resources”. For some members it suggests “delegating to my team without losing control of quality”, “performing my role with efficiency” and “not being under-utilised and not doing too much”. For others it means not undertaking research in areas where there is already an established body of evidence. This includes using the library of publications and ‘back office’ materials that the team has established, as well as its own institutional memory. Others highlighted that “we publish everything, we have a good record, so we can easily find our published work”. 

    There was universal agreement that “giving appropriate methodological and impartial evidence-based advice [and] challenging where appropriate” was central to what the team does routinely, and in accordance with the GSR’s Ethical Assurance for Social and Behavioural Research in Government (GSR, 2021).

    Outward Facing

    This professional standard includes establishing “effective links with the external research community, collaborating with policy delivery colleagues and with other analytical professions within and across Departments”.

    Evidence of the team’s achievements against this standard include its active participation in Government networks such as the cross-Government Evaluation Group, as well as external organisations such as the Social Research Association (SRA) and the Market Research Society (MRS). The team has good relationships with analysts in other government departments, such as the Department for Environment, Food and Rural Affairs (DEFRA), which was reciprocally valued by colleagues at DEFRA.

    Attendance at seminars and conferences in the UK and abroad, and having good links with non-Government organisations concerned with food quality and food insecurity such as the Food Foundation, were also mentioned as evidence of being outward facing. So too was membership of the International Social Science Liaison Group (ISSLG) which brings the team into regular contact with FSA-equivalent organisations in the United States, New Zealand, Canada and Australia, plus the European Food Safety Authority (EFSA). 

    The Advisory Committee on Social Science (ACSS) was also mentioned as a source of expert knowledge and advice. The ACSS “provides expert strategic advice to the FSA on its use of the social sciences, including new and emerging methods, processes and systems to interrogate data, to deliver the FSA's objectives” (ACSS, 2022). 

    Appropriately skilled and continuously developed

    This professional standard includes being recruited and promoted in line with GSR recruitment protocol and committed to continuous professional development in line with the CPD requirements.   

    The recruitment policy of FSA social research aligns with the GSR Recruitment Guidance (GSR, 2010). Analysts who meet the standards required can apply to be ‘badged’ as GSR members. FSA social research encourages badging, and most members are badged, or are in the process of becoming badged, including based on their experience in external research agencies.

    The GSR Recruitment Protocol (GSR, 2023) outlines the qualifications required and other eligibility criteria for joining the Government Social Research service. This includes an ‘experience route’ that accepts applications from people with an undergraduate and postgraduate degree in any subject and at least four years’ social research practice experience. The latter can include experience working in a research agency, market research agency or specialised research team.

    There was agreement amongst the social science team that recruiting people from outside the civil service brings a wealth of experience and different perspectives to the work of the group. Working with people from other professions, such as communications, was also seen as beneficial. 

    Committed to CPD

    GSR researchers must complete at least 100 hours of CPD per year, including on-the-job learning, though there is some flexibility across grades. 

    Course participation in CPD is sometimes restricted by procurement rules, such as that if Civil Service Learning (CSL) offers a course that looks similar to one provided by another organisation then FSA researchers must take the CSL course. Sometimes, the CSL provision is at a lower level of technical standard than that desired by FSA social researchers.

    Researchers can attend non-civil service courses or training, but they must first obtain quotes from three other providers. Where there are not three other providers the time taken to establish this might result in the civil service course being full. This can be frustrating for social researchers.

    The social science team suggested, however, that the FSA is very supportive of staff undertaking professional development and training, possibly more so than other Government Departments, albeit within FSA’s HR rules and requirements. 

    Summary

    The group interview with the social science team revealed a self-assured and confident group which is proud of how it works and what it achieves. It sees itself as strong in terms of both commissioning and managing social research, despite the challenges of the timelines involved in their work, and the need to establish focused and answerable questions with policy colleagues. 

    The team was most eloquent when considering the ‘outward facing’ dimension of the GSR Code for People. It was able to provide a wide range of examples of how it has effective links with the external research community and other analytical professions within Government, and of positive collaborations with policy and delivery colleagues notwithstanding timeliness issues. The team was also confident about its quality assurance procedures.

    Improving the dissemination of research findings and establishing their impact is an area that social researchers thought could be improved if sufficient time were available.

    This review has attempted to assess the contribution that the FSA social science team makes to the FSA and its mission, and to identify what it does well, where there may be need for improvement, and what might be the direction of future learning and professional development.

    What does the social science team do well?

    1. The FSA’s social science team is a confident group that is well-regarded by the majority of internal and external stakeholders, and provides a robust evidence base on the social aspects of the FSA’s mission to ensure that food is safe, is what it says it is, and is healthier and more sustainable. 

    2. The team was described by stakeholders as having “a huge amount of in-house expertise” and providing a “variety of really interesting high-quality research that contributes towards the evidence base for our policies”. This included the “ability to provide quick and detailed comments on things where input was required”, and “some really strong areas of consumer insight that allows us to make very powerful statements as an organisation”.  
    3. The team’s qualitative analysis and their evaluation of the equality issues surrounding food policy and food insecurity were highly valued, as were its exploration of consumers’ attitudes and perceptions on food products. The ‘lived experience’ research of the team was also highly valued.  
    4. The team’s ability to identify and clarify the problems to be researched or evaluated, and to articulate the business needs for research or evaluation, was also recognised positively by internal and external stakeholders. So too was their capacity to challenge policy colleagues and contractors in order to specify research and evaluation that can be delivered in a timely manner and with quality.
    5. The research outputs reviewed for this report were found to be generally of a high standard in terms of the methodologies used, their design, execution and reporting. 
    6. The project management of the FSA social science team was generally good and appreciated by policy colleagues and contractors alike.
    7. The social science team meets most of the GSR Code for Products and People well. The team and its work were rated ‘Green’ or ‘Green/Amber’ against measures of: rigour and impartiality, relevance, accessibility, legal and ethical practice, performing role with integrity, being appropriately skilled and continuously developed, and outward facing.

    Where is there room for improvement?

    The data and analysis presented in this review has identified areas where there is room for improvement.

    1. The involvement of the team with policy colleagues does not always happen early enough. Consequently, research objectives, questions and approaches can be ill-defined or considered too late for appropriate specificity, delivery and quality assurance. 

    Managing expectations of policy colleagues regarding how long it takes to procure, deliver and quality assure research to high standards can also be problematic. Resolving different stakeholders’ opposing demands or conflicting needs for social research can present additional challenges for procuring and delivering high quality research.

    Recommendation 1: Heads of profession from all of the Government analytical services should establish with senior policy makers the importance of early and continuous involvement of researchers in the development of policies, programmes and projects.

    2. Procurement procedures for research have been identified as a challenge for some contractors and the social science team. They have been described as “inappropriate for procuring research and evaluation” and “not fit-for-purpose”. These procedures are cross-government requirements and beyond the control of the FSA social science team or other analytical professions across government. 

    Recommendation 2: The procurement arrangements for government research should be reviewed with the aim of having separate procedures and requirements from those of general government procurement.

    3.    Identifying the impacts of social research outputs is something the social science team would like to improve. This requires not only monitoring the uptake of the FSA social research team’s outputs, but also identifying and assessing their effects on dietary and hygiene-promoting behaviour. 

    Recommendation 3: The senior management of the social science team should identify ways in which the impacts of their research and analysis can be identified and evaluated. This might involve linking the data and findings of the FSA’s social research outputs to other sources of data on dietary behaviour.

    4.    This review has identified that not all published research outputs have a separate technical report providing in-depth details of how research projects are designed, samples and research instruments are selected, or how analysis will be undertaken. 

    Proportionality in the provision of such technical details is an issue, especially given the pace at which research has to be commissioned, delivered and quality assured. However, providing the technical basis of research outputs is one of the hallmarks of good science and should be common practice.

    Recommendation 4: Technical details of how research has been conducted should be made available as common practice if the scientific quality of research outputs is to be assured. This is also a matter of transparency and accountability.

    5.    External peer reviewing and quality assurance of research outputs is “undertaken on a case-by-case basis, subject to the pace, complexity and purpose of the work” (footnote 1). This is usually undertaken by the FSA’s Advisory Committee for Social Science (ACSS) and academics on the FSA’s Register of Experts. 

    Proportionality is also an issue in the provision of peer reviewing of research outputs. External peer reviewing, however, is another hallmark of good science and should be common practice. Where and when research outputs are for publication, and/or a potential evidence base for decision making, then wholly independent peer review should be a matter of good practice.

    Recommendation 5: All published social science outputs, and those that will provide a potential evidence base for decision making, should be peer reviewed by wholly independent experts as a matter of good practice.

    6.    There is some need for professional development of some members of the social science team. This includes professional development in methodology, leadership and coaching to stay abreast of the latest methods of social research and research management. 

    It is understood that plans are underway for some professional development of the social science team. The professional development of social researchers would also benefit from greater flexibility in selecting training courses at the appropriate level of technical expertise from organisations within and outside of the CSL prospectus.

    Recommendation 6: All members of the team should be able to take the CPD training of their choice, within or outside of the CSL provision, to keep informed of the latest developments in research methods and coaching. There would seem to be a particular need for CPD of quantitative methods of research, systematic review methodology and the coaching of staff.

    7. Most FSA social research is outsourced, so most of the team’s work is commissioning, managing and quality assuring research, rather than undertaking data collection and analysis. This may have some limiting effects on social researchers’ abilities to maintain and develop their skills in research and analysis. 

    Recommendation 7: The FSA social science senior management team should review the balance of in-house versus contracted-out social research and ensure that all members of the team have the opportunity to maintain and improve their social research skills.

    8. The GSR Self-Assessment Tool aims to ensure that GSR’s professional standards are met in all of its products and people. It does this at a rather high level of generality and the indicators do not really capture the quality of research outputs at a sufficiently granular level. 

    The Self-Assessment Tool also requires all research products to be assessed in terms of their implications and solutions for policy and delivery. This is often beyond the scope of most external contractors. 

    The GSR Self-Assessment might best be used to assess the broader dimensions of professional standards, such as the accessibility, legal and ethical requirements, and recruitment and professional development. This, however, depends on the extent to which the GSR Self-Assessment code is currently used to assess the professional practice and outputs of social science in government. This review suggests that this may not be extensive.

    The scoring categories of red, amber, green were seen as insufficiently nuanced and as conflating standards. This is especially so in terms of assessing the rigour of social science methodology. The FSA Quality Assurance Toolkit was found to be more appropriate to appraise the rigour of research outputs as well as the production, assessment and procurement of research.

    Recommendation 8: It is strongly recommended that going forward the FSA Quality Assurance Toolkit should be used as the main means of assessing the quality of social science research at FSA and in other government departments. 

    FSA GSR Review: Interviews with Internal Stakeholders - Topic Guide

    1. What has been your involvement with FSA’s social researchers over the past two years? [Prompt: On which policies, programmes or projects have you been involved with FSA’s social researchers?]
    2. How have the FSA’s social researchers contributed to your work? [Prompt: What sorts of tasks have FSA’s social researchers undertaken?] [Prompt: What sorts of skills have FSA’s social researchers contributed?] Ask about GSR principles/technical framework/UCL toolkit
    3. How do you rate the contribution of FSA’s social researchers to your work? [Prompt: On a scale of ‘Poor’/‘Adequate’/‘Good’/‘Excellent’]
    4. What are the most valuable contributions that FSA’s social researchers have made to your work?
    5. What are the least valuable contributions that FSA’s social researchers have made to your work?
    6. In what ways, if any, do you think FSA’s social researchers could improve their contribution to your work?
    7. Do you have any other comments on the work of FSA’s social researchers?

    FSA GSR Review: Interviews with External Stakeholders - Topic Guide

    1. What has been your involvement with FSA’s social researchers over the past two years [Prompt: On which policies, programmes or projects have you been involved with FSA’s social researchers?]
    2. How have the FSA’s social researchers contributed to your work? [Note: Is this agency from a) the food services sector, or 2) the public health/local authority sector?] [Prompt: What sorts of tasks have FSA’s social researchers undertaken with your agency?] [Prompt: What sorts of skills have FSA’s social researchers contributed to your agency?] Ask about GSR principles/technical framework/UCL toolkit
    3. How do you rate the contribution of FSA’s social researchers to your agency’s involvement with the FSA? [Prompt: On a scale of ‘Poor’/‘Adequate’/‘Good’/‘Excellent’]
    4. What are the most valuable contributions that FSA’s social researchers have made to your work?
    5. What are the least valuable contributions that FSA’s social researchers have made to your work?
    6. In what ways, if any, do you think FSA’s social researchers could improve their contribution to your work?
    7. Do you have any other comments on the work of FSA’s social researchers?

    Output: Food and You 2 Wave 4 Technical report and Key findings [combined assessment]

    Authors: Key findings reports: Armstrong, B., King, L., Clifford, R., Jitlal, M., Ibrahimi Jarchlo, A., Mears, K.,

    Authors: Technical Report: Gallop, K., Smith, P., Ward, K., Horton, S., Candy, D., Hossein-Ali, H., Peto, C., and Lai, C. 

    Date: August 2022

    Assessment of Food and You 2 Wave 4 using the GSR code

    Rigorous and impartial

    Rigorous and impartial Rating Comments
    Based on sound methodology and established scientific principles High The technical report by IPSOS provides extensive evidence of meticulous conceptual and methodological preparation and testing before, during and after fieldwork. The participation of FSA researchers and the FSA Advisory Group in this methodological work is exemplary.
    Quality assured High Quality assurance was certainly built into the survey’s design, execution and reporting, both by IPSOS internally and by FSA researchers.
    Based on best design, given constraints High The survey design, development and execution are excellent. The attention to detail was considerable and the range of questions addressed by the Wave 4 is admirable.
    Conclusions are clearly and adequately supported by data High The conclusions of the ‘Key Findings’ Report of the Wave 4 F&Y2 survey have been clearly presented in the Executive Summary and they are well supported by the data presented in the main body of the report. This has been done in a systematic and rigorous way. It represents the ‘Findings’ or ‘Results’ section of a scientific paper in an academic journal. What is missing is a ‘Discussion’ and ‘Implications for Policy and Practice’ section.  (See the ‘Relevant’ section below)

    Relevant

    Relevant Rating Comments
    Anticipates future policy issues as well as addressing current ones High The Key Findings report, and the Technical Report, are addressing policy issues about food content, food choice and the behavioural drivers of healthy eating. The Key Findings report indicates many areas where the public’s awareness of food content, the food supply chain and policy measures such as the Food Hygiene Rating Scheme is considerable.  
    The Key Findings report also identifies areas where the public’s attitudes about sustainable eating are not as one might have anticipated – for example, “around a third of respondents thought that eating more fruit and/or vegetables (38%), contributed most to a sustainable diet (page 76). These findings might help policy makers at the FSA, Department of Health and DEFRA decide on how current and future food policy issues might be prioritised.
    Answers clear and researchable questions High The Technical and Key Findings reports develop the conceptual and methodological approach of the F&Y2 survey around the policy and behavioural questions of the FSA and DEFRA. Hence, they address clear and researchable questions.
    Contributes to all stages of the policy and delivery processes Medium Contributing to all stages of the policy and delivery process is beyond the scope of the Food and You 2 survey. However, the evidence base that the F&Y2 survey provides does provide a sound empirical foundation upon which discussions about food policy and delivery can be made.
    Delivers solutions that are viable, actionable and represent value for money Low The Key Findings and the Technical Report are less concerned with delivering solutions or determining value for money than establishing the scientific evidence base of the public’s knowledge, attitudes and behaviours in relation to food, food safety and food security. Hence, there are few solutions that are viable, actionable and represent value for money.

    Accessible

    Accessible Rating Comments
    Published High Publication of the findings of the F&Y Survey has been extensive and of a high quality. The publication of both the Technical report and the Key Findings is clear and well presented. 
    Data made available where possible High There is an abundance of relevant and appropriate data from the development and piloting of the study (Technical Report), and the data are presented in a systematic and rigorous way in the Key Findings report. Both reports are easy to read and comprehend.
    Clear and consistent High The technical report is very clear. Though not concise (68 pages) it provides all the technical detail necessary to understand the F&Y2 Survey. It is an excellent ‘back office’ record of the scientific background to this important substantive survey.
    Although being almost 100 pages in length, the good and clear presentation of the data in the Key Findings report make is clear and readable.
    Related to existing work in field High This technical report builds upon earlier survey work of the FSA and uses the legacy of that work very well. It has also incorporated the USDA 10-item US Adult Food Security module. The Key Findings report also refers well to this other existing work.
    Legal and ethical Rating Comments
    Complies with relevant legislation High This Technical Report and the Key Findings report comply with GDPR legislation, and both reports are mindful of the legislative framework and obligations within which FSA (and DEFRA) operates.
    Complies with GSR ethical guidelines High Both reports comply with the GSR ethical guidelines for the development, execution, analysis and reporting of scientific research. They go to considerable lengths to meet these requirements.

    FSA QAT Assessing Research reports checklist Food and You 2 Wave 4 Key Findings

    Checklist 2: Assessing research reports

    Q1. Title, lead author and year
     Food and You 2: Wave 4 Key Findings. Lead author: Dr Beth Armstrong, August 2022.
    Q2. Has a clear research need been outlined?  
     Yes – fully - This report builds upon, and replaces, FSA’s face-to-face Food and You survey, the Public Attitudes Tracker and the Food Hygiene Rating Scheme Tracker survey. Hence, a clear research need has been outlined to meet key issues about a range of FSA’s policies.
    Q3. Has a precise research question/aim been specified?  
     Yes – fully - This report addresses a very clear set of questions about consumers’ knowledge, attitudes and behaviours related to food safety and other food issues.
    Q4. Is the research design… 
    Cross-sectional - This report is on Wave 4 of a cross-sectional study
    Q5. Is the research method… 
    Quantitative 
    Q6. Is there a good match between the research question/aim, research design and research method?  
    Yes – fully - The match between the research questions/aims, research design and research method is excellent, largely due to careful preparation by FSA social researchers in collaboration with the FSA Advisory Board and the contraction (IPSOS).
    Q7. Is the study population and setting specified?  
    Yes – fully -  The study covers consumers in England, Wales and Northern Ireland, and local communities within these countries.

    If Q5 = Qualitative, go to Q8a. If Q5 = Quantitative, go to Q8b. If Q5 = Both, go to Q8a and Q8b.

    Q8b. Is the sampling method… 
    Stratified sampling.

    Go to Q9. 
    Q9. Is the sampling method appropriate for addressing the research question? 
    Yes – fully -  The study design is well stratified to include relevant sub-groups on food consumers in each country, and uses the Index of Multiple Deprivation for England (only) effectively.

    If Q5 = Qualitative, go to 9a. If Q5 = Quantitative, go to 9b. If Q5 = Both, go to Q9a and Q9b.

    Q9b. Has a sample size calculation been conducted? 
    Yes – fully - A great deal of work has gone into establishing a sample size that is large enough to provide statistical power.
    Go to Q10.
    Q10. Are the research instruments valid and reliable?  
    Yes – fully  -  Again, a great amount of preparatory work has been undertaken by the contractor, working in collaboration with FSA social researchers and other analysis, to ensure that the research instruments valid and reliable.
    If Q5 = Qualitative, go to Q11a. If Q5 = Quantitative, go to Q11b. If Q5 = Both, go to Q11a and Q11b.

    Q11b. Is the analytical approach… 
    Chi-square test, correlation test, t-test or analysis of variance 

    Go to Q12. 
    Q12. Is there a good match between the analytical approach, the research method and the research question?
    Yes – fully - This report uses mainly percentage differences in consumers’ knowledge, attitudes and behavioural responses across socio-demographic and sub-groups. Differences of 10 percentage points between groups are mainly reported and are tested for statistically significance at the 5% level (p<0.05). Some differences between respondent groups are included where the difference is fewer than 10 percentage points when the finding is notable or of interest. This provides a good match between the analytical approach, the research method and the research question.
    Q13. Has a relevant checklist from the EQUATOR Network been used in the reporting of the results?  
    Yes – fully -  This report meets the guidelines of the EQUATOR network for quantitative methods. That is, there is a basic presentation and analysis of results using descriptive tables and figures (e.g., histograms) that are useful for conveying the results. The tables and figures are concise and easy to read. The statistical commentary brings the numbers to life and does so in a fairly simple and straightforward way. Important differences between different socio-demographic groups are noted in the commentary. 
    Q14. Have descriptive data on the characteristics of participants been presented?
    Yes – fully - Considerable descriptive data are provided on the characteristics of participants both in this ‘Key Findings’ and in the accompanying ‘Technical Report’.

    If Q5 = Qualitative, go to Q15. If Q5 = Quantitative, go to Q19. If Q5 = Both, go to Q15 and Q19.
    Q19. Have descriptive data on exposures/interventions and potential confounders been presented?
    Yes – fully - The report provides data on respondents’ awareness of FSA’s programmes and policies aimed developing and maintaining food standards (which are the ‘exposures and interventions’ of this study). It also analyses the contribution of other government programmes, such as the Healthy Starts voucher programme, to consumers’ knowledge, attitudes and behaviours. 

    Go to Q20.

    Q20. Have unadjusted and adjusted point estimates and confidence intervals been presented alongside statistical significance? 
    Yes – partly  
    Yes – fully 
    This report mainly presents percentage distributions of respondents’ knowledge, attitudes and behaviours. This provides an indication of the range of results which have been weighted to reflect socio-demographic factors. This is not the same as presenting unadjusted and adjusted point estimates and confidence. However, it does make the presentation of the findings easy to read and comprehend, especially for those who may be less oriented towards statistical detail.

    Go to Q21. 

    Q21. Has generalisability been considered in the interpretation of the results?  
    Yes – fully - This report provides sufficient data on how generalisable the findings are across the populations of England, Wales and Northern Ireland. The report clearly points out that the data are “typically reported only where the absolute difference is 10 percentage points or larger and is statistically significant at the 5% level (p<0.05). It also reports that “some differences between socio-demographic and other sub-groups are included where the difference is less than 10 percentage points, when the finding is notable or judged to be of interest.” This enhances the generalisability, and limitations, of the findings.
    Q22. Has causality been considered in the interpretation of the results? 
    Not applicable - This is not an experimental or quasi-experimental study. Hence, it does not attempt to provide evidence of causality. The data presented are correlational rather than causal.
    Q23. Has uncertainty been considered in the interpretation of the results? 
    Yes – fully 
    Q24. Has a clear study conclusion been presented?  
    Yes – fully - There is an excellent ‘Summary of Key Findings’ at the beginning of the report. These are well presented with clarity, and in large print that is nicely spaced, over just five pages.
     

    Outputs: FSA Small and Micro FBO Tracking Survey Wave 3 2021

    Authors: IFF Research

    Date: March 2022

    Assessment of FSA Small and Micro FBO Tracking Survey Wave 3 2021 using GSR code

    Rigorous and impartial

    Rigorous and impartial Rating Comments
    Based on sound methodology and established scientific principles High This survey was undertaken after relevant and careful cognitive testing and sample design/planning. Considerable effort went into sample preparation – for example, research to undertake telephone lookups. using external data suppliers (Market Location and REaD Group) and internal desk research. Weighting of the sample was also undertaken carefully and appropriately.
    Quality assured Medium This report was quality assured by FSA internal researchers and external consultants. FSA research reports are quality assured via peer reviewer comments directly on draft reports and discussions in meetings rather than with a separate report of comments.
    Based on best design, given constraints High The survey design was appropriate and built upon previous waves of the FBO Tracker, with some additional questions (with the agreement of the FSA).
    Conclusions are clearly and adequately supported by data High The conclusions are laid out clearly at the end of the report, and a summary of the conclusions is presented in the Executive Summary.  In both case the conclusions are adequately supported by the data.  

    Relevant

    Relevant Rating Comments
    Anticipates future policy issues as well as addressing current ones Medium This report certainly addresses how recent and current policy issues of the FSA, and the wider UK Government, have impacted on small and medium sized businesses. It does not really anticipate future policy issues.
    Answers clear and researchable questions High This research report addresses, and answers, clear and researchable questions about:
    • the implications of UK’s exit from the European Union (EU) on small and micro enterprises.
    • attitudes towards regulation and deepen insights and knowledge of small and micro enterprises.
    • trust in the FSA and extent to which the FSA is considered a modern, accountable regulator.
    Contributes to all stages of the policy and delivery process Medium* Whilst this report does not directly contribute to all stages of the policy and delivery process it does ‘unpack’ the evidence on how UK’s exit from the European Union (EU), the regulatory structures of the FSA, and trust in the FSA has developed since the 2019 Tracker survey. This may have implications of FSA policy and practice issues.
    Delivers solutions that are viable, actionable and represent value for money Low This report presents clear conclusions about the findings of the survey. It does not venture into delivering solutions or establishing value for money.  This is not the responsibility or role of the survey company (IFF) who undertook this survey. It does not appear to have been in the survey specification.

    Accessible

    Accessible Rating Comments
    Published High The Small and Micro Food Business Operator (FBO) Tracking Survey: Wave 3, 2021 Technical Report is published and available at: 
    https://doi.org/10.46756/sci.fsa.sty242
    Data made available where possible High Summary data are available in this report and the accompanying Technical Report. These are presented clearly and extensively. 
    Clear and concise High Although this report overall is not really concise (approximately 100 pages) it is clearly presented. Also, the individual chapter are reported concisely.
    Related to existing work in field High This report builds upon previous waves of this annual tracking survey that were conducted in 2018 and 2019. Hence, it relates to existing work in the field, and it extends this earlier work with additional questions.
    Legal and ethical Rating Comments
    Complies with relevant legislation High This report seems to comply with relevant legislation that outlines the FSA’s regulatory role, and its law enforcement function on food crimes and food hygiene.
    Complies with GSR ethical guidelines High This survey has been undertaken in compliance with the GSR ethical guidelines. 

    * External contractors are not always in a position to “anticipate future policy issues as well as addressing current ones”, “contribute to all stages of the policy and delivery process” and “deliver solutions that are viable, actionable and represent value for money”. Hence, a low or medium score reflects the limitation of using the GSR Self-Assessment tool for assessing the quality of research outputs.

    FSA QAT Assessing Research reports checkllst, FSA Small and Micro FBO Tracking Survey Wave 3 2021: Checklist 2: Assessing research reports

    Q1. Title, lead author and year
     FSA Small and Micro FBO Tracking Survey Wave 3 2021, Author: IFF Research, March 2022
    Q2. Has a clear research need been outlined?  
    Yes – fully -  A clear research outline is presented about the implications of the EU Exit on small and micro enterprises, attitudes towards FSA regulation, and trust in the FSA.
    Q3. Has a precise research question/aim been specified?  
    Yes – fully - The survey has been designed to address questions about the EU Exit on small and micro enterprises, attitudes towards FSA regulation, and trust in the FSA.
    Q4. Is the research design… 
    Longitudinal 
    Q5. Is the research method… 
    Quantitative 
    Q6. Is there a good match between the research question/aim, research design and research method?  
    Yes – fully - This survey addresses the research aims using an appropriate design and research  methods.
    Q7. Is the study population and setting specified?  
    Yes – fully -  This survey clearly specifies that the population in question is small and micro Food Business Operators (FBO) in England, Wales and Northern Ireland.
    If Q5 = Qualitative, go to Q8a. If Q5 = Quantitative, go to Q8b. If Q5 = Both, go to Q8a and Q8b.

    Q8b. Is the sampling method… 
    Stratified sampling 
    Go to Q9. 
    Q9. Is the sampling method appropriate for addressing the research question? 
    Yes – fully - The samples selected allows sub-group analyses by types of FBO in the constituent countries of England, Wales and Northern Ireland.
    If Q5 = Qualitative, go to 9a. If Q5 = Quantitative, go to 9b. If Q5 = Both, go to Q9a and Q9b.
    Q9a. Is the sampling method appropriate for addressing the research question?
    Yes Fully
    Q9b. Has a sample size calculation been conducted? 
    Yes – fully -  The sample sizes in England, Wales and Northern Ireland have been calculated for statistical power and are adequate to undertake analysis of different types of FBO in England, Wales and Northern Ireland. Details are presented in the text and Annex 2 of the report
    Go to Q10.
    Q10. Are the research instruments valid and reliable?  
    Yes – fully -  The research instruments have been developed using good cognitive fieldwork amongst a small subset of businesses. A total of 700 ‘mainstage 1’ interviews were conducted via a Computer-Assisted Telephone Interviewing (CATI) methodology. The data was weighted to be representative of the in-scope micro and small FBO population across England, Northern Ireland and Wales at the time of sampling. This represents good fieldwork planning and development.

    If Q5 = Qualitative, go to Q11a. If Q5 = Quantitative, go to Q11b. If Q5 = Both, go to Q11a and Q11b.
    Q11b. Is the analytical approach… 
    Time series analysis 
    Q12. Is there a good match between the analytical approach, the research method and the research question?
    Yes – fully -  The analytical approach builds upon Waves 1 and 2 of the Tracker Survey and it has included additional questions to address policy and business issues since the 2019 Tracker (and the COVID pandemic).
    Q13. Has a relevant checklist from the EQUATOR Network been used in the reporting of the results?  
    Yes – partly -  This survey does not explicitly use the EQUATOR Network checklist for surveys, but it certainly follows most of the requirements of this checklist.
    Q14. Have descriptive data on the characteristics of participants been presented?
    Yes – fully - This survey provides descriptive details of the different participants amongst the service activities sector, larger businesses, sole traders and those with a higher FHRS rating certificate.
    If Q5 = Qualitative, go to Q15. If Q5 = Quantitative, go to Q19. If Q5 = Both, go to Q15 and Q19.
    Q19. Have descriptive data on exposures/interventions and potential confounders been presented?
    Yes – partly -  This is a single wave report on a longitudinal survey and not an experimental or quasi-experimental study. It does investigate the consequences of the UK exit from the EU and, to some extent the COVID-19 pandemic, but it does not attempt statistical modelling of other potential confounders. This would require a different research design and analytical approach.
    Go to Q20.
    Q20. Have unadjusted and adjusted point estimates and confidence intervals been presented alongside statistical significance? 
    No -The survey results are presented mainly as percentages of respondents’ different responses. There are no unadjusted and adjusted point estimates and confidence intervals, though the results are based on weighted data. The distribution of responses are often classified as “most likely”, “more likely than average” or “particularly likely”.
    Go to Q21. 
    Q21. Has generalisability been considered in the interpretation of the results?  
    Yes – fully -  The survey has been designed to enable findings to be generalisable across different FBO businesses and the different countries (England, Wales and Northern Ireland). At the same time the authors are careful to note where and when caution should be taken when interpreting these results due to small base size.
    Q22. Has causality been considered in the interpretation of the results? 
    No 
    Not applicable
    Causality is hard to establish with survey data that does not have counterfactual samples. This survey recognises this and merely reports descriptive percentages of responses to a wide range of questions. The report does not go beyond analysis that is possible with survey data only.
    Q23. Has uncertainty been considered in the interpretation of the results? 
    Yes – partly  - This survey acknowledges where the results are sometimes based on small sub-samples and, hence, where the results are uncertain. It does not over-interpret the findings of the survey.
    Q24. Has a clear study conclusion been presented?  
    Yes – fully - The Conclusions section of this report is clear and cautious. It is balanced and indicates where the survey responses are mixed in terms of the consequences of (for instance) UK exit from the EU and the resultant impacts on business activity
     

    Add to smarter communications search Off

    Outputs: Value of the FHRS: Businesses, Consumers and Local Authorities

    Authors: Ipsos

    Date: May 2023

    Assessment of Value of the FHRS: Businesses, Consumers and Local Authorities using GSR code

    Rigorous and impartial

    Rigorous and impartial Rating Comments
    Based on sound methodology and established scientific principles High IPSOS and FSA researchers worked closely together on this project to establish the best and most feasible combination of online workshops, deliberative engagement, in-depth interviews and scenario setting to elicit views of the businesses, consumers and local authorities about the value and working on the FHRS. This generated surveys with appropriate sample sizes and representativeness that capture the diversity of businesses, consumers and LAs. There is good evidence of careful methodological planning by IPSOS and the FSA researchers.
    Quality assured Medium This survey was quality assured internally by FSA social researchers at the commissioning, analytical and reporting stages. There was no external quality assurance of these three surveys. This is regrettable given the importance of independent peer review of research outputs. Other research outputs of the FSA social science team have been independently peer reviewed (footnote 1).
    Based on best design, given constraints High Given that the aim of this assessment was to “to understand in more detail how Local Authorities (LAs), businesses and consumers feel about the current FHRS”, and that it had to capture views of the FHRS in England, Wales and NI, the chosen qualitative design was appropriate and well thought-through. Sample sizes were also appropriate for qualitative research with over-sampling where necessary. The range of methods (online workshops, in-depth interviews, scenario setting etc) was also well chosen.
    Conclusions are clearly and adequately supported by data High The conclusions are presented clearly and succinctly and are adequately supported by the data.

    Relevant

    Relevant Rating Comments
    Anticipates future policy issues as well as addressing current ones High This qualitative survey contributes to the FSA’s responsibility for food safety across England, Wales and NI. It provides an evidence base for current and future discussion of policy issues surrounding the FHRS.
    Answers clear and researchable questions High This survey does indeed address clear and researchable questions about the value of the FHRS to businesses, consumers, the local authorities, and possible areas of change for the regulatory approach of the FSA
    Contributes to all stages of the policy and delivery process High This survey certainly informs how the current FHRS is operating in practice and how it might be modified going forward. By doing so from the perspective of businesses, consumers and local authorities it has the potential to contribute to the appraisal, implementation and delivery processes of the FHRS.
    Delivers solutions that are viable, actionable and represent value for money Medium This survey does not deliver solutions. That is not really the purpose of the survey.  However, it does identify and test empirically views on some potential changes to the FHRS (for example, using third-party independent audits and internal audits, the use of remote inspections, reduced physical inspections, assessing supermarkets and other large or multi-site businesses as a whole business). These are viable and actionable policy options, for which the value for money would need to be assessed separately using economic appraisal methods.

    Accessible

    Accessible Rating Comments
    Published High These three reports on the FHRS have been published by the FSA.
    Data made available where possible Medium There is a great deal of summary data in each of the three sub-reports (businesses, consumers, local authorities) on the responses to the online workshops and in-depth interviews. These are generally well presented with key themes, concepts and principles having been identified.
    There is a summary of the sampling methods used for the Consumers’ survey (Appendix 1), but not for the businesses or local authorities’ surveys. Greater detail about sampling and the wider methodology was readily forthcoming from the FSA social research team on request.
    It would have been advisable to make these background methodological details publicly available on the FSA website, preferably in the form of a Technical Report, as is the case with some of the other research outputs of the FSA social research  team.
    Clear and concise High Given the breadth of data collected across three samples (businesses, consumers, local authorities) the three reports on the Value of the FHRS are both clear and concise. The Executive Summary and the Summary Conclusions are also well presented in a clear and concise manner.
    Related to existing work in field Low Apart from the statement at the beginning of each report indicating that “as part of its work on the Achieving Business Compliance (ABC) programme, the FSA wanted to understand in more detail how Local Authorities (LAs), businesses and consumers feel about the current Food Hygiene Rating Scheme (FHRS)” there is little or no link to other relevant research work on the FHRS. This does not really lessen the value of these three reports, but it would have been useful to indicate the other related FSA research projects on the FHRS (e.g. the ‘FHRS Consumer Attitudes Tracker’ surveys, and the ‘Qualitative research to explore consumer attitudes to food sold online’). 
    Legal and ethical Rating Comments
    Complies with relevant legislation High These reports comply with GDPR legislation. 
    Complies with GSR ethical guidelines High These three surveys do comply with GSR ethical guidelines.

    FSA QAT Assessing Research reports checkllst, FSA Food Hygiene Rating Scheme Surveys: Checklist 2: Assessing research reports

    Q1. Title, lead author and year
    Value of the FHRS: Business, Consumers and Local Authorities. Lead author: IPSOS May 2022.

    Q2. Has a clear research need been outlined?  
    Yes – fully -  The research need for these three surveys was expressed in terms of the FSA wanting to a) understand in more detail how Local Authorities (LAs), businesses and consumers feel about the current FHRS, and b) capture consumer views on potential changes to the regulatory approach.
    Q3. Has a precise research question/aim been specified?  
    Yes – fully - For each of the three surveys (businesses, consumers, local authorities) clear and precise questions have been made from the outset.
    Q4. Is the research design… 
    Cross-sectional - Comparative (across England, Wales and Northern Ireland). 
    Q5. Is the research method… 
    Qualitative 
    Q6. Is there a good match between the research question/aim, research design and research method?  
    Yes – fully -  The research design and qualitative methods are a good match with the aims of the research and the questions addressed. The sections on methodology and sampling indicate an appreciation of how to undertake a qualitative survey.
    Q7. Is the study population and setting specified?  
    Yes – fully -  The study population includes the populations of England, Wales and Northern Ireland. The samples are well designed to capture these different populations and their distribution by key demographic factors.
    If Q5 = Qualitative, go to Q8a. If Q5 = Quantitative, go to Q8b. If Q5 = Both, go to Q8a and Q8b.

    Q8b. Is the sampling method… 
    Purposive sampling. Other Thematic Analysis.
    Go to Q9. 
    Q9. Is the sampling method appropriate for addressing the research question? 
    Yes – fully - The samples for each survey – of businesses, consumers and local authorities - are purposively selected to reflect these entities and different characteristics of each (e.g. type and size of business; demographic factors; local authority areas).
    If Q5 = Qualitative, go to 9a. If Q5 = Quantitative, go to 9b. If Q5 = Both, go to Q9a and Q9b.
    Q9a. Is the sampling method appropriate for addressing the research question?
    Yes Fully - The sample sizes are appropriate for qualitative research, and there is some over-sampling of businesses with low FHRS score (1-3) that represent a small proportion of businesses.


    Go to Q10.
    Q10. Are the research instruments valid and reliable?  
    Yes – fully.

    If Q5 = Qualitative, go to Q11a. If Q5 = Quantitative, go to Q11b. If Q5 = Both, go to Q11a and Q11b.
    Q11b. Is the analytical approach… 
    Thematic analysis 

    Got to Q12. 
    Q12. Is there a good match between the analytical approach, the research method and the research question?
    Yes – fully - The analytical approach has been designed and executed to identify the themes, concepts and principles of respondents’ experiences of the FHRS. It has done this well.
    Q13. Has a relevant checklist from the EQUATOR Network been used in the reporting of the results?  
    Yes – partly -  Not explicitly, though the reporting of these three surveys follows closely the structure of the EQUATOR Network guidance document for qualitative research (O'Brien B C, Harris I B, Beckman T J, et al, 2014). 
    Q14. Have descriptive data on the characteristics of participants been presented?
    Yes – partly - There is less descriptive data on the characteristics of participants in the final reports of these three surveys than in the background methodology responses provided by the contractors for the FSA research team. It would be good practice to make available these methodological details perhaps in the form of a separate Technical Report.
    Q15. Have two or more researchers been involved in the analysis process (e.g., through double coding)?
    Cannot say. This information is unavailable
    Go to Q16.
    Q16. Is there consistency between the data presented and the themes?
    Yes – fully - The themes, concepts and principles underlying respondents’ views and experiences of the FHRS are fully consistent with the data presented
    Go to Q17.
    Q17. Have similarities and differences between participants been explored (e.g., negative cases)?
    Yes – fully - The analysis of the qualitative data has identified positive and negative themes in respondents’ views and experiences of the FHRS. It has also shown that these different views and experiences reflect the background characteristics of respondents (e.g. type and size of business, urban and rural setting, etc)
    Go to Q18.
    Q18. Did participants provide feedback on the findings (i.e., member checking)?
    No - There is no indication that the analysis and findings of these three reports have been fed back to the businesses, consumers or local authorities involved. Further information on this would be helpful.
    Go to Q21.
    Q21. Has generalisability been considered in the interpretation of the results?  
    Yes – fully - The reports have revealed generalised findings within the context of a qualitative survey, for example, identifying themes, concepts and principles that were revealed across nations and sub-groups of the population. There was also attention given to where there was a diversity of views and experiences about the FHRS. The balance between generalisability and context specificity has been presented rather well. 
    Q22. Has causality been considered in the interpretation of the results? 
    Not applicable - Causality is hardly appropriate with a non-experimental or non-quasi experimental design. Thematic analysis is what is offered and what is appropriate.
    Q23. Has uncertainty been considered in the interpretation of the results? 
    Yes – partly - The results of these three surveys have been reported with due caution.  Uncertainty is indicated to some extent given the diversity and the context specificity of the findings, but this is perhaps best expressed as caution rather than uncertainty.
    Q24. Has a clear study conclusion been presented?  
    Yes – fully - All three studies that make up this overall view/review of the FHRS has presented clear conclusions.

    Outputs: Testing the impact of overt and covert ordering interventions on sustainable consumption choices: a randomised controlled trial on an online supermarket.

    Authors: Kantar Public's Behavioural Practice

    Date: May 2022

    Assessment of Research outputs using GSR code

    Rigorous and impartial

    Rigorous and impartial Rating Comments
    Based on sound methodology and established scientific principles High This trial is well designed with good attention to sample size, appropriate comparison groups (three-way), valid and reliable indicators and other research instruments, and is well reported with appropriate cautious interpretation.
    Quality assured Medium This report was quality checked internally by two FSA social researchers and one external expert. The format of the QA process consists of reviewers’ comments using track changes within the margins of the final report. A separate QA report, preferably by two external reviewers, would be more rigorous and appropriate.
    Based on best design, given constraints High RCT design is the best approach to establish the net effects of an intervention  over a counterfactual. Hence this was the best design. Given the budget constraints (mentioned by the contractors) this trial was undertaken to a high standard.
    Conclusions are clearly and adequately supported by data High The conclusions of the trial are presented clearly as ‘Key Findings’ in the Executive Summary (page 6), and they are adequately supported by data. It would have been a good idea to have the ‘Key Findings’ presented at the end of the full report.

    Relevant

    Relevant Rating Comments
    Anticipates future policy issues as well as addressing current ones Low The report locates the topic of overt and covert ordering interventions on sustainable consumption choices within the context of the UK government’s National Food Strategy (Dimbleby, 2021) and the need “to understand how interventions in online shopping environments affect consumer choices in relation to the sustainability of products”. Apart from this very brief mention of the policy context there is no anticipation of future or current policy issues. This does not appear to have been part of the trial’s specification.
    Answers clear and researchable questions High This report poses the central research question as “how interventions in online shopping environments affect consumer choices in relation to the sustainability of products”. More specifically it asks “whether a specific choice architecture intervention – displaying products in an ascending order of their carbon footprint – in an online supermarket environment can shift consumer choices towards more sustainable options compared to when products are randomly ordered”.
    Contributes to all stages of the policy and delivery process Low By indirect implication the findings of this report might contribute to how food policy and delivery might be developed, but the contribution is implicit and opaque. It does not contribute to all stages of the policy and delivery process. 
    Delivers solutions that are viable, actionable and represent value for money Low The report is on the technical aspects and key findings of the trial. It does not address solutions that are viable, actionable and represent value for money.  

    Accessible

    Accessible Rating Comments
    Published High This report of the online supermarket trial was published in May 2022. It has a good and readable Executive Summary as well as a full report of its methods and findings.
    Data made available where possible High Summary data tables are fully presented in the body of the report and in its Appendices. 
    Clear and concise High Given the complexity and detail of the intervention, and the trial methodology and procedures, the report can be considered and concise. Technical details are presented relatively clearly and the Executive Summary is very clear and concise.
    Related to existing work in field Medium There is a brief mention of “the few existing studies, which were based on behaviour in bricks-and-mortar environment using hard-copy menus” and to studies of prompting people to make healthier food decision using pop-ups.  Otherwise, there is little or no reference to related to existing work in the field.
    Legal and ethical Rating Comments
    Complies with relevant legislation High These reports comply with GDPR legislation. 
    Complies with GSR ethical guidelines High These three surveys do comply with GSR ethical guidelines.

    * External contractors are not always in a position to “anticipate future policy issues as well as addressing current ones”, “contribute to all stages of the policy and delivery process” and “deliver solutions that are viable, actionable and represent value for money”. Hence, a low or medium score reflects the limitation of using the GSR Self-Assessment tool for assessing the quality of research outputs.

    FSA Quality Assurance Toolkit - Online Supermarket Trial: Checklist 2: Assessing research reports

    Q1. Title, lead author and year 

    Testing the impact of overt and covert ordering interventions on sustainable consumption choices: a randomised controlled trial.  Kantar Public’s Behavioural Practice. May 2022

    Q2. Has a clear research need been outlined?  
    Yes – fully - This research has been undertaken to address the need for evidence on how consumers respond to different ways of presenting information about the sustainability of food products. This is in response to the fact that online supermarkets constitute an increasingly large share of grocery shopping, 12.6% of grocery sales were made online in March 2022 compared with just 8.0% three years ago.
    Q3. Has a precise research question/aim been specified?  
    Yes – fully - The central research question was how do interventions in online shopping environments affect consumer choices in relation to the sustainability of products. Specifically, do overt, covert or no ordering of information about the sustainability of food products make a difference to consumer choice?t.
    Q4. Is the research design… 
    Experimental.

    Q5. Is the research method… 
    Quantitative 
    Q6. Is there a good match between the research question/aim, research design and research method?  
    Yes – fully -   The three-arm between-subjects design with randomisation is an appropriate research design for the central research question of the study. 
    Q7. Is the study population and setting specified?  
    Yes – fully -   The study population is specified as online grocery shoppers who are aged over 18 in England, Wales, and Northern Ireland. They are then selected by population characteristics of age, gender and ethnic group (White, Asian, Black, Mixed, Other).
    If Q5 = Qualitative, go to Q8a. If Q5 = Quantitative, go to Q8b. If Q5 = Both, go to Q8a and Q8b.

    Q8b. Is the sampling method… 
    Quota sampling. 
    Go to Q9. 
    Q9. Is the sampling method appropriate for addressing the research question? 
    Yes – fully -  The authors note “as no official statistics were available on the specific demographic breakdown of online grocery shoppers in the targeted areas, we used quotas plus screening questions to get a sample close to a representative sample of the target group”.
    If Q5 = Qualitative, go to 9a. If Q5 = Quantitative, go to 9b. If Q5 = Both, go to Q9a and Q9b.
    Q9a. Is the sampling method appropriate for addressing the research question?
    Yes Fully -  The authors note that “the panel provider sent out new invites to potential participants in batches until the planned sample size was reached”.  Large sample sizes were collected that were appropriate for three-way randomisation.

    Go to Q10.

    Q9b. Has a sample size calculation been conducted?

    Yes fully - Sample size was calculated based on a power simulation, run using a logistic regression model. The authors used a power of 0.999 to detect a difference of 8%, and a power of 0.843 to detect a difference of 5%.
    Q10. Are the research instruments valid and reliable?  
    Yes – fully -  Six products in each product category were used because this study focuses on ordering of products, namely the position effects, and six products should give enough variation in terms of position of products. Given a fixed budget, there is a trade-off between the number of products in each category and the number of categories. participants were asked about their environmental concern, attitudes towards nudges, normative attitudes towards shopping sustainably, whether they eat meat, and demographics.

    If Q5 = Qualitative, go to Q11a. If Q5 = Quantitative, go to Q11b. If Q5 = Both, go to Q11a and Q11b.
    Q11b. Is the analytical approach… 
    Linear Regression. 

    Got to Q12. 
    Q12. Is there a good match between the analytical approach, the research method and the research question?
    Yes – fully. 
    Q13. Has a relevant checklist from the EQUATOR Network been used in the reporting of the results?  
    Partly met -  Although a relevant checklist from the EQUATOR Network was not used the reporting of the results was well structured and relevant.
    Q14. Have descriptive data on the characteristics of participants been presented?
    Yes – fully - Baseline demographic characteristics are presented fully in Appendix D. 

    Q19. Have descriptive data on exposures/interventions and potential cofounders been presented?

    Yes - fully -  Descriptive data on exposures/interventions and potential confounders have been presented in the ‘Procedure’ section and in Appendix B (Products Lists). 

    Go to Q20.

    Q20. Have unadjusted and adjusted point estimates and confidence intervals been presented alongside statistical significance?

    Yes - fully -  Confidence intervals (CI) are reported around Odd Ratio (OR) throughout the data analysis.

    Go to Q21. 

    Q21. Has generalisability been considered in the interpretation of the results?  
    Yes – fully - The authors note that “the results still come from an online experiment completed by panellists, which potentially threatens the ability to generalise from our results to the real-life situation we are studying (external validity) and which could be better dealt with using a field trial”. They also mention potential threats to generalisability at various points in the analysis and discussion sections of the report.
    Q22. Has causality been considered in the interpretation of the results? 
    Yes fully - A causal pathway linking inputs to activities, outcomes and longer-term outcomes has been developed. This is commendable. Caution is advised by the authors about interpreting the ‘real world’ external validity and causality of the findings given the trial’s simulated nature.
    Q23. Has uncertainty been considered in the interpretation of the results? 
    Yes – fully - The authors note: We used logistic mixed-effects models, which included separate error terms for participant and product category, allowing us to incorporate additional uncertainty in the estimates of intervention effects associated with variation between participants and categories.
    Q24. Has a clear study conclusion been presented?  
    Yes – fully -  The report concludes with a list of ‘Key Findings’ and an overall conclusion that “There was no effect of the covert ordering intervention on the probability of choosing more sustainable products versus less sustainable products, compared with the control arm (OR = 0.97, 95% CI 0.88-1.07, p-value = 0.533).

    Outputs: Kitchen Life 2: Literature Review of Food Safety & Hygiene Behaviours in Domestic & Commercial Kitchens.

    Authors: Weller, J., Kaptan, G, Bhandal, R., & Battachery, D., Leeds University Business School and Basis Social

    Date: May 2022

    Assessment of the Kitchen Life 2: Literature Review using GSR Code

    Rigorous and impartial

    Rigorous and impartial Rating Comments
    Based on sound methodology and established scientific principles Medium This literature review falls short of a systematic review in that it uses a limited search string and only two databases (Scopus and Web of Science). It also does not appear to use double screening or double data extraction. In fairness, this output is presented as a literature review and not a systematic review, but this is less that a sound methodology or established scientific practice nowadays.
    Quality assured Medium This review was quality assured internally by the FSA social research team and by members of the Social Science Advisory Board. It was not reviewed by wholly external peer reviewers using the usual standards of practice.
    Based on best design, given constraints Low Given that this review had over eight months to be completed the best design would have been a narrative systematic review following Cochrane/Campbell methodological procedures and guidelines. Alternatively, it could have used Rapid Evidence Assessment methodology which would have enhanced its search and analytical approach.
    Conclusions are clearly and adequately supported by data Medium This review does not have a ‘Conclusions’ section, though the section on ‘Recommendations’ summarises the key findings and ‘take home’ messages clearly and well. These are supported by the data collected, but given the somewhat narrow search procedures there may be some risk of bias in what is presented.

    Relevant

    Relevant Rating Comments
    Anticipates future policy issues as well as addressing current ones High This review focuses on key behaviours relating to food safety in domestic and business kitchens within the context of the FSA’s policy response to the COVID pandemic. Its ‘key recommendations’ about what people need to know and do have considerable implications for current and future policy issues.
    Answers clear and researchable questions High  This review seeks “to identify the key behaviours relating to food safety that occur in domestic and business kitchens, as well as the factors that may reduce the likelihood to enact recommended food safety & hygiene behaviours”. This would appear to be its main research question and it is researchable.
    Contributes to all stages of the policy and delivery process Medium This review has implications for the further development and implementation of the FSA’s policies on food hygiene. To this extent it does contribute to the policy process. However, given its limitations with searching for appropriate evidence its findings should be used with some degree of caution.  
    Delivers solutions that are viable, actionable and represent value for money Medium This review identifies a number of activities in domestic and business kitchens that may require action to improve food hygiene and ensure healthier eating. It does not “deliver solutions”, nor is this its purpose or role. The latter Is surely the responsibility of policy makers at the FSA and local authority inspectors.

    Accessible

    Accessible Rating Comments
    Published High This literature review was published by the FSA in October 2021. 
    Data made available where possible Low This review does not include a ‘flow diagram’ of how the initial search yields were reduced to the studies that were finally included. There is no list of excluded studies with the reasons for their exclusion. The studies included in the review are well documented as part of the narrative of the report.  There is no attempt to rate the included studies in terms of their strength of evidence.
    Clear and concise High This research output is presented clearly and reasonable concisely. It is also easy to read.
    Related to existing work in field High The content of this report is by definition related to existing research and evaluation in the field, including meta-analyses.
    Legal and ethical Rating Comments
    Complies with relevant legislation High This report seems to comply with relevant legislation that outlines the FSA's regulatory role, and its law enforcement function on food crimes and food hygiene. 
    Complies with GSR ethical guidelines High This survey has been undertaken in compliance with the GSR ethical guidelines. 

    FSA QAT Assessing Research Reports Checklist
    Kitchen Life 2: Literature Review of Food Safety and Hygiene Behaviours Checklist 2 – Assessing research reports

    The FSA’s Quality Assessment Toolkit does not include a checklist for assessing the quality of literature reviews. It does provide links to the ‘Preferred Reporting Items for Systematic Reviews and Meta-Analyses’ (PRISMA) guidelines and checklist via the Equator Network. The ‘issues for consideration’ are based on the PRISMA guidelines.

    Outputs: Psychologies of Food Choice: Public views and experiences around meat and dairy consumption

    Authors: Caitlin Connors, Melanie Cohen, Sam Saint-Warrens, Fan Sissoko, Francesca Allen, Harry Cerasale, Elina Halonen, Nicole Afonso Alves Calistri & Claire Sheppard.

    Date: October 2021

    Overall Rating: A high quality research output.

    Assessment of the Psychologies of Food Choice:
    Public views and experiences around meat and dairy consumption using GSR Code

    Rigorous and impartial

    Rigorous and impartial Rating Comments
    Based on sound methodology and established scientific principles Medium This study is based on ‘remote’ ethnography and mixed methods iterative qualitative research, and has been designed, executed and reported in accordance with established principles. Given the length and format of its final report it does not present the amount of detailed observational and interview data one might expect from this approach. Nonetheless, it does present summary data from which the main themes of the report are generated.
    Quality assured Medium The report was quality assured by the FSA social research team with some external peer review comments using track changes.
    Based on best design, given constraints High More of a mixed methods iterative qualitative investigation than what is commonly considered ethnography. But its design is appropriate and well executed.
    Conclusions are clearly and adequately supported by data High This study does indeed meet this dimension. It makes clear conclusions, draws attention to the non-linear complexity of a capability-motivation-opportunity behavioural model, and supports claims with well collected and well-reported qualitative data.

    Relevant

    Relevant Rating Comments
    Anticipates future policy issues as well as addressing current ones Medium To some extent this study meets this standard, but this was not its primary task. The team had ‘senior expertise in food and public policy research’ and it anticipates that “overly linear and simplistic policy responses are inappropriate given the complexity of the capability-motivation-opportunity behavioural model”. This point is valid.
    Answers clear and researchable questions High The purpose of the study is clear. It “aims to explore the situational, social, emotional and psychological roles of meat and dairy and how these influence buying and eating decisions”. It asks: “what motivates consumers to cut down on or cut out meat and dairy, and their experiences in doing so".
    Contributes to all stages of the policy and delivery process Medium This study does not intend to contribute to all stages of the policy and delivery process. That is not its primary purpose. It does, however, address some of the implications of the complexity of the capability-motivation-opportunity behavioural model for policy making. A one-size-fits-all policy approach is inappropriate. The policy and delivery process needs to be more nuanced and contextualised. 
    Delivers solutions that are viable, actionable and represent value for money Medium This study does not meet this standard directly. It does, however, provide some ‘Implications for Policymakers’ based on the “rich lived experience evidence on what shapes meat and dairy consumption behaviours for a wide variety of UK residents”. It delivers knowledge of the potential barriers and behavioural facilitators that influence meat and dairy consumption.

    Accessible

    Accessible Rating Comments
    Published High Freely available and downloadable as a PDF from the FSA website. 
    Data made available where possible Medium Given the length of the published report the authors were unable to give detailed transcript data or field notes. They do, however, provide good summary data on fieldwork observations and participants’ responses in interviews and groups. 
    Clear and concise High This report of this study is well presented in a clear and concise manner. 
    Related to existing work in field High The report identifies and uses other existing work in the field, including research on the COM-B Model of behaviour. It also has good linkage to the early literature review findings of the University of Bath team.
    Legal and ethical Rating Comments
    Complies with relevant legislation High These reports seems to comply with relevant legislation that outlines the FSA's regulatory role, and its law enforcement function on food crimes and food hygiene. 
    Complies with GSR ethical guidelines High These study complies with GSR ethical guidelines. 

    * External contractors are not always in a position to “anticipate future policy issues as well as addressing current ones”, “contribute to all stages of the policy and delivery process” and “deliver solutions that are viable, actionable and represent value for money”. Hence, a low or medium score reflects the limitation of using the GSR Self-Assessment tool for assessing the quality of research outputs.

    FSA QAT Assessing Research Reports Checklist
    Psychologies of Food Choice: Public views and experiences around meat and dairy consumption Checklist 2 – Assessing research reports

    Q1. Title, lead author and year 

    Psychologies of Food Choice Public views and experiences around meat and dairy consumption, Connors, C., October 2021

    Q2. Has a clear research need been outlined?  
    Yes – fully - The research need for this study was to fill some gaps in the existing knowledge of the drivers of behaviour and the triggers to try eating differently.
    Q3. Has a precise research question/aim been specified?  
    Yes – fully -  The precise aim of the study was “to explore the situational, social, emotional and psychological roles of meat and dairy and how these influence buying and eating decisions”.
    Q4. Is the research design… 
    Ethnographic and behavioural.

    Q5. Is the research method… 
    Qualitative 
    Q6. Is there a good match between the research question/aim, research design and research method?  
    Yes – fully - The use of an ethnographic and mixed methods qualitative research to study a ‘capability-motivation-opportunity’ model of behaviour is a good match
    Q7. Is the study population and setting specified?  
    Yes – fully -   The population is food consumers in England. The sample was 33 participants, and the setting was at-home tasks and online interviews and workshops.
    If Q5 = Qualitative, go to Q8a. If Q5 = Quantitative, go to Q8b. If Q5 = Both, go to Q8a and Q8b.

    Q8a. Is the sampling method… 
    Purposive sampling. 

    Q8b. Is the sampling method...

    Quota sampling - Structured quotas and screeners were agreed with the FSA team. 
    Go to Q9. 
    Q9. Is the sampling method appropriate for addressing the research question? 
    Yes – fully -  The sample represented a broad cross section of the UK: including residents of England, Wales and Northern Ireland, and representing a range of variables including age, gender, lifestage, household composition, meat consumption habits, ethnicity and religion.
    If Q5 = Qualitative, go to 9a. If Q5 = Quantitative, go to 9b. If Q5 = Both, go to Q9a and Q9b.
    Q9a. Is the sampling method appropriate for addressing the research question?
    Yes Fully -   The sample was based on the questions addressed and the expertise in recruiting complex quotas for rigorous social policy research.

    Go to Q10.

    Q9b. Has a sample size calculation been conducted?

    Yes fully - Sample size was calculated based on a power simulation, run using a logistic regression model. The authors used a power of 0.999 to detect a difference of 8%, and a power of 0.843 to detect a difference of 5%.
    Q10. Are the research instruments valid and reliable?  
    Yes – fully -   The research instruments included a set pro-forma using the COM-B structure, structured reviews of data, record of ‘key findings’ from brainstorm sessions and structured analysis documents. These are valid and reliable for iterative qualitative research.

    If Q5 = Qualitative, go to Q11a. If Q5 = Quantitative, go to Q11b. If Q5 = Both, go to Q11a and Q11b.

    Q11a. Is the analytical approach...

    Thematic analysis.

    Got to Q12. 
    Q12. Is there a good match between the analytical approach, the research method and the research question?
    Yes – fully -  The analytical approach and the methods used for data collection and analysis are appropriate for this type of ethnographic and iterative qualitative research. 
    Q13. Has a relevant checklist from the EQUATOR Network been used in the reporting of the results?  
    No - The study would have benefited from using the ‘standards for reporting qualitative research: a synthesis of recommendations’ (in the EQUATOR Network). The reporting that is provided meets many of these standards.

    Q14. Have descriptive data on the characteristics of participants been presented?
    Yes – fully - These descriptive data are presented on page 4 of the report. 

    If Q5 = Qualitative, go to Q15. If Q5 = Quantitative, go to Q19. If Q5 = Both, go to Q15 and Q19.

    Q15. Have two or more researchers been involved in the analysis process (for example, through double coding)?
    Yes – fully - The team shared and compared emerging findings throughout fieldwork (viaWhatsApp).After each fieldeldwork week, the full research team (sometimes including FSA colleagues) conducted a structured review of data - noting new emerging questions or gaps and comparing audience groups.
    Go to Q16.
    Q16. Is there consistency between the data presented and the themes?
    Yes – partly
    Yes – fully
    The data presented were mostly extracts of verbatim data from interviews and workshops. These were generally consistent with the themes identified, though there is an element of de-contexualisation of these verbatim data.

    Go to Q17.

    Q17. Have similarities and differences between participants been explored (e.g., negative cases)?
    Yes – fully -  The authors found that usage of definitions of meat and dairy alternatives labels was wildly inconsistent between participants, and often even within single participants’ language. 
    Go to Q18.
    Q18. Did participants provide feedback on the findings (i.e., member checking)?
    Yes – partly - The workshops allowed some degree of participant feedback.

    Go to Q21.

    Q20. Have unadjusted and adjusted point estimates and confidence intervals been presented alongside statistical significance?

    Yes - fully -  Confidence intervals (CI) are reported around Odd Ratio (OR) throughout the data analysis.

    Go to Q21. 

    Q21. Has generalisability been considered in the interpretation of the results?  
    Yes – fully - The main generalised finding of this study was that there is considerable variety, difference & conflict in understanding food products and in dietary choices. The study concludes that a one-size-fits-all policy approach is inappropriate for understanding how to influence food consumption behaviour.
    Q22. Has causality been considered in the interpretation of the results? 
    No

    Yes - partly and directly - ACausality in the interpretation of the results is not mentioned explicitly, nor is it the focus of ethnographic research. However, the observation that “not one participant in this work reported a ‘clear line’ of behaviour across the COM-B framework”, and that “a one-size-fits-all policy approach is inappropriate for understanding how to influence food consumption behaviour, both suggest that there is no direct causality between drivers/triggers and food consumption behaviour.
    Q23. Has uncertainty been considered in the interpretation of the results? 
    Yes – partly -  The corollary of the above observations (Q22) is that there is uncertainty in the relationship between drivers and outcomes of behaviour. 
    Q24. Has a clear study conclusion been presented?  
    Yes – fully -  The study concludes that a one size fits all policy approach is inappropriate for understanding how to influence food consumption behaviour. 

    Department: Food Standards Agency (Social Science Team)

    Name of Head of Profession: Joanna Disson

    Date completed: April 2023

    How to complete the template

    Red, Serious concerns

    • you have very limited/no control over application of this indicator and/or
    • systems not in place or in development and/or
    • you know or suspect that practice is highly variable

    Amber, Development area

    • you have some control over application of this indicator and/or
    • procedures may be in development and/or
    • practice is reasonably consistent, but you may have some concerns

    Green, Strong

    • you have sufficient control over the application of this indicator and/or
    • you have clear procedures/guidance in place (if appropriate) and/or
    • you are confident that these are known and applied most of the time
    •  

    GSR Products

    Rigorous and Impartial

    • based on sound methodology and established scientific principles

    • quality assured
    • based on best design, given constraints
    • conclusions are clearly and adequately supported by data
    Indicators Self-assessment rating Evidence to support assessment rating  Issues or risks Good practice
    Project design
    RED: Projects not formally reviewed (either internal or external review) at the design stage of a research project.
    AMBER: Some projects are reviewed at design stage, but practice is inconsistent or processes in development.
    GREEN: Processes in place for appropriate formal review – internal or external to organisation - at design stage of all projects to ensure that decisions on methods and methodology represent the best options, given available budgets and time constraints.
     
    Amber to Green

    Project specs are always reviewed internally – at least by G7, often G6.

    External peer reviewers may be appointed at the beginning of many projects, and review the project specs.

    This isn’t currently applied across every single project: a decision is made on a case-by-case basis subject to the pace, complexity and purpose of the work.

    Risk that the quicker turnaround project designs aren’t reviewed to the same extent as other projects. This may be appropriate as there is a need for proportionality, but there could be a more consistent, formal process to decide which projects need external peer review, and for which internal review is sufficient.

    Processes are in place for example,: ACSS QA Gateway for (almost) all new projects (there are some exceptions to this (for example, very low value and/or quick turnaround survey work)

    The new ACSS QA gateway process may bring this up to a “green”, by ensuring that new projects are externally reviewed at design stage.


    Recently have begun uploading project plans to the Open Science Framework: https://osf.io/

    Quality assurance of outputs

    RED: No formal procedures in place to quality assure outputs.
    AMBER: Processes in place for internal quality assurance but no formal mechanisms in place to ensure that outputs are appropriately independently reviewed.
    GREEN: Processes and guidance in place to advise on appropriate quality assessment at completion, including external and internal peer review as and when necessary, to ensure that methods and analysis have been rigorously executed, and conclusions are clearly
    and adequately supported by data.

    Amber to Green

    Most project outputs are externally peer-reviewed by the ACSS or the FSA’s Register of Experts.

    This isn’t currently applied across every single project: a decision is made on a case-by-case basis subject to the pace, complexity and purpose of the work. When there is sufficient capability in- house, outputs are quality assured internally.
    Some work has its own Steering / Advisory Group.
    Quality assurance also sought from cross- governmental colleagues working on similar issues (for example, behavioural Handwashing project review by cross-government handwashing group)
     

    -

    Register of Specialists and ACSS demonstrate well established mechanisms for seeking peer- review.

    The new UCL Quality Assurance  Toolkit is a further opportunity to embed good
    practice
     

    Overall rating (1 to 10): 8

    Relevant

    • anticipates future policy issues as well as addressing current ones
    • answers clear and researchable questions 
    • contributes to all stages of the policy and delivery process
    • delivers solutions that are viable, actionable and represent value for money.
    Indicators Self-assessment rating Evidence to support assessment rating Issues or risks Good practice
    Short/long term balance
    RED: The balance of short and long term social science work is not appropriate for the requirements of the evidence base.
    Little or no contribution to departmental horizon scanning work or attempt to anticipate future research needs.
    AMBER: There is a reasonable balance of long and short-term social science research but improvements necessary. Some attempt to anticipate future issues, possibly through GSR contribution to departmental horizon scanning work.
    GREEN: There is a good balance of long and short-term social science research. Future policy needs are anticipated, and GSR contributes to departmental horizon scanning work.
    Amber to Green

    We regularly monitor wider food trends and compile any stories that are of interest in a document on our shared drive. Every few weeks, we circulate the most interesting / relevant news stories across AU and also do an internal Yammer post.

    We conduct timely research on our rapid testing platform via the call off contract with Ipsos, for example, Alt Proteins report, Healthy sustainable diets report.

    We have started doing a monthly publication for the consumer insights tracker, increasing frequency so that customers can always quote the latest figures.

    We also have a suite of long-term tracking surveys, for example, Food and You 2, Consumer Insights Tracker, FBO Tracker.

    More work is done for the short term than the long term. 

    We led and contributed to various pieces of FSA horizon scanning work in 2020 and 2021 for not only the short term but the medium to long term for example, FSA Covid-19 Horizon Scanning, social media listening

    We delivered a whole programme of work at pace to meet evidence needs during Covid, for example, the Covid tracker and the handwashing tracker. 

    To inform the development of the new 3rd pillar of FSA’s strategy, we have published evidence reviews and research on healthy sustainable diets and developed new questions added to F&Y2.

    Departmental business planning
    RED: Little or no involvement in the development of departmental business plan, including input and impact indicators, or input to how performance against priorities will be managed.
    AMBER: GSR provides ad hoc analysis to inform development of departmental business plan, including input and impact indicators, and measuring of performance against priorities.
    GREEN: GSR is proactive and successful in providing analysis and advice for departmental business plan, including input and impact indicators, and development, monitoring and delivery. 
    Amber to Green

    Performance resources report which utilises Food and You 2 and Consumer Insights Tracker data

    Our data contributes to the published Annual Report and Accounts

    We conducted an internal piece of research to improve FSA Diversity and Inclusion Evaluation Action Plan published. 

    We produced a whole tranche of work in good time to inform the new third pillar of new FSA Strategy (for example, Public interests needs and concerns around food)

    -

    Liaising with policy stakeholders on F&Y2 questionnaire to cover emerging areas of interest,
    for example, GE, GM foods

    SS team member on each of the cross-FSA Area of Research Interest (ARI) Steering Groups. 

    Regular catch ups and key OGDs/research funders for example, Defra.

    Strategic level sign-off
    RED: There are no mechanisms in place for research programmes and projects to be signed off at policy and/or Ministerial/equivalent senior post for example, CEO level.
    AMBER: Some individual projects receive policy and/or Ministerial/equivalent senior post for example, CEO sign off, but there is scope for more engagement at this level and plans are in place to address this.
    GREEN: There is policy and/or Ministerial/equivalent senior post for example, CEO sign off for all research programmes and projects.
    Amber to Green

    Portfolio/ARI Steering Groups comprised of the policy stakeholders sign- off the bid for funding for the project.

    Investment Board (IB) process – all projects have to be signed off at Director level for funding approval.

    Our commissioning process – policy teams approach us to conduct work

    ACSS QA Gateway signs off new work design

    -

    We get sign off from policy area leads for final outputs, for example, fieldwork materials, reports Future Publication Panel (FPP) process requires project reports to be signed off by a DD panel for all publications not considered low risk by head of SERD.

    We brief the top of the office on new publications, including the
    research methodology and its limitations, so CEO/CSA Chair have sight prior to publication. 
     

    Impact is assessed
    RED: No post-hoc assessment of the impact of research. Difficult for HoPs to provide evidence of the impact of social research on policy and delivery.
    AMBER: Some ad-hoc assessment of research but not consistently applied, or processes may be in development.
    Some evidence of the impact of social research.
    GREEN: Appropriate processes are in place to assess the usefulness/impact of the research and the extent to which it has answered policy questions, including feedback from policy/delivery colleagues on project completion. HoPs are able to produce examples of policy relevant research for both current and future policy issues.
    Amber to Green

    EPI forms help us establish impact throughout the lifecycle of a project.

    Social Science Impact Log, which tracks our dissemination activities and how our data has been used.

    Food and You 2 feedback survey sent to policy and other stakeholders (not yet sent – draft survey uploaded as evidence).

    Tracking DOIs

    -

    Engaging with external research users.

    Post project review of impact – following up with policy to understand which impacts we anticipated have been realised / which recommendations implemented. 

    Overall rating (1 to 10): 8

    Accessible

    • published
    • data made available where possible
    • clear and concise
    • related to existing work in field
    Indicators Self-assessment rating Evidence to support assessment rating Issues or risks Good practice

    Published

    RED: Outputs are not routinely published according to GSR Publication Guidance. Embedding the guidance is an aspiration but there is still work to do on promoting and integrating it within the organisation. No record of exceptions is kept. No clear communications plans.
    AMBER: Outputs are usually published according to GSR Publication Guidance but some work to do on promoting its use and embedding it into organisational procedures. A record of exceptions is kept but this is not comprehensive.
    Communications plans in development. GREEN: Research outputs are routinely published to time according to GSR Publication Guidance, and all exceptions are recorded. GSR Publication Guidance
    is promoted and integrated into organisational procedures. Clear communications plans in place. 

    Green

    Social research reports published on the FSA website under ‘Research reports’ in line with accessibility guidance (now in HTML format).

    Social research reports and data tables are published in line with accessibility guidelines. The FSA Intranet has a whole area and pages dedicated to accessibility, with guidance and templates (to use internally and share with contractors) and training videos. We also have a folder in the Social Science Teams channel with accessibility guidance, resources and templates. All members of the social science team have attended accessibility training (provided by the FSA comms team), and refresher sessions and lunch and learns are run frequently.

    Procurement specification advice can be found on the Intranet, and procurement specification templates are shared with contractors which states that: in line with the Government’s Transparency Agenda which aims to encourage more open access to data held by government, the Agency is developing a policy on the release of underpinning data from all of its science- and evidence-gathering projects. Data should be made freely available in an accessible format, as fully and as promptly as possible. 

    Project officers update the SERD publication tracker on a monthly basis with estimated dates of publication to track research publications and comms.

    All publications (unless granted special exception) go through our Future Publications Panel process to ensure publication is handled appropriately and received sign off from each Director. This process also checks for assurance of accessibility (see FPP form)

    DOI (Digital Object Identifier) codes are assigned to each published report to help users to identify the object online (for example, publication in online journals) and act as a unique persistent identified to publications. 

    FSA research is published as  standard in line with our open and transparent principles, not always but usually within 12 weeks of receiving final output as per GSR publication guidance.
    The majority of our research has a clear communications plan in place, where a comms plan is not in place there is usually a clear reason for this.

    We usually have a rationale for when publications do not follow the GSR publication principles, but our exceptions are not typically recorded.

    We could do more to promote the GSR publication guidance internally, and ensure it’s used.
    GSR share the guidance with Perm Secs every so often to remind them. It’s sometimes hard to make this a reality – though as a non- ministerial department this isn’t too big an issue.

    Principle 2 of publication principles guidance states: Protocols and analysis plans should usually be developed and published in advance of any study being started. We don't usually do this (although trial protocols were published on Open Science Network). 

    Example comms plans from recently published research projects: The UK Public’s interests, needs and concerns around food
    A rapid evidence review on sustainability

    Working groups / workshops with FSA colleagues across different teams, with reports shared and published.
    For example the Climate Change and Consumer Behaviours workshop with colleagues from Policy and Strategic Insights. 

    All data published via FSA data catalogue, or for smaller projects data is available on request (F&Y2 data also published on UKDS).

    A quarterly FSA Science newsletter is sent out to around 700 external stakeholders to flag research and science news.
    Example from June send out saved here. This demonstrates our
    work on the sub-principle around 'relating to existing work in field'. Our SERD stakeholder list also enables us to connect with the wider network (OGDs, academia, consumer, trade associations, learned societies, research agencies etc.)

    Cross- governmental networks and workshops (e.g. upcoming Household Food Insecurity workshop) and ACSS working groups to share learning and ensure our research is accessible and known (related to
    existing work in field). 

    Format

    RED: Outputs are not sufficiently accessible for all stakeholder groups. Local guidance for authors is not available.
    AMBER: There is a move towards clear, concise and more accessible outputs but practice is inconsistent or in development.
    GREEN: All outputs are written in clear, concise and jargon free language with short summaries routinely produced. Key messages are easily identifiable and understood. Guidance is available for authors of reports and enforced by GSR project managers.

    Green

    Transparency policy (published on the Intranet here) shows that openness is one of the core values of the FSA.

    All research reports are published as HTML now and checked by Samantha Merrett (our accessibility lead in Comms) prior to publication and data tables are shared with John Clowes (data team) for publishing on the FSA data catalogue. Our KIMS team also assess each new dataset for suitability of publication using an online form (process outlined here).

    The Welsh language policy published on our Intranet details how we are legally obliged to 

    provide all services in Welsh. We liaise with the Welsh Translation Unit to check requirements when publishing (N.B. not always required but a part of the process).

    All reports routinely include an ‘Executive summary’ so that readers can identify the key messages. Larger projects such as Food and You 2 and Kitchen Life 2 have webpages that summarise the project so that readers do not have to download the full report.

    We do usually ask for 1-3-25 reports in our research specifications but could embed this further. 

    Example reports with executive and/or short summaries included:

    The UK Public’s interests, needs and concerns around food (Executive summary included at start of report)

    Food and You 2 (main findings summary included on webpage in addition to full final report and technical report. Executive summary also included at start
    of report).

    Consumer Insights Tracker.

    Rapid evidence review on sustainability is an example report with both an executive summary published as part of report and a standalone 3- page summary for internal use to key messages are easily identifiable and understood. 

    Overall rating (1 to 10): 9

    • complies with relevant legislation
    • complies with GSR ethical guidelines
    Suggested indicators Self-assessment rating Evidence to support assessment rating Issues or risks Good practice
    Ensuring good practice in the commissioning, management and conduct of government social research
    RED: GSR research does not uphold the principles outlined in the GSR Ethics guidance. There is no clear process for identifying and assessing risks which may compromise these principles, or for obtaining further relevant advice and clearance for projects where necessary.
    AMBER: The GSR Ethics principles are upheld as far as the head of profession is aware, but the process for identifying and assessing risks which may compromise these principles, and for obtaining further relevant advice and clearance for projects could be better embedded. Risks may not be reviewed and managed throughout the life of a project, or there may be inconsistent application of process between commissioned and in-house research.
    GREEN: GSR research upholds the GSR Ethics
    principles. There is a clear process for identifying and assessing risks which may compromise these principles, and for obtaining further relevant advice and clearance for projects where necessary. These processes are not a one-off but ensure that risks are reviewed and managed appropriately throughout the life of both commissioned and in-house projects.
    Amber to Green

    For all externally commissioned work, ethics are required in the research specification to be considered by tenderers using the GSR ethical guidelines. We highlight particular areas and additional sensitivities for consideration.

    We have, on occasion, sought external advice from an ethical advisory. In addition, our recent behavioural trials have used Kantar’s ethical panel. And had extensive ethical procedures for for example, Kitchen Life 2 

    Ethical issues are included in our discussions regarding new work at the ACSS QA Gateway.

    Principle 1: Research should have a clear user need and public benefit
    •    Investment Board Process for approval of research projects to ensure they are it offers value for money and aligns with the needs of the organisation
    •    Policy of transparency - all research is published and presented to internal FSA stakeholders. Research methods used are outlined in all publications along with findings.

    Principle 2: Research should be based on
    sound research methods and protect against bias in the interpretation of findings

    •    ACSS Steering Group - all new research projects are reviewed by the ACSS to ensure that the research questions and methods used are sound.
    •    QAT asks some high level questions about methods and justification

    Principle 3: Research should adhere to data protection regulations and the secure handling of personal data
    •    PIA form completed for some projects
    •    Data protection processes form part of the tendering process
    •    Data protection response for call off contract checked by
    call off contract lead

    Principle 4: Participation in research should be based on specific and informed consent
    •    Informed consent is obtained for all research projects which are conducted.
    •    Good example from KL2 where multiple layers of consent were gained in businesses.

    Principle 5: Research should enable participation of the groups it seeks to represent
    •    Aim to gain a representative sample based on geography, SES, ethnicity, age.
    •    F&Y2 have paper based and web data collection options

    Principle 6: Research should be conducted in a manner that minimises personal and social harm

    •    Currently research is predominately conducted with those who are less at risk of harm. When conducting research with businesses/ organisations where case studies are used- provided with opportunity to review to ensure that reputational harm is not being caused.
     

    For GSR ethics principles:

    1.    None
    2.    None
    3.    New team members may not be aware of PIA guidance. Team training session to be arranged
    4.    None
    5. Considerations for enabling participation for those who are harder to reach (for example, those who don't have internet access those with English as a Second Language)

    6. Agree processes for engaging with vulnerable groups and for asking questions on the sensitive topics listed. 

    Training on aspects such as 6 point plan and support for researchers (both internal and external).
     

    1.    Research is prioritised through ARI steering groups to ensure projects with a clear policy and business need are prioritised.

    2.    Methods reviewed through ACSS. Research work packages/ protocols signed off prior to work commencing.

    3.    Data protection part of tendering process. PIA process in place, with guidance on when this should take place.

    4. Ethics included in tendering process. There is a check list for the privacy notices and the participant information sheet.

    5. Our call off contractor always offers to provide tablets/ dongles for those who are not web enabled so that they can take part in research. FSA has a Welsh translation unit for surveys to be translated if requested. 

    Procurement

    RED: Social research is not consistently procured in line with GSR Guidance on the Procurement of Government Social Research. Good practice is not followed to reduce burdens to suppliers or maximise value for money and use to the customer.
    AMBER: Inconsistent application, or knowledge of, GSR Guidance on the Procurement of Government Social Research and application of good practice.
    GREEN: GSR research is conducted in line with GSR guidance on the Procurement of
    Government Social Research, backed by departmental procedures and the advice of procurement experts. Unless there is a strong, justifiable reason, all contracts are awarded as the result of competition. Good practice in procurement is followed to ensure that processes are proportionate, burden on suppliers and commissioners is reduced as far as possible, the supplier market is developed, and processes are operationalised to maximise value for money and to meet the needs of policy customers.

    Amber to Green

    Unless it is not possible due to time constraints, all tenders go through the full competitive tendering process.

    Procurement team are involved in all tendering processes and moderate the panels.
    Individual scores are then agreed by consensus.

    High value tenders have an external impartial expert on the panel to help ensure quality and value for money. 

    Some contracts go straight to the call off contractor due to time constraints.  -

    Data security
    RED: No systems or guidance in place for handling, storing and sharing data. There is concern that GSR members are not fully aware of their responsibilities.
    AMBER: Systems and guidance in development, GSR members’ awareness of data security being addressed as a priority.
    GREEN: Systems and guidance in place for handling, storing and sharing data in line with CO core minimum standards and relevant legislation. GSR members are fully aware of their responsibilities and actively manage contractors’ data security and handling processes.

    Amber to Green

    Systems and guidance in place for handling, storing and sharing data

    All contractors must provide details of how they will securely store, process and share personal data in their tender application form/ work package response. 

    Training on data protection/security is not provided to GSR members beyond the generic information management course.  Mandatory Civil Service data security e-course every year. 
    Data sharing
    RED: Little or no thought to how data can be used or shared beyond the original purpose of its collection. Inhibits use of data by others and does not use existing data appropriately/where possible.
    AMBER: Some sharing of data takes place but on an ad hoc basis. Secondary analysis is carried out, but more use could be made of existing sources.
    GREEN: Ensure data resources are made best use of by us and others, by making them openly available as far as possible. Data management and sharing considered as part of planning process for new data collection projects, recognising different levels of data sensitivity.
    GSR members proactively support the re-use of data and make systematic use of data archives where appropriate (for quant and qual data).
    Amber to Green Data tables are routinely published on the FSA website as part of FSA core principles of being an open and transparent science led organisation.
    We systematically use data from archives other sources where appropriate, for example, for scoping evidence reviews, for our horizon scanning work.
    - Data sharing is planned via and complemented by our project comms plans, agreed with Comms.

    Overall rating (1 to 10): 8

    Performing role with integrity

    • make best use of available resources
    • give appropriate methodological and impartial evidence-based advice, challenging where appropriate
    Suggested indicators Self assessment rating Evidence to support assessment rating Issues or risks Good practice
    Make best use of available resources/achieve value for money
    RED: There are no formal processes in place to consider the added value of a project, taking into account the evidence base. Value for money is not monitored throughout the life of projects.
    AMBER: Inconsistent practice in reviewing the added value of new projects prior to undertaking new research. Value for money is not regularly monitored throughout the life of projects. Processes are in development to address these issues.
    GREEN: GSR members routinely consider the added value of a project before undertaking new research. Value for money is routinely monitored throughout the life of a project.
    Amber to Green

    Business case processes for example, VFM

    ACSS QA gateway process for example, VFM

    Procurement processes for example, VFM Work with policy stakeholders to understand existing evidence base.
    Conduct REAs/literature reviews to understand evidence base and evidence gaps and use this to inform commissioning.

    Ensuring documents are up to date

    Link across projects to ensure making best use of commissioned work (for example, secondary analysis of kitchen life 2 data, citizen science linking up with research ARIs /
    stakeholders).

    EPI form, which includes elements on making best use of available resources Steering groups to ensure cross-org knowledge is leveraged.

    Knowledge management

    RED: Knowledge management is patchy with no mechanisms in place to encourage GSR members to keep up with, share and retain knowledge (including research/methodological/policy developments) at individual or organisational levels.
    AMBER: Most GSR members undertake knowledge management activities, including keeping up with research/methodological/policy developments relevant to work area. No mechanisms yet in place to help promote the sharing and retention of knowledge at the individual or organisational levels.
    GREEN: GSR members routinely keep up with emerging research/methodological/policy developments relevant to work area.
    This activity may be written into their objectives. Mechanisms are in place to ensure that organisational knowledge is
    shared and retained as appropriate.

    Amber to Green

    Social Science Library to capture existing research

    L and D channel library of resources

    See section on ''Appropriately skilled and continuously developed''

    Ongoing maintenance of resources

    Degree to which GSR members are embedded into policy areas (for example, invited to relevant meetings which don’t have explicit research needs, have documents shared etc.)

    Production of summary papers of existing FSA research in relevant policy areas (for example, FHS consumers' information preferences, labelling.

    Membership of cross government groups (for example, on behavioural research and evaluation) 

    Open, fair and honest
    RED: GSR staff are not given appropriate support and guidance to enable them to earn trust and respect of users of government social science research, research participants and the wider public. Appropriate mechanisms do not exist to ensure learning received from stakeholders is acted upon.

    AMBER: GSR staff are given some support and guidance in this area, and learning is acted upon in most cases, however, more could be done to support staff and to develop skills. There is some concern that GSR staff are not always making the right judgements in how they deal with stakeholders.
    GREEN: GSR members work to gain the trust and respect of users of government social science research, research participants and the wider public, and are given appropriate support and guidance to do this. As a profession, GSR deals openly and fairly with research customers and other stakeholders, acting upon feedback and information received. GSR members use good judgement to balance rigour and relevance, build constructive relationships within and outside their profession, and perform their challenge
    role appropriately.

    Green

    Insights are published. We measure trust in the FSA and track the reputation of the FSA (FSA Reputation tracker, Food and You 2)

    Membership of cross-government groups (for example, on behaviour research and evaluation)

    Disclosure of data and reports for transparency. 

    -

    In primary research conducted by the team, we follow best practice principles as laid out by bodies such as the GSR, SRA and MRS when engaging with research participants (for example, ensure suitable measures in place for data protection, anonymity etc.)

    Use of stakeholder list (OGD, academics, NGOs) ensures we are open in communicating insights

    Overall rating (1 to 10): 8

    Appropriately skilled and continuously developed

    • recruited and promoted in line with GSR recruitment protocol

    • committed to continuous professional development in line with the CPD handbook
    Suggested indicators Self assessment rating Evidence to support assessment rating Issues or risks Good practice
    Recruitment and induction
    RED: GSR members not recruited in line with GSR Recruitment Guidance. No formal arrangements for local induction. New recruits not encouraged to attend central GSR induction.
    AMBER: Some GSR members are recruited in line with GSR protocols. Local induction processes are in place, but these are not applied consistently for all new recruits. New recruits are encouraged to attend central GSR induction.
    GREEN: All members are recruited and promoted in line with the GSR Recruitment Guidance using either locally managed procedures or by drawing on a centrally managed process. [For corporate members this should apply to all members recruited from point organisation joined GSR].
    Formal procedures are in place for inducting new recruits to organisation, and to GSR if recruited externally.
    External recruits encouraged to attend
    central GSR induction, and this is routinely taken up when available. 
     
    Green Excel document of team qualifications, years of experience etc. - All our members are recruited in line with the GSR recruitment protocol.
    Continuing Professional Development
    RED: Development plans not routinely produced and members not routinely achieving and/or recording 100 hours of CPD. No discussion of current/ future skills needs at unit and/or organisational level. Perceived or actual lack of development opportunities locally. Few CPD opportunities identified or promoted at unit/organisational level.
    AMBER: Development plans are generally produced. Evidence that some GSR members are achieving and/or logging 100 hours of CPD. Skills needs at unit and/or organisational level may have been discussed but no formal plans are in place. Some development opportunities are identified and promoted at unit/organisational level.
    GREEN: Development plans are routinely produced. Evidence that most GSR members are achieving and/or logging 100 hours of CPD. Opportunities exist for members to discuss their professional CPD on an annual basis.
    Recognition at unit and/or organisational level of current and future skills needs and plans are in place to meet those
    needs. There is an emerging culture of development and CPD opportunities are routinely identified and promoted at unit/organisation level. 
     
    Green Individual CPD logs Team CPD log SERD capability plan - including for social science. 

    The individual, with the support of their line manager, are expected to produce their own development plan

    We have made an active choice not to conduct a skills audit.

    Individual CPD logs

    On a team level and individual level we have a lot of CPD and have now created a team evidence log for this

    L&D Teams chat channel to share events. Feedback given to the rest of the team from individuals regarding L&D. 

    Career and talent management

    RED: HoP has little or no control over facilitating/supporting career moves of GSR members. Little guidance offered to members about career management and how to gain broader/deeper experience within/external to the organisation. Little/no promotion of leadership development amongst GSR members at any level. Access to talent management opportunities is limited.
    AMBER: HoP has some influence in facilitating/supporting career moves of GSR members but in organisational context more could be done.
    Opportunities such as leadership development, talent management, broader experience and take-up of generic analyst roles are available but these could be better promoted.
    Promotion of leadership skills development focuses only on senior members.
    GREEN: HoP plays active role (as appropriate) in managing the career development of GSR members and
    promotes opportunities to develop and demonstrate leadership skills at all levels within the GSR community. HoP identifies and develops those with the potential to progress to senior positions, including SCS. Access to talent management opportunities is open and transparent. Other opportunities e.g. for broader/deeper experience, or take-up of generic analyst roles, are identified and promoted in the context of career management.

    Amber to Green 

    HoP, with team, enables all individuals to gain experience of different work areas (as best as possible subject to business needs) as well as line management opportunities

    Fast Streamers have successfully rotated within the team to date

    2 members of staff have recently been seconded and loaned to other departments.
     

    Managed moves are not really feasible due to workload / resourcing pressures

    The small and centralised nature of the team means there are limitations on internal career moves (for example, we have no GSR SCS)
     

    Regular career conversations with team members as part of line management responsibilities,
    for example, Talent and career conversations,
    for example, Mentoring

    All GSR members are encouraged to play an active role in X-govt groups & GSR working groups. Several are currently active.

    Access to talent management opportunities,
    for example, Civil Service
    talent and accelerated development schemes.

    Badging exercise will be held in 2023 for non GSR members. 

    Balance and use of skills
    RED: Not all GSR members have roles where they are able to routinely use and apply research skills and knowledge.
    AMBER: GSR members use a wide range of research skills and knowledge when commissioning and managing research but there is limited opportunity to undertake primary data collection and/or secondary analysis and/or research synthesis.
    GREEN: Most GSR members are given the opportunity to undertake primary data collection and/or secondary analysis and/or research synthesis, in addition to applying research skills throughout the commissioning and managing process.
     
    Amber to Green

    All team members (including non-GSR) are constantly using research skills and knowledge whether via project delivery or providing advice to policy.

    Where possible (subject to making the best use of limited resources) staff conduct research and analysis. 

    Resource limitations / workload pressure restricts opportunity to undertake work in- house. However, budget restrictions in the next FY may necessitate that we do more in-house. Examples of primary data collection / secondary analysis / research syntheses are for example, 'Consumer Insights Tracker, published Handwashing research, in house REAs.

    Overall rating (1 to 10): 8

    Outward facing

    • establish effective links with the external research community

    • actively collaborate with policy/delivery colleagues
    • actively collaborate with other analytical professions within and across departments
    Suggested indicators Self-assessment rating Evidence for support assessment rating Issues or risks Good practice

    External research community: 

    RED: Some links with the external research community pursued but not very well developed. No plans in place to identify more. Potential for external
    engagement not realised.
    AMBER: Some well-established links with external research community. Further links identified and plans may be in place to follow up, but not systematic. More emphasis needs to placed on this kind of activity for example, engaging with both
    individual external experts and external
    research bodies.
    GREEN: GSR members engage actively in developing links with the external research community in their work/policy area. Strategic plans in place to support
    this activity, for example, external social research
    advisory committee in place, joint funding of research programmes or centres with external research bodies, consulting externally on research programme. 

    Green Advisory Committee  for Social Science (ACSS) and working groups in place to advise on research programme / projects and provide links with academics working in food research Regular presentation of our research to ACSS and other FSA committees (ACMSF, WFAC, NIFAC)
    Research fellowships and PhD studentships in
    place. Research fellow
    embedded in the social science team, providing links with academia Register of Specialists in place and used to identify and commission peer reviewers Joint funded projects with
    UKRI Active engagement and dissemination of research to external research community including academics, research agencies and  third sector organisations doing their own research for example, WRAP, Food Foundation, Trussell Trust. The Consumer Insights Tracker excels at this.
    - Dissemination of research using external stakeholder list, Science newsletter, blogs, news articles and through presenting at external events /
    conferences e.g. MRS  Behavioural science
    conference, LSE public policy
    conference, ESRC festival,
    International Food Regulatory
    Analysis conference, British
    Feeding and Drinking Group Conference, NI Consumer Council Board, GenPopWeb2 webinar, ISSLG conference, Innovation Insights, Exchange conference (IIEX Europe), GSR event (see impact log). 

    Other government analysts

    RED: Few opportunities for GSR members to work with, or gain knowledge of, the other analytical professions. Limited engagement with other analytical HoPs.
    AMBER: Some notable interaction and working by GSR members with the other analytical professions. Mechanisms for planning or facilitating joint working are in their infancy. Established contact/engagement with other analytical HoPs.
    GREEN: GSR members are aware of the contribution of the other analytical professions, know when and how to engage them and there are good examples of working together where appropriate. Mechanisms for the joint planning of analysis are well in development or have been implemented. Routine and constructive contact with other analytical HoPs.

    Green

    Regular engagement with wider Analytics Unit (AU) through regular unit level meetings with other analytical professions (and their HoPs) including statistics, operational research, economics and strategic insights (intelligence)

    Collaboration with other analytical professions within the FSA on joint projects with other professions for example, consumer insights tracker, F&Y2, Kitchen Life 2, evaluation work

    -

    Regularly seek out other professions for advice or support when appropriate for example, statistics, economics

    Collaboration with other analysts across government in the UK (for example,
    Defra, FSS, DHSC) on
    research projects and cross- government reports, and internationally (through the ISSLG)
     

    Policy/delivery community
    RED: Little interaction with policy/delivery community and structural arrangements do not promote close working relationships.
    AMBER: GSR members do have links with the policy/delivery community but
    more could be done to increase the visibility and input of GSR into the policy cycle.
    GREEN: GSR members act as educators, internally and externally, promoting the profession of GSR to key stakeholders and working collaboratively with these stakeholders to ensure the relevance, comprehensiveness and applicability of social research output.
    Appropriate structural arrangements are in place for the allocation and location of social researchers, which allow close working relationships with key stakeholders.
    Green Team members have good links with FSA policy community and are regularly approached by Policy for analytical input.  Subject to resources, we could play a more active educative role with policy teams (for example, running introductory/training sessions for policy)

    Research and Evidence Programme Steering Groups in place allow close working relationships with
    Policy. Project working groups in place to support collaboration with Policy on research projects from start to finish.

    Overall rating (1 to 10): 9

    GSR Code’s Standards for Products

    GSR Code's Standards for Products Self Assessment rating by FSA Social Science Team Peer Reviewer's Ratings Peer Reviewer's comments
    Rigour and impartiality - project design, quality assurance of outputs A/G A/G
    • project specs and proposals are well reviewed internally. External reviewing and QA is undertaken on a case-by-case basis, as is acknowledged in the self-assessment. A more consistent approach to external QA is required. 
    • the ACSS is an independent expert committee of the Food Standards Agency and provides peer review of outputs. The claim a) “the new ACSS QA gateway process may bring this up to a green”, and b) the new UCL Quality Assurance Toolkit is a further opportunity to embed good practice for quality assurance.
    • the evidence presented notes that “engaging colleagues in exploratory/experimental work” and “accessing good research suppliers at pace and within government procurement processes” is a challenge.
    • the rating of amber/green is appropriate for this standard.

    Relevance:

    Short/long term balance

    Departmental business planning

    Strategic level sign-off

    Impact assessed

    A/G
    • A/G
    • there is indeed a good balance between short term (operational) and longer term (strategic) research and analysis. The FSA horizon scanning work by the social science team in 2020 and 2021 is rightly claimed as a case in point.
    • future policy needs are not always anticipated in published research outputs. External contractors can hardly (or rarely) be held responsible for doing this. The FSA social science team links well with policy customers to establish the relevance of its research to current and future policy issues. 
    • documentation of these four indicators is good and valid. The evidence presented to support the self-assessment notes that “turning dissemination into measurable impact” is another challenge for the social science team, Suggest links to the National Diet and Nutrition survey.
    • a rating of amber/green is appropriate for this standard.
    Accessibility - published, format

    A/G

    A/G
    • publication of social science research outputs is timely, readable and informative. Presentational format is generally good. Summary data are presented well in published outputs. The evidence provided in support of this standard is considerable. 
    • more detailed information about methodology is often, but not always, provided in a separate ‘Technical Report’. This should be common practice. 
    • a rating of amber/green would be more appropriate for this standard.

    Legal and ethical

    Ensuring good practice in the commissioning, management and conduct of government

    Procurement

    Data Security

    A/G A/G
    • the social science team follows guidance on commissioning, managing and conducting research well, including attention to GSR ethical guidelines. 
    • procurement procedures have been identified as a challenge for some contractors. Through good working relationships with procurement colleagues the social science team manages this well. There is room for improvement in the research procurement arrangements, but these are beyond the control of the social science team. 
    • external impartial expertise is used for ‘high value tenders’, the threshold for which is unclear.
    • data security is handled well and in accordance with civil service requirements.
    • data sharing is generally good and readily available. A consistent approach to publishing technical reports would enhance this standard.
    • a rating of amber/green is appropriate for this standard

    GSR Code’s Standards for People

     

    GSR Code's Standards for People Self Assessment rating by FSA Social Science Team Peer Reviewer's Ratings Peer Reviewer's comments

    Performing role with integrity

    Make best use of available resources/achieve value for money

    knowledge management

    Open, fair and honest

    A/G A/G
    • the social science team certainly meets the GSR criteria for a green rating for being open, fair and honest and considering the added value of a project before undertaking new research. 
      identifying existing and emerging research is undertaken by the team using literature reviews and rapid evidence assessments. There is some room for improvement in terms of using up-to-date methods of evidence synthesis such as systematic reviews and evidence gaps maps. Sharing of information gained gathering existing evidence is generally good.
      knowledge management also requires good storage, file management and retrieval of evidence. The team has worked to improve this in recent months.
      a rating of amber/green is appropriate for this standard.

    Appropriately skilled and continuously developed:

    Recruitment and induction

    Continuing Professional Development

    Career and talent management

    Balance and use of skills

    A/G
    • A/G
    • all indications are that the SR team are recruited and promoted in line with GSR protocols. 
      recruitment of researchers from the external research and evaluation community has enhanced the experience and expertise of the SR team.
      there is some imbalance of grades in the current structure of the SR team (six PROs, six SROs, 3 ROs). That said, there is evidence of routine research tasks being undertaken by PROs and SROs. Future recruitment might focus on adding more ROs. 
      most GSR members are given the opportunity to undertake a balance of research and analysis skills.  Some technical skills might require development.
      staff receive opportunities for continuing professional development in line with GSR guidelines. Procurement requirements sometimes limit the choice of CPD opportunities, but otherwise talent management is good.
      a rating of amber/green would be more appropriate for this standard..

    Outward facing:

    External research community

    Other government analysts

    Policy/delivery community

    G

    G
    • the self assessment's reporting of the SR team's links with the external research community. other government analysts and policy delivery colleagues is fair and accurate. This was confirmed in the interviews with the SR team and with internal and external stakeholders. 
    • a rating of green is appropriate for the standard.

    GSR Technical Skills Framework (footnote 1)

    Research Officer

    Technical Skills Domain Research Officer - Technical skills
    Knowledge of research methods, techniques and application of these in small scale research projects
    • knowledge of research and analytical methodologies and ability to demonstrate practical application of both qualitative and quantitative approaches through project-based work; for example, a suitable level of experience would be that obtained from obtaining a 1st or 2:1  degree or a postgraduate qualification in a social science discipline, that includes a substantial element of social research methods and training 
    • working knowledge of a range of research methods and an awareness of new innovative methods, including within the data science field.
    • broad awareness of the role of quantitative and qualitative social research methods and their application, i.e., knowing when their application is appropriate and when it is not
    • uses both qualitative and quantitative approaches to undertake small in-house pieces of work while under supervision
    • carries out analytical tasks under directions
    • prepares accurate statistics
    Identifying research needs, designing and specifying research
    • designs small scale and less complex research projects for either in-house or commissioned projects; understands how to get things done in the Civil Service
    • helps line manager identify areas for new research
    • writes and designs draft research specifications for less complex projects
    • defines research questions, and re-defines where necessary
    Analysis and interpretation
    • makes use of different sources of information and carries out basic analysis of key data sets by producing frequencies and cross tabulations; interprets the key findings from this.
    • uses computer software in the analysis and presentation of information
    • working knowledge of relevant data analysis packages, particularly SPSS and Excel, and qualitative packages. Packages to be determined by the particular role and job content
    • introductory level knowledge of data science techniques
    • accurately interprets data (verbal & numerical) and research papers, for example, makes an accurate interpretation of the key findings from a literature search
    • summarises verbally and numerically expressed research information accurately

    Managing and commissioning social research

    • aware of key departmental procurement procedures
    • judges accurately the merits of less complex research tenders
    • supports team members in managing more complex external research projects
    • working knowledge of ethical issues in research; knowledge of GSR principles on research ethics and SRA's guidance; with support, is able to apply these when developing appropriate solutions/ proposals
    • working knowledge of legal requirements surrounding research, particularly data protection and the Freedom of Information Act; knows when to seek further support
    • sensitive to legal requirements that may surround particular departmental research agenda; knows when to seek further support
    • demonstrates attention to detail in checking information/evidence for accuracy and validity, for example, checking an interim research report from a contractor
    • critically assesses research findings against an established standard or specification
    • uses logic to evaluate new ideas and alternatives, for example, critically assesses new hypotheses or new methodologies
    • has knowledge of quality assurance methodologies required for analytical work and understands the context and relevance of quality assurance products, such as the Aqua Book.
    Leading, communicating and championing social research
    • contributing to the wider GSR network
    • works with other social researchers and/or analysts
    • delivers appropriate and timely analysis to support policy making and policy implementation, seeking support from senior colleagues where appropriate
    • clear knowledge of the format and style required to report research results; able to report research information clearly in writing
    • persuades others to support the research process, for example, industry bodies to release necessary information or policy customers of the value of social research
    Understanding government context
    • knows when to consult with others, for example, with GSR colleagues, other analytical professions or policy colleagues
    • uses the GSR network to increase awareness of cross cutting research possibilities
    • understands the role and social research needs of policy divisions and can link that to own output
    • tailors approach and frames research output in a way that is directly relevant to the customer's needs
    • understands the policy context of own work
    Learning and development
    • identify opportunities to expand knowledge and breadth of skills
    • commit time to learning and sharing knowledge with others around approaches to research

    Senior Research Officer

    Technical Skills Domain Senior Research Officer - Technical Skills
    Knowledge of research methods, techniques and application of these in small scale research projects
    • as Research Officer plus around two years successful applied experience of working in applied social research in which strong analytical skills have been clearly demonstrated. This can include work towards successful completion of a post graduate qualification with a strong research/research methods component
    • thorough and detailed knowledge of the main quantitative, qualitative and evaluation research methods and experience of their use in more complex projects
    • understands the pros and cons of different research methods so can advise, critique and make independent direct use of same
    • uses a range of analytical techniques to carry out in-house analysis & briefing work
    • up-to-date knowledge of methodological developments including the role of innovative methodologies; applies these methods when and where appropriate; makes use of appropriate new developments from other analytical professions and outside the Civil Service where relevant
    • incorporates the latest techniques into their work where appropriate, and champions innovation and embraces new ways of working.
    • good knowledge & application of analytical techniques to address key questions
    • assesses the suitability for purpose of alternative research methods
    • brings a fresh approach to devising research methods, for example, designing questionnaires, modifying a methodology
    • generates imaginative and useful hypotheses which can be tested
    Identifying research needs, designing and specifying research
    • draws upon a track record of designing medium sized or more complex projects to translate a policy question into a viable research specification or in-house project.
    • meets the social research needs of several divisions and areas, for example, through consulting with other stakeholders and analysts at an early stage, clarifying objectives and setting deliverable goals
    • clarifies and agrees research objectives and 
    • translates broad projects aims into researchable questions, for example, turns policy requirements into well designed research specifications
    • accurately identifies where there are gaps in the evidence base and makes sound recommendations for how this can be managed
    • reframes a vague or unhelpful research question to one that can provide outputs that meets the customer’s needs
    • incorporates best practice guidance into research specification and management
    Analysis and interpretation
    • weighs up competing sources of data and identifies a clear line to take
    • assesses relevance of research information to the task in hand
    • identifies salient points and trends from research or other information and draws out sound, logical inferences, for example, picks out key messages from dense data sets
    Managing and commissioning social research
    • experience also of having had direct responsibility for management of commissioned research or for undertaking social or other relevant research projects of a significant scale. The above also needs to include some experience of having worked with other analysts, for example, economists, statisticians
    • specifies, commissions and manages research projects; works within agreed budgets
    • understands the basics of research planning, including bidding timetables, thinking ahead and liaising with policy divisions
    • assesses whether a contractor’s report is based on a sound approach and robust analysis
    • as RO but for more complex projects. Overall an ability to manage independently the entire procurement process for all but the most complex projects, including budgetary requirements
    • takes responsibility and action to ensure that the legal and ethical compliance needs of research projects are met; Knows when to seek further support on legal or ethical issues
    • liaises successfully with ethics committees, and other monitoring/ compliance committees, for particular projects
    • applies GDPR principles where appropriate to particular projects
    • manages research processes and contractors so ensuring quality of results and methodological rigour
    • has knowledge of, and incorporates quality assurance processes within all analytical work, making use of relevant quality assurance products, such as the Aqua Book
    • monitors and reviews performance and progress of research contractors and anticipates necessary action
    Leading, communicating and championing social research
    • seen as a knowledgeable voice (for example, helping others understand what social research can achieve)
    • may support other social researchers
    • delivers appropriate and timely analysis to support policy making and policy implementation
    • communicates written and oral information clearly; avoids unnecessary use of jargon and technical terms
    • accurately and thoroughly evaluates complex information for the purposes of advice or recommendation; does so in a timely fashion
    • communicates professional judgements regarding the application of social research methods; defends position/viewpoint in the face of opposition or challenge
    • helps customers make full use out of available social research evidence, even when it is not perfect
    • able to stimulate interest in social research and its applications; persuades others such as senior civil servants of the value of social research to the policy process
    • able to reassure colleagues on social research issues
    • contributes effectively to research steering groups and advisory boards
    Understanding government context
    • makes use of the GSR network to explore opportunities for cross cutting research
    • develops effective communication links with other social researchers to provide appropriate collaborative support to the policy process
    • looks beneath the surface of a request for advice or a piece of research, thinks laterally and explores different angles critically and analytically
    • demonstrates sufficient technical ‘authority’ by taking the lead in recommending solutions to fill strategic gaps
    • conducts a risk analysis of an evidence base to ensure advice is sound, for example, understands the trade-offs in balancing quality and timing of - project delivery
    Learning and Development Proactively identifies and expands knowledge and breadth of experience where professional skills are less developed. Commits time to learning and sharing knowledge with others around new techniques and innovative methods 

    Principal Research Officer

    Technical Skills Domain Principal Research Officer - Technical Skills
    Knowledge of research methods, techniques and application of these
    • as for Senior Research Officer, plus established track record (around three/three plus years) in designing, carrying out or managing social research, and providing research based advice and briefings
    • established track record of developing and managing research projects employing the full range of research methods
    • has used or actively considered the use of the latest methods in recent projects, as appropriate
    • supports SROs/ROs on selection of methods and can deal with more complex problems without detailed knowledge of project
    • provides an overview of research methods for a wide portfolio of projects, and provides a supervisory/ sounding board for team leaders where appropriate

    Identifying research needs, designing and specifying research

    • sets out clear research objectives and expected outcomes; defines key delivery objectives for staff / department
    • thinks around a problem; reframes it; questions assumptions, for example, able to reframe a research question to maximise assistance to customer
    • looks beyond immediate issue – identifies trends, areas for further research or analysis, links
    • proactive in helping policy divisions and directorates identify their information needs and evidence gaps; translates unfocussed requests and ideas into effective research design.
    Analysis and interpretation
    • provides and supervises briefing activity based on analytical work
    • reports complex and often conflicting research information to senior non-specialists/customers, assists them to isolate key facts, discern trends and draw implications
    • evaluates and integrates research information from a variety of sources to come to logical conclusions

    Managing and commissioning social research

    • understands and can work within the budgetary requirements at the research project level
    • as SRO but can draw upon more extensive experience of research procurement, contractor and financial management
    • promotes multi -disciplinary working; understands what other analysts can contribute (for example, economists, statisticians) and how their own (& own team’s) work fits in
    • supports RO’s and SRO’s in resolving legal and or ethical issues
    • as SRO, but with an overview knowledge to apply to multiple, or complex, or potentially controversial or high profile projects
    • manages quality and product assurance issues on projects, developing quality and product assurance requirements with customers and analysts prior to projects commencing. 

    Leading, communicating and championing social research

    • manages and supports other social researchers through their career
    • draws upon extensive experience of the design, development, commissioning and management of projects and in-house analytical activity to ensure appropriate social research input to policy decisions
    • ensures the provision of appropriate and timely analysis to support policy making and policy implementation
    • makes objective and timely decisions based on best available evidence and sound analysis
    • uses evidence-based arguments, even when under pressure
    • gets to the heart of an issue; subjects information to a thorough analysis to ensure high quality decisions and recommendations
    • provides impartial and balanced advice, using sound application of knowledge and expertise; for example, communicates understanding of policy realities, but still represents social research evidence
    • shapes customers' expectations and needs by educating them about what social research can achieve
    • is an enthusiastic advocate of social research; can sell an idea or argument
    • raises the level of debate by encouraging greater co-operation and communication between researchers across the department and further afield; provides links between academic and GSR colleagues
    • encourages, coaches and supports others to adopt the latest social research methods and data science techniques into their work
    Understanding government context
    • works in partnership with other analysts and departments to achieve joint customer goals
    • co-operates and works well with others in the pursuit of social research goals
    • uses the GSR network effectively to actively pursue options for cross cutting research
    • identifies gaps in the social research evidence base that relates to key policy objectives and suggests methodologically robust ways to fill them
    • takes considered risks and assesses and manages the risks; is not deterred by incomplete or inconclusive data
    • takes the lead on a number of ‘technical’ matters within the wider GSR/ analytical community, for example, this could be methodological or evidence base.
    Learning and Development
    • is proactive in keeping abreast of new methodological and technical developments, how they might be used within department in different policy contexts

     

    Research Officers (N=3)

    Research Officers: Technical Skills Some ability No ability
    Working knowledge of a range of research methods and an awareness of new innovative methods, including within the data science field. 1/3 -
    Carries out analytical tasks under direction. 1/3 -
    Prepares accurate statistics. - 1/3
    Designs small scale and less complex research projects for either in-house or commissioned projects; understands how to get things done in the Civil Service 1/3 -
    Helps line manager identify areas for new research. 1/3 -
    Makes use of different sources of information and carries out basic analysis of key data sets by producing frequencies and cross tabulations; interprets the key findings from this. 1/3 -
    Working knowledge of relevant data analysis packages, particularly SPSS and Excel, and qualitative packages. Packages to be determined by the particular role and job content - 2/3
    Introductory level knowledge of data science techniques - 2/3
    Accurately interprets data (verbal & numerical) and research papers, for example, makes an accurate interpretation of the key findings from a literature search 1/3 -
    Aware of key departmental procurement procedures 1/3 -
    Working knowledge of legal requirements surrounding research, particularly data protection and the Freedom of Information Act; knows when to seek further support. 1/3 -
    Uses logic to evaluate new ideas and alternatives, for example, critically assesses new hypotheses or new methodologies 1/3 -
    Has knowledge of quality assurance methodologies required for analytical work and understands the context and relevance of quality assurance products, such as the Aqua Book. - 2/3
    Persuades others to support the research process, for example, industry bodies to release necessary information or policy customers of the value of social research - 2/3
    Uses the GSR network to increase awareness of cross cutting research possibilities 1/3 -
    Commit time to learning and sharing knowledge with others around approaches to research 1/3 -

    Senior Research Officers (N=3)

     

    Senior Research Officers: Technical Skills Some ability No ability
    Uses a range of analytical techniques to carry out in-house analysis & briefing work. 2/6 -
    Up-to-date knowledge of methodological developments including the role of innovative methodologies; applies these methods when and where appropriate; makes use of appropriate new developments from other analytical professions and outside the Civil Service where relevant. 2/6 -
    Incorporates the latest techniques into their work where appropriate, and champions innovation and embraces new ways of working. 1/6 -
    Draws upon a track record of designing medium sized or more complex projects to translate a policy question into a viable research specification or in-house project. 1/6 -
    Clarifies and agrees research objectives and translates broad projects aims into researchable questions, for example, turns policy requirements into well designed research specifications. 1/6 -
    Accurately identifies where there are gaps in the evidence base and makes sound recommendations for how this can be managed 1/6 -
    Weighs up competing sources of data and identifies a clear line to take 1/6 -
    Understands the basics of research planning, including bidding timetables, thinking ahead and liaising with policy divisions. 1/6 -
    Overall an ability to manage independently the entire procurement process for all but the most complex projects, including budgetary requirements. 1/6 -
    Liaises successfully with ethics committees, and other monitoring/ compliance committees, for particular projects. 1/6 -
    Able to stimulate interest in social research and its applications; persuades others such as senior civil servants of the value of social research to the policy process. 1/6 -
    Contributes effectively to research steering groups and advisory boards. 1/6 -
    Demonstrates sufficient technical ‘authority’ by taking the lead in recommending solutions to fill strategic gap. 2/6 -
    Conducts a risk analysis of an evidence base to ensure advice is sound, for example, understands the trade-offs in balancing quality and timing of - project delivery. 1/6 -

    Principal Research Officers (N=3)

    Principal Research Officers: Technical Skills Some ability No ability
    Promotes multi-disciplinary working; understands what other analysts can contribute (for example, economists, statisticians) and how their own (& own team’s) work fits in 1/6 -
    Shapes customers' expectations and needs by educating them about what social research can achieve 1/6 -
    Is an enthusiastic advocate of social research; can sell an idea or argument 1/6 -
    Raises the level of debate by encouraging greater co-operation and communication between researchers across the department and further afield; provides links between academic and GSR colleagues. 1/6 -
    Encourages, coaches and supports others to adopt the latest social research methods and data science techniques into their work 2/6 -
    Uses the GSR network effectively to actively pursue options for cross cutting research 3/6 1/6
    Takes considered risks and assesses and manages the risks; is not deterred by incomplete or inconclusive data. 1/6 -
    Takes the lead on a number of ‘technical’ matters within the wider GSR/ analytical community, for example, this could be methodological or evidence base. 1/6 -
    Takes the lead on a number of ‘technical’ matters within the wider GSR/ analytical community, for example, this could be methodological or evidence base. 4/6 -
    Is proactive in keeping abreast of new methodological and technical developments, how they might be used within department in different policy contexts 1/6 -