FSA GSR Review: FSA Response
This is the FSA response to the GSA Social Science Review report.
Foreword
We welcome this independent external review by Dr Philip Davies and its recommendations and are grateful to him for his expert considerations.
Dr Davies is an independent public policy consultant and Associate Fellow at the University of Oxford. He is also a Senior Research Fellow at the American Institutes for Research, Executive Director of the Campbell Collaboration and Deputy Director of Systematic Reviews at the International Initiative for Impact Evaluation. Dr Davies is a former senior civil servant and Deputy Head of Government Social Research (GSR).
Background
The FSA is committed to delivering quality research and evidence: one of our guiding principles in the FSA Strategy is that we are science and evidence led. We have detailed plans to deliver continuous improvement setting out priority areas for developing our people, methods, processes and governance. As part of this commitment, we commissioned an independent assessment of the Social Science team (people) and outputs (products) against the GSR code professional standards.
The objectives of the review were:
- To assess the contribution that the FSA social science team makes to the FSA and its mission, and to identify what it does well, areas for improvement, and make recommendations for Continuing Professional Development (CPD);
- To assess the GSR Code for People and Products and the use of the GSR Self-Assessment tool to appraise social science outputs.
The approach taken was:
- Interviews with a range of internal and external stakeholders (including from FSA, other government departments, academia, and the private sector).
- Appraisal of a range of research reports;
- Appraisal of the team's self-assessment against the GSR code;
- An online survey of the team's technical skills using the GSR Competency Framework;
- A group interview with the social science team.
Conclusions and recommendations with FSA responses
What does the social science team do well?
1. The FSA’s social science team is a confident group that is well-regarded by the majority of internal and external stakeholders, and provides a robust evidence base on the social science aspects of the FSA’s mission to ensure that food is safe, is what it says it is, and is healthier and more sustainable.
2. The team was described by stakeholders as having “a huge amount of in-house expertise” and providing a “variety of really interesting high-quality research that contributes towards the evidence base for our policies.” This included the “ability to provide quick and detailed comments on things where input was required” and “some really strong areas of consumer insight that allows us to make very powerful statements as an organisation.”
3. The team’s qualitative analysis and their evaluation of the equality issues surrounding food policy and food insecurity were highly valued, as were its exploration of consumers’ attitudes and perceptions on food products. The ‘lived experience’ research of the team was also highly valued.
4. The team’s ability to identify and clarify the problems to be researched or evaluated, and to articulate the business needs for research or evaluation, was also recognised positively by internal and external stakeholders. So too was their capacity to challenge policy colleagues and contractors in order to specify research and evaluation that can be delivered in a timely manner and with quality.
5. The research outputs reviewed for this report were found to be generally of a high standard in terms of the methodologies used, their design, execution and reporting.
6. The project management of the FSA social science team was generally good and appreciated by policy colleagues and contractors alike.
7. The social science team meets most of the GSR Code for Products and People well. The team and its work were rated ‘Green’ or ‘Green/Amber’ against measures of: rigour and impartiality; relevance; accessibility; legal and ethical practice; performing role with integrity; being appropriately skilled and continuously developed; and outward facing.
FSA response
We welcome the identification of these areas of strength for our team such, and we will continue to maintain and develop our performance in these areas. For example, where we are rated ‘Green/Amber’, we will work to improve our performance to achieve ‘Green’ ratings.
Where is there room for improvement?
1. Early policy involvement
The involvement of the team with policy colleagues does not always happen early enough. Consequently, research objectives, questions and approaches can be ill-defined or considered too late for appropriate specificity, delivery and quality assurance. Managing expectations of policy colleagues regarding how long it takes to procure, deliver and quality assure research to high standards can also be problematic. Resolving different stakeholders’ opposing demands or conflicting needs for social research can present additional challenges for procuring and delivering high quality research.
Recommendation 1
Heads of profession from all of the Government analytical services should establish with senior policy makers the importance of early and continuous involvement of researchers in the development of policies, programmes and projects.
Response 1
Analytics Unit, including the FSA’s Heads of Professions (social science, economics, statistics, operational research), will continue to meet with FSA’s senior policy, strategy and operational colleagues as part of business as usual, and in addition present the Analytics Unit offer in dedicated meetings.
Our Research and Evidence Steering Groups provide us with a cross-FSA forum to discuss and agree future evidence requirements, and use of new processes will better identify research impacts before work commences.
2. Procurement procedures
Procurement procedures for research have been identified as a challenge for some contractors and the FSA social science team. They have been described as “inappropriate for procuring research and evaluation” and “not fit-for-purpose”. These procedures are cross-government requirements and beyond the control of the FSA social science team or other analytical professions across government.
Recommendation 2
The procurement arrangements for government research should be reviewed with the aim of having separate procedures and requirements from those of general government procurement.
Response 2
We agree that current government-wide arrangements for procurement are a barrier to efficient research delivery. FSA Procurement have been engaging closely with cross-government activity in this area including via the Transforming Public Procurement project, and the Procurement Bill currently being debated by Parliament, to further improve procurement opportunities in the future.
3. Identifying impacts
Identifying the impacts of social research outputs is something the social science team would like to improve. This requires not only monitoring the uptake of the FSA social research team’s outputs, but also identifying and assessing their effects on dietary and hygiene-promoting behaviour.
Recommendation 3
The senior management of the social science team should identify ways in which the impacts of their research and analysis can be identified and evaluated. This might involve linking the data and findings of the FSA’s social research outputs to other sources of data on dietary behaviour.
Response 3
Maximising our science impact is a priority area in the Science, Evidence and Research Directorate (SERD) Science Capability Plan 2022-2025. We will continue developing our impact assessment processes to identify intended research impacts pre-project as well as monitor impact post-project. We will seek opportunities for combining datasets and triangulating evidence to enhance the impact and visibility of our work. This will include joining up with other government departments to explore data opportunities.
4. Transparent technical details
This review has identified that not all published research outputs have a separate technical report providing in-depth details of how research projects are designed, samples and research instruments are selected, or how analysis will be undertaken. Proportionality in the provision of such technical details is an issue, especially given the pace at which research has to be commissioned, delivered and quality assured. However, providing the technical basis of research outputs is one of the hallmarks of good science and should be common practice.
Recommendation 4
Technical details of how research has been conducted should be made available as common practice if the scientific quality of research outputs is to be assured. This is also a matter of transparency and accountability.
Response 4
We have now developed a methodology reporting guide for our contractors and for in-house use, which will be reviewed by ACSS Quality Assurance Working Group. This is based on sources including the Quality Assurance Toolkit, to help ensure that all technical aspects are transparently reported more consistently across our published research outputs. Our published technical reporting will more consistently provide in-depth details of how research projects are designed, how samples and research instruments are selected, and how analysis is undertaken. We will embed this by incorporating it into our specification and reporting templates.
5. External peer reviewing
External peer reviewing and quality assurance of research outputs is “undertaken on a case-by-case basis, subject to the pace, complexity and purpose of the work.” This is usually undertaken by the FSA’s Advisory Committee for Social Science (ACSS) and academics on the FSA’s Register of Experts. Proportionality is also an issue in the provision of peer reviewing of research outputs. External peer reviewing, however, is another hallmark of good science and should be common practice. Where and when research outputs are for publication, and/or a potential evidence base for decision making, then wholly independent peer review should be a matter of good practice.
Recommendation 5
All published social science outputs, and those that will provide a potential evidence base for decision making, should be peer reviewed by wholly independent experts as a matter of good practice.
Response 5
In consultation with our Chief Scientific Advisor and ACSS, we are developing a decision-making matrix for peer review, to ensure a consistent approach across SERD. This will align with the risk-based approach outlined in the Aqua Book. In addition, we are adapting our existing peer review templates, using the QA toolkit checklists, to ensure a more consistent approach to peer review across SERD. We will continue to keep our register of specialists under ongoing review for fresh independent insights and expertise.
6. Specialist CPD
There is some need for professional development of some members of the social science team. This includes professional development in methodology, to stay abreast of the latest methods of social research and research management, and therefore be more confident in supporting less senior members of staff on methodological matters. It is understood that plans are underway for some professional development of the social science team. The professional development of social researchers would also benefit from greater flexibility in selecting training courses at the appropriate level of technical expertise from organisations within and outside of the Civil Service Learning (CSL) prospectus.
Recommendation 6
All members of the team should be able to take the CPD training of their choice, within or outside of the CSL provision, to keep informed of the latest developments in research methods and coaching. There would seem to be a particular need for CPD of quantitative methods of research and systematic review methodology.
Response 6
The FSA is committed to enabling ongoing access to professional skills development, and analytical capability is a priority area our capability plans. For example, we are commissioning a bespoke whole social science team training course, to include quantitative methods and systematic reviews, as well as newer and innovative methods. This will be supplemented by other activities such as pro-bono quantitative skills training from one of our suppliers and peer training sessions with statistician colleagues. Team and individual training will be reviewed quarterly by the Social Science leadership team and by line managers, for example at extended check-ins, to ensure we meet the needs of both the organisation and the individual.
7. Skills maintenance
Most FSA social research is outsourced, so most of the team’s work is commissioning, managing and quality assuring research, rather than undertaking data collection and analysis. This may have some limiting effects on social researchers’ abilities to maintain and develop their skills in research and analysis.
Recommendation 7
The FSA social science senior management team should review the balance of in-house versus contracted-out social research and ensure that all members of the team have the opportunity to maintain and improve their social research skills.
Response 7
FSA senior management team will continually review business needs, balancing in-house versus contracted-out social research, to help ensure that all members of the team can maintain and improve their social research skills. For example, we will seek opportunities to conduct in-house secondary data analysis and so improve our quantitative skills, with support from our Statistics Team and Statistics Research Fellow. Where possible, internal and external shadowing will take place to help maintain research skills.
8. Self-assessment tool
The GSR Self-Assessment Tool aims to ensure that GSR’s professional standards are met in all of its products and people. It does this at a rather high level of generality and the indicators do not really capture the quality of research outputs at a sufficiently granular level. The Self-Assessment Tool also requires all research products to be assessed in terms of their implications and solutions for policy and delivery. This is often beyond the scope of most external contractors. The GSR Self-Assessment might best be used to assess the broader dimensions of professional standards, such as the accessibility, legal and ethical requirements, and recruitment and professional development. This, however, depends on the extent to which the GSR Self-Assessment code is currently used to assess the professional practice and outputs of social science in government. This review suggests that this may not be extensive. The scoring categories of red, amber, green were seen as insufficiently nuanced and as conflating standards. This is especially so in terms of assessing the rigour of social science methodology. The FSA Quality Assurance Toolkit was found to be more appropriate to appraise the rigour of research outputs as well as the production, assessment and procurement of research.
Recommendation 8
It is strongly recommended that going forward the FSA Quality Assurance Toolkit should be used as the main means of assessing the quality of social science research at the FSA and in other government departments.
Response 8
We have embedded the Quality Assurance Toolkit within the Social Science team and are working with the wider FSA Analytics Unit to adapt it to meet quality assurance needs across professions. Use of the toolkit to assess the quality of social science will be complemented by FSA’s annual review of our Analytical Function Standards.
We are working with the wider GSR community and Government Analysis Function to help build community knowledge, by disseminating the Quality Assurance Toolkit as a good practice tool cross-government, alongside our new methodology guide for contractors.
In addition, we are collaborating with the cross-government GSR Code Working Group to develop a comprehensive self-assessment tool for individuals.
Conclusion
We are pleased to accept the recommendations made. We are committed to continuous improvement of our social science evidence and will review progress against the recommendations made in 12 months’ time.
Revision log
Published: 10 August 2023
Last updated: 10 August 2024