Skip to main content
English Cymraeg
Shifting toward healthy and sustainable diets: How to optimise evidence use for policy and practice

Technical report: Results

This section provides the results from both the rapid review of evidence use literature (the barriers and enablers) and the primary qualitative research.

Last updated: 16 November 2022
See all updates
Last updated: 16 November 2022
See all updates

The findings were analysed using an iterative thematic analysis process, which involved listening to recordings, reviewing and synthesising facilitator notes and collaboratively (among the research team) drafting a practitioner guide entitled: ‘Guiding Principles, Promoting healthy and sustainable diets: How to effectively generate and translate evidence’, referred to as the ‘Guiding Principles’ hereafter.

Rapid Review: Conceptual Framework for Understanding Diet-Change Evidence Use 

The following section describes the results from the rapid review of evidence use literature. First, current practice for applying evidence into practice is described through ‘the evidence use process.’ The different stages of this process –  generation; translation; dissemination; adoption and implementation – are commonly referred to throughout the literature and deeply integrated throughout the project. In line with the What Works approach described in the scoping review, this section provides an idealised version of the evidence use process and discusses how the ‘ideal’ seldom reflects the ‘practical’ experience of applying evidence to policy and practice. Next, a summary of terms which feature throughout the project is provided. 

Following this, the barriers to and enablers for evidence use are provided through two different tables and include the barrier/ enabler, a brief description and the relevant stage(s) of the evidence use process. These barriers and enablers were found exclusively through the rapid evidence review and were not informed by the primary qualitative research. During the write-up stage, both these findings and the primary research findings were synthesised to develop the Guiding Principles. The barriers are organised by the COM-B model presented in the scoping review and loosely colour-coded by evidence use stage (reflected in Figure 5). The enablers tend to cut across the different evidence use process stages, actor groups and sectors, but are also listed in by the evidence use process stage as much as possible. Table 4 also includes the relevant barrier(s) for each enabler, in direct reference to Table 3. Not all barriers have a reflecting enabler and some enablers may support multiple barrier(s); however, this category demonstrates the close relationship between the two and also the complexity of evidence use. The literature also demonstrated that for some sectors, health practitioners and third sector actors in particular, there are specific enablers that may improve the application of evidence to practise. These results are provided in separate tables. The section ends with a brief discussion about the applicability of the discovered barriers and enablers to diet shift evidence use. 

What are current practices for applying evidence to policy and practice?

The process for the application of research evidence into policy and practice is complex, and understanding how this process works in ‘the real world’ is critical for evidence to be successfully adopted and implemented (footnote 1).  Evidence generators, especially academics and researchers, commonly have an “idealised” understanding of how evidence is used to inform decision-making in that they believe it to be “rational”, “predictable”, “linear” and “direct.” (footnote 2)   In practice, however, this process is usually messy, unpredictable, iterative and non-linear (footnote 3).  While this is true of any policy issue or field, food systems are inherently complex and wide-reaching, so these characteristics are particularly pronounced in the case of diet shift evidence (footnote 4)

The following section outlines the idealised evidence use process and describes each stage: evidence generation (1); evidence translation, including message crafting and communication (2); evidence dissemination (3); evidence adoption (4) and evidence implementation (5). Figure 5 below presents a graphical representation of this idealised evidence use process produced by the authors, informed by the literature bodies set out in the methods.

Figure 5. Evidence-to-Policy/Practice Process

Evidence generation, evidence translation, evidence dissemination, evidence adoption, evidence implementation and review and evaluate.

Source: Authors

Stage 1: Evidence Generation

The first stage of the evidence-to-policy/practice process is evidence generation (see Figure 5). In general terms, this can be understood as the creation of evidence based on research.  It may involve:

  • conducting primary research;
  • collating and synthesising existing evidence to provide new insights; or
  • assessing and evaluating existing evidence (footnote 5)

Generated evidence fundamentally involves the creation of new evidence, through either primary (first-hand) or secondary (using existing evidence) research. There are a variety of different generation mechanisms which depend on the project’s aims and objectives as well as the researcher’s approach and relationship to the end user. Each mechanism has its own challenges, benefits and relevance to certain audiences. Appendix A provides a table of common ways to generate evidence, along with their associated challenges, benefits and applicability to different participant groups.

Stage 2: Evidence Translation

The second stage of the evidence-to-policy/practice process is evidence translation (see Figure 5). 

Evidence translation has been defined as “an active process through which actors identify, filter, interpret, adapt, contextualise and communicate evidence for the purpose of policy [and practice]. (footnote 6)”  This includes the two components message crafting and communication.

  • message Crafting is the process of critically reviewing the data to identify and filter the relevant research findings; interpret the results; and adapt and contextualise it for the appropriate audience (footnote 7)
  • communication is the process of identifying the appropriate audience and formatting and packaging the evidence into a deliverable message (i.e. aesthetics, style, language and type of mechanism through which the evidence will be conveyed that can be effectively received (see Table 2). For this to occur, the message must be understandable, accessible and clear, which requires the translator to make judgments on the audience’s knowledge base, interests and priorities (footnote 8). Additionally, the target audience must have the capacity and motivation to act on the message (i.e. evidence) and choose to advocate for it to be adopted into policy and/or practice (see adoption and implementation). Communication is often combined with evidence dissemination in the literature.

The context, clarity and accessibility of the communication mechanisms are critical to how effectively the messages from the research are received by the target audience (see barriers) (footnote 9).  To address this, evidence translators may choose to use multiple mechanisms to more effectively reach their audience(s). For example, toolkits, briefs and seminars are common outputs from a research project which communicate to different audiences (footnote 10).  Appendix B lists popular evidence communication and dissemination mechanisms, along with their challenges, benefits, target audiences and effectiveness based on the available evidence.

Stage 3: Evidence Dissemination

The third stage of the idealised evidence-to-practice/policy process is evidence dissemination, which is closely related to evidence translation and the two are often combined as one step in the literature (footnote 11).  Evidence dissemination is the task of delivering the message to the appropriate audience or individual (see Figure 5). The target audience must have the knowledge, capacity and motivation to act on the message (i.e. research findings) and advocate for it to be adopted into policy and/or practice (footnote 12).  Additionally, the disseminated message must be accessible, relevant and timely in order to be received by the audience.

Stages 4 & 5: Evidence Adoption and Implementation

The fourth and fifth stages of the idealised evidence-to-policy/ practice process are evidence adoption and implementation (see Figure 5). Adoption and implementation are the stages where decision-makers review evidence, choose whether or not to integrate it and convert it into deliverable actions. Together, these two stages can be understood as a tipping point for the application of evidence into policy and/or practice.

  • adoption is the act of integrating research evidence into policy and/or practice. It occurs when evidence findings are reviewed by the appropriate audiences, judged as useful and considered when designing policy and/or practice actions. The influence that the evidence has on the final policy and/or practice may vary; but in order to be successfully adopted the evidence must have some influence on the decision-making process. Evidence adopters are decision makers who receive the translated evidence and have the capacity to affect change (footnote 13)
  • implementation is the conversion of the policy and/or practice into action ‘on-the-ground’. It involves deciding how to pursue the policy and/or practice, convert it into actionable steps (including who is responsible for delivery) and delivering it to the public in the appropriate setting/environment. Evidence implementers are the ‘on-the-ground’ actors who deliver the final policy and/or practice (to the public) (footnote 14)

Process of Diet Shift Evidence into Policy and Practice

In food, evidence is generated not only by academic institutions and other research organisations, but also by government itself. Policy is not necessarily made by governments to be implemented by public sector implementers; commercial practitioners - individual companies/chains and peak bodies are also ‘policy-makers’ in the sense that they set internal and industry/sector policies and introduce interventions (e.g. certification; labelling; voluntary commitments on reformulation, advertising etc). So, unlike in more defined stakeholder groups such as health professionals and their relationship to implementation science, the evidence pathway is not sequential from evidence generation to policy to practise. The diversity of food system actors, and thus the end-users of diet shift evidence, means that there is no single pathway from evidence to policy and practice. Likewise, for diet shift the evidence-implementation process is messy, blurred, indirect and often difficult to predict (footnote 15).  These qualities lead to a strong potential for gaps or discontinuous points throughout the evidence use process. Appendix C provides an overview of these potential gaps.

Summary of terms

Informed by the evidence on evidence use and the diet shift evidence ecosystem described above, a range of terms are utilised in the project. The section below recaps the definitions of these terms, organised according to evidence use process stage and actor group.

Evidence use process

Evidence Generation: The creation of evidence based on research

  • may involve: conducting primary research
  • synthesising existing evidence
  • assessing and evaluating existing evidence

Evidence Translation: The active process through which evidence use stakeholders craft and communicate research evidence for the purposes of policy and/or practice

  • message crafting: the critical process of reviewing data to identify and filter the relevant research findings; interpret the results; and adapt and contextualise it for the appropriate audience
  • communication: the process of identifying appropriate audience(s) and formatting evidence into a deliverable message that can be effectively received by end-users

Evidence Adoption: The act of integrating research evidence into policy and/or practice

  • occurs when evidence findings are:
  • received and reviewed by the appropriate audiences 
  • judged as useful
  • considered when crafting new policy and/or practice actions 
  • *the influence evidence has on the final policy and/or practice may vary; but in order to be successfully adopted the evidence MUST have some influence on the decision-making process

Evidence Implementation: Conversion of the policy and/or practice into action ‘on-the-ground’

  • involves deciding how to pursue the policy and/or practice
  • converting it into actionable steps (including who is responsible for delivery) 
  • delivering it to the public in the appropriate setting/ environment

Actors and Actor Groups

Evidence Generators: Any evidence use actor or actor group that creates new evidence, including both:

  • primary researchers who develop new data sets to create new evidence (i.e. academics, scientists, professional researchers, think tanks) AND
  • applied researchers who review existing data and reframe it to create new evidence (i.e. corporations, government bodies, NGOs)

Evidence Translators: evidence use stakeholder(s) that identifies, filters, interprets, adapts, contextualises or communicates evidence for the purpose of policy and practice

Evidence End-users: Policymakers and practitioners with the capacity to adopt and/or implement evidence into final policy and/or practice, including both adopters and implementers
Stakeholder Groups: 

Evidence use stakeholder: an individual, organisation or group of actors that serve a role in the evidence use process and generally includes: generators, translators and end-users

Participant stakeholder: an individual, organisation or group of actors that serve a role in the diet shift evidence use process and have been identified as relevant to this project (see Table 4)

What are the barriers to evidence use for policy and practice?

Barriers to evidence use in policy and practice cross the different stages of the evidence use process and the different stakeholder groups. Broadly, 15 barriers that appear throughout the relevant literatures and which can be applied to diet shift evidence use have been identified. These are listed in Table 2, organised according to the COM-B model, with brief descriptions and referencing the different stages of the evidence use process to which they apply. The table descriptions are drawn from the literature, sometimes adapted to fit the project focus on diet shift evidence use.

Table 2: Barriers in the evidence use process (footnote 16)

COM-B component Barrier Description Evidence use process stage

Capability

Physical (for example structural and organisation)

Time The generator/translator’s time availability to conduct research and craft and communicate messages Generation, translation

Capability

Physical (for example structural and organisation)

Time The end-user’s time availability to receive, review and decide whether to integrate evidence Translation especially communication, dissemination, adoption

Capability

Physical (for example structural and organisation)

Resources The generator’s availability of resources (i.e. budget, equipment, technology, ‘man-power’) to conduct research Generation

Capability

Physical (for example structural and organisation)

Resources The end-user’s availability of resources (i.e. budget, equipment, facilities, technology, ‘man-power’) to deliver policy/practice actions Adoption, implementation

Capability

Physical (for example structural and organisation)

Organisational complexity The impact of complex organisational and hierarchical structures (‘bureaucracy’) for evidence use stakeholder groups, which can result in ineffective collaboration and communication and incoherence between:
  • Policy/practice AIMS and IMPACTS
  • Different end-user bodies (for example, departments, sectors)
  • Different scales (for example, national, local)
Adoption, implementation

Capability

Psychological (for example skills and knowledge/knowledge management)

Comprehensibility The ‘understandability’ of the message, especially language (for example, colloquial vs. jargon): which should be clear, concise, and above all understandable Translation especially message crafting, adoption

Capability

Psychological (for example skills and knowledge/knowledge management)

Inappropriate skills and/or knowledge 

and/or

lack of skills and/or knowledge

Generator and translator skills and knowledge about: 
  • evidence-to-policy/practice process
  • end-users’ knowledge base and needs
  • effective communication (for example, clarity, understandability, active vs. passive voice, using simple language, etc.)
  • political/cultural/practical context
  • rigorous research methods (for ‘in-house’ researchers)
Generation, translation

Capability

Psychological (for example skills and knowledge/knowledge management)

Inappropriate skills and/or knowledge 

and/or

lack of skills and/or knowledge

End User skills and knowledge, which are dependent on individual and situational circumstances. They may lack skills and/or knowledge about:
  • disciplinary assumptions and context
  • topic background and context (for example, may not understand contextual differences and complexity)
  • how to read/ understand research writing and data (for example, jargon)
Adoption, implementation

Capability

Psychological (for example skills and knowledge/knowledge management)

Unmanageable volume(s) of evidence The End User's overload of information; an inability for the end-user to effectively identify, understand and filter relevant evidence due to limited ‘attentive capacity’ (see below) and large amounts of research; especially associated with ‘push’ generation.
*Also contributes to ‘Time' above, ‘Attentive Capacity’ below
Generation, adoption (for example, causes evidence to be automatically rejected), implementation

Capability

Psychological (for example skills and knowledge/knowledge management)

Ineffective presentation of evidence Inability of TRANSLATORS to identify and interpret relevant findings and communicate them effectively (for example, form, formatting, language, aesthetics).
Important considerations include:
  • content and language
  • sentence structure and grammar
  • form (for example, mechanism ) and format/design
  • aesthetics
  • timing (for example, when evidence is presented)
Translation, adoption and implementation

Opportunity

Limited access to credible evidence Translator and end user lack of access to evidence that is clear, verifiable and peer-reviewed due to:
  • scientific journal subscriptions (lack thereof and affordability)
  • poor quality ‘in house’ research due to
  • lack of skills/knowledge/training
  • time pressures
  • routinised research methods
  • lack of formal evaluation (i.e. inability to judge effectiveness of interventions)
Translation, Dissemination, adoption and implementation
Opportunity Alternative capacity The limited energy an individual Generator/ translator or end-user can expend on a particular task, (for example, focus); affected by:
  • situational and personal factors (for example, access to resources, partisan bias, feeling unwell, etc.) which change over time
  • competing messages, desires, needs, responsibilities/ demands
  • time pressures
Generation, Dissemination, adoption and implementation
Opportunity Unequal coverage

Biassed or slanted inclusion of research findings and counter-perspectives influencing content and Availability.

Content: 
“the extent to which the communication describes the most important options and their potential outcomes, for example, who is affected, which outcomes are included, short- and long-term benefits and harms, and uncertainties” (footnote 17)
Availability (for example, publication bias):
instances when journals prioritise certain fields or types of studies (for example, quantitative versus qualitative)

Translation, adoption and implementation
Motivation Lack of Salience  Failures in timeliness or relevance of evidence with respect to current policy and practice priorities and its applicability to the intended context Dissemination, adoption, implementation
Motivation Biases, Attitudes and Perceptions Neutrality: the actual and perceived balance of coverage in research findings, impacted by 
  • generator bias and content FRAMING (i.e. identification and interpretation of evidence) and
  • time and space restrictions (see Capacity above)
Generation, translation, adoption, implementation
Motivation Biases, Attitudes and Perceptions

Communication Environments: (footnote 18) the cultural and normative contexts that influence how messages are received, including:
Competition for attention:
the overload of available information, not all of it credible, that exists online and in the media
Political Polarisation:  the integration of scientific evidence with partisan opinions
Status Quo bias: behavioural phenomenon of individuals seeking comfort in maintaining the status quo in times of controversy rather than pursuing change
*Linked to Cultural differences below

Generation, translation, dissemination, adoption, implementation
Motivation Biases, Attitudes and Perceptions

All actors within the evidence use process experience cognitive biases which are rooted in individual perception and information processing; these include; 

Confirmation Bias: when an individual seeks out evidence that confirms a previously held assumption (for example, a person who eats meat seeks out evidence that meat consumption is more nutritionally balanced than a vegan diet)
Selection Bias: when an individual selectively chooses to pay attention only to evidence that reinforces their beliefs/worldview and ignores evidence that challenges it
Blind Spot Bias: the tendency to recognise bias in others’ judgments but not one’s own

Generation, translation, adoption, implementation
Motivation Biases, Attitudes and Perceptions

The ‘knowledge/ doing’ gap:
the tension between generators (especially academic researchers) idealised or ‘ivory tower’ understanding of the ‘real-world’ and the lived experiences of practicing end-users  
Impact Example: 
Practitioners see an academic as ‘the expert’ who will solve the organisation’s problems, so become deferential and take on the role of an observer rather than a participant in the research process to  practitioner becomes dependent on the academic rather than benefiting from resource sharing/ skill transfer
*Linked to Trust below

Generation, adoption, implementation
Motivation Biases, Attitudes and Perceptions

Prestige: the concern of generators, (especially academics) about the value of evidence generation to their career, i.e. as less valuable than academic research for publication; also raises organisational concerns for institutional independence

Generation
Motivation Trust and Transparency

The END-USERS’ perceived credibility (i.e. “the perceived quality, validity and scientific adequacy of people, processes and knowledge exchanged”) (footnote 19) of research evidence, depending perceived credibility of the message itself, the communication mechanism and the authority of the generator/ translator (which may change over time):
Distrust (for example, misinformation, controversial evidence, poor relationships) can cause false causal attributions for end-users and the public which are extremely difficult to change

Two-way communication (for example, feedback from audience members) positively influences trust and transparency due to

  • more inclusive methods
  • perception of process as ‘more fair’
  • perception of research outputs as more legitimate, less biased and more representative
Generation, translation, adoption, implementation
Motivation Complexity and uncertainty The loss of context and caveats for research findings due to time pressures and limited 'attentive capacity' (see above), causes findings to seem more conclusive than they actually are.  Translation, implementation
Motivation Complexity and uncertainty Conflicting, unclear and/or unavailable evidence (for example, GMOs, reduced meat consumption, etc.) contributing to distrust and increased bias (see above) Generation, translation, adoption, implementation
Motivation Complexity and uncertainty

Timescales: uncertainty about salience and impact of evidence over time (for example, shifts in policy/ priorities, adaptation to unexpected crises, etc.) and Variable timetables for generators/ translators (especially academics) and end-users; for example, academics work on longer timetables whereas end-users often work based on specific (time-sensitive) needs
 

Generation, dissemination, adoption, implementation
Motivation Complexity and uncertainty Uncertainty over the credibility of information, its relevance and the future direction of policy or practice Adoption, implementation
Motivation Complexity and uncertainty Unpredictability of future events, needs and contexts causes radical changes to be less acceptable socially and politically (see ‘biases’) Adoption, implementation
Behaviour change impacts Variable impacts of implementation actions The differences of impact and contextual applicability to different socio-demographic groups, influenced by:
  • socioeconomic status
  • cultural norms
  • location
  • age
  • gender 
  • race
Adoption, implementation
Behaviour change impacts Variable impacts of implementation actions Single-measure success scales (for example, GDP, obesity rate reduction, etc.) to determine intervention impact cause oversimplification of complex challenges and add to perceptions of ‘silver-bullet’ solutions; Multi-criteria measures are perceived to be ‘best practice’ and are more commonly recommended (footnote 20) Implementation
Behaviour change impacts Variable impacts of implementation actions Specificity and applicability of implementation actions (for example, general rules vs. specific solutions):
conflicting or mismatching evidence between generalised research findings/ national guidance and local context
*contributes to evidence ‘complexity’ and ‘distrust’ (see above)
Adoption, implementation

Source: Authors informed by literature, including: Grimshaw et al. (2012); Brick et al. (2018); Atkins et al. (2017);  Schoen et al. (2017); Warira et al. (2017), etc.

What are the enablers of evidence use for policy and practice?

This section outlines key enablers of effective evidence use found in the literature. It is important to note, however, that effective evidence use closely depends on the purpose, desired outcomes and context of the research. For each enabler, the relevant stakeholder group, the applicable stage of the evidence use process and related barrier(s) that could be addressed are listed. Two participant stakeholder groups, health practitioners and third sector actors (see Table 1), were the subject of specific literature so a separate section with additional enablers / interventions that are particular to these evidence users is included.

Table 3. Cross-sector Enablers of Evidence Use

Enabler Description Relevant Actor Group(s) Evidence use process stage Related Barrier(s)
Clarity Findings and recommendations should be clear and concise; (footnote 21) discussion should be kept short to avoid overwhelming the audience with information and complexity (footnote 22).  Language used should be selected to match the knowledge base of the audience and common terms and phrases should be prioritised over jargon (footnote 23) Generators, Translators Generation, Translation, Dissemination, Adoption Comprehensibility, attentive, capacity, inappropriate/lack of skills/knowledge, ineffective presentation
Adapt to the audience Consideration should be given to the resources, needs, capacity and interests of the audience and materials accordingly (footnote 24).  Using multiple mechanisms (email, seminar, toolkits, etc.) can ensure the information caters to different learning styles, as can balancing  auditory and visual presentations (footnote 25). Regularly follow-up and communicate with busy policymakers and practitioners throughout the stages of the research to increase engagement and interest (footnote 26).  Provide quick summaries and take-aways to aid with comprehension (footnote 27) Generators, Translators Generation, Translation, Adoption, Implementation Comprehensibility, inappropriate/lack of skills/knowledge, unmanageable volumes of evidence, attentive capacity, ineffective presentation, salience, biases, complexity and uncertainty, variable impacts. 
Use of visuals Aesthetically pleasing and easy-to-understand visuals help with quick and easy information processing (footnote 28); Headings, graphs, tables, charts, icons and infographics save space and convey complex information quickly (footnote 29);   Use contrasting colours and be consistent with designs and formatting (footnote 30).   Generators, Translators Translation, Adoption, Implementation Comprehensibility, attentive, capacity, ineffective presentation, biases, complexity and uncertainty. 
Selecting frames Framing occurs when a communicator emphasises specific aspects of a topic, which in turn influences how the topic is understood by the audience (footnote 31). Like any evidence interpretation, carefully consider how much emphasis, and of which aspects of the data (footnote 32). Evidence is often perceived to be ‘neutral’ rather than ‘persuasive’, but framing influences which message is conveyed (footnote 33).  Because of this, decide whether to be an ‘issue advocate’ (for example, persuasive) or an ‘honest broker’ (for example, as neutral as possible). (footnote 34)  Be explicit about what is evidence and what is interpretation within the message (footnote 35).  Despite these risks, frames can be used to guide audiences to clear conclusions (footnote 36).  It presents evidence in a way that appeals to policymakers and practitioners while demonstrating its relevance (footnote 37) Generators, Translators Translation, Adoption Comprehensibility, unmanageable volumes of evidence, attentive capacity, ineffective presentation, unequal coverage, salience, biases, trust, complexity and uncertainty.
Timing Be strategic about when to present research and make it as convenient and accessible as possible (footnote 38).  Frequent and ongoing communication throughout the project is often more useful than one summative presentation at the end (footnote 39).  Send update emails, with key takeaways concisely summarised, ‘bitesize’ presentation sessions and informal conversations over coffee or lunch to keep the audience engaged in the research and receptive to evidence findings (footnote 40) Generators, Translators Dissemination, Adoption, Implementation Unmanageable volumes of evidence, attentive capacity, ineffective presentation, salience, biases, complexity and uncertainty. 
Engaging with 'the practical' The policymaking process is often idealised as linear and predictable (footnote 41).  In practice, however, policymaking is usually messy, complicated and non-linear (footnote 42).  To effectively influence policy and practice, understand how these processes work and identify at which stages evidence will have the most impact (footnote 43).  Communicating at convenient opportunities; identifying the most relevant person or audiences with the capacity to influence change; and tailoring the messages to suit that audience can make the difference between evidence being adopted or rejected (footnote 44) Generators, Translators, end users Generation, Translation, Dissemination, Adoption, Implementation Resources organisational complexity, inappropriate lack of skills/knowledge, unmanageable volumes of evidence, attentive capacity, ineffective presentation, limited access, salience, biases, trust and transparency, complexity and uncertainty.
Building and sustaining relationships Relationships are critical to effective communication and have large impacts on trust, message clarity and relevance (footnote 45).  Build more engagement and project credibility by working directly with higher management (in both industry and policy) (footnote 46).  Develop diverse networks and contacts by taking advantage of informal channels such as coffee, lunchtime seminars and distributing research PDFs via email (footnote 47).  When ‘cold calling’ journalists, policymakers or practitioners, always include a quick self-introduction, a clear statement of why that person is being contacted and a clear ask that is within the person’s work remit and interests (footnote 48).  Put in the effort early on to build these relationships and sustain them over time to gain direct experience with the practical decision-making process and adapt to the audience more effectively (footnote 49) Generators, Translators, end users Generation, Translation, Dissemination, Adoption, Implementation Organisational complexity, comprehensibility, attentive capacity, ineffective presentation, limited access, salience, biases, trust and transparency, complexity and uncertainty.
Salience and relevance Policymakers and practitioners are more receptive to evidence when it is salient and relevant to their interests and priorities (footnote 50).  Consider the needs, political and social context of the research topic and the capabilities (in terms of resources, time and decision-making ability) of both the research team and the audience (footnote 51).  Learn about the decision-making process(es) to strategically provide evidence on topics that are timely and already of interest to decision-makers (footnote 52).  Likewise, stay up-to-date on current policy practices and consider the current political landscape in research design to stay relevant (footnote 53).   Generators, Translators Generation, Translation, Adoption, Implementation Attentive, capacity, ineffective presentation, Salience, biases, trust and transparency. 
Building capacity

Both policy makers/practitioners and research teams have restricted capacities in terms of  resources, time availability and knowledge base (footnote 54).  Early career researchers (ECRs) in particular can struggle with effectively translating and communicating evidence for adoption into policy and practice (footnote 55).  Tailored training for researchers, based on their research stage and knowledge of practical decision-making processes, is one enabling strategy to address this (footnote 56).  Likewise, increased provision for research funding and incentives for research contributions could help address resource and time constraints for researchers (footnote 57).  

Training for policymakers and practitioners on specialised research topics, understanding complexity and reading scientific reports would likewise enable better translation and comprehension of evidence (footnote 58)

Generators, Translators, end users Generation, Translation, Adoption, Implementation Resources, comprehensibility, inappropriate lack of skills/knowledge, attentive capacity, limited access, salience, biases, trust and transparency.

 Source: Schoen et al. (2017)

Enablers identified for specific stakeholder groups

Along with the more general findings on evidence use enablers outlined above, the review highlighted some stakeholder specific findings related to health professionals and third sector practitioners. 

Health Professionals

Implementation science describes ‘various concerted strategies (also referred to as implementation interventions, facilitators, enablers, etc.) to influence the implementation process in order to achieve desired changes in clinical practice’ including those listed in Table 4 below (footnote 59)

Table 4: Enablers of evidence use for Health Professionals
Enabler/Intervention Details Effectiveness (footnote 60)
Printed Educational Materials Published or printed recommendations for clinical care including, clinical practice guidelines, audio-visual materials, and electronic publications. 
Target: knowledge and potential skills gap; no evidence they target motivation
Moderately effective
Educational meetings Conference, lectures, workshops, traineeships Moderately effective
Education outreach Use of a trained person to meet with providers to give information with aim to change provider practices. Moderately effective for simple behaviours (for example, prescription)
Local opinion leaders Target knowledge and skills of their peers, not a formal position but due to reputation in the field and activities. Moderately effective
Audit and feedback ‘summary of clinical performance of healthcare over a period of time’. Performance can be identified via medical records, computerised databases, observation. Target healthcare providers perceptions of performance levels.  Moderately effective
Reminders Prompts on paper or computer screen for health professional to recall information Moderately effective when baseline compliance was low.
Tailored interventions Strategies to improve professional practice which take into account barriers (e.g. information management, clinical uncertainty) for change prospectively Highly effective
Multi-faceted interventions Interventions targeting multiple barriers.  Unclear - more research needed.

Source: Author from Grimshaw et al (2012)

Third Sector
Enabler/Intervention Details
Knowledge Brokers As intermediaries between academics, practitioners and policy makers. 
Secondment Long-term relationships with non-academic partners, including secondment opportunities for academic staff members
Project Advisory Groups Academics can be brought onto CSO Boards, Steering Groups or Advisory Panels. The Carnegie UK Trust recommends the use of Project Advisory Groups including policy and practice partners relevant to the research project, as a means of informing the research, promoting impact and developing relationships.
Long-term relationships Relationships should be sustained between research projects. 

 Source: Schoen et al. (2017)

Diet shift enablers and barriers

The barriers and enablers described in the past section typically relate to evidence use for policy although most, if not all, could also be applied to evidence use for practice. As stated in the scoping review, there is very little available literature on current evidence use practices and barriers to and enablers of evidence use in practice, especially in food systems. For this reason, the researchers identified barriers and enablers that cut across different sectors and actor groups throughout the evidence use process. Sector-specific enablers to relevant diet shift actor groups, health practitioners and third sector actors, were also included. Broadly speaking, all of the above barriers and enablers could, in theory, apply to diet shift evidence users. To determine whether or not that is the case in practice, and if there are any barriers and enablers that are specific to diet shift and not included, the project also included primary qualitative research with 30 food policymakers and practitioners in England.

Primary Research findings

This section sets out the findings from the primary research, which included a series of interviews with government (national and local), private and third sector representatives, two practitioner workshops (retail and local practitioners), retailer discussions, and a series of feedback sessions to refine our results. The authors conducted a thematic analysis to draw out key themes from the data. 

Both the primary data collection combined with the rapid evidence synthesis provided the evidence for the eight Guiding Principles (see Guiding Principles document). From the thematic analysis coupled with the co-creative feedback sessions a set of key themes have emerged for both better evidence generation and better evidence translation. The themes are presented below with associated guiding principles coupled with illustrative quotes. In addition, a set of barriers and enablers to evidence use have also been identified below. 

Better Evidence Generation

Four key categories have been identified for better evidence generation, including the need to: 

  • practice more interdisciplinary food systems approaches
  • employ greater co-creative and inclusive approaches to evidence generation to develop genuine partnerships with stakeholders
  • develop greater understanding of the policy process, actors and politics
  • ensure credibility of research design and data

Practice more interdisciplinary food systems approaches

Guiding Principle 1

A common theme throughout the workshops and interviews across all sectors was the need for more interdisciplinary food systems evidence generation to tackle complex challenges facing the food system.  One of the policy maker informants explains:

“Just to emphasise again the importance of embracing a food systems approach to everything that we do to develop comprehensive modelling and scenario analysis, that there is consideration of socioeconomic impacts and that we embrace multidisciplinary approaches in the way we develop our evidence base. Yes, we developed the evidence jointly with a wide range of academic communities. We need to make sure that different viewpoints are considered and that's part of the evidence, a systems approach allows you to hear different viewpoints. Sometimes we tend to treat evidence as academic findings and, again, you can get very important insights from talking to any stakeholder that is not an academic and that's part of the evidence base as well. So it's developing that common understanding of improving a situation again considering the viewpoints of the whole system and not just one particular group” (Policy Maker informant).

Employ greater co-creative and inclusive approaches to evidence generation to develop genuine partnerships with stakeholders (including evidence brokers, citizens etc.)

Guiding Principle 2

This was a common theme raised in the primary data collection in both the interviews and workshops and is illustrated by one of our informants as follows:

“Our Children's Future Food Inquiry was quite a good example, where we combined a lot of new analysis of national data, information from evidence brokers plus we did a big consultation and we had lots with young people from all different demographics, and we had that sort of integrated throughout and that was a very visually accessible report that you could just dip in and out of” (Third Sector Food Organisation Informant).

Develop greater understanding of the policy process, actors and politics

Guiding Principle 3

This was illustrated during our workshop with small medium-sized food retailers who discussed in depth how they work with their trusted suppliers to source evidence to inform their food procurement policy. This shows the importance of understanding the processes in different sectors:

“To decide what to source we have conversations with our suppliers based on customers’ concerns; our suppliers generally have some criteria for what ‘sustainable’ means in mind (i.e. local, free-range, carbon footprint, etc.). We make a big thing of Local sourcing particularly for meat. It's important we know where the meat comes from and our local supplier can provide the required traceability and quality we prefer. UK meat has higher quality and welfare standards than US and Australia sourced meat”.

Ensure credibility of research design and data

Guiding Principle 4

A common theme in both the workshops and interviews was the importance of credible research design and data. One informant explained:

“I would say there are some trusted organisations that we would look to that publish evidence. Things like the IPPC, and other panels like that who produce credible data. I think methodology is the key thing. I would always look at what methodology was used, what assumptions within that methodology.  If you can understand that methodology in more detail, so for example, if it is an LCA, what are the system boundaries, etc.? That's probably the most important thing. In terms of something like interviews or focus groups, or something where you're talking to the public, what I would want to see there is how many people have you spoken to? What type of people have you spoken to? Then how that's brought through really into the report or the study”.

Better evidence translation

We have identified from the primary data two key categories for better translation which relate to our Guiding Principles, including the needs to:

  • enhance of evidence presentation and communication by easy-to-follow guides and language, being visual and concise, and timing dissemination for optimum impact; and
  • enhance skill development for evidence generators and users.

Enhance evidence presentation and communication with easy-to-follow guides and language, being visual and concise, and timing dissemination for optimum impact

Guiding Principles 5, 7, 8

This theme was supported by a number of illustrative examples in our primary data collection. First, from an interview with one of our informants:
“It's a really short process infographic, but it shows the different steps involved in, well, between a chicken sandwich that an individual might eat, and global biodiversity loss, essentially. It shows the chicken sandwich, then it shows the soy that's being grown. It shows the rainforest being cut down, and then that link to the loss of biodiversity. I feel that because that's quite a complicated message to communicate, having pictures, and having it in that process as simple language is a really good way to get that message across - and is relatable as well, with the chicken sandwich”.

Second, from one of our workshop participants:

The Liverpool Good Food Plan has no published document -- it’s an interactive website complimented by five short animations that are voiced by people with lived experience; it was six months of work that did not have a written output. It’s critical to engage with the media and others outside of the established network, but this is increasingly difficult due to the ‘adversarial nature’ of media communication today”.

Enhance skill development for evidence generators and users

Not included in guiding principles because it is an organisational/ systemic issue that cannot be fully addressed by evidence generators

A key theme emerging in the interviews and workshops was the need for more capacity building and training for both evidence users and generators. This is shown below in our illustrative quotes.

“So it is for us very important that we can communicate complex technical information into clear messages that can be translated into policy action. That's probably one of the most important skills for the evidence specialists working on policy. We spend a lot of time building training capacity in this area.  Make sure nothing is lost in that translation and the translation reflects all the technical complexity that might be involved in the development of the evidence. So it is quite a challenging and difficult skill to accomplish and needs training and learning”

“I've said this many times before, for a long time, that I don't think there's any difference between policymaking, policy delivery and scientific progress. It's just that, let's say, we tend not to recognise it that way, but I think we would all benefit hugely from recognising that actually, we're all scientists in a sense, and we're all trying to develop a more objective view of the world. There needs to be much more emphasis on skill development for both evidence users and generators and much more joint understanding of the respective roles and much more joint training”.