Be visual and explore different formats
Uninspiring design or processed data means evidence is less likely to be noticed and understood.
Aesthetically pleasing and easy-to-understand visuals help policymakers and practitioners process information quickly and easily (footnote 1). Improving the visual appeal of your evidence ranges from simple changes such as using headings, to inserting graphs, tables, and charts, or using icons and infographics to save space and convey complex information quickly (footnote 2). Analysis on the use of visuals such as diagrams has demonstrated that their inclusion is associated with higher citation rate for scientific papers. Contrasting colours and being consistent with designs and formatting (footnote 3) can also improve the chances of your evidence being communicated effectively.
Presenting the evidence in an exciting way (such as through video, social media, a personal experience, etc.) is more likely to engage and connect with audiences, particularly if they are time poor. Icons can be helpful, but care needs to be taken to ensure they are understandable and representative of the concept they refer to (footnote 4).
There are creative software programmes, such as Canva, which you might want to utilise to make your evidence outputs more visual. You might also want to consider employing the services of a professional designer to make your findings more visually appealing.
“We developed a set of animations as it’s critical to engage with people outside of the established media network as there is a lot of mistruths told about issues around diet and environment. The media seems very ‘adversarial in nature’ on this topic.” - Community Shop owner and working group member of the Liverpool Good Food Plan.
Consider the best format to communicate your message (and consider using multiple formats)
Different evidence users have different resources, needs, capacity and interests and so, materials should be designed accordingly (footnote 5). Using multiple mechanisms can also ensure evidence caters to different learning styles, as can balancing auditory and visual presentations. It is also important that evidence generators consider digital inequality, particularly when end-users are individual citizens (footnote 6). Delivery of a piece of evidence using multiple formats (emails, webinars, workshops, summaries, videos, etc.) improves the likelihood that it will reach the user and therefore be actioned. It is important to be clear why you are using a particular communication format. It is also important to ensure formats provide links for those who want to find out more, including what scientific papers a message is based on.
“We like the launch of a report offering new insights that has synthesised complex evidence that is combined with a webinar. You don’t have to read the full report you can just jump onto a one hour webinar to get the evidence summary and new insights. A good example was the launch of the OECD report on ‘Making Better Policies in Food Systems’ which is over 200 pages long. They launched the report and in partnership with academic group N8 Agrifood presented a webinar with insights from responders and for a retailer it was so useful the whole webinar. This approach save us a lot of time.” – Food Retailer
“To me policy brief says ‘boring’, the average Joe wouldn’t be reading it” – Academic
"Long documents don't do anyone any good in this area I think is the really key thing. Nobody wants to read a 15-page systematic review on something -- and I say no one, the people who are decision makers, the people who are extraordinarily busy. What they will do is send people off to check evidence, depending on the person. Some people will be very keen to know where is this coming from, especially if they're challenging that position." – Regional Public Health Network
There is varying evidence of effectiveness for different formats. For example, many evidence generators and translators are strongly encouraged to produce policy briefs based on their work. But in reality the evidence of their effectiveness in terms of impacting policy or practice is poor. Table 2 below provides examples of different mechanisms, including a description, the challenges and benefits to using it and a description of its effectiveness based on available literature. The mechanisms are colour-coded by effectiveness: red being not effective, yellow being somewhat effective and green being fairly effective.
Table 2. Mechanisms for evidence communication and dissemination (footnote 7)
Mechanism | Description | Challenges | Benefits | Effectiveness |
---|---|---|---|---|
Briefs | “A concise standalone document that prioritises a specific policy issue and presents the evidence in a non-technical and jargon-free language; in general, the purpose is to distil or synthesise evidence with the intention of influencing thinking and actions of policy actors” (footnote 8) | Clarity and maintaining concise messaging Bias Comprehension and unpredictable knowledge base of audience |
Relevant and salient (often commissioned) Easy comprehension Direct engagement on specific topic |
Valued by participants but little demonstration of impact on policy or practice Largely ineffective for addressing institutional / structural barriers to evidence engagement |
Blogs and Social Media | Quick summaries and highlights of key findings from scientific research, written colloquially | Clarity and maintaining concise messaging Credibility and bias Relevance and salience |
Open access Easy comprehension Convenient |
Effective for reaching a wide audience and building awareness Unclear / mixed for influence on policy / practice (footnote 9) |
Conferences and Seminars | Formal oral and (sometimes) visual presentations (in person and virtual) of evidence to a group | Engagement Clarity and maintaining concise messaging Comprehension and unpredictable knowledge base of audience |
Common venue Often funded Recognition |
Ineffective for influencing policy and practice |
Data visualisation | Using design principles to communicate complex information (for example graphs, charts, icons etc.) |
Clarity Balancing complexity while being concise Bias |
Easy comprehension Engaging Accessible |
Highly effective when done well (footnote 10) |
Toolkits | Practical guides / handbooks on possible ways to adopt and implement evidence |
Clarity Coverage Relevance and usefulness |
Easy comprehension Practical to adopt |
Moderately effective when tailored to audience needs |
Different users may find different mechanisms useful / familiar. For example, the third sector organisation Incredible Edible is a now a large activist network, but the initiative was actually spurred by a TedTalk they watched; they adapted the model described in the talk to create the Incredible Edible Project.
Practical examples: Visual Communication
The following examples illustrate different methods of visual communication:
- the Liverpool Good Food Plan has no published document - it’s an interactive website complimented by five short animations that are voiced by people with lived experience; it was six months of work that did not have a written output.
- the Food Systems ‘Flower’ Figure is a ‘visual thinking tool’, created to help policymakers and practitioners to consider the food system as a whole, and support them to identify connections between activities, outcomes and the related policies. The content of the Figure is grounded in the literature around food systems and food policy, and the design itself was co-created with a professional designer. The Figure has been utilised across policy, practice and academia. [include thumbnail of the diagram]
Checklist
- could you make your evidence more aesthetically pleasing and easy-to-understand through the use of visuals?
- could you present your evidence in an exciting way, such as through video, social media, or a personal experience?
- if using icons, are you confident they are understandable and representative?
- have you considered using a professional designer to help communicate your evidence?
- are there different formats you could utilise (emails, webinars, workshops, summaries, videos, etc.) to improve the likelihood your evidence will reach the user and therefore be actioned?
- could you employ multiple mechanisms and a balance of auditory and visual presentations, to cater to different learning styles?
- are you familiar with the varying evidence of effectiveness for different formats?
- have you considered digital inequality, particularly if your end-users are individual citizens?
-
Phoenix, J. H., Atkinson, L. G. and Baker, H. (2019) ‘Creating and communicating social research for policymakers in government,’ Palgrave Communications, 5(98); *See also: Brick, C. (2020) What works for What Works: How to communicate effectively, Gov.uk [Blog]; Brick, C. and Freeman, A. L. J. (2020) Communicating evidence for policy makers in icons and tables: What works? Preprint. University of Amsterdam.
-
Sources: Brick, C. (2020) What works for What Works: How to communicate effectively, Gov.uk [Blog]; Bazalgette, L. (2019) Supporting evidence-informed children’s social work: Creating a What Works toolkit, Gov.uk [Blog].
-
Sources: Brick, C. (2020) What works for What Works: How to communicate effectively, Gov.uk [Blog]; Brick, C. and Freeman, A. L. J. (2020) Communicating evidence for policy makers in icons and tables: What works? Preprint. University of Amsterdam; Bazalgette, L. (2019) Supporting evidence-informed children’s social work: Creating a What Works toolkit, Gov.uk [Blog]; Phoenix, J. H., Atkinson, L. G. and Baker, H. (2019) ‘Creating and communicating social research for policymakers in government,’ Palgrave Communications, 5(98).
-
Brick and Freeman (2021).
-
Phoenix, J. H., Atkinson, L. G. and Baker, H. (2019) ‘Creating and communicating social research for policymakers in government,’ Palgrave Communications, 5(98).
-
Sources: Bazalgette, L. (2019) Supporting evidence-informed children’s social work: Creating a What Works toolkit, Gov.uk [Blog]; Breckon, J. and Dodson, J. (2016) ‘Using evidence: What works? A discussion paper,’ Alliance for Useful Evidence.
-
Sources: Balian, E. V. et al. (2016) ‘Supporting evidence-based policy on biodiversity and ecosystem services: recommendations for effective policy briefs,’ Evidence & Policy, 12(3) [restricted access]; Breckon, J. and Dodson, J. (2016) ‘Using evidence: What works? A discussion paper,’ Alliance for Useful Evidence.
-
Balian, E. V. et al. (2016) ‘Supporting evidence-based policy on biodiversity and ecosystem services: recommendations for effective policy briefs,’ Evidence & Policy, 12(3) [restricted access].
-
Breckon, J. and Dodson, J. (2016) ‘Using evidence: What works? A discussion paper,’ Alliance for Useful Evidence.
-
Langer, L., Tripney, J. and Gough, D. (2016) ‘The science of using science: researching the use of research evidence in decision-making,’ UCL Institute of Education.
Revision log
Published: 17 October 2022
Last updated: 17 October 2023