What Difference Does a Policy Brief Make?
Penelope Beynon, Christelle Chapoy, Marie Gaarder and Edoardo Masset of the Institute of Development Studies, the International Initiative for Impact Evaluation and NORAD carried out an intriguing randomized control design survey involving over 800 experts to try to answer this question.
The policy brief used in the study was the report: Masset, E.; Haddad, L.; Cornelius, A. and Isaza-Castro, J. (2011) A Systematic Review of Agricultural Interventions that Aim to Improve Nutritional Status of Children. London: EPPI-Centre, Social Science Research Unit, Institute of Education, University of London. It included fisheries interventions.
One conclusion was that including an opinion piece with the Policy Brief “seemed to help in reinforcing certain key messages more than others. Respondents who did not receive an opinion piece tended to identify other key messages within the text of the policy brief.”
Download the full report here
Here is the Summary of their results.
Research has potential to improve the lives of the world’s vulnerable people if it is appropriately referred to in decision-making processes. While there is a significant industry of activity each year to communicate research findings, little systematic research has tested or compared the effectiveness of such efforts either for changing beliefs or for prompting action. Using a randomised control design, this study explored the effectiveness of one popular research communication tool, a policy brief, and queried whether different versions of a brief bring about different results. We find that the policy brief had little effect on changing the beliefs of readers who held strong prior beliefs on entering the study, but had some potential to create evidence-accurate beliefs among readers holding no prior beliefs. Also, when it comes to beliefs, the impact of the policy brief seems to be independent of the specific form of the policy brief. However, different versions of the brief (versions that include a research Opinion with or without a suggestion that the opinion is from an Authoritative source) do achieve different results when it comes to prompting actions. We find that other factors internal and external to the brief (gender of the reader, reader’s self-perceived level of influence and the extent to which the reader feels ‘convinced’ by the brief) are also linked to action. This first-of-its-kind study has implications for how research communication experts design policy briefs, how they understand and enable readers to act as knowledge brokers in their particular environment, and how we evaluate research communication going forward.