A range of techniques and methods exist to assemble and present research findings in a way that will be ‘useful’ to policymakers. In public health, three of the most popular are Health Impact Assessments, systematic reviews, and economic decision-making tools (including cost-benefit analysis and scenario modelling). Despite the broadly shared goals, these methodologies have developed distinct, and often parallel, ‘epistemic cultures’ (Knorr-Cetina), through mailing lists, training courses, journals and conferences devoted to each one. In a recent article we conceptualised all three as examples of ‘evidence tools’, arguing that despite their differences they all assemble, assess and present evidence in an effort to influence decision-making processes. Paradoxically, we found that despite this explicit aim, very little attention had been paid to how policymakers experienced these tools. Based on Katherine’s interviews with public health policymakers, in policy practice, evidence tools are perceived as useful when they:-
save time, especially where the work has been carried out by others
can be adapted to different contexts
convey credibility to external audiences
offer clear, quantified answers and/or predictions of likely policy outcomes
Scenario modelling, which is widely perceived to have been a critical factor in the introduction of minimum unit pricing for alcohol in Scotland, was described as particularly appealing because it predicted a very specific, quantified benefit (for example, potential saved lives). This was described as gold dust in the political process. However most research users were frank in admitting that they had little understanding of how modelling produced this figure. Far from being a drawback, we argue (in contrast to researchers who have found that policymakers value transparency in their evidence tools) that in public health policy, at least in this particular example, the black magic of modelling actually appeared to enhance its appeal.
The practical technical advice often offered to researchers to make their findings more useful, or ‘impactful’, often presents failures of evidence-based policy as a supply-side issue. Research findings are not relevant enough, too wordy, or buried in obscure academic journals. In contrast, examining how policy actors describe using tailor-made ‘evidence tools’ highlights the complicated role evidence plays within the inevitably political and democratic process of policymaking.
We use cookies to ensure that we give you the best experience on blogs.sps.ed.ac.uk. You can adjust all of your cookie settings by navigating the tabs on the left hand side.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.
If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.
Google Analytics
Some sites at blogs.sps.ed.ac.uk use Google Analytics to collect anonymous information such as the number of visitors to the site, and the most popular pages. The data help us improve the experience of using our site.
Please enable Strictly Necessary Cookies first so that we can save your preferences!
Privacy and cookies policy
Please see the School of Social and Political Science's privacy and cookies page. In addition to the cookies described there, blogs.sps.ed.ac.uk also uses the following cookies:
WordPress: wordpress_test_cookie
WordPress is the content management system (CMS) used to build blogs.sps.ed.ac.uk
Wordfence: wordfence_verifiedHuman, wfvt_
Wordfence is a WordPress security package
Twitter: personalization_id, guest_id, external_referer, ct0, _twitter_sess
Twitter is a microblogging social media service
Recent Comments