Table 1 - uploaded by Anna Williamson
Content may be subject to copyright.
Definitions of key terms

Definitions of key terms

Source publication
Article
Full-text available
The importance of utilising the best available research evidence in the development of health policies, services, and programs is increasingly recognised, yet few standardised systems for quantifying policymakers' research use are available. We developed a comprehensive measurement and scoring tool that assesses four domains of research use (i.e. i...

Contexts in source publication

Context 1
... scoring tool is a comprehensive checklist that lists the key subactions of the four research use domains (Figure 1 provides an example of the checklist for tactical use). These subactions are the essential features or actions of each research use domain (see Table 1 for defini- tions of key terms and examples). For example, subac- tions of tactical research use include using research to validate a predetermined decision, or using research to persuade stakeholders to support a decision. ...
Context 2
... in health economics to determine what health prod- ucts and services patients prefer, and the attributes driving these preferences [52,53,[56][57][58][59]. In traditional conjoint analysis, respondents rate combinations of subactions 1 called profiles (see Table 1 for definitions). This is an eco- logically valid approach, because each type of research use is composed of several smaller actions [28]. ...
Context 3
... identified the subactions of each research use domain, the next step involved dividing each subaction into its levels (Table 1). Levels in conjoint analysis refer to the possible values of a subaction [50]. ...
Context 4
... the levels of subactions was a necessary step before conducting the conjoint analysis, so that profiles could be created. Profiles are com- binations of subaction levels (Table 1 and Additional file 2). The final list of subactions and their levels for each research use domain is displayed in Figure 2. ...

Citations

... In developing the instrument, the researcher drew on established standards identified from the literature, direct experience of Tanzanian homes for older people, and established international standards. Therefore, the audit instrument was based on previously identified domains, standards, and corresponding criteria (Makkar et al., 2015). In this study, domains were referred to as fields (Audit instrument). ...
... In developing the instrument, the researcher drew on established standards identified from the literature, direct experience of Tanzanian homes for older people, and established international standards. Therefore, the audit instrument was based on previously identified domains, standards, and corresponding criteria (Makkar et al., 2015). In this study, domains were referred to as fields (Audit instrument). ...
Article
Full-text available
Problems experienced in homes for older people in Tanzania highlighted the need for a situation analysis of healthcare standards to identify the baseline of care provided to residents in these homes. This study conducted a situation analysis of structure healthcare standards and associated criteria with the aim of contributing to improved quality of care for residents in homes for older people in Tanzania. Thirty-two homes for older people in Tanzania were audited using an audit instrument that included seven fields, 26 structure standards, and 262 associated criteria. The analysis showed that overall, the homes were non-compliant with healthcare structure standards and associated criteria. The Tanzanian Government should urgently introduce measures to address the missing standards and associated criteria.
... In this regard, one of the necessary interventions is to enhance the understanding of knowledge translation in this group. Knowledge translation includes a wide range of activities that can be used to effectively convey the message of the research to the target group (in this case, the policy-makers) [62,63]. Researchers sometimes tend to overestimate their knowledge translation activities, so it is necessary to train and familiarize them with all the aspects of knowledge transfer and translation as well as the methods for evaluating these activities, which can be highly effective in operationalization and use of research results [64]. ...
Article
Full-text available
Background Providing valid evidence to policy-makers is a key factor in the development of evidence-informed policy-making (EIPM). This study aims to review interventions used to promote researchers’ and knowledge-producing organizations’ knowledge and skills in the production and translation of evidence to policy-making and explore the interventions at the individual and institutional level in the Iranian health system to strengthen EIPM. Methods The study was conducted in two main phases: a systematic review and a qualitative study. First, to conduct the systematic review, the PubMed and Scopus databases were searched. Quality appraisal was done using the Joanna Briggs Institute checklists. Second, semi-structured interviews and document review were used to collect local data. Purposive sampling was used and continued until data saturation. A qualitative content analysis approach was used for data analysis. Results From a total of 11,514 retrieved articles, 18 papers were eligible for the analysis. Based on the global evidence, face-to-face training workshops for researchers was the most widely used intervention for strengthening researchers’ capacity regarding EIPM. Target audiences in almost all of the training programmes were researchers. Setting up joint training sessions that helped empower researchers in understanding the needs of health policy-makers had a considerable effect on strengthening EIPM. Based on the local collected evidence, the main interventions for individual and institutional capacity-building were educational and training programmes or courses related to the health system, policy-making and policy analysis, and research cycle management. To implement the individual and institutional interventions, health system planners and authorities and the community were found to have a key role as facilitating factors. Conclusion The use of evidence-based interventions for strengthening research centres, such as training health researchers on knowledge translation and tackling institutional barriers that can prevent well-trained researchers from translating their knowledge, as well as the use of mechanisms and networks for effective interactions among policy-makers at the macro and meso (organizational) level and the research centre, will be constructive for individual and institutional capacity-building. The health system needs to strengthen its strategic capacity to facilitate an educational and training culture in order to motivate researchers in producing appropriate evidence for policy-makers.
... Recorded impacts were then classified according to the typology described in Table 2. Descriptive statistics were used to report research impacts by category of impact and type of research. (Makkar et al. 2015). Peer-reviewed research from other locations, as well as that conducted in NSW, was included. ...
Article
Full-text available
Current assessments of research impact have been criticized for capturing what can be easily counted not what actually counts. To empirically examine this issue, we approached measuring research impact from two directions, tracing forwards from research and backwards from policy, within a defined research-policy system (childhood obesity prevention research and policy in New South Wales, Australia from 2000 to 2015). The forward tracing research impact assessment component traced a sample of 148 local research projects forward to examine their policy impacts. Of the projects considered, 16% had an impact on local policy and for a further 19%, decision-makers were aware of the research, but there was no evidence it influenced policy decisions. The backward tracing component of the study included an analysis of research use across three policy initiatives. It provided a more nuanced understanding of the relative influence of research on policy. Both direct uses of specific research and indirect uses of research incorporated as broader bodies of knowledge were evident. Measuring research impact from both directions captured the diverse ways that research was used in decision-making. Our findings illustrate complexities in the assessment process and in real-life policymaking trajectories. They highlight the role that timing of assessment plays in perception of impacts and difficulties attributing longer-term impacts to specific research. This study supports the use of models where politics and complex system dynamics shape knowledge and its influence on decision-making, rather than research being the primary driver for policy change.
... • identify and explore the nature of policy determinants in the development and implementation of the Future Directions strategy, developed using by the Analysis of Determinants of Policy Impact (ADEPT) determinants interview guide (Rütten, Röger, Abu-Omar and Frahsa, 2009), and • assess the use of evidence in the development of the Future Directions strategy using a validated tool based on the Supporting Policy In health with Research: an Intervention Trial (SPIRIT) Action Framework, called the Staff Assessment of enGagement with Evidence (SAGE) interview tool (Makkar et al., 2015). ...
Article
Full-text available
Background: In the Australian state of New South Wales nearly 60,000 approved applicants are waiting for social housing. Future Directions for Social Housing is a response to this challenge. This collection of housing programs aims to provide more social housing, support and incentives for leaving social housing and a better social housing experience. This document presents the protocol of the evaluation of these programs and the overarching Future Directions Strategy. Methods/Design: The evaluation will use a Type 1 effectiveness-implementation hybrid design, with an integrated, dual focus on assessing the effectiveness of Future Directions and better understanding the context for reform implementation. Program effectiveness will be examined using quasi-experimental techniques applied to linked administrative data. The implementation context will be examined via program level data, qualitative interviews and focus groups with stakeholders and tenants. Some quantitative survey and administrative data will also be used. Findings from the implementation evaluation will be used to inform and interpret the effectiveness evaluation. Economic evaluations will also be conducted. Discussion: This methodology will produce a high-quality evaluation of a large, complex government program which aims to facilitate rapid translational gains, real-time adoption of effective implementation strategies and generate actionable insights for policymakers.
... In this regard, one of the necessary interventions is to enhance the understanding of knowledge translation in this group. Knowledge translation includes a wide range of activities that can be used to effectively convey the message of research to the target group (in this case, the policymakers) (27,28). The study of majdzadeh et al. showed that researchers tend to overestimate their knowledge translation activities, making it necessary to train and familiarize them with all the aspects of knowledge transfer and translation as well as the methods of evaluating these activities, which can be highly effective in operationalization and use of research results (29). ...
Preprint
Full-text available
Background: Providing appropriate information to policymakers by strengthening evidence-based capacity is a key factor in the development of evidence-based policy making (EIPM). This study aims to examine the necessary interventions in the Iranian health system for empowering researchers and knowledge-producing organizations to strengthen EIPM. Methods: This qualitative study was conducted using interviews and document review. The views and experiences of enterviewees were extracted through semi-structured interviews. Purposive sampling was used and continued until data saturation. Thematic framework analysis and MAXQDA 12 software were used for data analysis. Results: Necessary interventions for empowering researchers and knowledge-producing organizations were categorized into health system interventions, community-based interventions, organization interventions, and individual interventions. Conclusion: Incompatibility of health policy decisions with scientific evidence derived from research highlights the importance of creating a common language among health policymakers and researchers. In this regard, developing scientific and practical interventions, educating health researchers on knowledge translation, and using mechanisms and networks for effective interaction will be constructive.
... In this regard, one of the necessary interventions is to enhance the understanding of knowledge translation in this group. Knowledge translation includes a wide range of activities that can be used to effectively convey the message of research to the target group (in this case, the policymakers) (27,28). The study of majdzadeh et al. showed that researchers tend to overestimate their knowledge translation activities, making it necessary to train and familiarize them with all the aspects of knowledge transfer and translation as well as the methods of evaluating these activities, which can be highly effective in operationalization and use of research results(29). ...
Preprint
Full-text available
Background: Providing appropriate information to policymakers by strengthening evidence-based capacity is a key factor in the development of evidence-based policy making (EIPM). This study aims to examine the necessary interventions in the Iranian health system for empowering researchers and knowledge-producing organizations to strengthen EIPM. Methods: This qualitative study was conducted using interviews and document review. The views and experiences of enterviewees were extracted through semi-structured interviews. Purposive sampling was used and continued until data saturation. Thematic framework analysis and MAXQDA 12 software were used for data analysis. Results: Necessary interventions for empowering researchers and knowledge-producing organizations were categorized into health system interventions, community-based interventions, organization interventions, and individual interventions. Conclusion: Incompatibility of health policy decisions with scientific evidence derived from research highlights the importance of creating a common language among health policymakers and researchers. In this regard, developing scientific and practical interventions, educating health researchers on knowledge translation, and using mechanisms and networks for effective interaction will be constructive.
... The application of conjoint analysis in the study, which combined a real-life example of EBI with multifaceted and multilevel attributes, provided insights into the value that decisionmakers place on features of a given intervention package. The method has increasingly been applied in implementation research, because as compared to conventional prioritization methods, conjoint analysis has several advantages: first, the method offers greater realism, grounds attributes in concrete descriptions, and extends the idea of side-by-side comparisons [15]; second, instead of 'stated importance', the method provides more scientific rigor by quantifying 'derived importance' values for each attribute or feature in the process of decision-making [20][21][22][23]; and third, conjoint analysis offers the potential of using a simulation model to predict of how hospital stakeholders would respond to a new EBI or changes to existing intervention models [24]. The relatively short data collection time, the respondents' positive evaluation, and meaningful results, all suggested the feasibility of this method is in assessing intervention adoption preferences among hospital stakeholders. ...
... The stakeholder preferences identified in the study are not limited to the stigma reduction intervention only. To the contrary, it provides implications for EBI adoption for other projects and in other contexts; (2) assigning the component level of the attributes: it was suggested that the levels of attributes should be stated in concrete terms [22]. Therefore, we have provided specific examples of component levels for some of the attributes, including personnel involvement, duration of the intervention, and format. ...
Article
This study used conjoint analysis, a marketing research technique, to investigate hospital stakeholders’ decision-making in adoption of evidence-based interventions (EBI). An efficacious hospital-based stigma-reduction intervention was used as a ‘product’ to study adoption of EBI. Sixty hospital directors in Fujian, China evaluated the likelihood of adopting the EBI in their hospitals by rating across eight hypothetical scenarios with preferred and non-preferred levels of seven attributes, including (1) administrative support, (2) cost, (3) personnel involvement, (4) format, (5) duration, (6) technical support, and (7) priority alignment with the hospital. A hierarchical generalized linear model was fit to the likelihood of intervention adoption for the eight scenarios, with the seven attributes served as independent variables. Monetary cost of intervention implementation (impact score = 2.12) had the greatest impact on the directors’ reported likelihood of adopting the EBI, followed by duration of the intervention (impact score = 0.88), availability of technical support (impact score = 0.69), and flexibility of format (impact score = 0.36). The impact scores of other attributes were not statistically significant. Conjoint analysis was feasible in modeling hospital directors’ decision-making in adoption of EBI. The findings suggested the importance of considering cost, duration, technical support, and flexibility of format in development and dissemination of interventions in healthcare settings.
... Findings add to the literature by revealing no association between reported research use barriers and actual research use in policy and highlighting access to consultants and researchers as useful strategies for increasing the quality of the evidence used in policy quality in terms of methodological rigour and validity, (5) generating new research and/or data analyses, and (6) interacting with researchers [33]. According to the Framework, if the policymaker performs one or more of these actions, and relevant research is obtained, this research can then be used in four different ways in policymaking: (1) instrumental use whereby research evidence directly informs policy [15,36,37], (2) conceptual use where research is used to clarify understanding about the policy issue [38][39][40], (3) tactical use where research evidence is used to help justify and/or persuade others to support a predetermined decision [40,41], or (4) imposed use where research evidence is used due to legislative, funding, or organisational requirements [42]. ...
... Findings add to the literature by revealing no association between reported research use barriers and actual research use in policy and highlighting access to consultants and researchers as useful strategies for increasing the quality of the evidence used in policy quality in terms of methodological rigour and validity, (5) generating new research and/or data analyses, and (6) interacting with researchers [33]. According to the Framework, if the policymaker performs one or more of these actions, and relevant research is obtained, this research can then be used in four different ways in policymaking: (1) instrumental use whereby research evidence directly informs policy [15,36,37], (2) conceptual use where research is used to clarify understanding about the policy issue [38][39][40], (3) tactical use where research evidence is used to help justify and/or persuade others to support a predetermined decision [40,41], or (4) imposed use where research evidence is used due to legislative, funding, or organisational requirements [42]. ...
... The interview format allows for in-depth exploration of whether and how research was used in the development of the document and barriers and facilitators to its use. An empirically derived scoring system has been developed for SAGE [40,41]. The scoring checklist breaks down each of the ten measured domains (six research engagement actions and four types of research use) into the essential features or main actions associated with them (subactions). ...
Article
Full-text available
Background Much has been written about the use of evidence in policy; however, there is still little known about whether and how research is engaged with and used in policy development or the impact of reported barriers and facilitators. This paper aims to (1) describe the characteristics of 131 policy documents, (2) describe the ways in which research was engaged with (e.g. was searched for, appraised or generated) and used (e.g. to clarify understanding, persuade others or inform a policy) in the development of these policy documents, and (3) identify the most commonly reported barriers and facilitators and describe their association with research engagement and use. Methods Six health policy and program development agencies based in Sydney, Australia, contributed four recently finalised policy documents for consideration over six measurement periods. Structured, qualitative interviews were conducted with the policymakers most heavily involved in developing each of the 131 policy documents. Interviews covered whether and how research was engaged with and used in the development of the policy product and any barriers or facilitators related to this. Interviews were scored using the empirically validated SAGE tool and thematically analysed. Descriptive statistics were calculated for all key variables and comparisons made between agencies. Multiple regression analyses were used to estimate the impact of specific barriers and facilitators on research engagement and use. Results Our data shows large variations between policy agencies in the types of policy documents produced and the characteristics of these documents. Nevertheless, research engagement and use was generally moderate across agencies. A number of barriers and facilitators to research use were identified. No barriers were significantly associated with any aspects of research engagement or use. Access to consultants and relationships with researchers were both associated with increased research engagement but not use. Thus, access to consultants and relationships with researchers may increase the extent and quality of the evidence considered in policy development. Conclusions Our findings suggest that those wishing to develop interventions and programs designed to improve the use of evidence in policy agencies might usefully target increasing access to consultants and relationships with researchers in order to increase the extent and quality of the research considered, but that a greater consideration of context might be required to develop strategies to increase evidence use. Electronic supplementary material The online version of this article (10.1186/s13012-019-0886-2) contains supplementary material, which is available to authorized users.
... At the outset, we located only a few measures relevant to the capacity of agencies to find and use research and these had variable levels of psychometric testing [36][37][38]; none of these measures aligned well with the variables in the SPIRIT Action Framework. As part of our conceptual and methodological platform, we therefore developed and tested three new measures aligned to the SPIRIT Action Framework to measure changes in individual staff (Seeking, Engaging with and Evaluating Research (SEER)) [39]; the organisation (Organisational Research Access, Culture and Leadership (ORACLe) [40] and in the policy products produced (Staff Assessment of en-Gagement with Evidence (SAGE)) [41][42][43] (Table 1). All of these measures performed well in psychometric testing; however, it became evident that, because policy products can take over a year to develop, SAGE would not provide dependable measures of change in an intervention with short-to medium-term follow-up. ...
Article
Full-text available
Background This paper describes the trial of a novel intervention, Supporting Policy In health with evidence from Research: an Intervention Trial (SPIRIT). It examines (1) the feasibility of delivering this kind of programme in practice; (2) its acceptability to participants; (3) the impact of the programme on the capacity of policy agencies to engage with research; and (4) the engagement with and use of research by policy agencies. Methods SPIRIT was a multifaceted, highly tailored, stepped-wedge, cluster-randomised, trial involving six health policy agencies in Sydney, Australia. Agencies were randomly allocated to one of three start dates to receive the 1-year intervention programme. SPIRIT included audit, feedback and goal setting; a leadership programme; staff training; the opportunity to test systems to facilitate research use in policies; and exchange with researchers. Outcome measures were collected at each agency every 6 months for 30 months. Results Participation in SPIRIT was associated with significant increases in research use capacity at staff and agency levels. Staff reported increased confidence in research use skills, and agency leaders reported more extensive systems and structures in place to support research use. Self-report data suggested there was also an increase in tactical research use among agency staff. Given the relatively small numbers of participating agencies and the complexity of their contexts, findings suggest it is possible to effect change in the way policy agencies approach the use of research. This is supported by the responses on the other trial measures; while these were not statistically significant, on 18 of the 20 different measures used, the changes observed were consistent with the hypothesised intervention effect (that is, positive impacts). Conclusions As an early test of an innovative approach, SPIRIT has demonstrated that it is possible to increase research engagement and use in policy agencies. While more work is needed to establish the replicability and generalisability of these findings, this trial suggests that building staff skills and organisational structures may be effective in increasing evidence use. Electronic supplementary material The online version of this article (10.1186/s12961-018-0408-8) contains supplementary material, which is available to authorized users.