Category: Dissemination / Implementation
Keywords: Measurement | Implementation
Presentation Type: Symposium
There is a need for valid and reliable measures of implementation-related constructs; however, practitioners are unlikely to use these measures if they are not pragmatic. Glasgow and Rileysuggest that pragmatic measures are important to stakeholders, of low burden for respondents and staff, ‘actionable,’ and sensitive to change. These criteria have considerable face validity, but were not informed by stakeholders or a systematic integration of the literature. The aim of this study was to develop a literature and stakeholder-driven operationalization of the pragmatic measurement construct for use in implementation science and related fields. To accomplish this, we conducted 1) a systematic review, and 2) semi-structured interviews (n=7), 3) a concept mapping process (n=24), and 4) a two-round Delphi process with stakeholders (n=26) with experience in behavioral health and implementation research and practice. The systematic review and semi-structured interviews were conducted to generate a preliminary list of criteria for the pragmatic measurement construct (e.g., low cost, brief), and yielded 47 items after duplicates were removed. Concept mapping was conducted to produce conceptually distinct clusters of the pragmatic measurement criteria, and to yield item and cluster-specific ratings of their clarity and importance. The 47 criteria were meaningfully grouped into four distinct categories: 1) useful (e.g., “informs decision making”), 2) compatible (e.g., “the output of routine activities”), 3) easy (e.g., “brief”), and 4) acceptable (e.g., “offers relative advantage”). Average ratings of clarity and importance for each criterion were used to trim the list prior to the initiation of the multi-round Delphi process, which was intended to further refine the set of criteria and obtain stakeholder consensus on their clarity and importance. The two-round Delphi resulted in obtaining consensus on all but one item; although, qualitative comments provided during the Delphi process supported consensus. The final set will be used to develop quantifiable pragmatic rating criteria that can be used to assess measures in implementation research and practice.
Vice President of Clinical Training, Evidence-Based Practices, and Research and Evaluation
Hathaway-Sycamores Child and Family Services
Friday, November 17
12:45 PM – 1:45 PM
The asset you are trying to access is locked. Please enter your access key to unlock.