NIDILRR is always looking for new ways to determine the merit, worth, or value of its program funding mechanisms, its grants, and the research, development, training, and knowledge translation projects conducted under the auspices of grants studying a particular disability-related problem area.
To that end, NIDILRR has used various summative evaluation approaches over the years.
Defining Summative Evaluation
Michael Scriven’s definition is: valuation determines the merit, worth, or value of things. The evaluation process identifies relevant values or standards that apply to what is being evaluated, performs empirical investigation using techniques from the social sciences and then integrates conclusions with the standards into an overall evaluation or set of evaluations (Scriven, 1991) Scriven, M. (1991). Evaluation thesaurus (4th ed.). Newbury Park, CA: Sage.
Summative Evaluation is the process of determining the merit, worth, or value of programs or projects at the end of their operating cycle.
Summative Evaluation at NIDILRR
- 2009
-
NIDILRR asked the National Academy of Sciences—National Research Council to conduct an external evaluation of NIDILRR (known as NIDRR then) and its grantees.
Staff of the National Academy of Sciences interview NIDILRR staff and and work with NIDILRR management and NIDILRR data staff. This interaction with various NIDILRR staff greatly facilitated the refinement of study research questions and the identification and use of data sources.
- 2012
-
The results of the National Academy of Sciences Summative Evaluation were published.
- The results of this summative evaluation were published in a 2012 report, Review of Disability and Rehabilitation Research: NIDILRR Grant making Processes and Products. Two noteworthy conclusions from the report are summarized below:
- The National Academy of Sciences (NAS) Committee found that 75% of outputs, from a sample of 30 grantees, were rated good to excellent. A total of 148 outputs, produced by the 30 grantees in the study, were rated using the quality criteria developed by NAS.
The NAS Committee offers NIDILRR a number of recommendations aimed at process improvement:
Priority-Setting Advisory Council
Recommendation 3-1: NIDILRR should fulfill the statutory mandate to form and utilize a standing disability and rehabilitation research advisory council to advise on the priority-setting process and provide input for priority setting.
Strategic Planning
Recommendation 3-2: NIDILRR should use a structured, consistent, and inclusive strategic planning process to develop its long-range plans and priorities.
Establishment of a Standard Calendar
Recommendation 3-3: NIDILRR should utilize a standard calendar for the setting of priorities, publication of notices inviting applications, submission of applications, and peer review meetings to improve the efficiency of the process.
Soliciting Applications
Recommendation 3-4: NIDILRR should expand its efforts to disseminate notices inviting applications to new potential applicants, including developing a communication strategy to ensure that the notices reach new audiences.
Enhancements to the Peer Review Process
Recommendation 4-1: NIDILRR should further strengthen the peer review infrastructure by expanding the pool of high-quality reviewers; establishing standing panels, or formal cohorts of peer reviewers with specialized knowledge and expertise as appropriate for the program mechanisms; enhancing reviewer training; and improving the consistency of NIDILRR staff facilitation of panel meetings and the quality of feedback provided to grantees.
Reducing Reviewer Burden
Recommendation 4-2: NIDILRR should streamline the review process in order to reduce the burden on peer reviewers.
Use of Consumer Peer Reviewers
Recommendation 4-3: NIDILRR should continue to have consumer representation in the peer review process and establish procedures to guide the participation of those without scientific expertise.
Grant Management
Recommendation 5-1: NIDILRR should continue to focus efforts on improving its grant monitoring procedures and specific elements of its overall grant management system that impact grantee-level planning, budgets, and the quality of outputs.
Recommendation 5-2: NIDILRR should review the requirements placed upon technical innovation grants and large multisite studies to ensure that planning, reporting, supervisory, and technical assistance requirements fit their particular circumstances.
Quality of Outputs
Recommendation 6-1: Although close to 75 percent of outputs were rated as good to excellent (i.e., 4 or higher on the seven-point quality scale), NIDILRR should make it clear that it expects all grantees to produce the highest-quality outputs.
Recommendation 6-2: NIDILRR should consider undertaking bibliometric analyses of its grantees—™ publications as a routine component of performance measurement.
Defining Future Evaluation Objectives
Recommendation 6-3: NIDILRR should determine whether assessment of the quality of outputs should be the sole evaluation objective.
Reviewer Expertise
Recommendation 6-4: If future evaluations of output quality are conducted, the process developed by the committee should be implemented with refinements to strengthen the design related to the diversity of outputs, timing of evaluations, sources of information, and reviewer expertise.
Improving Use of the Annual Performance Report
Recommendation 6-5: NIDILRR should consider revising its APR to better capture information needed to routinely evaluate the quality and impacts of outputs, grants, or program mechanisms. They might consider efforts such as consolidating existing data elements or adding new elements to capture the quality criteria and dimensions used in the present summative evaluation.
Recommendation 6-6: NIDILRR should investigate ways to work with grantees to ensure the completeness and consistency of information provided in the APR.
- 2013
-
Abt Associates, Inc. completed its five–year evaluation plan for NIDILRR
The plan included a set of research questions aimed at assessing the effectiveness and efficiency of NIDILRR’s operations as well as the quality and impacts of NIDILRR-funded activities and products. These questions are grouped around four domains:
- Domain 1 questions explore the quality and impacts of NIDILRR-funded activities and products
- Domain 2 questions explore the effectiveness of NIDILRR’s process for determining grant priorities
- Domain 3 questions explore the effectiveness of NIDILRR’s funding decision process
- Domain 4 questions explore the effectiveness of NIDILRR’s process of grants monitoring
Each of these domains has a subset of questions to be addressed as part of the overall evaluation. Abt Associates then identified tools that needed to be developed to answer these questions. A broad list of potential tools is listed below:
- Analysis of the APR data
- Grantee and trainee survey
- Bibliometric analysis
- NIDILRR staff survey
- Expert panel review of grantee products
- NARIC data abstract
- NIDILRR partner survey
- Knowledge Translation feasibility study
- Advisory board survey
- Peer review survey
As part of the plan, Abt Associates, Inc. also developed a schedule of evaluation activities which identifies the programmatic focus for each year’s analysis, and identifies specific questions to be addressed and the data source.
- 2014
-
Abt Associates, Inc. began the second phase of this contract by developing the tools for collecting the data to address key questions
Much work went into this phase because many data collection tools were developed. Each tool was tailored for use in the appropriate situation and was created to help answer the domain-specific research questions (see 2013) that guide and inform the evaluation plan.
In sum, the planning phase of the project was successfully completed in 2014.
- 2015
-
Abt Associates began the evaluation of the NIDILRR program portfolio
NIDILRR already collects extensive outcome data on the funded projects while they are active. Consequently, the evaluation is focused on longer-term outcomes of NIDILRR programs as well as on the effectiveness and efficiency of grant-making process. Specifically, the objective of the evaluation is to address the following questions:
- What are the longer-term benefits of NIDILRR funding for individuals with disabilities and their caregivers, the research community, federal partners, and other stakeholders?
- How can NIDILRR administer its programs most efficiently?
Each year, the evaluation will focus on a portion of the NIDILRR program portfolio and on an assessment of some aspect of the grant-making process, such as strategic planning, internal management, or peer review.
In year one of the five–year evaluation, Abt Associates examined three programs: the Disability and Business Technical Assistance Centers (now known as ADA National Network Centers), Knowledge Translation, and Small Business Innovation Research. In addition, Abt Associates evaluated participant satisfaction with the entire application process and with the funding level and duration.
Abt Associates engaged in several data collection activities:
- They conducted an on-line survey of all Principal Investigators (PIs) funded by the three programs being reviewed to determine whether they had been able to continue the work funded under the grant and to document the benefits of the grants
- We interviewed the users of NIDILRR funding, other than the grantees to validate and supplement the information on the benefits provided by PIs
- Abt Associates conducted bibliometric analysis of publications citing KT grants to characterize PI productivity and influence
- Abt Associates performed web searches for the companies cited by SBIR grantees to collect additional information on commercialization
- Finally, we surveyed all recent NIDILRR applicants to determine the strengths and weaknesses of the application process
- 2016
-
Abt Associates publishes NIDILRR Survey Support for Evaluation Activities: Year 1 Report, part of NIDILRR’s five–year NIDILRR evaluation plan
The report evaluates the Disability Business and Technical Assistance Centers (DBTACs)—now known as the ADA National Network Centers—program; the Knowledge Translation program; and the Small Business Innovation Research Program. Feedback from two user groups were obtained: principal investigators, funded under these grant programs, and customers of these grant programs.
The report also evaluates the participant satisfaction with the entire application process and with the funding level and duration.
Evaluation Contact
Have questions about NIDILRR Summative Evaluation Efforts?
Email William Schutz (william.schutz@acl.hhs.gov), NIDILRR’s evaluation specialist.