2018 State Standard of Excellence

Leading Examples

Results for America classified state governments’ data-driven and evidence-based practices, policies, programs, and systems as “leading” examples based on whether the examples met the requirements of the criteria question; were in effect in April 2018; were verifiable with publicly available information; and the extent to which the examples exhibited four characteristics: breadth, depth, legal framework, and interconnectedness. See Methodology (p. 14) section of the full report for more details.

1. Strategic Goals

Did the Governor have public statewide strategic goals?

Colorado Outline



The Colorado Governor’s Office publishes statewide strategic goals and statewide and agency-specific outcomes on its performance dashboard. The Governor’s annual budget request links these goals to specific agency activities and outcomes.

2. Performance Management / Continuous Improvement

Did the state or any of its agencies implement a performance management system aligned with their statewide strategic goals; with clear and prioritized outcome-focused goals, program objectives, and measures; and did it consistently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance?

Tennessee Outline



Tennessee’s data and performance website, Transparent TN, has statewide performance dashboards with specific sub-goals, targets, and performance data. The site also includes fiscal data related to agency’s programmatic spending and other expenditures. The site also publicizes strategic goals in the areas of education and workforce development; fiscal strength and efficient government; health and welfare; jobs and economic development; and public safety.

3. Data Leadership

Did the governor’s office or any state agency have a senior staff member(s) with the authority, staff, and budget to collect, analyze, share, and use high-quality administrative and survey data—consistent with strong privacy protections—to improve (or help other entities including but not limited to local governments and nonprofit organizations improve) federal, state, and local programs? (Example: chief data officer)

Indiana Outline



A 2017 Indiana law established the position of chief data officer (p. 8) with the budget, staff, and authority to (1) coordinate data analytics and data transparency for state agencies; (2) advise state agencies regarding best practices for data maintenance, security, and privacy; and (3) oversee the Indiana Management Performance Hub, which uses state data, such as the Education and Workforce Development database, to provide “analytics solutions tailored to address complex management and policy questions enabling improved outcomes.”

4. Data Policies / Agreements

Did the state or any of its agencies have data-sharing policies and data-sharing agreements—consistent with strong privacy protections—with any nonprofit organizations, academic institutions, local government agencies, and/or federal government agencies which were designed to improve outcomes for publicly funded programs, and did it make those policies and agreements publicly available? (Example: data-sharing policy, open data policy)


Multiple agencies

The Washington Education Research and Data Center has a memorandum of understanding which identifies how data will be collected and shared among partners with a strong focus on protecting individual privacy. The center brings together eleven partners, including other state agencies and nonprofits, to compile education and workforce data to improve student achievement and workforce outcomes.

5. Data Use

Did the state or any of its agencies have data systems consistent with strong privacy protections that linked multiple administrative datasets across state agencies, and did it use those systems to improve federal, state, or local programs?

Kentucky Outline


Multiple agencies

A 2013 Kentucky law established the Kentucky Center for Education and Workforce Statistics which collects and links high-quality, actionable data from five state agencies in order to improve education and workforce programs in the state. By providing data sets, publishing reports, and fulfilling research requests, the center provides state-specific insights with appropriate data privacy and data access measures. It has more than 40 staff members who are dedicated to “developing reports, responding to research requests, and providing statistical data about these efforts so policymakers, agencies, and the general public can make better informed decisions” (p. 7). The center is run by an executive director with oversight from a board composed of participating state agencies. The center has also developed a research agenda for 2017–2019.

6. Evaluation Leadership

Did the governor’s office or any state agency have a senior staff member(s) with the authority, staff, and budget to evaluate its major programs and inform policy decisions affecting them? (Example: chief evaluation officer)

Colorado Outline



Colorado’s lieutenant governor serves as the state’s chief operating officer and is responsible for working with agencies on the state’s performance management, process improvement, accountability and transparency. In compliance with Colorado’s State Measurement for Accountable, Responsive and Transparent Government (SMART) Act, the lieutenant governor oversees the Governor’s Dashboard with the goal of improving services for residents. The lieutenant governor’s office also spearheaded the launch of the Colorado Evaluation and Action Lab, which is helping departments evaluate their programs.

7. Evaluation Policies

Did the state or any of its agencies have an evaluation policy, evaluation plan, and research/learning agenda(s) and did it publicly release the findings of all completed evaluations?

Massachusetts Outline


Single agency

8. Evaluation Resources

Did the state or any of its agencies invest at least 1% of program funds in evaluations?


Results for America was not able to identify any states with leading or promising examples for this criteria.

9. Outcome Data

Did the state or any of its agencies report or require outcome data for its state-funded programs during their budget process?

New Mexico Outline



A 1999 New Mexico law (p. 5) requires all New Mexico state agencies to submit annual performance-based budget requests which include (1) the outputs and outcomes from each program, (2) performance measures and performance targets for each program, and (3) an evaluation of the program’s performance. This information is released annually in the state’s policy and fiscal analysis, which includes individual agency performance reports (pp. 87–129) and information on the cost effectiveness of different programs (pp. 15–20, 49–50).

10. Evidence Definition and Program Inventory

Did the state or any of its agencies release a common evidence framework, guidelines, or standards to inform its research and funding decisions and make publicly available an inventory of state-funded programs categorized based on at least two tiers of evidence?

Minnesota Outline


Multiple agencies

Under a 2015 Minnesota law (section 13), the Minnesota Management and Budget Office developed numerous inventories and cost-benefit analyses of evidenced-based programs. These inventories include the areas of adult criminal justice, mental health, child welfare, juvenile justice, and substance use. As part of these inventories, the state developed evidence definitions to categorize these interventions based on the following four levels: proven effective, promising, theory-based, or no effect. Further, Minnesota published a guide for using evidence in policymaking to help policymakers use “the effectiveness of previously implemented policies or programs to inform management, policy, and budget decisions.”

11. Cost-Benefit Analysis

Did the state or any of its agencies assess and make publicly available the costs and benefits of public programs?


Multiple agencies

A 2013 Washington State law (pp. 105–106) directed the Department of Corrections, in consultation with the Washington State Institute for Public Policy (WSIPP), to (1) compile an inventory of existing programs; (2) determine whether its programs were evidence-based; (3) assess the effectiveness, including a cost-benefit analysis, of its programs; and (4) phase out ineffective programs and implement evidence-based programs. As a result of this and similar laws, WSIPP has published hundreds of cost-benefit analysis reports over the past 10 years.

12. Use of Evidence in Grant Programs

Did the state or any of its agencies: (1) invest at least 50% of program funds in evidence-based solutions or (2) use evidence of effectiveness when allocating funds to eligible grantees (including local governments) from its five largest competitive and noncompetitive grant programs?

Oregon Outline


Multiple agencies

A 2003 Oregon law states that the Oregon Department of Corrections, the Oregon Youth Authority, the Oregon Youth Development Division, and “the part of the Oregon Health Authority that deals with mental health and addiction issues” shall (1) “spend at least 75 percent of state moneys that the agency receives for programs on evidence-based programs” by 2011, (2) perform cost-benefit analyses, and (3) compile a biennial program inventory with results from funded programs.

13. Innovation

Did the state or any of its agencies have staff, policies, and processes in place that encouraged innovation to improve outcomes?

California Outline


Multiple agencies

The California Health and Human Services Agency’s Let’s Get Healthy California Innovation Challenge 2.0 awarded grants to 12 community-based initiatives to advance California’s goal of becoming the healthiest state in the nation by 2022. In the selection process, applications were scored based data use (“the extent to which data was effectively used to inform, target, and evaluate the innovation”) and effectiveness (“the extent to which the innovation’s results were achieved or show promise of being successful with the intended population”) among other criteria.

In 2011, the California Franchise Tax Board launched the Enterprise Data to Revenue project, a multiyear tax system modernization to increase efficiency and improve services for California taxpayers. The five-year project generated approximately $3.7 billion in additional revenue, with recurring additional revenue of $1 billion annually, through five components: an automatic tax return service, a new data warehouse and analytics tool that incorporated legacy tax data, a new customer service interface, an improved case management system, and enhanced tools for collections. Launched in 2016, the second phase of the project will build on these improvements.

California’s Eureka Institute “guides, supports, and integrates innovation and drives continuous improvement throughout state government” as a way to improve the impact of the state’s programs. The institute trains state employees on leadership, using open data, and Lean techniques, which are designed improve customer service.

14. Contracting for Outcomes

Did the state or any of its agencies enter into performance-based contracts and/or use active contract management (frequent use of data and regular communication with providers to monitor implementation and progress) to improve outcomes for publicly funded programs?

Rhode Island Outline


Single agency

Since 2015, Rhode Island’s Department of Children, Youth, and Families has worked to reform and restructure the department’s procurement processes in four areas: improving service delivery through strategic planning, embedding results-driven procurement in new contracts, improving performance through active contract management practices, and supporting results-driven contracting practices through technical resources, tools, and processes for staff. As part of this initiative, the department executed $90 million in results-driven contracts that require providers to meet outcome goals rather than output metrics. This has led to a reduction in the number of children in group care by nearly 20%, reduced the number of children in state custody due to improved preventative services, expanded services available to families and children, and made improvements in the department’s procurement process.

15. Repurpose for Results

Did the state or any of its agencies shift funds away from any practice, policy, or program which consistently failed to achieve desired outcomes?

Minnesota Outline


Multiple agencies

A 2014 Minnesota law (subdivision 7) requires the Minnesota Department of Human Services to use the Self-Support Index to monitor each county’s performance in assisting clients to become self-sufficient. Counties that meet performance targets receive a 2.5% bonus payment from the state, whereas counties that perform below the expected target must submit a performance improvement plan. In counties where “no improvement is shown by the end of the multiyear plan, the county’s or tribe’s allocation must be decreased by 2.5 percent” [256J.626(7)(a)(2)].

A 2016 Minnesota law (section 14, line 15.21) allows savings from reducing sentences for minor drug offenders to be applied to evidence-based drug and mental health treatments for those in prison and under supervised release. The evidence to support this law comes from the Department of Corrections’ own research which found (p. 26) that drug treatment reduces recidivism.