Government Program Evaluation and Design

IMG_1122.JPG

As evaluation practices increase, what role can design play?

The design world’s role in social innovation has grown tremendously over the past ten years. A human-centered approach to design yields products and services that are sensitive and responsive to the needs of the populations they serve. Designers use ethnographic and social science methods to get their creations right the first time and avoid user error after launch. Recently, trends in government program administration offer an opportunity for designers to participate after launch as well, in program evaluation.

Government programs are increasingly required to include evaluations as part of their budget, guaranteeing high quality and evaluating on a continuing basis so impact is clear. The federal government is driving nudges in budget allocation that set aside grant funding for evaluation. This incentivizes building on programs that are demonstrably successful and will ideally improve government initiatives over all. Evaluation takes the form of randomized, controlled trials where researchers select people to enroll in a program, organize a control group of individuals not enrolled and then measure the differences between the two.

Massachusetts is engaging in this evaluation-based approach, launching the largest social impact bond in the United States. The state is partnering with Roca for what is known as the Massachusetts Juvenile Justice Pay for Success Project that aims to keep at-risk young men out of prison and employed. Roca is working with three other organizations for advisory services and upfront financial support, but according to the terms of a social impact bond, the program must have and meet specific, tangible goals in order to receive state funding. If randomized controlled trials show that the goals are met, the Commonwealth will contribute $11.7 million in funding procured from the US Department of Labor.

Opportunities for designers

David Bornstein’s analysis of these evaluation programs critiques that federal agencies fall short on maintaining standards of rigor and packaging evidence in accessible and timely ways. David Leonhardt’s article on the topic similarly cautions that “none of this work is sexy” and that those who oppose the process insist that measurement is too hard.

These shortcomings create an intriguing opportunity for human centered designers. Visual designers can convey evidence so that it is digestible and compelling. They can also help to create toolkits and guides to help organizations navigate the complex measurement process. Design researchers can use ethnographic methods to give quick but rich insight into how programs are impacting participants. Design thinkers can streamline the various components of these initiatives, taking a systems approach to the products and services involved. Finally, designers can help translate findings into opportunity areas for program improvement going forward.

It’s an exciting time for social innovation as sectors work together and experiment with ways to create more intentional and iterative programs. Let’s figure out, as a design industry, how we can join in this realm of possibility.