Skip to main content

Utilizing Evidence-Based Approaches in Juvenile Justice

By Dennis Giever, PhD- Indiana University of Pennsylvania

Juvenile justice agencies striving to improve their programming face a number of challenges.  The movement toward evidence-based practice is well established, but an agency attempting to adopt a proven set of approaches often is forced to address a number of conflicting issues.  The overall goal typically is to establish an evidence-based program that will be effective within a particular jurisdiction.  Issues arise, as there generally is not a “one size fits all” approach that can be adopted.  Agencies face differing budgetary concerns and are often working with existing programs that either need to be modified or abolished.  Such agencies face choices on which evidence-based practice or program will work best.  Do they just adopt an existing established program?  Can they utilize research to modify their existing practices to conform to established research on what works?  Often communities have programs in place, but are moving towards evidence-based practices and are looking not to start over, but rather to modify their programs to bring them in line with practices that have been shown to be effective by research.  Many of the established programs are very expensive to implement, and some jurisdictions might not have the resources to adopt these programs.  Some may question whether relying on research undertaken in differing jurisdictions will be effective in their community.

One question that agencies must ask is whether “brand-name” programs are superior to more “generic” programs.  This very question was addressed by Brandon Welsh, Michael Rocque, and Peter Greenwood (2014).  In their research they used a series of decision trees to determine which were best.  Brand-name programs, according to these researchers, are programs that have been identified by various expert groups as “proven” or “promising” and are implemented with a high degree of fidelity.  In contrast, generic programs draw on the results of meta-analysis and their findings as a basis for improving existing programs.

The researchers looked at three comparable evidence-based programs – two that could be classified as “brand-name” and one that was a collection of “generic” programs.  Utilizing decision tree models, the researchers uncovered a number of interesting findings.  Two models were presented, the first of which did not include a standardized program evaluation protocol.  In this model, the “brand-name” programs had a large advantage because of the difference in program completion (70% for brand-name and only 20% for the generic).  Such a model clearly favored the brand-name program.  The researchers took it another step and included in the decision tree the Standardized Program Evaluation Protocol and applied it to generic programs.  In a sense, program completion was taken out of the model and, in this case, all three programs produced highly favorable expected values.

So what does this all mean?  It is a mixed bag as the two models demonstrated.  In the first model, it was clear that the brand-name program had the highest expected value, but as shown in the second model, much of the difference between the programs was attributed to completion rates.   Added to this are any number of differences between brand-name and generic programs that must be considered.  For example, the cost of brand-name programs are often much higher.  Brand-name programs are commercially developed, owned, and must be purchased as a package.  The researchers point out that in some cases this can cost a community as much as $500,000 per year.  There is often the added cost of the transition from existing programs to a new evidence based process.  Brand-name programs may require that existing programs are shut down; as such there is a higher startup cost.  In many cases, generic programs require modification to existing practices and may allow a more fluid transition.  That said, brand-name programs are often held to a higher standard and are more likely to be evaluated with a more robust research design.  In sum, there are advantages and disadvantages to both approaches, as much depends on the community, budget and existing efforts.

Each agency has a number of difficult choices to make.  The overarching concern is: what approach should we take?  In another well-written article entitled “Evidence-Based Policies, Programs, and Practices in Juvenile Justice,” Douglas Thomas and his colleagues (2016) developed a list of sound ideas or principles that could and should be applied to any program, whether a community or agency is looking at a brand-name or generic program.  The study addresses state support centers, but their recommendations can apply to any jurisdiction looking at implementing evidence-based practices.  The authors found it likely that the success of any program is based more on the fidelity and process utilized to implement the program rather than its type.  After all, both brand-name and generic programs have been tested and have demonstrated their effectiveness.  The question of which route is taken is often dependent on factors such as cost, location, the number to be served, and what will work best for a particular community.  Five recommendations from Thomas and his colleagues are germane in either case.

Recommendations

The first is strong leadership.  As with any effort, having a strong leadership team is critical to program success.  Leadership can take many forms, but the authors recommend a leadership team that consists of stakeholders from a variety of professions at all organizational levels.  These teams should have a strong commitment to the success of the initiative.

The second recommendation is a clear and unambiguous vision.  The stakeholders must help develop this vision and must all agree on what is meant by being evidence-based, the acceptable standards of evidence, and who is accountable for maintaining these standards.  This vision must be clear on goals, objectives, activities, and the expected outcomes.

The third recommendation is flexible support center development.  The authors are recommending that there is flexibility to allow the inclusion of some existing programs and practices that have (or can) demonstrate effectiveness.  Flexibility also applies to a jurisdiction’s ability to work with partners, such as universities or other research organizations.  In other words, they must be able to collaborate with partners.

The fourth recommendation is a well-designed data infrastructure.  Any evidence-based practice is based on well documented evidence of program effectiveness.  As such, having the capacity for the collection, processing, and application of data is critical.

The final recommendation is to have a collaborative relationship.   This recommendation is to create an environment that is conducive to collaboration, which allows both practitioners and researchers to communicate and learn from each other.

References

Thomas, D., Hyland, N., Deal, T., Wachter, A., and Zaleski, S. (2016).  Evidence-Based Policies, Programs, and Practices in Juvenile Justice: Three States Achieving High Standards through State Support Centers.  Pittsburgh, PA: National Center for Juvenile Justice.

Welsh, B. C., Rocque, M., and Greenwood, P. W. (2014).  Translating research into evidence-based practice in juvenile justice: Brand-name programs, meta-analysis, and key issues. Journal of Experimental Criminology, 10, 207-225.

Recent Articles

Monthly Publication of the Evidence-Based Professionals Society
| EBP Monthly
Winding Down the Year, Revving Up Opportunities -> Book your slots for upcoming; Trauma-informed Care &...
evidence based change
| News & Announcements
Over the past year, we've been dedicated to enhancing your experience with the EBP Society and Joyfields Institute. W...
Monthly Publication of the Evidence-Based Professionals Society
| EBP Monthly
Autumn Ambitions: Get Ready for a Season of Growth -> Book your slots for upcoming; Trauma-informed Care &a...
Monthly Publication of the Evidence-Based Professionals Society
| EBP Monthly
October Momentum: Ignite Your Learning   On-demand Certification Masterclass for New Human Services Entrants: ...
Monthly Publication of the Evidence-Based Professionals Society
| EBP Monthly
Harvest Knowledge, Cultivate Excellence   On-demand Certification Masterclass for New Human Services Entrants:...
Monthly Publication of the Evidence-Based Professionals Society
| EBP Monthly
Summer Surge: Your Business Update   On-demand Certification Masterclass for New Human Services Entrants: Exce...
Monthly Publication of the Evidence-Based Professionals Society
| EBP Monthly
A Happy New Month!   NEW! ~ On-demand Credentialing Masterclass for New Human Services Entrants Training for ne...
Monthly Publication of the Evidence-Based Professionals Society
| EBP Monthly
Sun's Out! Enjoy...   NEW! ~ On-demand Masterclass (with Optional Credentialing) for New Human Services Entrants...
Monthly Publication of the Evidence-Based Professionals Society
| EBP Monthly
May News to Brighten Your Day! FEATURED NMMFA gets $2.35M for Recovery Housing Program Today in Juvenile Just...