CrimeSolutions uses rigorous research to inform practitioners and policy makers about what works in criminal justice, juvenile justice, and crime victim services. Because we set a high-bar for the scientific rigor of the meta-analyses we will use to do this, many meta-analyses do not meet our criteria and we are unable to rate the practice.
For practices listed below, our reviewers determined that the available evidence was insufficient for a rating to be assigned. Practices are included in the list because of the insufficient research evidence about the practice, not because of any known weaknesses in the practices themselves.
Along with each practice listed are the references to the meta-analysis or meta-analyses that were reviewed and an indicator of why each study was rejected. See Reasons for Rejecting Meta-analyses.
Practice Title | Topic | Meta-analysis/Reason to Reject | ||||
---|---|---|---|---|---|---|
Cognitive-Behavioral Therapy for Aggressive Behavior in Youth | Juvenile Justice |
| ||||
Alternative Education Programs | Juvenile Justice |
| ||||
Beverage Alcohol Price and Tax Levels on Drinking | Crime Prevention |
| ||||
Community-Oriented Policing (COPS) | Law Enforcement |
| ||||
Positive Youth Development Programs | Juvenile Justice |
| ||||
Preventing Repeat Domestic Burglary Victimization | Crime Prevention |
| ||||
Prison Privatization | Corrections |
| ||||
Restorative Justice Conferencing for Adults Offenders | Corrections |
| ||||
Sex Offender Registration and Notification | Corrections |
| ||||
Training for Community Supervision | Corrections |
|
Reasons for Rejecting Meta-Analyses
Inadequate Design Quality: meta-analyses can be rejected if they do not provide enough information about the design or have significant limitation in the design (for example, did not measure the methodological quality of included studies; did not properly weight the results of the included studies; etc.) such that it is not possible to place confidence in the overall results of the review.
Low Internal Validity: meta-analyses can be rejected if they have low internal validity, meaning that the overall mean effect size was based on results from included studies with research designs that are not free from threats that could potentially bias the effect estimate. Randomized controlled trials (RCTS) have the strongest inherent internal validity. The internal validity scores of meta-analyses will be lower as the proportion of the included studies with non-RCT designs (i.e., quasi-experimental designs) increases.
Lacked Sufficient Information on Statistical Significance: meta-analyses may be rigorous and well-designed, but they are rejected if they do not provide sufficient information to determine if the mean effect sizes were statistically significant or non-significant. The statistical significance of a mean effect size is needed to determine an outcome's final evidence rating.