Data Collection
Before embarking on DEI evaluation efforts, programs not currently collecting demographic data will want to determine how best to do so. The National Center for State Courts has published a good resource for collecting race and ethnicity data. Court ADR programs also typically collect quite a bit of other data that might be useful in evaluating DEI efforts. This may include numbers of cases referred to ADR, mediations or arbitrations conducted, time to resolution and ADR outcomes. For help with participant surveys, programs can look to the Model Surveys Toolkit created by Resolution Systems Institute (RSI) and the American Bar Association for help. In particular, they may want to consult the Model Party Survey.
Evaluation Process
Assessing DEI initiatives has three levels of depth. They are described in the following two hypothetical examples.
Example 1
A foreclosure mediation program determines that Latinx homeowners are not represented proportionately in the group of homeowners who contact the program, as compared to homeowners overall who are facing foreclosure. The program decides on steps to address this issue, such as enlisting a social services agency to reach out to the community.
The first level of assessment is to monitor progress on the steps taken to address this issue. For example, what outreach was conducted?
The second is to determine whether those steps are working. In this case, how many people did the social services agency reach through its efforts?
The third level of analysis examines the impact of the actions taken. For example, the program would examine the race/ethnicity make up of those who contacted the program after the social services agency began its outreach to see if the percentage of Latinx homeowners increased.
Example 2
A small claims mediation program becomes aware of an issue when Juan, a self-represented defendant, is referred to his court’s mediation program. Juan has received an email from the court about the program that directs Juan to the court’s website, which features a video on the program. Juan is also a person who is deaf, and when he starts watching the court’s video about the program he discovers there are no subtitles or downloadable text for the video. The ADR program responds by including closed captioning for all program videos as one of its action steps.
The program’s first assessment is to ensure it has enabled people who are deaf to review the video on its website by adding closed captioning.
The next level of assessment is to make sure that the steps have worked. If the court has decided to use YouTube’s automatic closed captioning program (which can often have many errors) to address the lack of access for people with hearing disabilities, people may still not be able to understand the video. Therefore, to find out if people who have hearing disabilities are having any difficulties with this, courts can provide an opportunity for feedback. This can be in the form of a pop up question on the website asking for feedback, through a focus group of similarly situated people, or an end of program participation survey.
The third level would be to examine the impact of steps taken to see if those steps lead to a change in experience. For example, did adding closed captioning lead to participants who are deaf or hard of hearing feeling better able to navigate the mediation process?
All programs should conduct the first two levels of assessment. Those who want to go further will learn much more about the effectiveness of their program. For more information on conducting evaluations, see RSI’s Guide to Program Success.