L&D Evaluation Belief #4: We can’t understand the why behind the results

analyizeresult

This post is part of a series on beliefs about social experimentation; if you missed the first post, start at the beginning of the series here.

Belief 4:

Evaluation findings are of little value because the “black box” just reveals that an intervention is effective or not, but nothing about why.

Stacey’s Counter:

This is a valid belief held by many social science skeptics. I struggle with this very issue myself. I do want to show my clients why we found or did not find impact of an L&D intervention. There are a couple of approaches you can take to maximize stakeholders’ confidence in the evaluation:

Constantly communicate and socialize findings
One approach I take to manage my client’s comfort level is to continually communicate and frequently socialize findings. I do not wait to reveal findings in an annual report. When I discover an important finding, I communicate it to my client via formal or informal channels. I pick up the phone, I shoot out an email, or I draft a semi-formal memorandum when I have a finding. This approach empowers my clients to make real-time decisions. You want to get people to agree to the facts, and then a logical conclusion can be drawn and politics/personal agendas are removed from the discussion.

Evaluation design dominates analysis
L&D evaluators should employ the strongest research design possible to answer the relevant impact questions and employ the best analytic strategies for that design. That is, design trumps analysis. Consider evaluation design options that are preferable to even the best analytic innovations to get inside “the black box” to determine what program features influence program impacts. For instance, predictive analytics (e.g., regression models) is the hot analytic approach right now. However, these analytic approaches are not appropriate or needed to answer all business impact questions. The bottom line is this: do not design your L&D evaluation around popular analytic techniques. Base your design on your evaluation and business questions. Answering your stakeholders’ questions and providing the “why” dominates sexy analytics!

A Logical Proof that Santa Exists!
L&D Evaluation Belief #3: Social experiments lack ...
 
  • No comments found
Add comment