At some point, most of us have been or will be required to attend a workshop, seminar, or assembly (probably through school or work) that is focused on addressing some social issue. This generally addresses topics like "diversity", "bullying/harassment", "gender/racial equality", etc.
However, it's frequently unclear whether these seminars are actually doing anything to improve the school/work/whatever environment. Generally, the goal is simple: to create a dialogue and promote change; but I have not yet seen that result in the real world.
For example, I attended a seminar on alcohol last year, which was required by my college. After two hours of watching corny videos on the subject, interrupted periodically by the facilitator's awkward attempts to create dialogue, I feel like most attendees began to treat it like a joke, and took the issue a lot less seriously.
I'm really curious how everybody else feels about this type of thing. So, do you think these types of initiatives can have a tangible positive impact? What might need to change to make that happen?
Maybe you experienced some that are actually succesful?