As we all know, Brainshark is a great tool for interaction with the learner and testing their knowledge of the given subject matter. If you’ve been seeking to up the ante as far as the quality of education/retention that you’re attempting to get out of Brainshark, consider becoming more introspective with the questions that you’re asking. The quality of the questions you ask will directly impact the experience and benefit of both the end user and the organization. Allow me to share a few questions that I’ve asked myself and found helpful when composing learning modules.
What’s the purpose of the question?
Basic recall? Concept mastery? To develop critical thinking skills? Testing for recall is infinitely easier than other types of testing or knowledge validation.
Am I providing feedback based on the answers provided?
One of the best tools in Brainshark is the ability to provide feedback for individual answers, or branch the presentation based on the answers provided. While certainly more work, the benefits can be well worth the effort.
What type of questions am I asking?
There are advantages and disadvantages of the major types of questions (open, closed, true/false, multiple choice, essay, fact finding, opinion, follow up, etc). Most resources suggest asking a variety of type of question to “mix it up”. When mixing it’s best to group the types of questions together (All of the T/F, then the multiple choice, etc.). Brainshark’s ever increasing inventory of question types can assist in providing variety for the learner.
Do you find yourself sometimes asking questions just for the sake of asking questions?
I know I have been guilty of this on occasion to get a fair number of questions.
Do I ask leading questions?
Leading questions are phrased in such a way that it suggests or gives away the answer and therefore discourages the student from thinking on their own. Or a variant, …asking a multiple choice with four possible answers and one choice is obviously not the answer. That’s marginally better than a true or false question with 50/50 odds.
Am I getting all of the information out of the results/reports that I can?
Spending time learning the ins and outs of the numerous reporting options is time well spent.
Is there a poorly written/ gotcha/misleading question?
There are ways to visually scan the reports for the answers given to see if a fair amount of people answered with the same incorrect answer. That’s one of two scenarios, the first is that the question was confusing or improperly written, or secondly, more training needs to be done on the topic.
Rhetorical question: Are you spending enough time composing effective questions?
The above is a barebones primer/thought exercise on questions. There is so much material on this subject and hopefully I’ve piqued your interest to dig a little deeper into this important subject.
Bottom line: If we’re expecting more out of our learners, we have to spend more time crafting and honing the questions. Asking good questions takes work – not something we can ask of the learner, only ourselves.