Risks aheadIn this age of technology where things are getting smaller, faster and cheaper by the second, organisations are becoming more demanding of their L&D functions. It has always been the case that our clients want top quality for little money in a short space of time, but this expectation has become over-inflated in recent years.  As a result we are seeing learning solutions being developed cheaply and quickly using a variety of solutions from offshore development resources to rapid authoring tools.  These can work well when you have good solid instructional design behind a solution, but sadly this is becoming rarer as technology advances and pressure to deliver cheaper solutions in shorter time frames increases.

According to Wikipedia, “Instructional Design (also called Instructional Systems Design (ISD)) is the practice of maximising the effectiveness, efficiency and appeal of instruction and other learning experiences. The process consists broadly of determining the current state and needs of the learner, defining the end goal of instruction, and creating some “intervention” to assist in the transition.”

The three areas highlighted in this definition are where bad instructional design breeds:

  • The current state and needs of the learner
  • Defining the end goal of instruction
  • Creating an “intervention” to assist in the transition.

Let’s look at each of these in more detail to find out what goes wrong.

The current state and needs of the learner

A Learning Needs Analysis (LNA) is often considered to be a big job and something no-one really wants to do unless they have to.  When suggesting a LNA to clients, normal responses usually include:

  • it will take too long
  • we did one last year
  • people won’t complete it
  • we don’t want to bother people with another survey
  • we know what they need.

The truth is:

  • a LNA can be created, completed and analysed within two working days if you have a resource available to do it
  • if you carried out a LNA last year, it’s now out of date
  • anyone who is likely to complete a LNA will do it on receiving the invitation to participate, not one week later, so you can ask for responses within 24 hours
  • people like to be asked about things which affect them directly like how they want to learn
  • never assume that you know what someone needs without asking them first
  • it is due diligence to assess requirements of a learning program before spending valuable budget on its development in order to ensure maximum return on investment (ROI).

The LNA Survey

Even in cases where a LNA has been carried out, there are generally a few key elements missing or not analysed in depth which can mean that the very first part of the design process is fatally flawed.  Here are some suggestions for areas to cover in addition to questions around specific skills or business knowledge to find out what your learners really need:

  • Build some open ended questions into your LNA.  If you ask yes or no questions, you will only get yes or no answers.  Open ended questions like “how do you feel about carrying out training via elearning” or “where do you think the process breaks down” can tell you an awful lot about your target audience, the problem and how you can design learning to fix it.
  • IT experience and comfort levels.  For elearning and IT training this is an essential line of questioning.  Just because we all use a pc doesn’t mean we are comfortable with it or enjoy it.  Is there pre-training that needs to be done before they can start the main learning
  • Previous learning experiences.  Find out what was good and bad about previous learning experiences.  It can give you inspiration for your design and tells you a lot about how your target audience like to learn which will promote engagement.  If this is an elearning project, find out about other elearning experiences because if these were bad you are starting on the back foot in terms of getting buy-in to your solution
  • Learning styles.  Learning styles are often talked about in the design of learning, but how many solutions include a range of activities which truly address the learning styles of their target audience?  For example, rolling out elearning with no audio to a group of learners who are predominately auditory will be less effective than that which has audio accompaniment.  While this may be a limitation of IT infrastructure, it will have a direct impact on the effectiveness of the learning and will need more consideration.
  • Time to learn.  If learners are being expected to complete self-paced learning of any kind, it is important to consider finding out what time they have to do so.  What does their role involve?  If they are out on the road or have a very demanding role day-to-day, it  may be better to deliver learning via podcast or printable material that can be accessed easily on demand.

Survey alternatives

If, for whatever reason, you are unable to use a questionnaire as a means of identifying learning needs, there are some alternative options:

  • Make phone calls to a random sample of your target audience and ask questions.  These can take less than five minutes each and they:
    • Are a great PR exercise for L&D
    • Make people feel like someone cares about what they think
    • Often elicit more meaningful and in-depth responses than you would have received via the online survey
  • Focus group – set up a meeting with a group of your target audience to ask questions and get ideas around what they would like to see in training
  • Use records to gather information, e.g. OHS records will show information about accidents in the workplace which will tell you who needs training, what they need to be trained on and where the problem is occurring.

Defining the end goal of instruction

If you don’t know where you’re going, how will you know when you get there?

Instructional goals have been described in various ways over the years, including:

  • learning objective and learner outcomes
  • learning objective and enabling objectives
  • general instructional objective and specific learner outcomes
  • training aim and objectives

And the list goes on.  While defining objectives and outcomes is considered a fundamental in our business, why is it that we still see the word “understand” used in their definition?  However you prefer to label them, there are two questions to be asked when defining instructional goals for learning:

  • what is the overall purpose of the training?
  • what measurable behaviours will be assessed to show that the purpose has been achieved?

“Measurable” is the key word in the paragraph above.  Illustrative verbs should be used to define learner outcomes which in turn will be used as assessment criteria.  Here’s a few examples of do’s and don’ts when it comes to writing clear, measurable learner outcomes:

Good verbs Bad verbs
Identify Understand
Describe Know
List Comprehend
Perform Be aware of
Complete Consider
Recognise Realise

When writing outcomes, ask yourself “how can I prove that the learner is able to do this?”

Creating an “intervention” to assist in the transition

The “intervention” is the really exciting piece for an instructional designer – it’s the time to take everything they know and weave their magic into the learning solution.  Or so you would think.  Some examples of bad instructional design I have seen in recent times include, but are certainly not limited to:

  • Boring learning.  Unleash your creative side!  What is the program treatment?  What theme and approach will be applied?  Can you create a game, or a race, or use characters, or give out points/rewards?  Even IT system simulation training can be fun when you put your creative hat on!
  • Lecture based solutions.  These are just telling the learner what they need to know on 400 PowerPoint slides, not encouraging learning to take place.  The person working hardest here is the facilitator when it should be the learners.  Ask a question and invite a discussion, then give the answer.  Better still, make it a quiz or time trial – energise your audience!  In elearning this is called a page-turning solution, but same principles apply.
  • Inconsistencies in instruction.  Whether you are designing elearning or classroom-based courses with manuals and handouts, define your style guide early and stick to it.  Learners need to be introduced to navigation and instruction types at the start of learning so they are comfortable working through material.  In a large program this is particularly important as learners may move through a number of different lessons or modules.  Standardise common instructions like “Click File” so that this is used consistently through materials, rather than “Click File” in one place and “Click the File menu” in another.
  • Getting carried away with technology.  When designing elearning, there is so much great functionality available, it can tempting to design an all-singing, all-dancing solution.  But what value is it adding to the learning?  Every single interaction with your learner has to be meaningful, so make it engaging but leave your Director’s chair at home.
  • Using elearning as “the” solution.  There is no doubt that elearning has revolutionised the L&D offering, but it is important to remember that this is a method of delivery which in most cases should be used as part of a blended learning solution.  It is not a solution in its own right, so by all means work it into your design but consider where the human aspect of learning will play a vital role.
  • Training as a “fix all”.  Any transfer of knowledge will require support to implement back on the job, so who will make sure this happens?  Part of your learning program may require buy-in from managers and other resources to ensure that learners have the support they need back on the job to implement their new skills and elicit the required change in behaviour.

None of the above should add any significant time to the design phase of your elearning project, but it can add significant value.

Having read this post, is there anything you will do differently on your next project?