Learning Blog

Random despatches from places where L&D meets software and systems

Developing the GROUNDED way

The following description of the GROUNDED methodology for project managing e-learning first appeared in a much longer article called "Managing your AML awareness and training responsibilities using e-learning", which appeared in

Anti-Money Laundering: A Guide for Financial Services Firms, ed. Tim Gough, ISBN 1 904339 78 6

One of the problems inherent in e-learning is that people become so involved with the mechanics that they lose sight of the high level view of the project and its rationale. It’s not just about the build of the screens; keep focused on the bigger picture. Bear in mind that, regardless of your decision on whether to outsource or not, you will need to reserve time for project management – otherwise you might not get the result you were hoping for.

Before you get too enmeshed in the mechanics, please consider the following process I call "GROUNDED", which can help you avoid some of the more obvious pitfalls and put in place a firm foundation on which you can build e-learning that is fit for purpose:

  • Goals
  • Requirements
  • Objectives, scripts and storyboards
  • Usability and effectiveness
  • N-versioning
  • Delivery
  • Evaluation
  • Deployment

Obviously the work in each stage will be shared between you, others in your organisation and your development partner. Try to work out who is going to be responsible for what – or what the mix is - as early as possible.

Don’t subcontract everything out! You will need to do a certain amount of managing yourself but if you don’t have time to complete documentation at each stage, for example, ensure that you’ve done enough thinking and talked to the right people. Rushing the earlier stages, in particular, could prejudice the quality of your product.

Goals and Requirements

What you need to do


State your high level business drivers;

what you want your audience to know;

how you want them to change their behaviours. Do this in a way that resolves any potential conflicts and don’t forget to identify critical success factors. Define what success would look and feel like.


How you do it


Talk to the relevant people in your organisation. If you have "opposite numbers" in other organisations, talk to them too.


What could happen if you don’t


People might say:

"We discovered that what we wanted wasn’t really what was needed"

(or vice versa)

"Why are we doing this?"


In the first part of the Grounded process you need to put together a clear statement of the project’s intent and the value it expects to add. If possible, get the relevant people to sign it off, or otherwise buy into it. This could help you find the extra resources and help you may need.


What you need to do


Write and agree a hierarchy of objectives in the form

"by the end of this section, you will be able to …"


How you do it


Using Microsoft Word in Outline view really helps you to work out the chronology of the piece (by moving items up and down) as well as what belongs under what (promote, demote).


What could happen if you don’t


People might say:

"The solution didn’t encapsulate what we were trying to do", or

"It wasn’t very well structured", or

"It jumped around a lot"


Nicely ordered objectives form the underpinning of your e-learning project. Now you, or your development partners, can review your goals and requirements, start to write objectives and then begin to flesh these out in the form of scripts and/or storyboards. Remember the best films usually come from the best scripts; try to develop the best possible script which will discharge your higher level goals and requirements.

Usability and effectiveness

What you need to do


Achieve the best possible design, ensuring that your planned AML programme is easy to use, learnable and generally fit for purpose


How you do it


Get feedback on your scripts and/or storyboards using a combination of user trials, interviews, expert walkthroughs and/or best practice benchmarking


What could happen if you don’t


People might say:

"The training wasn’t well enough designed; it didn’t really change anything", or

"I couldn’t be bothered with it", or

"It was too hard so I gave up"


You can save a lot of money at this stage using only pencil and paper. Once you start building the software, , you should make sure that your input materials discharge the project’s goals and requirements, clearly put over all the objectives ("chunked" appropriately, in the right sequence) and help your people learn what they need to know and what they need to do.


What you need to do


Achieve the best possible build given the resources available


How you do it


Build an agreed number of prototypes each evaluated with suggestions properly subjected to cost / benefit analysis


What could happen if you don’t


People might say:

"The training wasn’t put together as well as it could have been", or

"The screens were a real mess", or

"It wasn’t very engaging", or

< something unrepeatable >


"N-versioning" is the only word I could think of to make the GROUNDED acronym work. But it is an important part of the process. It simply means building a sequence of prototypes that get better and better as you refine your ideas and incorporate more and more suggestions from your colleagues and users.

The process goes something like this:

Think very carefully about how you present your prototypes to the people whose opinions you value. Be aware that they will often notice the things you don’t want them to notice and won’t notice the things you do. Also, try to ask only open questions at the beginning of the interaction; that way you minimise the danger of suppressing unanticipated insights. If you have specific hypotheses you want to test ("should this come before or after that?"), keep them for later on in the conversation.

In theory, n-versioning is a never-ending process as you’ll always be able to improve what you do; in practice, more everyday concerns will limit how much you can do. There are no hard and fast rules for how many prototypes you should aim for on the road toward your first release but you should certainly not underestimate the value that can accrue from allowing yourself enough time for improvements. Don’t let n equal 1!

Piloting your solution will be necessary. This constitutes a more formal and extensive version of what you should have been doing all along. It can be looked upon in two ways. Internally, not only is it an opportunity to correct any remaining errors or inconsistencies, it’s also an opportunity to capitalise on the things you’ve done well. Externally, piloting can help to ease your rollout, ensure buy-in from your user population, reduce anxiety, diminish resistance, and so on. Try to pilot your application in as near to real contexts of use as possible. This means observing the appropriate people using the appropriate machines in the appropriate venues at appropriate times. Also, don’t pilot your application in a vacuum. If you intend it to be used with accompanying hardcopy material, for example, and/or instructor-led support, pilot that at the same time.

Try to work out what’s happening both objectively (e.g. are people’s scores improving?) and subjectively (e.g. how do people feel about it?). Fundamentally, ask yourself if it’s working and what you could do to make it better. A good pilot reveals many and varied issues; if you learn nothing from your pilot, that probably means it wasn’t conducted very well. A far more likely outcome is that you’ll be swamped with ideas for improvements. Be disciplined about applying cost/benefit thinking at this point; as you are likely to be nearing your release date.


What you need to do


Ensure the timely and satisfactory rollout of the materials


How you do it


Brainstorming triggers and barriers

Workflow checking

Pilot tests

What-if scenario planning


What could happen if you don’t


People might say:

"It was a good programme but the rollout was a disaster", or

"I never got to use it. Was it any good?"


Although you should aim to get your release right first time on time, you should also be prepared in case you don’t. What will you do if you’ve missed out an essential piece of information? Or if someone points out an error in one of your multiple-choice questions? Try to formulate contingency plans in advance of your release day.

Depending on what your users expect and the promises you’ve made, you may have to offer support, whether by phone, fax, letter, email or in person. Although very difficult, try to judge what kind and level of support you will need to provide – another good reason to pilot – and plan accordingly.


What you need to do


Assess end-user satisfaction and measure the programme against its intended business benefits. Quantify whose learned what


How you do it


Testing and assessment within the product itself, online satisfaction surveys


What could happen if you don’t


People might say:

"Did it really make any difference?", or

"I don’t suppose people were that bothered", or

"How much did it cost?!?"


Some ideas for improvements will come naturally, others have to be elicited. In either case you should try to find out how your application is faring out there in the real world. This means collecting and analysing more data, both of which can be done in a number of different ways.

Try to find a representative cross-section of your end-users and ask them what they liked and didn’t like about your application. Occasionally, you will of course receive feedback that is negative. Try to be philosophical and concentrate on those improvements which are going to make the biggest difference while simultaneously factoring cost into your decision making. Also, remember that you can’t please all of the people all of the time.

You might like to consider moderating a focus group, principally to generate some hypotheses to test on a larger section of your user population. You don’t necessarily need videos and two way mirrors to unearth interesting views; you could simply invite a few people for a sandwich and take some notes. Afterwards, if at all possible, try to get people to quantify – perhaps by means of a questionnaire – their thoughts and opinions. These, together with some carefully chosen objective test scores demonstrating who’s understood what (see Section 4.3 above) should help you demonstrate the cost / benefit to the relevant people in your organisation.

Try to collect your subjective data (what they thought of it all) by means of an online survey, which could be administered at the end of the course. You might include questions such as:

  • How confident are you that you understand what is required of you by the Money Laundering procedures (on a scale of 1 to 10 where 1 means "not very confident" and 10 means "very confident")?
  • How confident are you that you understand when and how to fill in a Suspicious Transaction report (on a scale of 1 to 10 where 1 means "not very confident" and 10 means "very confident")?
  • Do you know the identity of your Money Laundering Reporting Officer? Yes/No
  • How easy was this course (on a scale of 1 to 10 where 1 means "very hard" and 10 means "very easy")?
  • How relevant was this course to you and your job (on a scale of 1 to 10 where 1 means "very irrelevant" and 10 means "very relevant")?
  • How would you rate the quality of this course (on a scale of 1 to 10 where 1 means "very bad" and 10 means "very good")?


What you need to do


Ensure that all the learning from the project is captured, communicated and exploited


How you do it


Focus groups, depth interviews, show and tells, evangelism (or the reverse!)


What could happen if you don’t


People might say:

"The business didn’t really exploit the benefits", or

"It was OK for a one-off", or

"What ever happened to that AML programme …?"


Deployment means:

  • making sure everyone who should have received the e-learning did receive it
  • paving the way for improved future versions;
  • sharing the learning: what went well; what could have been better;
  • identifying other compliance opportunities in your organisation which could benefit from similar treatment. This may include things such as Systems and Procedures, Conduct of Business, Approved Persons;
  • identifying opportunities outside of compliance; and so on.

Subsequent withdrawal is typically the phase which receives the least amount of attention. Without a proper withdrawal procedure, users can become confused with multiple versions, all of which you’ll have to support. Implement a means of calling your projects back in once their time is up.

By now you will have realised that there’s a great deal more to managing your AML e-learning project through its life cycle than just the construction of a few screens of content. It may seem that the different phases detract from the fun part, but the real value derives from paying sufficient careful attention throughout and getting a good result.