Friday, 21 November 2014

The Power of Blurting

I was in a very lively and enjoyable #pkmchat this week on Twitter and a new sort of theory formed itself (at least momentarily) in my grey matter - or rather it took shape from what I was discussing with others and took a more distinguished shape.  There were two distinctive lines of thought around blogging and the sharing of thoughts, ideas and information.  One was along the lines that if you want your shared information to be useful it needs to be well thought out, accurate and well-written - which you have to say sounds very reasonable and professional.  It’s just that it leaves me kind of cold.  I’m all for factual information when you need it - things like legal settings and black and white rules and regs are often only approachable in this way, but even when the subject matter is highly factual our thought process doesn’t need to be constrained that way.  I’ve approached my blogs in a different way as those of you who actually read them will undoubtedly know.  I don’t go out of my way to write stuff that is incorrect either in knowledge base or grammar (although of the latter I’m surely at fault on a regular occasion), but I don’t focus on it as the single most important thing either.  For me writing and sharing isn’t about Nigel knows best (nor do I usually write about myself in the third person so apologies there too!), but moreover ‘hey, this is what I think’ or maybe ‘what about this?’.  It’s about the challenge, the thought itself, the idea of challenging what ‘knowledge’ alone will get you.  It’s about blurting.

I don’t proof read what I write which is probably self-evident with the typos and spelling mistakes that are sure to litter my posts.  I don’t care if I read someone else’s posts and they have minor errors that don’t detract from the message they’re trying to put out there.  I don’t care because to me it’s not important.  This leads me to my main point I guess, that the old ideal that ‘Knowledge is Power' is… well… it’s wrong.  I know that there are some people who focus life-long learning on the pursuit of knowledge, but I can’t help thinking this is misdirected.  If learning was just about amassing knowledge then science, technology and progress would be stuck with what we have.  The best learners don’t really seek to know everything or understand everything, the seek to make sense of things - yes, this can come from knowledge but only if you remember that knowledge itself isn’t a permanent thing - and that’s a good thing not a bad thing.  Also remember if we take the majority of learning theories that knowledge is actually the lowest form of learning.  Being able to simply recall information does not equate to high-level learning - let alone power.  The old theory of holding on to that knowledge and not sharing it so that you have something others don’t have is predicated on the amazing value of knowledge alone.  Once we dispel that myth we can then start to share without fear - that’s when we actually start to empower both ourselves and others and knowledge put into action starts to gain some strength.

My theory of Evolve I’ve shared recently is that we are not designed to be knowledge storing machines we’re designed to evolve.  It’s in our nature to seek improvements and adjustments that will improve things and that’s where we can tap into to amplify learning.  If we put too much value on the purity of knowledge we have to make sure what we’re sharing is correct - in the purest of science that may seem right but our very theories are evolving - that’s how science moves on.  Same with language.  Some words we scoff at today will be common place tomorrow, language is evolving and so should we with it.  It’s also the most natural thing to do.

So the power of blurting you could call the power of sharing.  Or maybe the power of sharing your ideas with others.  My take is that by sharing what you think more than what you know you’re actually sharing something far more useful to the growth of those around you.  By sharing and seeing responses we also evolve our own theories (I’ve definitely been known to change my mind even on theories I started!).

My last point is that evolution doesn’t have to involve the creator.  No, not touching on religion here, but if you were the one who came up with a concept and others take it in different directions… that’s okay.  No, it’s better than okay it’s great.  The chances are that if you continue to contribute past a point you’re anchoring your own views and not letting the theories go where they go. 

So next time you want to blog on something I say blog.  If you don’t know much about it that’s okay too - if you have an idea share it and see what happens.  Share. Blurt. Learn (but don't bother over checking) :)

Thursday, 20 November 2014

From ADDIE to SAM to Evolve in Learning Technologies

I recently heard someone wanting to roll back to a nine step or even eleven step process for producing learning materials - it wasn’t elearning, but actually that shouldn’t really make a huge difference should it as elearning is 90% learning (give or take a letter).  I was actually fairly shocked by this, as an agile proponent I’m always amazed by how failing and struggling organisations blame process and try to solve their issues by adding more process and more complexity to already overly complex situations.  The problem with being very process driven is that you can’t actually have a process that encompasses every possible situation at the most minute detail; what you in fact end up with is the old operations manuals that end up stuck on shelves gathering dust whilst relying on individuals and people doing things the way they always have.  Now I’m not saying that you should ditch all processes (although the anarchist in me would sometimes like to see it) but you need to reduce down your process and reduce the steps required to get things done if you want to make a more efficient and more flow.

A few years back the ADDIE process was considered the king of instructional design.  It was a no-nonsense common-sense approach to designing learning where we started by Analysing the problem, then Designed and Developed the learning before Implementing and then Evaluating it.  It really does make a lot of sense to have this sort of ideal behind making training - I mean can you imagine the outcome if you didn’t ever analyse or evaluate?  Yes, it was pretty much all training 20 years ago (and I bet a fair bit of it still around today) it was called ‘this is the way we’ve always done it’ training without anyone questioning the why, let alone the need for it.  The thing is though, that ADDIE is still a bit of a cumbersome beast with many hand-off stages that can cause issues.  If you apply the process in the way it is here the iterations take a long time to come about.  It’s great that you’re analysing the problem, but just like front-loading training, if you front-load your analysis things can get missed and forgotten - not to mention the likelihood that things actually change whilst your going through the process.  For example, you could be analysing some legislative piece that you need to include in training, have that all set, finish the design and start the development and the legislation changes (as legislations like to do).  What do you do know?  According to the model you continue through and this will get picked up in the evaluation - but that’s no good if we want to deliver effective training that complies with the new legislation.  Alternatively we groan and go back to the start of the project - or what often happens in reality is we do a bit of reverse engineering, chuck in the new changes and go from there.  Whatever we do, the process isn’t going to work as well as we’d like in an ideal world.  And that’s where the problem with ADDIE type processes lie for me; they rely on an ideal which seldom matches the reality of the modern world.  I’ve talked about the technological acceleration we’re currently living under, but there’s always a constant change in any business in any organisation that we need to be aware of and to tap in to.

That brings us to a more modern and agile form of instructional design SAM.  SAM isn’t just a more modern and friendly sounding buddy, but it’s a streamlined process for our ID that is a bit more responsive.  SAM stands for Successive Approximation Model which as the name suggests means that rather than design the perfect fit in the ideal world we hit it running and then we continually improve what we’ve done to get it right.  If ADDIE is a linear approach with a single cycle, SAM is an iterative approach that relies on as many iterations as necessary to achieve your aim.

My problem with SAM is two-fold, firstly it’s a model and all models are inherently relying on ideal situations to work and secondly it’s focus is heavily on the iterations and in reality two many iterations cause many projects to fail and run out of budget.  I’m not anti-SAM, I just believe that rather than having discrete iterations we should concentrate on a more analogue approach of evolving what we’re doing. For me rather than cycling around a model or working in a linear fashion we’re still working in a pre-formed solution type environment. 

All that sounds great but this blog isn’t called the Nth Degree just because my name starts with N (although… yeah… partly) it’s because we take things a little further (and sometimes a little too far!).  For me SAM ideals are great, but its still based around ideals and essentially I think we can boil that down to a single thing; evolving.  I’ve been discussing this model a little with like-minded individuals around the world (yes the beauty of Twitter) and I’m still not convinced if my own model is a great idea or if I’m just a bit more lazy when it comes to ID (… and project management and work as a whole).  Anyway, let me test it out on you and I’m grateful for any comments.  The theory for Evolve is this; you don’t waste inordinate amounts of time in complex design and analysis phases that always seem out of date by the time you complete them.  You don’t write reams of documentation about the design or indeed about much of anything.  At the same time forget about making a solution up that you have to keep going backwards and forwards over in a gazillion iterations.  Forget complex storyboarding and design docs, forget days and hours of formal training needs analysis and interviewing every man and his dog before you get going or going through complex evaluation scripts or numerous re-designs.  Instead you work together.  Collaboratively.  If it sounds a bit like SAM it is, but there’s a limit to it.  In SAM the stakeholder kind of knows what they want and then you keep doing it till you get it right in a process that is akin to trial and error and your ultimate success is measured by the stakeholder.  In Evolve the very aims are developed collaboratively so the success is shared and you reach that point together.  In my theory that means less iterations and more about minor tweaks as you work towards the goal.  It’s not just developing a storyboard together like you might in ADDIE but actually taking it back to the very essence of the story and coming up with it together.

  Yeah I know, now that I read my own stuff it comes out a bit hammy.  Maybe it’s just SAM without the catchy name, but I think it has principles of both ADDIE and SAM rolled in there - it’s not really the model that matters so much but the way we apply and work together.  If people are truly invested in a project then they have a greater interest in its outcomes - if they actually feel they’re responsible for the concept and idea then you’re already half way there.  I’ve worked in a creative team where we’ve come up with an idea together and it’s always been accepted and successful with the team involved; if you can include your ‘clients’ in that team then buy-in is a given and the iterations will reduce regardless of how you define the ‘project’.

Of course there’s a big flaw in trying to make all learning technologies projects evolve; simply put some people just don’t want to work that way with you.  Some clients like you to go away and do the work they think their money should be paying for and that their job is purely to evaluate what you do.  With those type of clients you’re probably best with the ADDIE style.  If they’re really keen to put more into the project look at SAM and if they’re really invested and want to work with you then I think Evolve is the model-less model that could really pay dividends.  Of course you can apply an Evolve mindset to an ADDIE or SAM project too - as always it’s about knowing who you’re working with and adjusting your style to match.

Evolve.  A wry grin at myself for defining a way of removing process and a more free and productive environment without having to do the boring bits of work!  Somebody is bound to shoot this down - maybe I will on one of my more process heavy days :)

Tuesday, 11 November 2014

Assessments and your Learning Technologies

Recently I've been on a flipping mission to get people to flip learning and talked about the flipped LMS in detail.  What that really means is that we need to concentrate heavily on the LMS as a tool for assessing learning (rather than holding all the knowledge and learning).  If you're going to achieve that it stands to reason that your assessments need to be well-designed if that's going to work.  The good news is your learning system is pretty much made to achieve this - tracking stuff is what it's good at after all.

If you're going to properly assess learning it really does help to have a good understanding of learning levels. There are for sure a few models out there, but if we start by looking at Bloom's Taxonomy (or classification of learning) it gives us a simple starting point.  The thing here to notice is that knowledge sits at the lowest level.  What that means in simple terms is that if we set the regurgitation of knowledge as our assessments we're really limiting what our assessment actually tests to the lowest levels.  Think about tests you've done where the only outcome is to recall basic facts.  The big issue is that this doesn't really test beyond that, do you know what that really means or how to apply that knowledge?

'Knowledge is power' is a misdirection in my opinion.  It's not absolutely wrong because it's a contributing factor for sure (and the base of the power pyramid perhaps), but actually just knowing 'stuff' probably isn't going to get the job done on its own.  An electrician for example can hold all the technical knowledge, legislation and standards, but if they can't actually do the work required such as stripping wire and operating the tools of the trade they are entirely limited in their abilities - then you'd have some 'power' issues!  So knowing is at the bottom closely followed by comprehension or understanding.  Again, great that you understand but until you really apply your knowledge its not going to hit the higher notes.  Application is the real turning point in assessments - it shows the ability to take what's been learned and start to demonstrate an improvement in abilities.  After all that's what we're really after in learning solutions isn't it?  At least at some level we're capability building and that relies on at least some level of application.

From here there's a number of higher levels of learning - from being able to analyse more complex issues to synthesis and evaluation.  Good news is that these fit very neatly into piece on scenarios many moons ago and most of that holds true.  By setting a decent scenario you can certainly get learners to analyse information and by taking it further they can evaluate the effectiveness too.  There are also plenty of tools out there to achieve this like the Storyline series and Captivate - but even using LMS built in tools like the excellent Quiz tool in Moodle and Totara you can create engaging scenarios.  One thing to realise is that a multi-choice test doesn't have to mean a simple knowledge check.  If you write your scenarios well there's actually nothing wrong with presenting the options as choices - you can then get learners to analyse the situation and select an appropriate course of action.  I love the use of getting them to reflect and evaluate their choices later too.
types of assessment that we can put together and track in our LMS.  First up let's think about scenarios as they hold the key to lots of good assessments.  I wrote a

Another good type of assessment to use is the synthesis path.  For an LMS this is harder to put into a standard quiz type assessment, but most learning systems will allow you to add uploads for assessment.  I've seen this most effectively used when asking learners to develop a plan or course of action.  One good example you may have seen is the 'Fire Plan' one where you work through a fire plan with your children in the case of a fire - escape routes etc.  It's a good example of synthesis in that you put together the plan based upon your learnings and pulling on all the important 'stuff' that is required.  The disadvantage is that this requires manual grading; but at times a meaningful assessment requires this.  Of course you can flip this around and have a forum type assessment where you can actually get other learners to evaluate each others' plans.  You're then hitting all the high notes with synthesis and evaluation displayed.  I've also seen this work really well using groups and a wiki type approach - that is where the group puts together a plan and modifies it before submitting.  Again using social learning type techniques peer review is another valuable tool (and less emphasis on the trainer again).

Another important area it's easy to forget is that all learning doesn't have to be demonstrated through an exam, quiz or assignment type activity.  One of the best ways to evaluate the effectiveness of training is through on job assessment.  Use your learning system to track how someone is performing on the job by having assessments that are completed by a supervisor.  I've seen this work really well where the assessments have been completed on an tablet and straight into the LMS.

Finally assessments are a really important part of the learning process - it's important to recognise that assessments aren't just knowledge checks and that they can actually be a part of the learning itself.  Those of you that have read some of my older posts will know that I'm actually a big fan of pull type learning and the assessment can be used to drive the learning rather than just measure it...

Thursday, 6 November 2014

Flipped Classroom, Flipped LMS, Flipped Conference... Flipped Motivation?

So if you've followed any of my recent blogs you'll know I'm a fan of improving learning by looking at the techniques we use - flipping is one of those.  When we flip things like the classroom it's about us really getting the most out of the facilitators rather than just using them to dump knowledge - the face to face time is used to apply rather than just regurgitate.  With Flipping the LMS I talked about how we can't expect the LMS to hold all the knowledge either and that our real focus there should be on the assessments. 

A recent conversation with +Ryan Tracey led us to stumble upon a great idea.. the flipped conference.   The simple concept was that with less presentation and more interaction both in the sessions and not just in the breaks.  It's really just the flipped classroom idea I guess, but if the theory is good enough for classrooms and 'lecture' theaters then it applies equally well to conferences.  It's again about sharing and collective knowledge and experiences rather than us continuing along the all-knowledgeable teacher and no-nothing student model which should be put as far behind us as possible.

So now another new; Flipped Motivation.  This came from another discussion (yes, walking the walk of learning means I usually get ideas from others!) this time with +Kari Scrimshaw at the end of the #NZATD conference.  I was commenting on the old idea of learning motivation and what over the years has been termed WIFM (what's in it for me).  It's the idea that if you want people to be motivated to do your learning then you have to show them something that they gain from it; promotion, pay rise, stick and carrot ideas.  We then moved to the more healthy ideas of altruism and doing things for others rather than yourself.  Funny thing is if you flip ME you get WE; flipping motivation should we be asking what's in it for we instead of me?
Call me an optimist but I think that most people actually do want to do something that benefits others.  Isn't this a lot of the theory behind social learning; you comment on wikis or contribute to discussion boards or twitter chats not for your own sounding board or to broadcast but to try and raise knowledge for others as well as yourself.  It's all about the interactions so focusing on and helping others is perhaps the greatest way to achieve the biggest gains out of social learning.

I'm really against the ideas that knowledge is power and you get that by hoarding the knowledge and keeping it to yourself.  Knowledge only has power when it's shared and applied, and if it can be shared and applied for the greater good then even better.