Here’s a useful assessment rubric created by the University of Wisconsin – Stout that can be applied to ePortfolios. I would consider adding links within the criteria to exemplars of best practice but I think it provides a solid basis for evaluating student work.
Today is all about pinning down the most appropriate types of assessments for this subject. Yesterday I think I got a little caught up in reviewing the principles of good assessment – which was valuable but it might also be better applied to reviewing and refining the ideas that I come up with.
For what it’s worth, these are the notes that I jotted down yesterday that I want to bear in mind with these assessments. DDLR Assessment ideas
Looking over the four versions of this subject that my colleague J has run in the last 2 years has been particularly enlightening – even if I’m not entirely clear on some of the directions taken. The course design changed quite substantially between the second and third iterations – from a heavily class-based activity and assessment focus to more of a project based structure. (For convenience I’ll refer to the subjects as DDLR 1, 2, 3 and 4)
DDLR 1 and 2 provide an incredibly rich resource for learning to use eLearn (our Moodle installation) and each week is heavily structured and scaffolded to guide learners through the process of developing their online courses. The various elements of the units of competency are tightly mapped to corresponding activities and assessment tasks – moreso in DDLR 2. (Image from the DDLR subject guide)
I have to wonder however whether the course provides too much extra information – given the relatively narrow focus on designing and developing learning resources. Getting teachers (the learner cohort for this subject) to learn about creating quizzes and assignments in Moodle is certainly valuable but are these truly learning resources? This may well be one of the points where my approach to this subject diverges.
The shift in approach in DDLR 3 and DDLR 4 is dramatic. (As far as a diploma level course about designing learning resources might be considered dramatic, at least.) The assessments link far more closely to the units of competency and all save the first one are due at the end of the subject. They are far more formally structured – template based analysis of the target audience/learners, design documents, prototypes and finished learning resources, as well as a reflective journal.
It does concern me slightly that this subject has a markedly lower rate of assessment submission/completion that the two preceding ones. That said, this subject is often taken by teachers more interested in the content than in completing the units of competency and that may just have been the nature of this particular cohort.
This new assessment approach also seems far more manageable from a teaching/admin perspective than the previous ones, which required constant grading and checking.
My feeling is that this is a more sustainable approach but I will still look for ways to streamline the amount of work that is required to be submitted.
The next step was to map the various elements of competency to assessment items. The elements for both units of competency are written differently enough to need to be considered separately (unfortunately) but they both still broadly sit within the ADDIE (Analyse, Design, Develop, Implement, Evaluate) framework. ADDIE seems like a useful way to structure both the course and the assessments so I have mapped the elements to this. I have also highlighted particular elements that are more indicative of outputs that might be assessed. Working through the analysis process will be quite dry (and could potentially come across as slightly patronising) so finding an engaging approach to this will be important.
(I’m also quite keen to bring digital badges into this process somehow, though that’s a lower priority at the moment)
Finally, I had a few ideas come to me as I worked through this process today that I might just add without further comment.
DDLR / DDeLR ideas
Get the class to design and develop a (print based? ) checklist / questionnaire resource that might be used to address DDLR 1 and DDeLR 1 UoCs. Get someone else in the class to use it to complete their Analysis phase.
Can I provide a range of options for the forms the assessment/resource pieces might take?
Try to develop a comprehensive checklist that teachers can use on the resources that they produce to raise the quality overall of resources at CIT. (Again, this could be a student led tool – the benefit of this is that it makes them think much more about what a good resource requires – does this meet any UoCs??)
Convert the print based Analysis document into a web resource – book tool or checklist maybe? Also possibly fix the print based one first – from a deliberately badly designed faulty version. (Lets me cover some readability / usability concepts early)
How much of this subject is leading the learners by the hand? How much is about teaching them how to use eLearn tools?
Could one of the learning resources be about developing something that teaches people how to use a particular eLearn tool???
Need to identify what kinds of resources teachers can make. Good brainstorm activity in week 1.
Think about the difference between creating a learning resource and finding one and adding it to your course. (Still important but tied to the UoC?)
If I give teachers the option to use previously developed resources (authenticity issues??), they should still provide some kind of explanatory document AND/OR edit the resource and discuss what changes they made and why.
Need to consider the relative strengths and weaknesses of the various types of tools.
In-class feedback of learning resources to better support the evaluation and implementation based elements of competency.
One activity (possible assessment) could be for learners to gather information needed to do an analysis from a partner in the group. (and vice versa) Might lead to a more critical examination of what information is being sought. Learner might even provide suggestions for design/development?
It’s been ten years since I last taught a formal subject. (But I’ve run a bucket-load of workshops and provided a lot of 1-1 training and support in that time)
I thought it might be a useful process to document my process as I continue to design and develop this course over the next three weeks. Obviously I’ve already spent a fair amount of time looking over the units of competency (linked above) – the holy documents within VET that define exactly what a learner needs to be able to demonstrate at the end of the course. These also outline the types of evidence that can be used to demonstrate competency and provide additional information about suggestions about interpreting the elements that make up the units of competency.
(As a side note, I know a number of people in Higher Education – the university sector – that shudder when they hear competency mentioned but it has been interesting to note how frequently it does seem to be coming up in discussions of the future of adult learning lately)
I’ve also spent a decent amount of time looking over the courses designed and delivered by my colleagues and dug down into the approaches that they have taken – as well as having long chats with them and other people on my team. (The logical thing to do would be to just tweak and re-deliver their old course but where’s the fun in that?)
There is also a practical consideration in refreshing my own course design and development skills. I have even toyed with the idea of trying to gamify the entire course but that seems unnecessarily over-ambitious. Maybe next year, when I have a better sense of how this subject runs in a conventional form.
So I started by blocking out exactly what it is that I need to accomplish in these three weeks. (Clearly other things will arise that will take priority but it is currently holidays so I have two weeks – boss free also – to get stuck into things like this with hopefully minimal disturbance) The final week is left free for feedback, editing and contingencies.
Assessment seems the logical place to start as I know what I need to cover, I just need to make sure that the learners provide enough evidence that they know it as well.
As with all good lists, the first item is to make the list. Nice to get a quick win on the board. From there it seemed prudent to revisit the subject I recently took as a student about assessment for some inspiration and that has given me some handy tools and processes that I might have eventually arrived at myself but not nearly as quickly. Once I’ve designed the assessments I think I’ll come back to this to make sure I haven’t missed anything vital.
Ensuring that the assessments are targeted at the appropriate Australian Qualifications Framework (AQF) level – 5 in this instance – and that every element of both the units of competency are addressed are the key factors here.
Taking another look at the assessments that the previous teacher of this subject – Jo – designed comes next. She’s run this subject four times now and so has had a good opportunity to refine her assessment tools. Being part of the Education Design and Technology team, we all maintain high quality online courses and Jo has invited me to make use of anything I find in her courses. (Thanks Jo – with four courses and subject guides to pore over, I might be some time)
This post by Cathy Moore (and another that I came across not too long ago here at Computing Education Blog ) struck a chord with me. In essence, they are both saying that learners can benefit by having their skills and knowledge tested right from the beginning of a subject. Whether it involves participating in a scenario and completing some kind of formative assessment, putting this activity up front lets your learners see what they are expected to know, what they don’t currently know and why this is a relevant and worthwhile part of their studies. The odds are pretty good that they will fail the scenario or quiz or whatever the first time around but as long as we make it clear that this is OK and that it’s just a part of learning, the memories of this experience will give context and meaning to everything else that they learn afterwards. I took this approach perhaps a little inadvertently in a digital literacy course that I trialled last year. I wanted to test the value of a particular quiz