Category Archives: assessment

Ed Tech must reads – column #20

First published in Campus Morning Mail 1st Feb 2022

How not to write about HyFlex or online learning from Bryan Alexander

While most academic discourse follows intellectually rigorous conventions, there is one area that seems resistant to them. Commentary about technology enhanced and online learning, particularly from those who are new to it, often reveals a lack of understanding of the field and dwells instead on anecdata and laments for the good old days. Bryan Alexander steps through some of the most common flaws in these kinds of pieces in this entertaining post that calls for better conversations about this space. 

Reverse engineering the multiple-choice question from The Effortful Educator

Multiple-choice questions (MCQs) are invaluable for making assessment at scale manageable and providing learners with quick feedback about their understanding of material. As learning tools though, they can be superficial and rarely reflect authentic uses of knowledge. The alternate approach to MCQs laid out in this post asks students to craft questions that use provided answers instead – the Jeopardy! approach to quizzing perhaps. While it may be more labour intensive to assess, this adds a richness to these kinds of questions.

Framework for Ethical Learning Technology from ALT

As the education technology market has grown and usage has become the norm, valid questions have been raised about factors beyond learning and teaching benefits. What are the drivers for businesses and university leadership in using them? How do we ensure that the focus stays on what learners need? The UK’s Association for Learning Technology (ALT) is developing a framework in four quadrants – Awareness, Professionalism, Care and Community and Values – to help guide thinking in this brave new world.

Contemporary Approaches to University Teaching MOOC 2022 from CAULLT

Many universities offer some form of educational development to their teachers, but if yours doesn’t or you would like to supplement it, this MOOC developed by 10 Australian universities under the auspices of the Council of Australasian University Learnings in Learning and Teaching is a particular rich free course to consider. Enrolments for the 2022 offering (28/2 to 29/7) are now open. It covers everything from Teaching your first class to Collaborative learning and The politics of Australian Higher Education.  

Best puzzle games // 10 indie puzzle games you need to try from Cutie Indie Recs I’ve long believed that education can learn a lot from game design in terms of creating engaging and enriching learning experiences. This nine minute video from Cutie Indie Recs showcases some of the incredible variety and creativity that can be found in PC and mobile games now. I’m not entirely sure how to convert these to teaching but maybe inspiration will strike.

Ed Tech must reads – Column 12

First published in Campus Morning Mail Tuesday 2nd November 2021

A heutagogical approach for the assessment of Internet Communication Technology (ICT) assignments in higher education from International Journal of Educational Technology in Higher Education (Open Access)

With students increasingly identifying as online content creators, and the slowly evolving nature of academic publishing, it makes sense to harness Internet platforms in their education. Lynch, Sage, Hitchcock and Sage here outline some formal structures to support a more self-determined form of assessment, where learners are as mindful of the external audience for the resources they create in their courses as they are of their teachers. This article offers a comprehensive guide to the theory behind this approach as well as some exemplar rubrics. The only issue that I would possibly take is the breathless excitement about this as a new mode – not to toot my own horn but I had my students posting blogs for assessment a decade ago. Perhaps without the rich theoretical framework though.

Bringing Clinical Simulation & Active Learning Strategies into the Classroom During COVID-19 from Healthy Simulation

Medical disciplines have long been leaders in the adoption of technology enhanced learning and teaching, with a particular need to be able to give learners as much authentic practical experience as possible while also being safe and logistically feasible. In this informative but brief post, Amy Curtis describes the practical changes that were required in a university nursing program in the South East US in response to COVID19.

Administrators are not the enemy from The Chronicle of Higher Education

Brian Rosenberg is the President in residence of the Harvard Graduate School of Education and pulls no punches with this strongly worded cri de coeur – the subheading is “Faculty contempt for nonfaculty employees is unjustified and destructive”. It isn’t a long read but covers a decent amount of ground about academia, from the primacy of expertise to toxic behaviour in hierarchies.

Introducing design thinking online to large business education courses for twenty-first century learning from Journal of University Teaching & Learning Practice

Vallis (USyd) and Redmond (USQ) discuss the application of design thinking principles that are, in essence, a more human centred angle on problem solving, in teaching business disciplines. They interview academics and student in a first-year course in this case study to delve into its usefulness in this practice and find some handy benefits.

Opinion: There’s nothing appealing about the Metaverse from Game Developer

When Facebook is in the news it can be easy to tune out these days but this opinion piece from Bryant Francis about Mark Zuckerberg’s rebranding of the parent company as ‘Meta’ and their roadmap for a remarkably Second Life-like all encompassing virtual social world is worth a read. While this isn’t about the educational applications of such a space, it points out a number of the logical flaws and so-what questions that aren’t yet being discussed enough.

Ed Tech must reads – Column 11

First published Campus Morning Mail, Oct 26th 2021

Team-based quizzes on no budget from Amanda loves to audit

Australia’s favourite lecturer on auditing, Amanda White at UTS, integrates technologies into her teaching to inspire and engage her students. In this post, she shares her approach to creating weekly branching quizzes that are taken firstly individually and then in small groups to create opportunities for collaborative learning via multiple attempts. She discusses how she has created a solution that bypasses LMS quiz limitations but which retains accountability.

Support Designer-Teacher Collaboration in Educational Game Design Using Learning Science Principles from Ma and Harpstead, CHI-PLAY 2021 proceedings

A common concern held about educational technologies is that the tech is prioritised about the pedagogy. This work in progress from Ma and Harpstead (Carnegie Mellon University), presented recently at the Computer-Human Interaction in Play conference outlines their work on educational game design support frameworks linked closely to evidence based learning science principles. Given the potential of educational games to create rich, authentic learning experiences, this work shows great potential.

Vale Mihaly Csikszentmihalyi from Jane McGonigal (Twitter)

For people with an interest in learner engagement, motivation and productivity, the loss of Csikszentmihalyi last week was a sad moment. His 1990 book Flow: The Psychology of Optimal Experience developed the idea of a ‘flow state’, the sweet spot between challenge and skill where people find themselves fully absorbed in an activity. This has been highly influential on education and game design and games in education. The comments below this tweet from McGonigal, an influential figure in serious game design thinking, offer a taste of the impact his work had.

UTS Video Meetup #10 Podcasting, Live-streaming and designing educational media Tues Oct 26, 2:00 pm – 3:30 pm (AEDT)

This video meetup this afternoon features academics and learning designers from a range of organisations presenting about using educational video (Mark Parry, AISNSW), live-streaming on Twitch (Jamie Chapman, UTAS), Learner generated digital media (Beverley Myles, OpenLearning) and podcasts as learning and teaching resources (Fidel Fernando, Macquarie Uni).

Towards a taxonomy of assessment types – webinar/workshop Thurs Oct 28th 12 noon (AEDT)

Hans Tilstra (Keypath Australia) leads what should be a lively set of activities intended to lead towards a meaningful taxonomy of assessment types in modern tertiary education. This is the final ASCILITE TELedvisors Network webinar for 2021 and caps off a stellar year of these events.

Ed tech must reads – column 3

Originally published in Campus Morning Mail 31st Aug 2021

Identifying, Evaluating, and Adopting New Teaching and Learning Technologies from Educause Review

The most common questions/complaints that I hear as an education technologist from academics wanting to use a new tool in their teaching revolve around the time it takes to add them to institutional systems. “But the nice salesperson told me that it only takes 30 mins to install – why has it been 6 weeks already?”  This article from Pat Reid draws back the curtain on many of the things that need to happen behind the scenes to ensure that an education technology is fit for purpose, supportable and will work with an institution’s many needs. It offers some useful insights into the practical realities that are frequently overlooked in most discussions of learning technologies.

Ranking Multiple-Choice Answers to Increase Cognition from The Effortful Educator

Multi-choice quizzes are a mainstay of online learning because they provide opportunities for learners to check their understanding of course material without the workload overhead to teachers of manually grading hundreds of responses. Legitimate concerns are raised though about whether MCQs test recall vs understanding and how authentic they are in relation to use of knowledge in practice. This post draws on research in the cognitive sciences to suggest an alternative approach to MCQs, asking students to explain why they think the options are right or wrong. There are clearly workload implications but it’s thought provoking.

Meet the man behind Tveeder, the no-frills live TV transcript that became an Australian media hero from The Guardian

Captioning and transcription of video for accessibility and also as a learning resource has come to the fore in recent years. Tveeder is a Melbourne based tool that aggregates the captioning feeds from Australian free-to-air TV in real time, for free. Given that many people parse text more quickly than video, and prefer to do so, this offers a handy resource for capturing relevant, real-world information that could be used in many teaching scenarios

Myth No More – Student Blackmailed by Cheating Provider from The Cheat Sheet

This email exchange between a student and a contract cheating service, shared by academic integrity newsletter The Cheat Sheet, highlights the real risks students choose.

Academics talk about The Chair – new podcast

The new Netflix series The Chair, a six episode dramedy about wheelings and dealings in an English department in a mid-level American university has unsurprisingly sparked much discussion in academia. Local Higher Ed notables Inger Mewburn, Narelle Lemon, Megan McPherson and Anitra Nottingham forensically and amusingly dissect the show episode by episode – definitely worth a listen.

Ed Tech must reads – Column 2

Originally published in Campus Morning Mail on 23rd Aug 2021

Current trends in online delivery and assessment in ANZ from @michael_sankey

The Australasian TEL world’s Mr Everywhere, Prof Michael Sankey, recently presented the findings of several ACODE surveys of HE institutions to the Blackboard APAC conference. Unsurprisingly, it shows the sector in the midst of significant change – not entirely brought on by the pandemic but certainly accelerated by it. This wide-ranging slide deck covers the variety of approaches to online exam proctoring, intentions for the lecture, micro-credentials and the kinds of communication and collaboration tools that institutions are using to support student learning.

Pearson unveils Pearson+ platform to address costly college textbook process from ZDNet

After moving from Disney (home of streaming platform Disney+), new Pearson publishing CEO Andy Bird has launched Pearson+, a subscription service for textbooks for US college students. They can either rent a single digital textbook for $9.99 per month or 1500 books for $15.99. What implications could this Brave new direction have for students? Might they find themselves losing access to books if authors get Tangled up in legal actions with the publisher? Will Pearson turn the textbook landscape Inside Out? Time will tell but either way, it’s good to see an organisation not Frozen in place.

Discussing the Stanford AI report on education from @BenPatrickWill on Twitter

Artificial Intelligence (AI) is a hot topic in many spaces and education isn’t spared. University of Edinburgh ed tech research Ben Williamson examines a hefty report published last week by Stanford University about work underway to train computer models to ‘understand’ teachers, students and more in this deep twitter thread. Will an algorithm one day be able to meaningfully replicate the interactions at the heart of good learning and teaching?

Webinar: Rescued from HERDSA21 – Technology’s role in enabling feedback and assessment Thursday 26/8 12 noon AEST

This year’s HERDSA conference was sadly cancelled but planned presentations keep popping up anyway. The ASCILITE TELedvisors Network hosts two of these on Thursday, with Deakin’s Ameena Payne showcasing the benefits and challenges of audio/video feedback and Griffith’s Diana Tolmie discussing the use of ePortfolios among music students. These webinars are always free and recordings are posted to the TELedvisors’ YouTube channel

Creating art with AI from @artgallerai

On the less daunting side of AI, there are many new tools that let creators work with the bizarre imagination of computers to create beautiful and surreal images. The @GallerAi account on Twitter, feeds the VQGAN+CLIP algorithm random poetic phrases like “Deep space dive bar” or “Golden Trojan Horse love bomb” and shares the resulting otherworldly images.

Thoughts on: Reflecting or Acting? Reflective practice and continuing professional development in UK Higher Education (Clegg, Tan and Saeidi, 2002)

Clearly one of the key ingredients in enhancing teaching practice is teacher professional development and a vital element of deriving meaning from this is reflective practice.

It is at this point however that we need to be cautious of the evangelisers of reflective practice as a global solution. “Reflecting or Acting? Reflective Practice and Continuing Professional Development in UK Higher Education” by Sue Clegg, Jon Tan and Saeideh Saeidi (2002) takes a methodical look at the use of reflection and notes that current (at the time – not sure how much they have evolved) uses of reflective practice in CPD isn’t suited to all learners and needs to be anchored in actions taken to be particularly meaningful.

Reflective practice is valued for acknowledging “the importance of artistry in teaching” (p.3), which seems even more important in 2016 than it was in 2002 with the rise of big data and analytics in education sometimes seeming determined to quantify and KPI-ify every single facet of teaching and learning. (Can you tell that I’m more of a qual than a quant?)

Clegg et al investigated the use and value of reflective practice amongst academic staff in accredited CPD between 1995-1998. In broad terms (Spoiler alert) they tied it to four types of practices/behaviours that reflected the learning preferences and teaching circumstances of the teachers. These preferences – either for ‘writerly’ reflection or not – and the circumstances (which impacted their ability to act on new teaching knowledge) had a significant part to play on how valuable reflection was to them.

The ‘action’ part is at the core of the question that Clegg et al are pursuing. They draw on Tomlinson (1999) in assuming that “the relationship between reflection and action is transparent with reflection-on-action leading to improvement and change” (p.4). This idea has been of interest to me recently because I’ve been involved with the HEA fellowship scheme at my university which appears to have a different focus, seemingly sans action. (I’ll discuss this further in future posts as engaging Fellows seems as though it is going to be an important part of my ongoing quest/research)

As for the learning preference side of the equation, one of the simultaneous strengths and failings of the widely followed reflective practice approach is the emphasis on a very ‘writerly’ style of reflection. By which the paper refers to Bleakly (2000), who has “argued for greater attention to the form of writing and a greater self-awareness of literary accomplishments of narrating and confessional.” The authors note however that “our data suggested that some practitioners fail to write or only write as a form ex post facto justification for accreditation purposes”. Which, based on the feedback from some of the participants that struggled with the writing element of the task, can be linked in part to the disciplinary orientation of the learners (i.e. quant vs qual backgrounds) and in some cases to gender-role perceptions – “the feminine reflective side as opposed to the more active masculine doing side of practice” (p.18)

These key factors allowed the authors to sort participants into four groups, based on their practices.

  • Immediate action – participants put new ideas into practice directly after the CPD workshops (and before reflection) (more often novice practitioners)
  • Immediate reflection – participants reflected on their own practices directly after CPD workshops (more often experienced practitioners) – they also found less value in the workshops  in terms of new knowledge
  • Deferred action – some participants were unable to act on knowledge gained in workshops due to organisational/time constraints (this limited their ability to reflect on the impact of new knowledge on their new actions/practices)
  • Deferred reflection – largely participants that struggled to engage with the reflection activity in its current format. Many only did it for accreditation purposes so saw little benefit in it.

Clegg et al take pains to emphasise that their research is about starting a conversation about the interrelationship between action and reflection and the need to maintain this link. They don’t draw any other conclusions but I think that even by simply looking at on-the-ground interaction with reflective practice, they have given us something to think about.

Reading this paper sparked a few random ideas for me:

  • Perhaps Design thinking might offer a way to bridge the gap between the ‘teaching as a craft’ and ‘teaching as an empirical science with hard data’ viewpoints by applying a more deliberate and structured way of thinking about pedagogy and course design
  • Are there ways that we can foster writing (and some reflection) as a part of every day ongoing CPD for academics? (Without it being seen as a burden? There probably needs to be a goal/outcome/reward that it leads to)
  • Decoupling reflection from action – particularly when action comes in the forms of making improvements to practice – gives people less to reflect on and might lead to too much navel gazing.
  • A large part of the work being done on reflective practice by one of my colleagues is focusing on the impact that it has on teacher self-efficacy. Tying it to professional recognition boosts confidence which is valuable but is there a risk that this can in turn lead to complacency or even over-estimation of one’s competence?
  • My personal philosophy when it comes to theory and practice is that none will ever hold all of the answers for all of the contexts. I believe that equipping ourselves with a toolbox of theories and practices that can be applied when needed is a more sustainable approach but I’m not sure how to describe this – one term that I’ve considered is multifocal – does this seem valid?
  • One concern that I have about this study is the large number of contextual factors that it tries to accommodate. These include : “how participants understood their activity including reflective practice, their motivations for joining the course, how they made sense of their decisions to complete or not complete, and whether they thought of this as a conscious decision” (p.7) On top of this there was the level at which the CPD was being conducted (novice teachers vs supervisors), disciplinary and gender differences as well as learning preferences. Maybe it’s enough to acknowledge these but it seems like a lot of variables.
  • Reflection shared with peers seems more valuable than simply submitted to assessors.
  • Even when reflective writing is a new, ‘out of character’ approach, it can be seen as valuable even though it can take learners time to ease into it. Supporting some warm up exercises seems like it would be important in this case.
  • It’s worth taking a harder look at exactly what forms meaningful reflections might take – is there just one ‘writerly’ way or should we support a broader range of forms of expression?
    Audio? Video? Dank memes?
    “Virtually all the descriptions of keeping a journal or gather materials together suggested that they somehow felt they had not done it properly – qualifying their descriptions in terms of things being just scrappy notes, or jottings, or disorganised files, or annotated e-mail collections. Such descriptions suggest that participants had an ideal-typical method of what reflective practice should look like. While the overt message from both courses was that there was no one format, it appears that despite that, the tacit or underlying messages surrounding the idea of reflective practice is that there is a proper way of writing and that it constitutes a Foucauldian discipline with its own rules” (p.16-17)
  • Good reflection benefits from a modest mindset: “one sort of ethos of the course is it requires you to, I don’t know, be a bit humble. It requires you to take a step back and say perhaps I’m not doing things right or am I getting things right, and throw some doubt on your mastery…” (p.17)
  • This is more of a bigger picture question for my broader research – To what extent does the disciplinary background shape the success (or orientation) of uni executives in strategic thinking – qual (humanities) vs quant (STEM)?

 

 

 

 

The Try-a-Tool-a-Week Challenge: Week 1 – Socrative (vs Kahoot)

Kelly Walsh over at EmergingEdTech seems like quite the Ed Tech advocate and he has started an ongoing series of posts for the next three months focusing on a range of tools.

He has asked people to try the tool and post some comments on his blog. So, what the hell, I’m happy to see where this might go. First up is a basic classroom quiz tool called Socrative.

At first glance, this reminds me of Kahoot, which I’ve looked at before. Socrative appears to use a more serious design style, eschewing the bright colours and shapes of Kahoot for more muted tones. Overall, the Socrative interface is a little more user friendly for both the student and teacher, with a clean, simple and logical design.

Creating a basic quiz in Socrative was a very straight-forward process and it was nice to be able to create all of the questions on the same page. I did encounter some problems with creating a multichoice question – for some reason it took repeated clicks (and some swearing) in the answer field before I was able to add answers. Editing the name of the quiz wasn’t intuitive either but overall, the process was simpler than with Kahoot.

Running the quiz went reasonably well however I did encounter a number of bugs, related to network connectivity (3G) and an initially buggy version of the quiz that seemed to crash the entire system. (I had inadvertently added a true/false question twice, once with no correct answer identified. Clumsy perhaps on my part but I would kind of expect this to be picked up by the tool itself).

I liked the fact that the student sees both the questions and the answers on their phone and that the feedback appears there as well. Socrates gives three options for running the quiz – Student paced with immediate feedback (correct answers shown on device upon answering), Student paced – student navigation (student works through all questions and clicks submit at the end) and Teacher paced where the teacher takes students through question by question. In the final two options, feedback appears only on the teacher’s computer (presumably connected to a data project / smart board).

Overall I’d say I rate the overall usability, look and feel of Socrative above Kahoot but the connectivity issues are a concern and I’d say that Kahoot offers a slightly more fun experience for learners by playing up the gamified experience, with timers and scoring.

 

A hierarchy of digital badges – Level 1 Accredited

Part of me thinks it’s a really dumb idea to try to identify a hierarchy for digital badges and particular to try to name them. Because the people out there that don’t get badges are often the same kinds of people that get fixated on names for things and let the names blind them to the function or purpose of the thing. (This is why we start getting things called micro-credentials and nano-degrees. Personally I would’ve called them chazzwozzers but that’s just me)

Maybe hierarchy isn’t even the right term – taxonomy could work just as well but I do actually believe that some badges have greater value than others – determined by the depth and rigour of their metadata and their credibility with an audience. (Which isn’t to say that some educators mightn’t find classroom/gamified badges far more valuable in their own practice).

In the discussions that I’ve seen of digital badges, advocates tend to focus on the kinds of badges that suit their own needs. Quite understandable of course but it does feel as though this might be slowing down progress by setting up distracting side-debates about what a valid badge even is.

Here is a quick overview of the badge types that I have come across so far. If I’ve missed something, please let me know.

Level 1 – Accredited 

Accredited badges recognise the attainment of specific skills and/or knowledge that has the highest level of accountability. The required elements of these skills are identified in fine detail, multiple auditable assessments are conducted (and ideally reviewed) and supporting evidence of the badge recipient’s skill/knowledge is readily available.

I work in the Vocational Education and Training (VET) sector in Australia, where every single qualification offered is built on a competency based education framework. Each qualification is comprised of at least 8 different Units of Competency, which are generally broken down into 4 or 5 elements that describe highly specific job skills.

VET is a national system meaning that a person undertaking a Certificate Level 4 in Hairdressing is required to demonstrate the same competence in a specific set of skills anywhere in the country. The system is very tightly regulated and the standards for evidence of competence are high. Obviously, other education sectors have similarly high standards attached to their formal qualifications.

Tying the attainment of a Level 1 badge to an existing accredited education/training program seems like a no-brainer really. The question of trust in the badge is addressed by incorporating the rigour applied to the attainment of the existing qualification and having a very clearly defined set of achieved skills/knowledge offers the badge reader clarity about the badge earner’s abilities.

E.G. A badge for Apply graduated haircut structures could easily be awarded to a hairdressing apprentice on completion of that Unit of Competency in the Certificate III in Hairdressing. It would include the full details of the Unit of Competency in the badge metadata, which could also include a link to evidence (photos/video/teacher reports) in the learner’s ePortfolio.

I use a VET example because that’s what I know best (and because it seems a natural fit for badges) but obviously, any unit in a formal qualification would work just as well

Next post, I’ll look at Level 2 – Work skills

 

ePortfolio grading rubric

Here’s a useful assessment rubric created by the University of Wisconsin – Stout that can be applied to ePortfolios. I would consider adding links within the criteria to exemplars of best practice but I think it provides a solid basis for evaluating student work.

https://www2.uwstout.edu/content/profdev/rubrics/eportfoliorubric.html

screenshot of eportfolio assessing rubric

Final thoughts on DDLR / DDeLR

It feels like I’ve been banging on about this subject forever now but with assessments now finalised, it seems like a good time for a final wrap up.

In broad terms, I was a little disappointed with my students. It might have been a bad time of year to run this subject, with its demanding workload, but the majority of them seem to have only put in the absolute barest effort needed to pass. Assessment instructions which I thought were pretty clear weren’t followed and most of the reflections lacked any great insight. I had to ask many of them to rework and resubmit their assessments just to meet the minimum requirements.

What this does make me ask is whether this is the fault of my students or me.

As I haven’t taught formal classes for more than a decade, there are a lot of things that I haven’t had to deal with in teaching an ongoing subject with rigorous formal assessment. I also have a tendency at times to over-complicate things because it just seems like it makes them better. This probably also extends to my communication with my students and my expectations of them.

Fortunately, I am still keen to try this again.

Even during the marking process, as I had to walk away from the computer and swear at the walls, I was constantly reshaping the course structure, the assessments and the class activities in my mind to help avoid some of the issues that were arising. The fact that a handful of the “good” students were able to understand and follow my instructions suggests that I’m on the right track at least and am not entirely to blame but the fact that more than a few got things quite wrong does tell me that there is more work to be done.

I need to make it clearer that when students are creating draft learning resources, they actually need to be resources – things, objects – rather than broad and loose activity plans for a class. I need to explain clearly that the final learning resources should be the same as the draft learning resources but improved based on testing and feedback.  To be honest, these things seemed so self evident to me that I couldn’t conceive of anyone not getting it but there we are.

I tried to put into practice a number of ideas that I’ve encountered in the education design community about getting students more involved in designing parts of their own assessments but this really just confused more people than it helped. (Which was a shame as I do believe that it is a valid and valuable approach)

I tried to give my learners freedom to follow their particular learning needs and interests but for the most part this ended up just giving them the opportunity to follow the path of least resistance and allowed for some fairly lazy work. I also should’ve factored into my thinking that the first week of a class is often going to be plagued by technical (logins not working) and administrative hassles and try to make allowances for this in not expecting too much work to be achieved in the first week. (That said, we have a strong need to demonstrate engagement in class activities to receive funding for students that later drop out and I was certainly able to prove that)

I think next time around there will need to be a little less freedom, a bit more structure and lot more clarity and simplicity.

On the whole I am happy that I have managed to get these teachers doing things they haven’t done before and I think they have developed useful skills and knowledge. I’d just like to do more.