Sometimes you need to spend hours poring over a list of 20,000+ academic journals looking for those related to education in business and economics. I’d advise against it.
Here are the ones that I found, so you don’t have to.
I’m not a discipline specialist, so I can’t speak to quality but I’ve included their ratings which will hopefully help.
This blog post will probably be of interest to maybe five other people (including my Mum) but it’s part of a process that I’ve decided on for tracking what I’m doing and maintaining some accountability, so here we are. It’s kind of the beauty of blogging – the whole “long tail” thing – that ultra-niche voices can still have a platform. (And in defense of my own posts, this will be far from the least coherent post that I have read this week)
So in my last post I mentioned that I’m putting processes and systems in place to help me work better on my PhD. (Well, my PhD proposal I should say – this has to be approved before I actually embark on the proper research itself). As I’ve mentioned, my topic is currently very broad but I expect that over the next year it will come into a much sharper focus. The absence of deadlines however has meant that I’ve felt that I’ve been drifting from one shiny topic to the next. People keep telling me that this is by far the best stage of a PhD and that this opportunity will probably never come again but there are still things that I need to get done in this time and having a clearer vision of what they are and when they should be done soothes my soul.
Much of what I’ve done is create an increasingly granular series of tasks and put them in 3 different tools – the Excel project plan, Wunderlist and Trello. In essence they are all glorified to-do lists but with varying functionality, including attaching teams/people, documents, calender items and integrating them with other productivity systems like Slack.
Managing all three seems a little like triple handling but the spreadsheet should be mainly just about denoting progress now and I like the whole-year perspective that it offers. IFTTT lets me automatically create Wunderlist tasks when I add them to a Trello card so this should also simplify matters.
(That’s Trello – it looks pretty plain at the moment but has given me a much needed roadmap for the different topic areas that I plan to investigate month by month)
The July topic – Universities as organisations – aligns well with both a (workplace) university strategic review as well as an upcoming ACODE University Benchmarking event that I’m hoping will offer some tangible insights into the ‘state of the actual’ in TELT practices. It also ties to the ongoing discussion/review process that I’ve been leading of ed tech and TELT in my university and the associated governance structures. (The powers that be are expecting a report with some recommendations in early August.)
The overall to-do list has gotten a little smaller -for now – though it needs updating and there are some ongoing tasks tied to digitising the myriad handwritten ideas and questions that come to me pretty much day and night now.
All this aside, it’s still hard not to feel that I haven’t made a lot of progress in the last few weeks – I guess it would be more accurate to say I haven’t made as much progress as I would’ve liked in terms of reading and writing things but I feel far more ready to do this in a more effective and productive way now.
On a day to day level though, the reading that I’ve been doing and the ideas that I’m starting to synthesise are really starting to feed into my professional practice and are giving me more confidence in the decisions and plans that I’m putting forward.
Clearly one of the key ingredients in enhancing teaching practice is teacher professional development and a vital element of deriving meaning from this is reflective practice.
It is at this point however that we need to be cautious of the evangelisers of reflective practice as a global solution. “Reflecting or Acting? Reflective Practice and Continuing Professional Development in UK Higher Education” by Sue Clegg, Jon Tan and Saeideh Saeidi (2002) takes a methodical look at the use of reflection and notes that current (at the time – not sure how much they have evolved) uses of reflective practice in CPD isn’t suited to all learners and needs to be anchored in actions taken to be particularly meaningful.
Reflective practice is valued for acknowledging “the importance of artistry in teaching” (p.3), which seems even more important in 2016 than it was in 2002 with the rise of big data and analytics in education sometimes seeming determined to quantify and KPI-ify every single facet of teaching and learning. (Can you tell that I’m more of a qual than a quant?)
Clegg et al investigated the use and value of reflective practice amongst academic staff in accredited CPD between 1995-1998. In broad terms (Spoiler alert) they tied it to four types of practices/behaviours that reflected the learning preferences and teaching circumstances of the teachers. These preferences – either for ‘writerly’ reflection or not – and the circumstances (which impacted their ability to act on new teaching knowledge) had a significant part to play on how valuable reflection was to them.
The ‘action’ part is at the core of the question that Clegg et al are pursuing. They draw on Tomlinson (1999) in assuming that “the relationship between reflection and action is transparent with reflection-on-action leading to improvement and change” (p.4). This idea has been of interest to me recently because I’ve been involved with the HEA fellowship scheme at my university which appears to have a different focus, seemingly sans action. (I’ll discuss this further in future posts as engaging Fellows seems as though it is going to be an important part of my ongoing quest/research)
As for the learning preference side of the equation, one of the simultaneous strengths and failings of the widely followed reflective practice approach is the emphasis on a very ‘writerly’ style of reflection. By which the paper refers to Bleakly (2000), who has “argued for greater attention to the form of writing and a greater self-awareness of literary accomplishments of narrating and confessional.” The authors note however that “our data suggested that some practitioners fail to write or only write as a form ex post facto justification for accreditation purposes”. Which, based on the feedback from some of the participants that struggled with the writing element of the task, can be linked in part to the disciplinary orientation of the learners (i.e. quant vs qual backgrounds) and in some cases to gender-role perceptions – “the feminine reflective side as opposed to the more active masculine doing side of practice” (p.18)
These key factors allowed the authors to sort participants into four groups, based on their practices.
Immediate action – participants put new ideas into practice directly after the CPD workshops (and before reflection) (more often novice practitioners)
Immediate reflection – participants reflected on their own practices directly after CPD workshops (more often experienced practitioners) – they also found less value in the workshops in terms of new knowledge
Deferred action – some participants were unable to act on knowledge gained in workshops due to organisational/time constraints (this limited their ability to reflect on the impact of new knowledge on their new actions/practices)
Deferred reflection – largely participants that struggled to engage with the reflection activity in its current format. Many only did it for accreditation purposes so saw little benefit in it.
Clegg et al take pains to emphasise that their research is about starting a conversation about the interrelationship between action and reflection and the need to maintain this link. They don’t draw any other conclusions but I think that even by simply looking at on-the-ground interaction with reflective practice, they have given us something to think about.
Reading this paper sparked a few random ideas for me:
Perhaps Design thinking might offer a way to bridge the gap between the ‘teaching as a craft’ and ‘teaching as an empirical science with hard data’ viewpoints by applying a more deliberate and structured way of thinking about pedagogy and course design
Are there ways that we can foster writing (and some reflection) as a part of every day ongoing CPD for academics? (Without it being seen as a burden? There probably needs to be a goal/outcome/reward that it leads to)
Decoupling reflection from action – particularly when action comes in the forms of making improvements to practice – gives people less to reflect on and might lead to too much navel gazing.
A large part of the work being done on reflective practice by one of my colleagues is focusing on the impact that it has on teacher self-efficacy. Tying it to professional recognition boosts confidence which is valuable but is there a risk that this can in turn lead to complacency or even over-estimation of one’s competence?
My personal philosophy when it comes to theory and practice is that none will ever hold all of the answers for all of the contexts. I believe that equipping ourselves with a toolbox of theories and practices that can be applied when needed is a more sustainable approach but I’m not sure how to describe this – one term that I’ve considered is multifocal – does this seem valid?
One concern that I have about this study is the large number of contextual factors that it tries to accommodate. These include : “how participants understood their activity including reflective practice, their motivations for joining the course, how they made sense of their decisions to complete or not complete, and whether they thought of this as a conscious decision” (p.7) On top of this there was the level at which the CPD was being conducted (novice teachers vs supervisors), disciplinary and gender differences as well as learning preferences. Maybe it’s enough to acknowledge these but it seems like a lot of variables.
Reflection shared with peers seems more valuable than simply submitted to assessors.
Even when reflective writing is a new, ‘out of character’ approach, it can be seen as valuable even though it can take learners time to ease into it. Supporting some warm up exercises seems like it would be important in this case.
It’s worth taking a harder look at exactly what forms meaningful reflections might take – is there just one ‘writerly’ way or should we support a broader range of forms of expression?
Audio? Video? Dank memes? “Virtually all the descriptions of keeping a journal or gather materials together suggested that they somehow felt they had not done it properly – qualifying their descriptions in terms of things being just scrappy notes, or jottings, or disorganised files, or annotated e-mail collections. Such descriptions suggest that participants had an ideal-typical method of what reflective practice should look like. While the overt message from both courses was that there was no one format, it appears that despite that, the tacit or underlying messages surrounding the idea of reflective practice is that there is a proper way of writing and that it constitutes a Foucauldian discipline with its own rules” (p.16-17)
Good reflection benefits from a modest mindset: “one sort of ethos of the course is it requires you to, I don’t know, be a bit humble. It requires you to take a step back and say perhaps I’m not doing things right or am I getting things right, and throw some doubt on your mastery…” (p.17)
This is more of a bigger picture question for my broader research – To what extent does the disciplinary background shape the success (or orientation) of uni executives in strategic thinking – qual (humanities) vs quant (STEM)?
I’ve been thinking that a core theme of my research – looking at how to support TELT practices in Higher Education – is Continuity and Change. This is a tiny bit tongue in cheek, referencing a deliberately meaningless slogan used initially in the HBO series Veep but later briefly embraced by the Australian Government.
It seems useful because it sums thing up fairly well; initiating change to new TELT practices where necessary but also supporting (and incrementally evolving) existing practices when they are already effective.
The Learning and Teaching Support Network (LTSN) Generic Centre – which no longer appears to exist but may have become something else – created a wonderfully thoughtful guide to implementing change in Higher Education in 2003 called “Change Thinking, Change Practices“.
I’ve been poring over this for the better part of a week because it is absolutely packed with insights both from theory (drawing heavily on Social Practice Theory) and a number of case studies. It up-ended a few of my own long-held ideas about implementing change (the need to win hearts and minds before getting started for one) and I think it’s well worth investing the time to read through if you are involved in or considering change in your institution.
Change in a higher ed institution can come from the top-down (a.k.a centre-periphery – the executive), bottom-up (teachers) or middle-out (departments, education support teams). These different sources of change become very important because they reflect different philosophical approaches to change. As with most things, I’d suggest that an approach drawing from all three is most valuable.
The paper identifies five common views of change that feed into these.
Technical/Rational – the top level identifies a need for change, makes a policy and a plan and the plan is enacted precisely
Resource allocation – Change needs resourcing and once this is provided, change will just occur
Diffusionist: Epidemiological – Change is driven by experts and early adopters that can successfully communicate the value of the change and inspire uptake
Kai Zen or continuous quality improvement – changes is driven incrementally from the bottom (practioners) working in communities of practice to identify needs in their area
Models using complexity – sponsors (otherwise undefined) of change create the conditions needed for change to flourish by providing resources and knowledge.
Unsurprisingly perhaps, none of these views make me entirely happy, with my pesky view that educational ecosystems of institutions are messy and we need to take a holistic approach to working with them. Fortunately this seems to be the position taken by the guide.
Rather than summarise the whole thing, I’ll explore the themes that emerged in determining conditions for success.
Flexibility
The change that is initially identified and planned for is rarely the change that you’ll end up with. This is generally a good thing because it means that as more people have become involved in the process, they have taken some ownership of it and better informed it. Having the flexibility to allow change to take its own course can generate wider acceptance.
The guide repeatedly comes back to the idea of viewing change as a process (‘changing’) rather than an outcome (‘CHANGE’)
“The innovation was ‘fuzzy’ enough to appeal to a variety of interests and points of view, even competing ones” (p.24)
Contextual awareness and understanding
This brings us neatly to the vital importance of understanding the local needs, history and practices of the place where the change is to be implemented. The guide stresses that incremental change at a departmental level has higher rates of success and provides a number of valuable case studies in support of this.
There’s a relatable but entirely frustrating contradiction about implementing change in a localised context; while change proposals with a solid backing of evidence and knowledge is more widely valued, there is simultaneously a resistance to external influences.
…colleagues will often balk at change unless it was ‘invented here’; they’ll discount foreign innovations. NIH (not invented here) breaks change forces (P.33)
I’ve already seen this on a number of occasions in my time in Higher Education when I’d get excited about something that I’d seen being done elsewhere that seemed particularly relevant to our needs only to have it met with the most disinterested of mehs. This often surprised me coming from people that I would assume to be open to knowledge and all good ideas but that downplays the tribal/parochial nature of these kinds of organisations.
This in turn led me to a side-thought, is it harder to drive change in an institution that is perceived to be (and considers itself) at the top of the heap? When your branding and culture pushes the idea of being an elite institution does this simultaneously facilitate NIH thinking in addition to diminishing the perceived urgency of change?
Incentives
A lot of factors come to bear on practitioner willingness to engage with new practices. The extent to which they have been involved in formulating the change is clearly a significant part, as is their understanding of its benefits. These intrinsic motivators provide deeper engagement with change but take longer. Extrinsic motivators, whether they be direct inducements (more time or resources) or policy directives will get results more quickly but at a shallower level.
I’ve long believed that it is vital to win hearts and minds before embarking on change processes but this guide makes a compelling case that “there is a lot of value in using tools and expertise to change practices: beliefs can follow” (P.21)
This makes sense to me on the level that giving people a lived experience of a change in practice can give them a deeper understanding of it.
Capacity / support
Whatever changes are proposed, it is essential that practitioners have the capacity to enact them. (Evidently this isn’t as obvious as it sounds). Change that builds on existing practice (scaffolded, essentially) thus becomes far more likely to succeed than entirely new practices.
A combination of training, Community of Practice support and the involvement of local support experts – such as education designers and technologists – is essential either way.
Resources / tools
The other facet that seems obvious is the need for adequate resourcing for the project. Particularly tools that are fit for purpose. This guide speaks at length about working with lecturers in the planning phase to collaboratively design and build tools (e.g. a new form of rubric) that can be used in practice to implement the changes.
This has the added benefit of creating more relevant and robust tools that incorporate local, contextual needs.
Communication
“Don’t assume that the way you think of an innovation is the way it will be understood on the ground” (p.19)
Language can also be loaded – “for many academic staff, the word ‘quality’ itself had come to symbolise additional administrative burdens which detracted from rather than enhanced their core work” (p.25)
HE institutions are fueled by words – using them well can mean the difference between failure and success. (No pressure)
Accountability mechanisms
A key element in successfully implementing a change process is remembering that it is more about the act of changing, so in some ways it never entirely ends. Putting a rigorous evaluation process into place that is clear about what is to be measured and how makes a massive difference.
There is a lot of other invaluable tips and strategies to effective change processes in this guide that are informed by theory and evidence from case studies. It expands greatly on the phases of implementation, considering them as pre-adoption (gather requirements), adoption (gaining support) and implementation. I compare this with the Ako Aotearoa model described by Akelma (2012) of initiation (arguably pre-adoption/adoption), implementation and institutionalisation.
If you have any involvement whatsoever with change in your HE institution, you need to read this paper
It was always my intention that researching in the area that I work in would help me to shape my professional practice (and it is) but I’ve been surprised lately at how much things are flowing in the other direction. I’ve been thinking a lot lately about what is needed to make an educational project successful and how we know that learners have actually benefitted.
This is partially coming from the big picture work that I’m doing with my peers at the university looking at what we’re doing and why and partially from my own college, which has recently launched a Teaching and Learning Eminence Committee/project to look into what we’re doing with teaching and learning. I wasn’t initially invited onto the committee, (it’s all academics), which speaks to some of the ideas that have been emerging in some of my recent posts (the academic/professional divide) as well as the fact that I need to work on raising the profile of my team* and understanding of our* capacity and activities in the college.
Anyway, while trawling through the tweetstream of the recent (and alas final) OLT – Office of Learning and Teaching – conference at #OLTConf2016, I came across a couple of guides published recently by Ako Aotearoa, the New Zealand National Centre for Tertiary Teaching Excellence, that fit the bill perfectly.
One focusses on running effective projects in teaching and learning in tertiary education, it’s kind of project managementy, which isn’t always the most exciting area for me but it offers a comprehensive and particularly thoughtful overview of what we need to do to take an idea (which should always be driven by enhancing learning) through three key phases identified by Fullan (2007 – as cited in Akelma et al, 2011) in the process of driving educational change – initiation, implementation and institutionalisation. The guide – Creating sustainable change to improve outcomes for tertiary learners is freely available on the Ako Aotearoa website, which is nice.
I took pages and pages of notes and my mind wandered off into other thoughts about immediate and longer term things to do at work and in my research but the key themes running through the guide were treating change as a process rather than an event, being realistic, working collectively, being honest and communicating well. It breaks down each phases into a number of steps (informed by case studies) and prompts the reader with many pertinent questions to ask of themselves and the project along the way.
The focus of the guide is very much on innovation and change – I’m still thinking about what we do with the practices that are currently working well and how we can integrate the new with the old.
The second guide – A Tertiary practitioners guide to collecting evidence of learner benefit – drills down into useful research methodologies for ensuring that our projects and teaching practices are actually serving the learners’ needs. Again, these are informed by helpful case studies and showcase the many places and ways that we can collect data from and about our students throughout the teaching period and beyond.
It did make me wonder whether the research mindset of academics might conventionally be drawn from their discipline. Coming from an organisation with an education and social science orientation, one might expect an emphasis on the qualitative (and there are a lot of surveys suggested – which I wonder about as I have a feeling that students might be a little over-surveyed already) but the guide actually encourages a mixture of methodologies and makes a number of suggestions for merging data, as well as deciding how much is enough.
Definitely some great work from our colleagues across the ditch and well worth checking out.
I shared some thoughts and summarised some of the discussions tied to the issues we face in supporting and driving institutional change, working with organisational culture and our role as professional staff experts in education design and technology.
There’s still much to talk about. Technology and what we need it to do, practical solutions both in place and under consideration / on the wishlist, further questions and a few stray ideas that were generated along the way.
Technology:
Unsurprisingly, technology was a significant part of our conversation about what we can do in the education support/design/tech realm to help shape the future of our institutions. The core ideas that came up included what we are using it for and how we sell and instill confidence in it in our clients – teachers, students and the executive.
The ubiquity and variety of educational technologies means that they can be employed in all areas of the teaching and learning experience. It’s not just being able to watch a recording of the lecture you missed or to take a formative online quiz; it’s signing up for a course, finding your way to class, joining a Spanish conversation group, checking for plagiarism, sharing notes, keeping an eye on at-risk students and so much more.
It’s a fine distinction but Ed Tech is bigger than just “teaching and learning” – it’s also about supporting the job of being a teacher or a learner. I pointed out that the recent “What works and why?” report from the OLT here in Australia gives a strong indication that the tools most highly valued by students are the ones that they can use to organise their studies.
Amber Thomas highlighted that “…better pedagogy isn’t the only quality driver. Students expect convenience and flexibility from their courses” and went on to state that “We need to use digital approaches to support extra-curricular opportunities and richer personal tracking. Our “TEL” tools can enable faster feedback loops and personalised notifications”
Even this is just the tip of the iceberg – it’s not just tools for replicating or improving analog practices – the technology that we support and the work we do offers opportunities for new practices. In some ways this links back closely to the other themes that have emerged – how we can shape the culture of the organisation and how we ensure that we are part of the conversation. A shift in pedagogical approaches and philosophies is a much larger thing that determining the best LMS to use. (But at its best, a shift to a new tool can be a great foot in the door to discussing new pedagogical approaches)
“It is reimagining the pedagogy and understanding the ‘new’ possibilities digital technologies offer to the learning experience where the core issue is” (Caroline Kuhn)
Lesley Gourlay made a compelling argument for us to not throw out the baby with the bathwater when it comes to technology by automatically assuming that tech is good and “analogue” practices are bad. (I’d like to assume that any decent Ed Designer/Tech knows this but it bears repeating and I’m sure we’ve all encountered “thought leaders” with this take on things).
“we can find ourselves collapsing into a form of ‘digital dualism’ which assumes a clear binary between digital and analogue / print-based practices (?)…I would argue there are two problems with this. First, that it suggests educational and social practice can be unproblematically categorised as one or the other of these, where from a sociomaterial perspective I would contend that the material / embodied, the print-based / verbal and the digital are in constant and complex interplay. Secondly, there perhaps is a related risk of falling into a ‘digital = student-centred, inherently better for all purposes’, versus ‘non-digital = retrograde, teacher-centred, indicative of resistance, in need of remediation’.” (Lesley Gourlay)
Another very common theme in the technology realm was the absolute importance of having reliable technology (as well as the right technology.)
“Make technology not failing* a priority. All technology fails sometime, but it fails too often in HE institutions. Cash registers in supermarkets almost never fail, because that would be way too much of a risk.” (Sonia Grussendorf)
When it comes to how technology is selected for the institution, a number of people picked up on the the tension between having it selected centrally vs by lecturers.
“Decentralize – allow staff to make their own technology (software and hardware) choices” (Peter Bryant)
Infrastructure is also important in supporting technologies (Alex Chapman)
Personally I think that there must be a happy medium. There are a lot of practical reasons that major tools and systems need to be selected, implemented, managed and supported centrally – integration with other systems, economies of scale, security, user experience, accessibility etc. At the same time we also have to ensure that we are best meeting the needs of students and academics in a host of different disciplines. and are able to support innovation and agility. (When it comes to the selection of any tool I think that there still needs to be a process in place to ensure that the tool meets the needs identified – including those of various institutional stakeholders – and can be implemented and supported properly.)
Finally, Andrew Dixon framed his VC elevator pitch in terms of a list of clear goals describing the student experience with technology which I found to be an effective way of crafting a compelling narrative (or set of narratives) for a busy VC. Here are the first few:
They will never lose wifi signal on campus – their wifi will roam seemlessly with them
They will have digital access to lecture notes before the lectures, so that they can annotate them during the lecture.
They will also write down the time at which difficult sub-topics are explained in the lecture so that they can listen again to the captured lecture and compare it with their notes. (Andrew Dixon)
Some practical solutions
Scattered liberally amongst the discussions were descriptions of practical measures that people and institutions are putting in place. I’ll largely let what people said stand on its own – in some cases I’ve added my thoughts in italics afterwards. (Some of the solutions I think were a little more tongue in cheek – part of the fun of the discussion – but I’ll leave it to you to determine which)
Culture / organisation
Our legal team is developing a risk matrix for IT/compliance issues (me)
(We should identify our work) “not just as teaching enhancement but as core digital service delivery” (Amber Thomas)
“we should pitch ‘exposure therapy’ – come up with a whole programme that immerses teaching staff in educational technology, deny them the choice of “I want to do it the old fashioned way” so that they will realise the potential that technologies can have…” (Sonja Grussendorf)
“Lets look at recommendations from all “strategy development” consultations, do a map of the recommendations and see which ones always surface and are never tackled properly.” (Sheila MacNeill)
“Could this vision be something like this: a serendipitous hub of local, participatory, and interdisciplinary teaching and learning, a place of on-going, life-long engagement, where teaching and learning is tailored and curated according to the needs of users, local AND global, actual AND virtual, all underscored by data and analytics?” (Rainer Usselman)
“…build digital spaces to expand our reach and change the physical set up of our learning spaces to empower use of technology…enable more collaborative activities between disciplines” (Silke Lange)
“we need a centralised unit to support the transition and the evolution and persistence of the digital practice – putting the frontliners into forefront of the decision making. This unit requires champions throughout the institutions so that this is truly a peer-led initiative, and a flow of new blood through secondments. A unit that is actively engaging with practitioners and the strategic level of the university” (Peter Bryant)
In terms of metrics – “shift the focus from measuring contact time to more diverse evaluations of student engagement and student experience” (Silke Lange)
“Is there a metric that measures teaching excellence?… Should it be designed in such a way as to minimise gaming? … should we design metrics that are helpful and allow tools to be developed that support teaching quality enhancement?” (David Kernohan) How do we define or measure teaching excellence?
“the other thing that we need to emphasise about learning analytics is that if it produces actionable insights then the point is to act on the insights” (Amber Thomas) – this needs to be built into the plan for collecting and dealing with the data.
Talking about the NSS (National student survey) – “One approach is to build feel-good factor and explain use of NSS to students. Students need to be supported in order to provide qualitative feedback” (David Kernohan) (I’d suggest that feedback from students can be helpful but it needs to be weighted – I’ve seen FB posts from students discussing spite ratings)
“We should use the same metrics that the NSS will use at a more granular levels at the university to allow a more agile intervention to address any issues and learn from best practices. We need to allow flexibility for people to make changes during the year based on previous NSS” (Peter Bryant)
“Institutional structures need to be agile enough to facilitate action in real time on insights gained from data” (Rainer Usselmann) – in real time? What kind of action? What kind of insights? Seems optimistic
“Institutions need at the very least pockets of innovation /labs / discursive skunk works that have licence to fail, where it is safe to fail” (Rainer Usselmann)
“Teachers need more space to innovate their pedagogy and fail in safety” (Silke Lange)
“Is it unfair (or even unethical) to not give students the best possible learning experience that we can?…even if it was a matter of a control group receiving business-as-usual teaching while a test group got the new-and-improved model, aren’t we underserving the control group?” (me)
“I can share two examples from my own experiences
An institution who wanted to shift all their UG programmes from 3 year to 4 year degrees and to deliver an American style degree experience (UniMelb in the mid 2000s)
An institution who wanted to ensure that all degree programmes delivered employability outcomes and graduate attributes at a teaching, learning and assessment level
So those resulted in;
a) curriculum change
b) teaching practice change
c) assessment change
d) marketing change ” (Peter Bryant)
“One practical option that I’m thinking about is adjusting the types of research that academics can be permitted to do in their career path to include research into their own teaching practices. Action research.” (Me) I flagged this with our Associate Dean Education yesterday and was very happy to hear that she is currently working on a paper for an education focussed journal in her discipline and sees great value in supporting this activity in the college.
“I think policy is but one of the pillars that can reinforce organisational behaviour” (Peter Bryant)- yes, part of a carrot/stick approach, and sometimes we do need the stick. Peter also mentions budgets and strategies, I’d wonder if they don’t change behaviour but more support change already embarked upon.
Technology
“let’s court rich people and get some endowments. We can name the service accordingly: “kingmoneybags.universityhandle.ac.uk”. We do it with buildings, why not with services?” (Sonia Grussendorf) – selling naming rights for TELT systems just like buildings – intriguing
We need solid processes for evaluating and implementing Ed Tech and new practices (me)
Pedagogical
“Could creating more ‘tailored’ learning experiences, which better fit the specific needs and learning styles of each individual learner be part of the new pedagogic paradigm?” (Rainer Usselman) (big question though around how this might be supported in terms of workload
“At Coventry, we may be piloting designing your own degree” (Sylvester Arnab)
“The challenge comes in designing the modules so as to minimise prerequisites, or make them explicit in certain recommended pathways” (Christopher Fryer)
I went on to suggest that digital badges and tools such as MyCourseMap might help to support this model. Sylvester noted that he is aware that “these learning experiences, paths, patterns, plans have to be validated somehow” Learner convenience over pedagogy – or is it part of pedagogy in line with adult learning principles of self-efficacy and motivation. In a design your own degree course, how do we ensure that learners don’t just choose the easiest subjects – how do we avoid the trap of having learners think they know enough to choose wisely?
“digital might be able to help with time-shifting slots to increase flexibility with more distributed collaboration, flipped teaching, online assessment” (George Roberts)
“At UCL we are in the midst of an institution-wide pedagogic redesign through the Connected Curriculum. This is our framework for research-based education which will see every student engaging in research and enquiry from the very start of their programme until they graduate (and beyond). More at http://www.ucl.ac.uk/teaching-learning/connected-curriculum
The connected bit involves students making connections with each other, with researchers, beyond modules and programmes, across years of study, across different disciplines, with alumni, employers, and showcase their work to the wider world…
There is strong top-down support, but also a middle-out approach with faculties having CC fellows on part time secondments to plan how introduce and embed the CC in their discipline.
From a TEL perspective we need to provide a digital infrastructure to support all of this connectivity – big project just getting going. Requirements gathering has been challenging… And we’re also running workshops to help programme and module teams to design curricula that support research-based and connected learning.” (Fiona Strawbridge) – liking this a lot, embedding practice. What relationship do these fellows have with lecturers?
“I am imagining that my research, personal learning environment would fit perfect with this approach as I am thinking the PLE as a toolbox to do research. There is also a potential there to engage student in open practice, etc.” Caroline Kuhn
“There may be a “metapedagogy” around the use of the VLE as a proxy for knowledge management systems in some broad fields of employment: consultancy, financial services, engineering…” (George Roberts) (which I’d tie to employability)
“We need to challenge the traditional model of teaching, namely didactic delivery of knowledge. The ways in which our learning spaces are currently designed -neat rows, whiteboard at front, affords specific behaviours in staff and students. At the moment virtual learning spaces replicate existing practices, rather than enabling a transformative learning experience. The way forward is to encourage a curricula founded on enquiry-based learning that utilise the digital space as professional practitioners would be expected to” (Silke Lange) – maybe but none of this describes where or how lecturers learn these new teaching skills. Do we need to figure out an evolutionary timeline to get to this place, where every year or semester, lecturers have to take one further step, add one new practice?
“Do not impose a pedagogy. Get rid of the curricula. Empower students to explore and to interact with one another. The role of the teacher is as expert, navigator, orienteer, editor, curator and contextualisor of the subject. Use heuristic, problem-based learning that is open and collaborative. Teach students why they need to learn” (Christopher Fryer)
This is but a cherry-picked selection of the ideas and actions that people raised in this hack but I think it gives a sense of some of the common themes that emerged and of the passion that people feel for our work in supporting innovation and good practices in our institutions. I jotted down a number of stray ideas for further action in my own workplace as well as broader areas to investigate in the pursuit of my own research.
As always, the biggest question for me is that of how we move the ideas from the screen into practice.
Further questions
How are we defining pedagogical improvements – is it just strictly about teaching and learning principles (i.e. cognition, transfer etc) or is it broader – is the act of being a learner/teacher a part of this (and thus the “job” of being these people which includes a broader suite of tools) (me)
What if we can show how learning design/UX principles lead to better written papers by academics? – more value to them (secondary benefits) (me)
“how much extra resource is required to make really good use of technology, and where do we expect that resource to come from?” (Andrew Dixon)
Where will I put external factors like the TEF / NSS into my research? Is it still part of the organisation/institution? Because there are factors outside the institution like this that need to be considered – govt initiatives / laws / ???
Are MOOCs for recruitment? Marketing? (MOOCeting?)
“How do we demonstrate what we do will position the organisation more effectively? How do we make sure we stay in the conversation and not be relegated to simply providing services aligned with other people’s strategies” (arguably the latter is part of our job)
“How do we embed technology and innovative pedagogical practices within the strategic plans and processes at our institutions?” (Peter Bryant)
Further research
Psychology of academia and relationships between academic and professional staff. (Executive tends to come from academia)
“A useful way to categorise IT is according to benefits realisation. For each service offered, a benefits map should articulate why we are providing the service and how it benefits the university.” (See https://en.wikipedia.org/wiki/Benefits_realisation_management ) (Andrew Dixon)
Leadership and getting things done / implementing change, organisational change
How is organisational (particularly university) culture defined, formed and shaped?
Actor-network theory
Design research
Some ideas this generated for me
Instead of tech tool based workshops – or in addition at least – perhaps some learning theme based seminars/debates (with mini-presentations). Assessment / Deeper learning / Activities / Reflection
Innovation – can be an off-putting / scary term for academics with little faith in their own skills but it’s the buzzword of the day for leadership. How can we address this conflict? How can we even define innovation within the college?
What if we bring academics into a teaching and learning / Ed tech/design support team?
Telling the story of what we need by describing what it looks like and how students/academics use it in scenario / case study format offers a more engaging narrative
What is the role of professional bodies (E.g. unions like the NTEU) in these discussions?
Are well-off, “prestigious” universities the best places to try to innovate? Is there less of a driving urge, no pressing threat to survival? Perhaps this isn’t the best way to frame it – a better question to ask might be – if we’re so great, what should other universities be learning from us to improve their own practices? (And then, would we want to share that knowledge with our competitors)
“I was thinking about the power that could lie behind a social bookmarking tool when doing a dissertation, not only to be able to store and clasify a resource but also to share it with a group of likeminded researcher and also to see what other have found about the same topic.” (Caroline Kuhn) – kind of like sharing annotated bibliographies?
Bigger push for constructive alignment
I need to talk more about teaching and learning concepts in the college to be seen as the person that knows about it
In conclusion
I’d really like to thank the organisers of the Digital is not the future Hack for their efforts in bringing this all together and all of the people that participated and shared so many wonderful and varied perspectives and ideas. Conversation is still happening over there from what I can see and it’s well worth taking a look.
I mentioned a couple of weeks ago that I’ve embarked on some sort of ramshackle process of evaluating what we’re doing in terms of Ed. Tech and design with some of my fellow Ed Techs and Designers in the colleges and central team. This is with a view to finding ways to work together better, build relationships and ultimately make some recommendations to the high ups that may or not be acted upon. (At the very least I’m optimistic that people on the ground will communicate and collaborate better and with a renewed clarity)
In some ways, we’re racing the clock, as our VC has started his consultation tour as the first part of his review/reform/something process. Best case scenario is that we’ll be able to feed our findings/opinions/fervent wishes into his process and change might be kickstarted. Worst case is – well, let’s not think about that. Something with dragons and ice zombies or something.
So we had our second discussion today and were able to successfully identify six core themes with some attendant issues and questions to press on with for more in-depth investigation. The goal is to try to come up with something tangible for each theme every two weeks, through a combination of online and in person discussions. This will ideally give us a greater sense of what we’re about (I hate to use the term mission statement but perhaps something less aethereal) which will inform some revised terms of reference for our lower level parts of the ed. tech governance structure. (This is where I’m expecting the greatest resistance but who knows.)
These are the themes that we have arrived at. (If you feel that we’ve missed something or over-estimated the importance of something, please feel free to leave a comment.)
Is it eLearning, blended learning, technology enhanced learning (and teaching), online learning or just plain old teaching and learning? Why? Are we about education innovation or education support? (It’s not simply about the language either – this can be quite political).
What are we ultimately trying to achieve for the learners, the academics, the university, etc?
Are there a set of key principles that guide us?
How do we define, encourage and support best practices in teaching and learning? (And in other areas?) How can we best serve teachers and learners? Is it strictly about the cognitive, pedagogical aspects of teaching and learning or do other factors need to come in to the training and advice that we offer including accessibility, equity and pastoral care?
What can we do as humble (yet expert) professional support staff to be listened to? How do we take a more substantive role in the decision making processes that directly affect us?
What can we do between our various colleges and teams to work together more effectively and share our skills and knowledge? How can we support wider dissemination of ideas in the university and in the wider education design/technology community?
What can we do to build better relationships between the colleges and central teams and to increase understanding of each other’s needs and obligations? Can we simplify the decision making process to streamline approvals for changes and new initiatives?
How can we make the elements of the existing governance structure work more effectively together and better utilise the resources available?
These are some sensitively phrased questions and ideas to get started – this process is going to be complicated by virtue of the range of different stakeholders with competing priorities and differences of opinion will be inevitable. My hope is that by keeping focus on the mutual benefits – and sticking to the discussion topics – progress will be made.
This is the padlet in progress – you should be able to add things but not change them.
(I should mention that some of the themes were inspired/expanded by the discussions in the “Digital is not the future” hack – particularly the question of expertise)
The Office for Learning and Teaching (OLT) is – now was – an Australian government body intended to support best practice in enhancing teaching and learning in the Higher Education sector.
It funds a number of research projects, which in 2013 included “What works and why? Understanding successful technology enhanced learning within institutional contexts” – driven by Monash University in Victoria and Griffith University in Queensland and led by Neil Selwyn.
Rather than focussing on the “state of the art”, the project focuses on the “state of the actual” – the current implementations of TELT practices in universities that are having some measure of success. It might not be the most inspiring list (more on that shortly) but it is valuable to have a snapshot of where we are, what educators and students value and the key issues that the executive face (or think they face) in pursuing further innovation.
The report identifies 13 conditions for successful use of Tech Enhanced Learning in the institution and with teachers and students. (Strictly speaking, they call it technology enabled learning, which grates with me far more than I might’ve expected – yes, it’s ultimately semantics but for me the implication is that the learning couldn’t occur without the tech and that seems untrue. So because this is my blog, I’m going to take the liberty of using enhanced)
The authors took a measured approach to the research, beginning with a large scale survey of teacher and student attitudes toward TEL which offered a set of data that informed questions in a number of focus groups. This then helped to identify a set of 10 instances of “promising practice” at the two participating universities that were explored in case studies. The final phase involved interviewing senior management at the 39 Australian universities to get feedback on the practicality of implementing/realising the conditions of success.
So far, so good. The authors make the point that the majority of research in the TELT field relates to more cutting edge uses in relatively specific cohorts and while this can be enlightening and exciting, it can overlook the practical realities of implementing these at scale within a university learning ecosystem. As a learning technologist, this is where I live.
What did they discover?
The most prominent ways in which digital technologies were perceived as ‘working’ for students related to the logistics of university study. These practices and activities included:
Organising schedules and fulfilling course requirements;
Time management and time-saving; and
Being able to engage with university studies on a ‘remote’ and/or mobile basis
One of the most prominent learning-related practices directly to learning was using digital technologies to ‘research information’; ‘Reviewing, replaying and revising’ digital learning content (most notably accessing lecture materials and recordings) was also reported at relatively high levels.
Why technologies ‘work’ – staff perspectives
Most frequently nominated ways in which staff perceived digital technologies were ‘working’ related to the logistics of university teaching and learning. These included being able to co-ordinate students, resources and interactions in one centralised place. This reveals a frequently encountered ‘reality’ of digital technologies in this project: technologies are currently perceived by staff and students to have a large, if not primary role to enable the act of being a teacher or student, rather than enabling the learning.
Nevertheless, the staff survey did demonstrate that technologies were valued as a way to provide support learning, including delivering instructional content and information to students in accessible and differentiated forms. This was seen to support ‘visual’ learning, and benefit students who wanted to access content at different times and/or different places.
So in broad terms, I’d suggest that technology in higher ed is seen pretty much exactly the same way we treat most technology – it doesn’t change our lives so much as help us to live them.
To extrapolate from that then, when we do want to implement new tools and ways of learning and teaching with technology, it is vital to make it clear to students and teachers exactly how they will benefit from it as part of the process of getting them on board. We can mandate the use of tools and people will grumblingly accept it but it is only when they value it that they will use it willingly and look for ways to improve their activities (and the tool itself).
The next phase of the research, looking at identified examples of ‘promising practice” to develop the “conditions for success” is a logical progression but looking at some of the practices used, it feels like the project was aiming too low. (And I appreciate that it is a low-hanging-fruit / quick-wins kind of project and people in my sphere are by our natures more excited by the next big thing but all the same, if we’re going to be satisfied with the bare minimum, will that stunt our growth?) . In fairness, the report explicitly says “the cases were not chosen according to the most ‘interesting’, ‘innovative’ or ‘cutting-edge’ examples of technology use, but rather were chosen to demonstrate sustainable examples of TEL”
Some of the practices identified are things that I’ve gradually been pushing in my own university so naturally I think they’re top shelf innovations 🙂 – things like live polling in lectures, flipping the classroom, 3D printing and virtual simulations. Others however included the use of online forums, providing videos as supplementary material and using “online learning tools” – aka an LMS. For the final three, I’m not sure how they aren’t just considering standard parts of teaching and learning rather than something promising. (But really, it’s a small quibble I guess and I’ll move on)
The third stage asked senior management to rank the usefulness of the conditions of success that were identified from the case studies and to comment on how soon their universities would likely be in a position to demonstrate them. The authors seemed surprised by some of the responses – notably to the resistance to the idea of taking “permissive approaches to configuring systems and choosing software”. As someone “on the ground” that bumps into these kinds of questions on daily basis, this is where it became clear to me that the researchers have still been looking at this issue from a distance and with a slightly more theoretical mindset. There is no clear indication anywhere in this paper that they discussed this research with professional staff (i.e. education designers or learning technologists) who are often at the nexus of all of these kinds of issues. Trying to filter out my ‘professional hurt feelings’, it still seems a lot like a missed opportunity.
No, wait, I did just notice in the recommendations that “central university agencies” could take more responsibility for encouraging a more positive culture related to TEL among teachers.
Yup.
Moving on, I scribbled a bunch of random notes and thoughts over this report as I read it (active reading) and I might just share these in no particular order.
Educators is a good word. (I’m currently struggling with teachers vs lecturers vs academics)
How do we define how technologies are being used “successfully and effectively”?
Ed Tech largely being used to enrich rather than change
Condition of success 7″the uses of digital technology fit with familiar ways of teaching” – scaffolded teaching
condition of success 10 “educators create digital content fit for different modes of consumption” – great but it’s still an extra workload and skill set
dominant institutional concerns include “satisfying a perceived need for innovation that precludes more obvious or familiar ways of engaging in TEL” – no idea how we get around the need for ‘visionaries’ at the top of the tree to have big announceables that seemingly come from nowhere. Give me a good listener any day.
for learners to succeed with ed tech they need better digital skills (anyone who mentions digital natives automatically loses 10 points) – how do we embed this? What are the rates of voluntary uptake of existing study skills training?
We need to normalise new practices but innovators/early adopters should still be rewarded and recognised
it’s funny how quickly ed tech papers date – excitement about podcasts (which still have a place) makes this feel ancient
How can we best sell new practices and ideas to academics and executive? Showcases or 5 min, magazine show style video clips (like Beyond 2000 – oh I’m so old)
Stats about which tools students find useful – data is frustratingly simple. Highest rating tool is “supplementing lectures, tutorials, practicals and labs” with “additional resources” at 42% (So 58% don’t find useful? – hardly a ringing endorsement
Tools that students were polled about were all online tools – except e-books. Where do offline tools sit?
Why are students so much more comfortable using Facebook for communication and collaboration than the LMS?
60% of students still using shared/provided computers over BYOD. (Be interesting to see what the figure is now)
Promising practice – “Illustrating the problem: digital annotation tools in large classes” – vs writing on the board?
conditions for success don’t acknowledge policy or legal compliance issues (privacy, IP and copyright)
conditions for success assume students are digitally literate
there’s nothing in here about training
unis are ok with failure in research but not teaching
calling practices “innovations signals them as non-standard or exceptions” – good point. Easier to ignore them
nothing in here about whether technology is fit for purpose
Ultimately I got a lot out of this report and will use it to spark further discussion in my own work. I think there are definitely gaps and this is great for me because it offers some direction for my own research – most particularly in the role of educational support staff and factors beyond the institution/educator/student that come into play.
Update: 18/4/16 – Dr Michael Henderson of Monash got in touch to thank me for the in-depth look at the report and to also clarify that “we did indeed interview and survey teaching staff and professional staff, including faculty based and central educational / instructional designers”
Which kind of makes sense in a study of this scale – certainly easy enough to pare back elements when you’re trying to create a compelling narrative in a final report I’m sure.