Half-way in to the slightly manic process of reorganising my Scrivener notes for my PhD thesis proposal, I wondered if I wasn’t using it to avoid to actual work. I was painstakingly working through a host of references (some with annotations – mainly from the abstract I suspect) – that I had added early on from my initial proposal and largely just dumped into my broad categories without much further thought. I haven’t since come back to them or considered how relevant or useful they are or what I plan to do with them.
My larger goal in this exercise was to try to find some kind of structure for my thinking – I’m increasingly aware that the Higher Education ecosystem is intricate and complex and most if not all of the moving parts impact on each other far more than the current literature seems to acknowledge. I’m still not sure what the best way to represent this is, but I’m hoping that creating some order will help me to place the myriad random thoughts and questions that I’ve come up with so far in something more manageable.
Which seems to be a point that I often reach in large projects (none as large as this, admittedly) before losing interest and moving on to something new. As I thought about this, I worried that I was doing this exact thing here.
Fortunately, I wasn’t. I eliminated a number of papers that seemed relevant on the surface but really weren’t, I found a few more that I’d completely forgotten that I have put into the high priority reading list and I think that now I have a place for everything and everything in its place. There’s a section for the actual writing (broken up by broad topics), a section for notes (broken up by broad topics which I’d say will get more and more subtopics), a section for quotes and references (with sub-sections for individual papers) and some general miscellaneous pages for ‘other’ stuff. What works best about this for me is the way that it lets me quickly jump around the proposal when something useful needs to be jotted down and the side-by-side structure of Scrivener lets me easily copy-paste chunks. It looks a bit like this.
The other part of this process that was useful was finding a brief paper that has the same focus as my research, which gave me some assurance that I’m on track with my ideas as well as check whether I’m missing anything vital. (Turns out that I think that they are missing a few things, which is obviously good for me. I’ll post about this one shortly)
I was recently invited by @UQKelly – Kelly Matthews of the University of Queensland – to attend the National Students as Partners Roundtable on a glorious Brisbane Spring day. (For which I am grateful almost as much for the chance to escape a particularly bleak Canberra day as for the exposure to some interesting ideas and wonderful people working in this space). This isn’t an area that I’ve had much to do with and I was invited to bring a critical friend/outsider perspective to proceedings as much as anything.
Students as Partners (which I’ll shorten to SaP because I’ll be saying it a lot) more than anything represents a philosophical shift in our approach to Higher Education, it doesn’t seem like too great a stretch to suggest that it almost has political undertones. These aren’t overt or necessarily conventional Left vs Right politics but more of a push-back against a consumerist approach to education that sees students as passive recipients in favour of the development of a wider community of scholarship that sees students as active co-constructors of their learning.
It involves having genuine input from students in a range of aspects of university life, from assessment design to course and programme design and even aspects of university governance and policy. SaP is described as more of a process than a product – which is probably the first place that it bumps up against the more managerialist model. How do you attach a KPI to SaP engagement? What are the measurable outcomes in a change of culture?
The event itself walked the walk. Attendance was an even mixture of professional education advisor staff and academics and I’d say around 40% students. Students also featured prominently as speakers though academics did still tend to take more of the time as they had perhaps more to say in terms of underlying theory and describing implementations. I’m not positive but I think that this event was academic initiated and I’m curious what a student initiated and owned event might have looked like. None of this is to downplay the valuable contributions of the students, it’s more of an observation perhaps about the unavoidable power dynamics in a situation such as this.
From what I can see, while these projects are about breaking down barriers, they often tend to be initiated by academics – presumably because students might struggle to get traction in implementing change of this kind without their support and students might not feel that they have the right to ask. Clearly many students feel comfortable raising complaints with their lecturers about specific issues in their courses but suggesting a formalised process for change and enhancements is much bigger step to take.
The benefits of an SaP approach are many and varied. It can help students to better understand what they are doing and what they should be doing in Higher Education. It can give them new insights into how H.E. works (be careful what you wish for) and help to humanise both the institution and the teachers. SaP offers contribution over participation and can lead to greater engagement and the design of better assessment. After all, students will generally have more of a whole of program/degree perspective than most of their lecturers and a greater understanding of what they want to get out of their studies. (The question of whether this is the same as what they need to get out of their studies is not one to ignore however and I’ll come back to this). For the students that are less engaged in this process, at the very least the extra time spent discussing their assessments will help them to understand the assessments better. A final benefit of actively participating in the SaP process for students is the extra skills that they might develop. Mick Healey developed this map of different facets of teaching and learning that it enables students to engage with. A suggestion was made that this could be mapped to more tangible general workplace skills, which I think has some merit.
As with all things, there are also risks in SaP that should be considered. How do we know that the students that participate in the process are representative? Several of the students present came from student politics, which doesn’t diminish their interest or contribution but I’d say that it’s reasonable to note that they are probably more self-motivated and also driven by a range of factors than some of their peers. When advocating for a particular approach in the classroom or assessment, will they unconsciously lean towards something that works best for them? (Which everyone does at some level in life). Will their expectations or timelines be practical? Another big question is what happens when students engage in the process but then have their contributions rejected – might this contribute to disillusionment and disengagement? (Presumably not if the process is managed well but people are complicated and there are many sensitivities in Higher Ed)
To return to my earlier point, while students might know what they want in teaching and learning, is it always what they need? Higher Ed can be a significant change from secondary education, with new freedoms and responsibility and new approaches to scholarship. Many students (and some academics) aren’t trained in pedagogy and don’t always know why some teaching approaches are valuable or what options are on the table. From a teaching perspective, questions of resistance from the university and extra time and effort being spent for unknown and unknowable outcomes should also be considered. None of these issues are insurmountable but need to be considered in planning to implement this approach.
Implementation was perhaps my biggest question when I came along to the Roundtable. How does this work in practice and what are the pitfalls to look out for. Fortunately there was a lot of experience in the room and some rich discussion about a range of projects that have been run at UQ, UTS, Deakin, UoW and other universities. At UoW, all education development grants must now include a SaP component. In terms of getting started, it can be worth looking at the practices that are already in place and what the next phase might be. Most if not all universities have some form of student evaluation survey. (This survey is, interestingly, an important part of the student/teacher power dynamic, with teachers giving students impactful marks on assessments and students reciprocating with course evaluations, which are taken very seriously by universities, particularly when they are bad).
A range of suggestions and observations for SaP implementations were offered, including:
Trust is vital, keep your promises
Different attitudes towards students as emerging professionals exist in different disciplines – implementing SaP in Law was challenging because content is more prescribed
Try to avoid discussing SaP in ‘teacher-speak’ too much – use accessible, jargon-free language
Uni policies will mean that some things are non negotiable
Starting a discussion by focusing on what is working well and why is a good way to build trust that makes discussion of problems easier
Ask the question of your students – what are you doing to maximise your learning
These images showcase a few more tips and a process for negotiated assessment.
There was a lot of energy and good will in the room as we discussed ideas and issues with SaP. The room was set up with a dozen large round tables holding 8-10 people each and there were frequent breaks for table discussions during the morning and then a series of ‘world cafe’ style discussions at tables in the afternoon. On a few occasions I was mindful that some teachers at the tables got slightly carried away in discussing what students want when there were actual, real students sitting relatively quietly at the same table, so I did what I could to ask the students themselves to share their thoughts on the matters. On the whole I felt a small degree of scepticism from some of the students present about the reality vs the ideology of the movement. Catching a taxi to the airport with a group of students afterwards was enlightening – they were in favour of SaP overall but wondered how supportive university executives truly were and how far they would let it go. One quote that stayed with me during the day as Eimear Enright shared her experiences was a cheeky comment she’d had from one of her students – “Miss, what are you going to be doing while we’re doing your job”
On the whole, I think that a Students as Partners approach to education has a lot to offer and it certainly aligns with my own views on transparency and inclusion in Higher Ed. I think there are still quite a few questions to be answered in terms of whether it is adequately representative and how much weighting the views of students (who are not trained either in the discipline or in education) should have. Clearly a reasonable amount but students study because they don’t know things and, particularly with undergraduate students, they don’t necessarily want to know what’s behind the curtain. The only way to resolve these questions is by putting things into practice and the work that is being done in this space is being done particularly well.
For a few extra resources, you might find these interesting.
Sometimes posting a research progress update can be like jumping on the scales after a weekend of eating cake – it’s important to do to maintain accountability but you know it’s not going to be pretty. This is one of those times.
As you can tell by my recent posting history, it’s been a while since I read and reflected upon anything. Since my last update however, I have had a PhD progress review where the panel was satisfied with how I’m going and took me off probation and I also attended the ePortforum conference in Sydney, two days of talking and learning about what is being done in Higher Ed. with ePortfolios.
I also read a chapter of a book my supervisor (hi Peter) wrote about teacher attitudes towards education technology which got me thinking much more about methodology than I have been to date. There’s a strangeness to reading (and particularly writing) about one’s supervisor’s writing – a lot of different conflicting feels. Am I obliged/expected to fall into line with his ideas and/or particular areas of interest? (I don’t think so – he’s been remarkably chilled about what I’m doing. Offering thoughts and suggestions, of course but I’ve never felt pressured). Is it ok if I disagree with something that he’s said in his writing? (Again, I think that if I was able to present a solid argument, it would be fine. That said, I’ve not come across anything yet that hasn’t been eye-opening, as you would hope for from a mentor/supervisor). If I read too much of his work, does it get weird or obsequious?
On the one (rational) hand, you approach a supervisor because you think that their interests/methods will inform yours and presumably align (or vice versa) so why wouldn’t you but on the other (emotional) hand, have I had some kind of need to explore the other literature first to come to some of my own conclusions before being shaped too much by his take on things? (In the same way that a filmmaker on a remake might go back to the initial novel but not watch the first film that came from it?). Even Peter said that I didn’t necessarily need to read this particular book as it’s from 2002 and attitudes to ed tech have no doubt shifted since then. He suggested more that I look at who has cited it.
I’m really glad that I did read it though as, as I mentioned, the methodological ideas gave me a lot to think about – largely in getting tutors to describe their grading process as almost as stream of consciousness in real time which was also recorded so that they could watch the recording and add a layer of reflection later. This may well be a common methodology but it’s not one that I’ve come across in the reading that I’ve done to date. I’ll post something about this chapter soon anyway.
I’ve also been working away on an application to upgrade myself from Associate Fellow of the HEA to Senior Fellow. This requires a lot of reflective writing (around 7000+ words) and has been useful in thinking in greater depth about my own professional practices and ‘learning journey’. (I always feel a little bit hippy using that expression but I haven’t come across a better one). So this application has taken up a decent chunk of my time as well.
I have also – because clearly I have a lot of free time on my hands – been slowly nudging forward the formation of a Special Interest Group through HERDSA (but not solely for HERDSA members I think) that is focused on Education Advisors. (a.k.a Education Support Staff – academic developers, ed designers, learning technologists etc). We had a great lunchtime conversation (vent?) about some of the issues that we face which aligned particularly with many of the papers that i have posted about here in the last couple of months. I suspect that one of the trickiest parts will be explaining to teaching academics that this isn’t a group for them. I guess this is one of the things that we’ll need to pin down in the formation of it. It’s far from a new idea – there are a range of city and state based parallels in varying states of activity – but having a national (transnational to include NZ) body isn’t something I’ve seen before. The funny thing is that while this is important to me, some of the issues/ideas that came up in the conversation yesterday, I felt like I have already moved on from in pivoting to research academic staff now and their issues and concerns. But I’m pretty sure I can walk and chew gum at the same time.
At this stage of looking at the matter of professional staff and academic staff in Higher Education, I feel that I’m somewhat flogging a dead horse and everything that needs to be said, has been said. So why am I still looking at this paper? Initially I was concerned that it grated on me because it doesn’t fit with my current narrative that there are significant cultural factors in universities that make it unnecessarily difficult for professional staff – particularly those in education support roles – to be heard when it comes to discussing teaching and learning.
If this was the case, I’d clearly not being doing my best work as a scholar – open to new information and willing to reconsider my world view in the face of it. Having looked over the paper a few times now though, I have to say that I think it’s just not that great a piece of research. A number of assertions are made that simply aren’t supported by the evidence presented and some of the reasoning seems specious. Events from four years prior to the publication date are referred to in the future tense but there is no discussion of whether they happened or what the consequences were.
Assuming that this is poor research – or perhaps poor analysis – it makes me happy that I’ve reached a point where I can identify bad work but also a little concerned that I’m wrong or I’m missing something because this was still published in a peer reviewed journal that I’ve found a lot of good work in previously. (Then again, I assume that most journals have their own favoured perspectives and maybe this was well aligned with it). I searched in vain to find other writing by the author but she appears to be a ghost, with no publications or notable online presence since the paper came out.
In a nutshell, based on an anonymous online survey of 29% of all staff – academic and professional at her institution, which included questions about demographics, perceptions of the nature of their roles, the ‘divide’ and the value of different types of staff in relation to strategic priorities, the author concludes that there is minimal dissension between academic and “allied” staff and most of what little there is, is felt by the allied staff.
Now it’s entirely reasonable that this may well be the case but there are a few elements of the paper that seem to undermine the authors argument. Wohlmuther asks survey participants about their perceptions of a divide but doesn’t dig directly into attitudes towards other kinds of staff, which McInnis (1998), Dobson (2000) and Szekeres (2004) all identified as central factors. She looks at the perceptions of contributions of academic and allied staff members to the strategic goals of the organisation which obliquely explores their ‘value’ within the organisation but it seems limited. Given the ambiguous value of some higher level strategic goals (Winslett, 2016), this would seem to tell an incomplete story.
The greatest weakness of the paper to my mind is that ‘allied’ and ‘academic’ work roles are unclear.
Survey respondents were asked what percentage of their time they spent on allied work and what percentage of their time they should spend on allied work. The term ‘allied work’ was not defined. It was left to the respondent to interpret what they meant by allied work (p.330)
With no further examination of the responses via focus groups or interviews, this alone (to me anyway) seems to make the findings murky.
She found that only 29% of staff – all staff? that is unclear – felt that there was “good understanding and respect for the significance of each others roles and all staff work well together” (p.331) across the institute, however doesn’t take this to be an indicator of division.
Looking over the paper again, these are probably my main quibbles and perhaps they aren’t so dramatic. This tells me that I still have a way to go before I can truly ‘read’ a paper properly but I’m on the way
This is just a quick one because I’m getting on a roll and am going to try to skim read 6 papers this weekend and properly read one that I’ve already started. I’ve been quite conscious of the fact that while I’m doing some good (it seems) deep reading, it’s taking a fair while and looking at the bibliographies even in journal papers makes me mindful of the fact that coming up with a useful (and read) list of 50+ papers requires a little getting the lead out.
Happily, I’ve found a contemporary paper (2016) by Greg Winslett of UNE that I’m hopeful will give me a recent take on the issues addressed in the three papers I looked at from the turn of the century. (There are also a host of recent citations that seem pretty pertinent)
Winslett’s paper – still from the Journal of Higher Education Policy and Management (I’m worried about drawing too often from the same journal well) but what can you do – is about “The struggle to satisfy need: exploring the institutional cues for teaching support staff”
I like two things about this already – the term teaching support staff seems more suitable than the “education support staff” that I’ve been favouring, (although I am sad to lose the ESP acronym) – and the fact that this is about how TSSs can take guidance from university strategies. (We’re in the middle of a strategic revamp at present, so there’s much to think about)
I did also quite like the fact that a paper co-written by my supervisors Peter and Lina was cited. There was a funny moment of “oh, I know them”
I’m also mindful of the fact that I’m leaning very heavily on papers about and writers from the Australian Higher Education sector. I think I’m ok with this for now but will probably need to consider this in the way that I shape my research questions.
My cool uber-boss, our Associate Dean Education (hi Bronwen) mentioned that I’ve been tweeting a lot about the professional/academic staff divide lately. I felt compelled to clarify that I wasn’t trying to make any particular point or that I have any issue, it’s just where my research is sitting at the moment – and I guess I’m noticing more when other people are tweeting about it.
(I’m scheduled to move on to Unis as Organisations next Friday – I’m not 100% clear what I mean by this but I think it includes education ecosystems among other things). My way of thinking is also such that I’m most interested in the search for solutions than dwelling on any possible issues in terms of any divide or tensions between academics and professionals. The way I see things, we are where we are and that part can’t be changed but by trying to understand it, we can see which bits are working and which can be improved .
I suspect this isn’t going to be the last time that thinking critically about academia in an academic way raises eyebrows.
Things are definitely feeling better in researchland – this weekend I’ve read 3 papers and 2 blog posts, have blogged about the post and have another post brewing that will capture key ideas from the three papers.
I think my choice of papers has helped me here, I’ve been looking at the divide between professional and academic staff in higher education and this has been a comparatively theory-lite experience, with far less epistemology and pedagogy to unpack than normal.
The papers are all at least a decade (and one is closer to two) old and have left me asking regularly – ok well that’s pretty interesting but where are we now? Have your promised or hoped for changes eventuated or has the academe stubbornly dug in?
Perhaps it was the papers that I chose/found but the role of Education Support People/Professionals is barely even acknowledged and this certainly gives me thing to move on with. Developing a broader understanding of attitudes and the impact of external changes (largely governmental in these papers in terms of higher expectations for accountability and professionalism) has definitely given me a greater feel for the environment and issues.
I have noted that all three papers came from the same journal, the Higher Education Policy and Management, which seems like a logical place for discussions of the operational side of universities and I’ll be interested to see whether the question of the role and value of professional staff is considered in journals relating to other aspects of Higher Ed.
I have a meeting booked with my supervisor on Thursday and I’m feeling like I might even have something of substance to discuss, seemingly for the first time in a while. This isn’t to say that the other meetings weren’t productive but I feel much more like I’ve been doing proper scholarship this time around.
Sometimes you need to spend hours poring over a list of 20,000+ academic journals looking for those related to education in business and economics. I’d advise against it.
Here are the ones that I found, so you don’t have to.
I’m not a discipline specialist, so I can’t speak to quality but I’ve included their ratings which will hopefully help.
It was always my intention that researching in the area that I work in would help me to shape my professional practice (and it is) but I’ve been surprised lately at how much things are flowing in the other direction. I’ve been thinking a lot lately about what is needed to make an educational project successful and how we know that learners have actually benefitted.
This is partially coming from the big picture work that I’m doing with my peers at the university looking at what we’re doing and why and partially from my own college, which has recently launched a Teaching and Learning Eminence Committee/project to look into what we’re doing with teaching and learning. I wasn’t initially invited onto the committee, (it’s all academics), which speaks to some of the ideas that have been emerging in some of my recent posts (the academic/professional divide) as well as the fact that I need to work on raising the profile of my team* and understanding of our* capacity and activities in the college.
Anyway, while trawling through the tweetstream of the recent (and alas final) OLT – Office of Learning and Teaching – conference at #OLTConf2016, I came across a couple of guides published recently by Ako Aotearoa, the New Zealand National Centre for Tertiary Teaching Excellence, that fit the bill perfectly.
One focusses on running effective projects in teaching and learning in tertiary education, it’s kind of project managementy, which isn’t always the most exciting area for me but it offers a comprehensive and particularly thoughtful overview of what we need to do to take an idea (which should always be driven by enhancing learning) through three key phases identified by Fullan (2007 – as cited in Akelma et al, 2011) in the process of driving educational change – initiation, implementation and institutionalisation. The guide – Creating sustainable change to improve outcomes for tertiary learners is freely available on the Ako Aotearoa website, which is nice.
I took pages and pages of notes and my mind wandered off into other thoughts about immediate and longer term things to do at work and in my research but the key themes running through the guide were treating change as a process rather than an event, being realistic, working collectively, being honest and communicating well. It breaks down each phases into a number of steps (informed by case studies) and prompts the reader with many pertinent questions to ask of themselves and the project along the way.
The focus of the guide is very much on innovation and change – I’m still thinking about what we do with the practices that are currently working well and how we can integrate the new with the old.
The second guide – A Tertiary practitioners guide to collecting evidence of learner benefit – drills down into useful research methodologies for ensuring that our projects and teaching practices are actually serving the learners’ needs. Again, these are informed by helpful case studies and showcase the many places and ways that we can collect data from and about our students throughout the teaching period and beyond.
It did make me wonder whether the research mindset of academics might conventionally be drawn from their discipline. Coming from an organisation with an education and social science orientation, one might expect an emphasis on the qualitative (and there are a lot of surveys suggested – which I wonder about as I have a feeling that students might be a little over-surveyed already) but the guide actually encourages a mixture of methodologies and makes a number of suggestions for merging data, as well as deciding how much is enough.
Definitely some great work from our colleagues across the ditch and well worth checking out.
I shared some thoughts and summarised some of the discussions tied to the issues we face in supporting and driving institutional change, working with organisational culture and our role as professional staff experts in education design and technology.
There’s still much to talk about. Technology and what we need it to do, practical solutions both in place and under consideration / on the wishlist, further questions and a few stray ideas that were generated along the way.
Technology:
Unsurprisingly, technology was a significant part of our conversation about what we can do in the education support/design/tech realm to help shape the future of our institutions. The core ideas that came up included what we are using it for and how we sell and instill confidence in it in our clients – teachers, students and the executive.
The ubiquity and variety of educational technologies means that they can be employed in all areas of the teaching and learning experience. It’s not just being able to watch a recording of the lecture you missed or to take a formative online quiz; it’s signing up for a course, finding your way to class, joining a Spanish conversation group, checking for plagiarism, sharing notes, keeping an eye on at-risk students and so much more.
It’s a fine distinction but Ed Tech is bigger than just “teaching and learning” – it’s also about supporting the job of being a teacher or a learner. I pointed out that the recent “What works and why?” report from the OLT here in Australia gives a strong indication that the tools most highly valued by students are the ones that they can use to organise their studies.
Amber Thomas highlighted that “…better pedagogy isn’t the only quality driver. Students expect convenience and flexibility from their courses” and went on to state that “We need to use digital approaches to support extra-curricular opportunities and richer personal tracking. Our “TEL” tools can enable faster feedback loops and personalised notifications”
Even this is just the tip of the iceberg – it’s not just tools for replicating or improving analog practices – the technology that we support and the work we do offers opportunities for new practices. In some ways this links back closely to the other themes that have emerged – how we can shape the culture of the organisation and how we ensure that we are part of the conversation. A shift in pedagogical approaches and philosophies is a much larger thing that determining the best LMS to use. (But at its best, a shift to a new tool can be a great foot in the door to discussing new pedagogical approaches)
“It is reimagining the pedagogy and understanding the ‘new’ possibilities digital technologies offer to the learning experience where the core issue is” (Caroline Kuhn)
Lesley Gourlay made a compelling argument for us to not throw out the baby with the bathwater when it comes to technology by automatically assuming that tech is good and “analogue” practices are bad. (I’d like to assume that any decent Ed Designer/Tech knows this but it bears repeating and I’m sure we’ve all encountered “thought leaders” with this take on things).
“we can find ourselves collapsing into a form of ‘digital dualism’ which assumes a clear binary between digital and analogue / print-based practices (?)…I would argue there are two problems with this. First, that it suggests educational and social practice can be unproblematically categorised as one or the other of these, where from a sociomaterial perspective I would contend that the material / embodied, the print-based / verbal and the digital are in constant and complex interplay. Secondly, there perhaps is a related risk of falling into a ‘digital = student-centred, inherently better for all purposes’, versus ‘non-digital = retrograde, teacher-centred, indicative of resistance, in need of remediation’.” (Lesley Gourlay)
Another very common theme in the technology realm was the absolute importance of having reliable technology (as well as the right technology.)
“Make technology not failing* a priority. All technology fails sometime, but it fails too often in HE institutions. Cash registers in supermarkets almost never fail, because that would be way too much of a risk.” (Sonia Grussendorf)
When it comes to how technology is selected for the institution, a number of people picked up on the the tension between having it selected centrally vs by lecturers.
“Decentralize – allow staff to make their own technology (software and hardware) choices” (Peter Bryant)
Infrastructure is also important in supporting technologies (Alex Chapman)
Personally I think that there must be a happy medium. There are a lot of practical reasons that major tools and systems need to be selected, implemented, managed and supported centrally – integration with other systems, economies of scale, security, user experience, accessibility etc. At the same time we also have to ensure that we are best meeting the needs of students and academics in a host of different disciplines. and are able to support innovation and agility. (When it comes to the selection of any tool I think that there still needs to be a process in place to ensure that the tool meets the needs identified – including those of various institutional stakeholders – and can be implemented and supported properly.)
Finally, Andrew Dixon framed his VC elevator pitch in terms of a list of clear goals describing the student experience with technology which I found to be an effective way of crafting a compelling narrative (or set of narratives) for a busy VC. Here are the first few:
They will never lose wifi signal on campus – their wifi will roam seemlessly with them
They will have digital access to lecture notes before the lectures, so that they can annotate them during the lecture.
They will also write down the time at which difficult sub-topics are explained in the lecture so that they can listen again to the captured lecture and compare it with their notes. (Andrew Dixon)
Some practical solutions
Scattered liberally amongst the discussions were descriptions of practical measures that people and institutions are putting in place. I’ll largely let what people said stand on its own – in some cases I’ve added my thoughts in italics afterwards. (Some of the solutions I think were a little more tongue in cheek – part of the fun of the discussion – but I’ll leave it to you to determine which)
Culture / organisation
Our legal team is developing a risk matrix for IT/compliance issues (me)
(We should identify our work) “not just as teaching enhancement but as core digital service delivery” (Amber Thomas)
“we should pitch ‘exposure therapy’ – come up with a whole programme that immerses teaching staff in educational technology, deny them the choice of “I want to do it the old fashioned way” so that they will realise the potential that technologies can have…” (Sonja Grussendorf)
“Lets look at recommendations from all “strategy development” consultations, do a map of the recommendations and see which ones always surface and are never tackled properly.” (Sheila MacNeill)
“Could this vision be something like this: a serendipitous hub of local, participatory, and interdisciplinary teaching and learning, a place of on-going, life-long engagement, where teaching and learning is tailored and curated according to the needs of users, local AND global, actual AND virtual, all underscored by data and analytics?” (Rainer Usselman)
“…build digital spaces to expand our reach and change the physical set up of our learning spaces to empower use of technology…enable more collaborative activities between disciplines” (Silke Lange)
“we need a centralised unit to support the transition and the evolution and persistence of the digital practice – putting the frontliners into forefront of the decision making. This unit requires champions throughout the institutions so that this is truly a peer-led initiative, and a flow of new blood through secondments. A unit that is actively engaging with practitioners and the strategic level of the university” (Peter Bryant)
In terms of metrics – “shift the focus from measuring contact time to more diverse evaluations of student engagement and student experience” (Silke Lange)
“Is there a metric that measures teaching excellence?… Should it be designed in such a way as to minimise gaming? … should we design metrics that are helpful and allow tools to be developed that support teaching quality enhancement?” (David Kernohan) How do we define or measure teaching excellence?
“the other thing that we need to emphasise about learning analytics is that if it produces actionable insights then the point is to act on the insights” (Amber Thomas) – this needs to be built into the plan for collecting and dealing with the data.
Talking about the NSS (National student survey) – “One approach is to build feel-good factor and explain use of NSS to students. Students need to be supported in order to provide qualitative feedback” (David Kernohan) (I’d suggest that feedback from students can be helpful but it needs to be weighted – I’ve seen FB posts from students discussing spite ratings)
“We should use the same metrics that the NSS will use at a more granular levels at the university to allow a more agile intervention to address any issues and learn from best practices. We need to allow flexibility for people to make changes during the year based on previous NSS” (Peter Bryant)
“Institutional structures need to be agile enough to facilitate action in real time on insights gained from data” (Rainer Usselmann) – in real time? What kind of action? What kind of insights? Seems optimistic
“Institutions need at the very least pockets of innovation /labs / discursive skunk works that have licence to fail, where it is safe to fail” (Rainer Usselmann)
“Teachers need more space to innovate their pedagogy and fail in safety” (Silke Lange)
“Is it unfair (or even unethical) to not give students the best possible learning experience that we can?…even if it was a matter of a control group receiving business-as-usual teaching while a test group got the new-and-improved model, aren’t we underserving the control group?” (me)
“I can share two examples from my own experiences
An institution who wanted to shift all their UG programmes from 3 year to 4 year degrees and to deliver an American style degree experience (UniMelb in the mid 2000s)
An institution who wanted to ensure that all degree programmes delivered employability outcomes and graduate attributes at a teaching, learning and assessment level
So those resulted in;
a) curriculum change
b) teaching practice change
c) assessment change
d) marketing change ” (Peter Bryant)
“One practical option that I’m thinking about is adjusting the types of research that academics can be permitted to do in their career path to include research into their own teaching practices. Action research.” (Me) I flagged this with our Associate Dean Education yesterday and was very happy to hear that she is currently working on a paper for an education focussed journal in her discipline and sees great value in supporting this activity in the college.
“I think policy is but one of the pillars that can reinforce organisational behaviour” (Peter Bryant)- yes, part of a carrot/stick approach, and sometimes we do need the stick. Peter also mentions budgets and strategies, I’d wonder if they don’t change behaviour but more support change already embarked upon.
Technology
“let’s court rich people and get some endowments. We can name the service accordingly: “kingmoneybags.universityhandle.ac.uk”. We do it with buildings, why not with services?” (Sonia Grussendorf) – selling naming rights for TELT systems just like buildings – intriguing
We need solid processes for evaluating and implementing Ed Tech and new practices (me)
Pedagogical
“Could creating more ‘tailored’ learning experiences, which better fit the specific needs and learning styles of each individual learner be part of the new pedagogic paradigm?” (Rainer Usselman) (big question though around how this might be supported in terms of workload
“At Coventry, we may be piloting designing your own degree” (Sylvester Arnab)
“The challenge comes in designing the modules so as to minimise prerequisites, or make them explicit in certain recommended pathways” (Christopher Fryer)
I went on to suggest that digital badges and tools such as MyCourseMap might help to support this model. Sylvester noted that he is aware that “these learning experiences, paths, patterns, plans have to be validated somehow” Learner convenience over pedagogy – or is it part of pedagogy in line with adult learning principles of self-efficacy and motivation. In a design your own degree course, how do we ensure that learners don’t just choose the easiest subjects – how do we avoid the trap of having learners think they know enough to choose wisely?
“digital might be able to help with time-shifting slots to increase flexibility with more distributed collaboration, flipped teaching, online assessment” (George Roberts)
“At UCL we are in the midst of an institution-wide pedagogic redesign through the Connected Curriculum. This is our framework for research-based education which will see every student engaging in research and enquiry from the very start of their programme until they graduate (and beyond). More at http://www.ucl.ac.uk/teaching-learning/connected-curriculum
The connected bit involves students making connections with each other, with researchers, beyond modules and programmes, across years of study, across different disciplines, with alumni, employers, and showcase their work to the wider world…
There is strong top-down support, but also a middle-out approach with faculties having CC fellows on part time secondments to plan how introduce and embed the CC in their discipline.
From a TEL perspective we need to provide a digital infrastructure to support all of this connectivity – big project just getting going. Requirements gathering has been challenging… And we’re also running workshops to help programme and module teams to design curricula that support research-based and connected learning.” (Fiona Strawbridge) – liking this a lot, embedding practice. What relationship do these fellows have with lecturers?
“I am imagining that my research, personal learning environment would fit perfect with this approach as I am thinking the PLE as a toolbox to do research. There is also a potential there to engage student in open practice, etc.” Caroline Kuhn
“There may be a “metapedagogy” around the use of the VLE as a proxy for knowledge management systems in some broad fields of employment: consultancy, financial services, engineering…” (George Roberts) (which I’d tie to employability)
“We need to challenge the traditional model of teaching, namely didactic delivery of knowledge. The ways in which our learning spaces are currently designed -neat rows, whiteboard at front, affords specific behaviours in staff and students. At the moment virtual learning spaces replicate existing practices, rather than enabling a transformative learning experience. The way forward is to encourage a curricula founded on enquiry-based learning that utilise the digital space as professional practitioners would be expected to” (Silke Lange) – maybe but none of this describes where or how lecturers learn these new teaching skills. Do we need to figure out an evolutionary timeline to get to this place, where every year or semester, lecturers have to take one further step, add one new practice?
“Do not impose a pedagogy. Get rid of the curricula. Empower students to explore and to interact with one another. The role of the teacher is as expert, navigator, orienteer, editor, curator and contextualisor of the subject. Use heuristic, problem-based learning that is open and collaborative. Teach students why they need to learn” (Christopher Fryer)
This is but a cherry-picked selection of the ideas and actions that people raised in this hack but I think it gives a sense of some of the common themes that emerged and of the passion that people feel for our work in supporting innovation and good practices in our institutions. I jotted down a number of stray ideas for further action in my own workplace as well as broader areas to investigate in the pursuit of my own research.
As always, the biggest question for me is that of how we move the ideas from the screen into practice.
Further questions
How are we defining pedagogical improvements – is it just strictly about teaching and learning principles (i.e. cognition, transfer etc) or is it broader – is the act of being a learner/teacher a part of this (and thus the “job” of being these people which includes a broader suite of tools) (me)
What if we can show how learning design/UX principles lead to better written papers by academics? – more value to them (secondary benefits) (me)
“how much extra resource is required to make really good use of technology, and where do we expect that resource to come from?” (Andrew Dixon)
Where will I put external factors like the TEF / NSS into my research? Is it still part of the organisation/institution? Because there are factors outside the institution like this that need to be considered – govt initiatives / laws / ???
Are MOOCs for recruitment? Marketing? (MOOCeting?)
“How do we demonstrate what we do will position the organisation more effectively? How do we make sure we stay in the conversation and not be relegated to simply providing services aligned with other people’s strategies” (arguably the latter is part of our job)
“How do we embed technology and innovative pedagogical practices within the strategic plans and processes at our institutions?” (Peter Bryant)
Further research
Psychology of academia and relationships between academic and professional staff. (Executive tends to come from academia)
“A useful way to categorise IT is according to benefits realisation. For each service offered, a benefits map should articulate why we are providing the service and how it benefits the university.” (See https://en.wikipedia.org/wiki/Benefits_realisation_management ) (Andrew Dixon)
Leadership and getting things done / implementing change, organisational change
How is organisational (particularly university) culture defined, formed and shaped?
Actor-network theory
Design research
Some ideas this generated for me
Instead of tech tool based workshops – or in addition at least – perhaps some learning theme based seminars/debates (with mini-presentations). Assessment / Deeper learning / Activities / Reflection
Innovation – can be an off-putting / scary term for academics with little faith in their own skills but it’s the buzzword of the day for leadership. How can we address this conflict? How can we even define innovation within the college?
What if we bring academics into a teaching and learning / Ed tech/design support team?
Telling the story of what we need by describing what it looks like and how students/academics use it in scenario / case study format offers a more engaging narrative
What is the role of professional bodies (E.g. unions like the NTEU) in these discussions?
Are well-off, “prestigious” universities the best places to try to innovate? Is there less of a driving urge, no pressing threat to survival? Perhaps this isn’t the best way to frame it – a better question to ask might be – if we’re so great, what should other universities be learning from us to improve their own practices? (And then, would we want to share that knowledge with our competitors)
“I was thinking about the power that could lie behind a social bookmarking tool when doing a dissertation, not only to be able to store and clasify a resource but also to share it with a group of likeminded researcher and also to see what other have found about the same topic.” (Caroline Kuhn) – kind of like sharing annotated bibliographies?
Bigger push for constructive alignment
I need to talk more about teaching and learning concepts in the college to be seen as the person that knows about it
In conclusion
I’d really like to thank the organisers of the Digital is not the future Hack for their efforts in bringing this all together and all of the people that participated and shared so many wonderful and varied perspectives and ideas. Conversation is still happening over there from what I can see and it’s well worth taking a look.
The Office for Learning and Teaching (OLT) is – now was – an Australian government body intended to support best practice in enhancing teaching and learning in the Higher Education sector.
It funds a number of research projects, which in 2013 included “What works and why? Understanding successful technology enhanced learning within institutional contexts” – driven by Monash University in Victoria and Griffith University in Queensland and led by Neil Selwyn.
Rather than focussing on the “state of the art”, the project focuses on the “state of the actual” – the current implementations of TELT practices in universities that are having some measure of success. It might not be the most inspiring list (more on that shortly) but it is valuable to have a snapshot of where we are, what educators and students value and the key issues that the executive face (or think they face) in pursuing further innovation.
The report identifies 13 conditions for successful use of Tech Enhanced Learning in the institution and with teachers and students. (Strictly speaking, they call it technology enabled learning, which grates with me far more than I might’ve expected – yes, it’s ultimately semantics but for me the implication is that the learning couldn’t occur without the tech and that seems untrue. So because this is my blog, I’m going to take the liberty of using enhanced)
The authors took a measured approach to the research, beginning with a large scale survey of teacher and student attitudes toward TEL which offered a set of data that informed questions in a number of focus groups. This then helped to identify a set of 10 instances of “promising practice” at the two participating universities that were explored in case studies. The final phase involved interviewing senior management at the 39 Australian universities to get feedback on the practicality of implementing/realising the conditions of success.
So far, so good. The authors make the point that the majority of research in the TELT field relates to more cutting edge uses in relatively specific cohorts and while this can be enlightening and exciting, it can overlook the practical realities of implementing these at scale within a university learning ecosystem. As a learning technologist, this is where I live.
What did they discover?
The most prominent ways in which digital technologies were perceived as ‘working’ for students related to the logistics of university study. These practices and activities included:
Organising schedules and fulfilling course requirements;
Time management and time-saving; and
Being able to engage with university studies on a ‘remote’ and/or mobile basis
One of the most prominent learning-related practices directly to learning was using digital technologies to ‘research information’; ‘Reviewing, replaying and revising’ digital learning content (most notably accessing lecture materials and recordings) was also reported at relatively high levels.
Why technologies ‘work’ – staff perspectives
Most frequently nominated ways in which staff perceived digital technologies were ‘working’ related to the logistics of university teaching and learning. These included being able to co-ordinate students, resources and interactions in one centralised place. This reveals a frequently encountered ‘reality’ of digital technologies in this project: technologies are currently perceived by staff and students to have a large, if not primary role to enable the act of being a teacher or student, rather than enabling the learning.
Nevertheless, the staff survey did demonstrate that technologies were valued as a way to provide support learning, including delivering instructional content and information to students in accessible and differentiated forms. This was seen to support ‘visual’ learning, and benefit students who wanted to access content at different times and/or different places.
So in broad terms, I’d suggest that technology in higher ed is seen pretty much exactly the same way we treat most technology – it doesn’t change our lives so much as help us to live them.
To extrapolate from that then, when we do want to implement new tools and ways of learning and teaching with technology, it is vital to make it clear to students and teachers exactly how they will benefit from it as part of the process of getting them on board. We can mandate the use of tools and people will grumblingly accept it but it is only when they value it that they will use it willingly and look for ways to improve their activities (and the tool itself).
The next phase of the research, looking at identified examples of ‘promising practice” to develop the “conditions for success” is a logical progression but looking at some of the practices used, it feels like the project was aiming too low. (And I appreciate that it is a low-hanging-fruit / quick-wins kind of project and people in my sphere are by our natures more excited by the next big thing but all the same, if we’re going to be satisfied with the bare minimum, will that stunt our growth?) . In fairness, the report explicitly says “the cases were not chosen according to the most ‘interesting’, ‘innovative’ or ‘cutting-edge’ examples of technology use, but rather were chosen to demonstrate sustainable examples of TEL”
Some of the practices identified are things that I’ve gradually been pushing in my own university so naturally I think they’re top shelf innovations 🙂 – things like live polling in lectures, flipping the classroom, 3D printing and virtual simulations. Others however included the use of online forums, providing videos as supplementary material and using “online learning tools” – aka an LMS. For the final three, I’m not sure how they aren’t just considering standard parts of teaching and learning rather than something promising. (But really, it’s a small quibble I guess and I’ll move on)
The third stage asked senior management to rank the usefulness of the conditions of success that were identified from the case studies and to comment on how soon their universities would likely be in a position to demonstrate them. The authors seemed surprised by some of the responses – notably to the resistance to the idea of taking “permissive approaches to configuring systems and choosing software”. As someone “on the ground” that bumps into these kinds of questions on daily basis, this is where it became clear to me that the researchers have still been looking at this issue from a distance and with a slightly more theoretical mindset. There is no clear indication anywhere in this paper that they discussed this research with professional staff (i.e. education designers or learning technologists) who are often at the nexus of all of these kinds of issues. Trying to filter out my ‘professional hurt feelings’, it still seems a lot like a missed opportunity.
No, wait, I did just notice in the recommendations that “central university agencies” could take more responsibility for encouraging a more positive culture related to TEL among teachers.
Yup.
Moving on, I scribbled a bunch of random notes and thoughts over this report as I read it (active reading) and I might just share these in no particular order.
Educators is a good word. (I’m currently struggling with teachers vs lecturers vs academics)
How do we define how technologies are being used “successfully and effectively”?
Ed Tech largely being used to enrich rather than change
Condition of success 7″the uses of digital technology fit with familiar ways of teaching” – scaffolded teaching
condition of success 10 “educators create digital content fit for different modes of consumption” – great but it’s still an extra workload and skill set
dominant institutional concerns include “satisfying a perceived need for innovation that precludes more obvious or familiar ways of engaging in TEL” – no idea how we get around the need for ‘visionaries’ at the top of the tree to have big announceables that seemingly come from nowhere. Give me a good listener any day.
for learners to succeed with ed tech they need better digital skills (anyone who mentions digital natives automatically loses 10 points) – how do we embed this? What are the rates of voluntary uptake of existing study skills training?
We need to normalise new practices but innovators/early adopters should still be rewarded and recognised
it’s funny how quickly ed tech papers date – excitement about podcasts (which still have a place) makes this feel ancient
How can we best sell new practices and ideas to academics and executive? Showcases or 5 min, magazine show style video clips (like Beyond 2000 – oh I’m so old)
Stats about which tools students find useful – data is frustratingly simple. Highest rating tool is “supplementing lectures, tutorials, practicals and labs” with “additional resources” at 42% (So 58% don’t find useful? – hardly a ringing endorsement
Tools that students were polled about were all online tools – except e-books. Where do offline tools sit?
Why are students so much more comfortable using Facebook for communication and collaboration than the LMS?
60% of students still using shared/provided computers over BYOD. (Be interesting to see what the figure is now)
Promising practice – “Illustrating the problem: digital annotation tools in large classes” – vs writing on the board?
conditions for success don’t acknowledge policy or legal compliance issues (privacy, IP and copyright)
conditions for success assume students are digitally literate
there’s nothing in here about training
unis are ok with failure in research but not teaching
calling practices “innovations signals them as non-standard or exceptions” – good point. Easier to ignore them
nothing in here about whether technology is fit for purpose
Ultimately I got a lot out of this report and will use it to spark further discussion in my own work. I think there are definitely gaps and this is great for me because it offers some direction for my own research – most particularly in the role of educational support staff and factors beyond the institution/educator/student that come into play.
Update: 18/4/16 – Dr Michael Henderson of Monash got in touch to thank me for the in-depth look at the report and to also clarify that “we did indeed interview and survey teaching staff and professional staff, including faculty based and central educational / instructional designers”
Which kind of makes sense in a study of this scale – certainly easy enough to pare back elements when you’re trying to create a compelling narrative in a final report I’m sure.