A strength in my role as a Learning Designer has been supporting teams and colleagues in the availability, adoption, evaluation and use of learning technologies to enhance their teaching content and design practice.
For each academic team I work with, I run ‘tools selection’ sessions. In most cases this begins with presenting a whistle stop tour of the various activity tools available in the OU VLE, presented from a non technical standpoint – and instead focusing on the pedagogic affordances each tool can enable or enhance from a distance learning perspective. This is aimed not to let the tools determine the teaching, but to instead help academic colleagues consider the breadth of design options available to them – and not restrict their thinking within their current understanding.
From there we’ll usually move in to the specific activities and student skills the team are looking to develop, and then matching potential tools, technologies, activities and examples to those – always considering the pros and cons.
Something I always look to stress (particularly with teams keen to play with the whizzier tools) is to make a strategic plan about which tools are being used when in the module, and considering the additional cognitive load and digital leap for students each time a new tool or interaction is introduced. My feeling (supported by evaluating tool engagement through our Analytics 4 Action work), is that the more tools and toys thrown in to the mix the more it generally dilutes student focus with the accompanying activities – and engagement with them will suffer.
For teams a little more au-fait with the technology to hand, I’ll often run a ‘what’s new and cool?’ session, updating them on the recent additions and changes to the toolset (something I keenly keep abreast of, liaising closely with our fabulous Learning Systems colleagues) to hook them in. Back in 2020, I ran a session similar to this for the school of Health, Wellbeing and Social Care – which included additional group activities inviting academics to ‘theorycraft’ hypothetical activities within their discipline areas using the design affordances of tools they had not previously encountered.
I use the the TEL In Practice (TiP) Tools resources I have developed to supplement my support for module teams in their selection of tools and technologies, as discussed in the ‘An understanding of the constraints and benefits of different technologies‘ section.
Once a module is out in the wild, with students getting stuck in to the teaching and tools, I help teams evaluate the deployment, engagement with and impact of their use of the various learning technologies. This is done within our Analytics for Action (A4A) framework, and data support meetings.
While working as a Media Coordinator on the OpenCreate project I was responsible for helping with the deployment, testing and adoption of the OpenCreate direct authoring system.
Regarding the deployment of bleeding edge Learning Technology, I've recently been working on testing, observing and evaluating the Digital Working Environment Exploratory Project (DWEEP), which is looking to test the viability of virtual environments for use within universities. I led on the underlying research paper, which you can check out at the Learning Innovation website.
Finally, I recently helped to prototype, and then design and write a large section of our new team VLE based induction, with my section focusing on VLE and production tools and the Learning Design practitioners toolkit.
'Deploying technologies' generated with Adobe Firefly
'Strength in technology' generated with Adobe Firefly
While I think this area is one of my strengths, I do feel it is something I still have a lot of space to develop.
I came in to the role with a lot of technical knowledge and ideas around how the available tools could function, but less understanding of the importance of fore fronting the pedagogy in discussions around them with academic authors. The learning curve on this was quite steep during my first year in the Learning Design team – with my initial keenness to share my understanding of the use of the tools brushing up against the areas of responsibility for other roles, and not really pushing the ‘learning design’ side of design. Following some sharp feedback on a learning design guide to our quizzing systems I made a conscious effort to refocus more towards use in activity design.
This became much easier after starting my Pedagodzilla podcast, which boosted my confidence discussing tools in a pedagogic design context. While I think I’m much improved in this area, I think my understanding of pedagogy in practice still has a lot of space to grow – particularly at the macro (module and qualification) activity design level.
From time to time I can also let my enthusiasm for the available technology get the better of me, and have had to walk back an accidental ‘this is the best thing since sliced bread’ sales pitch on a new tool or interaction when my championing of it has resulted in a team opting to use it in a less than ideal setting. I’m particularly guilty of this with the new Audio Recording tool, which I’m conscious lines up with my own interests (podcasting!), and has some great potential for student activities around presentation and communication practice and alternative approaches to reflection. It does however introduce some tricky accessibility considerations, particularly in an assessment setting.
I’m now making a conscious effort to be a bit more balanced in how I present learning technology, without losing the underlying energy and enthusiasm that I feel can be necessary to getting teams to consider exploring different technologies in their module designs. I’ve still got a way to go with this, but I think I’m gradually getting better – and the teams I work with taking a more strategic approach to tool use would seem to indicate that it’s gradually percolating through. I feel my efforts to bring data I've gathered through my data support meetings in to those discussions has really helped.
Feedback from academic author following support with the design of interactive quizzes - November 2019
Some lovely feedback from one of our academic authors, around the support and guidance I gave in the design of some really neat and rich quiz activities within this module. They were on to use the available question types in some awesome and engaging ways - and I think this particular section of the module ended up being very strong.
Impact: During evaluation (data support) we found that this iCMA (interactive computer marked assessment - essentially an interactive rich quiz) had a great submission rate and good student scores (not shareable here for obvious reasons). This module went on to have great student outcomes in this and future presentations, and I think part of that is down to the engaging nature of the content I helped design, and the general fabulousness of the academic team.
Feedback from organisers following a demonstration of the new Learning Design tool and Q&A with the Editorial team - Sept 2021
Feedback from colleagues in Editorial, following an introduction and walkthrough of the new Learning Design tool:
"Hi Mike and Mark, Thank you SO much for your terrific talk today – you were both great! I heard some very positive feedback after the session (from various DDEs and EPMs), so well done to both of you. 😊 If you are open to the idea, I think it would be great to have you back to talk about A4A (and maybe show that lovely animation you created). Just an idea at this stage… but would a slot in early 2022 work for both of you? " - P Hoffman
"Yes, well done both, and thank you! It was a really interesting topic, and it links really well with the pilot collaboration project that XXXX will be telling us about tomorrow. You did brilliantly! Mike, in your introduction I did change your ‘pop culture core’ to ‘pop culture twist’, not sure if you noticed, but I was trying not to giggle as I read it out! Thanks again, and yes, definitely would love to have you both back for an A4A talk!" - P Mumford
Impact: The session was a little daunting as we had a massive audience - but I think it went really well, and has started laying the groundwork for the deployment of the tool across our production activities. Making it easier for editors to quickly check the workload of a tagged / handed over unit and generate some evidence around it will be massive in managing student workloads over the next few years.
Digital Working Environment Exploratory Project (DWEEP) Evaluation strategy
The following screenshot shows the initial (rough) pitch for an evaluation strategy, taken from the wider document. I used a basic activity theory framework as a starting point, and to identify the 'unknowns' related to the research question, that we could then build an objective strategy around. Tricky project this, and I've been leading the evaluation side of it by proxy, which has been a challenging but interesting experience thus far. We're hoping to have the final form of this report published via ORO (Open Research online) and the Scholarship Exchange before the end of the year.
Impact: We started this project with a very, very open brief, a tight deadline and no clear way forward. I think that by sitting down and breaking the problem in to manageable chunks using the method below, I managed to get the ball actually rolling with our evaluation efforts.
Digital Working Environment Exploratory Project (DWEEP) Research paper and report
The DWEEP report was published! I wrote a good chunk of it, and worked my backside off in a mad week up to the deadline coordinating the rest. Is it perfect? No. Was it enough to help Andrew McDermott write a good summary? Yes.
I'm pleased with the end result overall. I think it presents a balanced case around the pros and cons of using virtual environments within the OU. My general feeling about the whole Metaverse thing is that it's probably going to come eventually, but the technology and landscape isn't quite in place yet. It will be interesting to see if Meta (formerly facebook) can survive their gamble on it.
You can check out the research paper over at the the Learning Innovation website.
Impact: You can read the report linked above for our findings, but I think it's also worth discussing how it impacted practice. While our ultimate conclusion was that virtual workspaces aren't yet universally suitable to replace our video chat calls, we did find that in specific use cases they could be beneficial. Since then I've set up and distrubted half a dozen headsets within the Learning Design team, and am coordinating some small scale team collaborative activities. I've also done some standard meetings using Horizon Workrooms, and I've got to say, it feels pretty legit. You do feel like you're actually sharing a space.
Learning Designers induction, Tools and technology and case study
Around a year ago I put a prototype and proposal together showing how we could make our induction programme for new starters in the team a bit more saleable by flipping the learning to a blended approach using our VLE - that could also support existing staff with their own practice.
One design process and a lot of sweat later, we've got the first (rough) version infront of a cohort of four new starters. I was involved in the design of the whole thing, but directly wrote and built the topic around Tools and Technology as it's my wheelhouse. I think it's come out pretty well so far, with lots of activities and opportunities to interact with the VLE.
I've written a case study on the overall planning and design process (linked), which includes the initial impact evaluation.
Impact: I've had some really good feedback from many of the new starters who've come through this site in the last year. A few colleauges have also let me know that they've used videos I created in this section to discuss tools with academic teams.
Here's a screenshot of the top level study planner for the topic, without wanting to give too many OU secrets away.
Supporting deployment of VLE tools using evidence
Graph generated to illustrate engagement with a specific VLE tool over the course of a presentation. This has subsequently been used to advise other teams in the deployment and use of the tool. The specific context of this example is that the tool was introduced as an optional activity in week 5, with the idea being that students would continue to dip in and out of it for the remainder of the presentation. Unfortunately, the links to assessment and overall alignment weren't clear to students in this first presentation, and you can see that engagement only ever peaked at 18% of the cohort. This is one of several data examples I'd refer to.
Impact: This example has served as a useful bit of data to support my discussions around intergrating VLE tools with quite a few teams. The point I always draw back to is that unless students see a crystal clear existential reason to engage with something (IE, links to assessment) they're more likely than not to bypass it - even if actual interaction with that tool and the associated activity would be really valuable to the overall learning. I'd like to think I've guided a few teams to not slapping in a tool arbitrarily thanks to this - and instead had them focus on how to line it up with the overall experience.
A commitment to exploring and understanding the interplay between technology and learning
Helps academic authors understand the affordances of the available learning technology, how to select the right types - and how to best use it in enhancing the student experience.
Published research paper on working in virtual environments.
A commitment to keep up to date with new technologies & a commitment to communicate and disseminate effective practice
Keeps up to date with latest technologies available within the OU, and disseminates through tools selection and 'what's new' sessions with module teams and asynchronous resources such as the TiP (TEL In Practice) tools site.
Delivers blended learning to new Learning Designers around VLE tools and the practitioners toolkit, as part of their inductions
Published research paper on working in virtual environments (so good I mentioned it twice).
An empathy with and willingness to learn from colleagues from different backgrounds and specialist options
Responsive to how colleagues use technology in practice
Continually iterating approach to tools based on feedback from users