Since I started in the role, one area where I've always felt at home is elbow deep in the data. In a previous life, I did BIK (benefit in kind) and fuel tax calculations for a couple of the UK's biggest construction companies - so had to learn how to deal with big, horrible datasets and do big horrible maths with them. I am not a mathematician, I just about scraped a B at GCSE. Still, I learned a bucket of stuff about how to process and visualise data, and boy howdy has it come in useful since.
As Learning Designers, we try and ground our advice in evidence - and thanks to being a primarily online university we have a wealth of data to help us understand our students. Who they are, how they're doing, and what is and isn't engaging them. It's no substitute for seeing the whites of their eyes in a classroom of course, but heck, that's showbusiness.
We've a wealth of data on student demographics, qualification alignment, and success - and I'll usually bring this in to design conversations with teams right at the beginning in the form of summarised visualisations. I'll then build on this quantitative data with the academics qualitative understanding of their audience (as many have been teaching in the subject area for yonks). Having established the central understanding of the cohort, I'd then move to design considerations and challenges introduced by their characteristics, asking 'what do we need to do to meet the needs of this cohort?'. It doesn't stop there though, as I'll then draw authors to consider the outliers, and question those considerations and challenges to see if we would be inadvertently disadvantaging other groups. You can see some top tips on designing in inclusivity put together by my colleagues over at the OU Learning Design blog.
With module design and production in full swing, I've then had the pleasure of working with our Curriculum Design Student Panel (CDSP - a panel of volunteer students who can trial and feed back on activities and design) to prototype and user test some of our ideas and assumptions. We'll then amend our designs, and feed back to the students and thanks them. They're a smashing bunch, and have helped me and the teams I work with test activity threads, assumptions around reading speeds - and even brainstorm new module names.
Once the module is live and out in the wild, I'll swing into action like a data visualisation Spiderman - running a series of data support meetings as part of our Analytics 4 Action (A4A) framework. This process is about helping module teams to identify who's studying their modules and how they're doing. Any bits that are going great and can be shared as best practice, or are having a wobble and need an intervention. On top of the quantitative data I present (drawn from our analytics systems), I also work with teams to design qualitative, targeted surveys (Real Time Student Feedback - RTSF) - aimed to both query and support students.
On top of all of this, I also help deliver regular training sessions, showing authors and production colleagues how to access, use and interpret the data at our disposal, in order to best understand our students.
'Student data' generated by Adobe Firefly
'Time poor student' generated by Adobe Firefly
With so much of my work centred around understanding our students, what have I learned about them? Quite a bit, and not enough at the same time:
Time poor: OU students tend to be at a slightly later stage in life than the classic idea of the late teens student, with many in full time or part time employment, or with caring responsibilities. I can barely find time to work and podcast, so my hat goes off and my heart goes out to the huge number of students studying at full time intensity while juggling all of this. Its one of the reasons we have to be so hot on workloads during design.
Diverse: In every sense of the word. We're a very accessible university, and we attract folk from all walks and stages of life - and a broad spread across any demographic measure you choose to pick. The complex mix of motivations and starting points is a big driver in our inclusive design work.
Complex needs: We've got one of the highest proportion of students with disabilities across an University in the UK. Our students can have needs and challenges that we need to understand, in order to build in to our overall approaches to teaching and tech.
While I feel my big-picture understanding of OU students is quite robust, and has been a boon in helping module teams make good design decisions, something I feel the lack of is visibility and understanding of individual students. With the exception of the student panel, I'm a few steps removed from direct student contact - which would be invaluable in understanding more, and better tailoring designs.
Something I'm continuing to wrestle with is the very narrow insight that our quantitative data gives in to the experience of our students. In the past I've fallen in to the trap of extrapolating narratives based on that data, which has ultimately said more about my own baked in assumptions than it has about what is actually being shown. Over the last year (and data cycle) I'm been making an effort to present the data sans narrative, but then building hypothesis with the module chairs, and identifying qualitative ways to test them. Slowly slowly catchy mousy.
While I took to the technology quite quickly, I have greatly benefited from support and guidance from our team's wonderful Data Analysts, and in particular Shingirai Shumba and Carl Small - who entertained my often bizarre data queries, and taught me how to interrogate the back-end data with SAS-EG and visualise the resultant mess through PowerBI.
Overall though, I'm chuffed with my data work. It's been something tangible I can clutch when other elements of the Learning Design role have felt a little ephemeral. Would I pursue data as a full time career? Heck no. There's only so many times you can type =VLOOKUP in to spreadsheets before you lose your marbles.
Example of data support slides (student demographics)
As the data is sensitive I can't show too much detail of what I deliver in Data Support sessions. The slide below should give you a flavour though, showing the age demographic composition of a cohort of students for a particular module - and a few KPIs we keep track of.
Impact: The data support sessions I've led focus on identifying where things are going great and can be championed as best practice, or things aren't quite going to plan - and need an in-presentation intervention. Since 2014 I've investigated data and chaired these sessions for nearly 20 modules, and have supported module teams in identifying and implementing tweaks to improve student outcomes.
Data support evaluation
Impact: The following is an extract from the Analytics for Action Impact report (2022), which demonstrates the effectiveness and impact of our data support work. I chaired 3 of the modules supported during that academic year, and co-chaired three others.
Example of CDSP analysis
The following (anonymised) extracts show sections of a report on a CDSP (Curriculum Design Student Panel) activity I ran with a module team, testing the effectiveness and alignment of PDP (personal development planning) activities in draft material. It also shows some of the recommendations I made based on the analysis.
Impact: Many of the suggested tweaks were made, with the Editor and I working closely to refine the related unit. We found that, despite this, engagement with the PDP materials once it went live was not as high as we'd hoped (even with it being an optional pathway). Saying that, student feedback amongst those who'd engaged was largely positive, and supported the arguement that OneNote as a tool needed to be de-prioritised (it introduced all manner of faff).
Example of data training
The opening (agenda) slide from the data training that I (am several of my colleagues) run on a regular basis for colleagues across the university. I've got dozens of these under my belt over the last few years.
Impact: I've run a lot of data training sessions, and feel I've had a big impact in helping faculty colleagues understand and navigate their module data. Here's some feedback I've had from the sessions I've run (on a scale of 1 (Strongly Disagree) to 5 (Strongly Agree)):
I expect that most staff will need formal training to use this tool, based upon my experience in this session: 4.33
Using the data tools will help to improve the quality of my teaching: 4.0
Using the data tools will help to enhance the effectiveness of the teaching: 4.0
The instructors were knowledgeable and engaged with the attendees: 4.67
The instructors provided clear instructions on how to engage with the activities: 4.67
Overall, I'm satisfied with the training session: 4.67
A commitment to exploring and understanding the interplay between technology and learning
Brings student usage of technology in live presentation in to data and evaluation discussions
Identifies patterns and trends in student learning using data technology
A commitment to keep up to date with new technologies & an empathy with and willingness to learn from colleagues from different backgrounds and specialist options
Works with Data Analysts to improve own ability to interrogtate background data and visualise it through professional data tools such as PowerBI.
A commitment to communicate and disseminate effective practice
Supports module teams in understanding student demographic and progression data, and uses this as an evidence based to affect interventions with modules, and inform the design decisions of modules in production.
Trains internal staff in the effective use of the OU's data tools.