If you are a registered HEi-know user, please log in to continue.
You must be a registered HEi-know user to access Briefing Reports, stories and other information and services. Please click on the link below to find out more about HEi-know.
The vast majority of students were satisfied with their university course in 2020, despite the Covid-19 lockdown from March, a sector-level analysis of the National Student Survey results has found.
Jonathan Baldwin, managing director of higher education at Jisc, looks at the changing role of post-Covid university leadership and the enduring need for collaboration.
The government's announcement of a major review of the National Student Survey signals a worrying shift in the HE regulatory landscape, warns Jon Scott, higher education consultant and former Pro Vice-Chancellor (student experience) at the University of Leicester.
Statements from ministers this week have made it clear that higher education in England is facing significant reforms, re-setting its focus towards helping to plug the UK's skills gaps and rebuilding the economy. Fariba Soetan, Policy Lead for Research and Innovation at the National Centre for Universities and Business, argues that the proposed changes bring a welcome focus on graduate outcomes and supporting the careers of young people.
As a new report from the Higher Education Commission calls on universities to catch up with international competitors in their use of learning analytics, Ed Foster, Student Engagement Manager in the Centre for Academic Development & Quality at Nottingham Trent University, explains how his institution is already responding to the challenge.
Learning analytics is a potentially transformative technology for the sector. Students, staff and HEIs could all benefit from the increased information and early warning indicators provided. That said, probably the most important word in the title of the Higher Education Commission’s Report ‘From Bricks to Clicks: The Potential of Data and Analytics in Higher Education’ (see HEi-know Briefing Report 281) is ‘potential’. We’ve seen other potentially transformative technologies that haven’t yet lived up to their promise or perhaps never will.
The promise of learning analytics is that it might enable us to deliver personalisation at scale. It can be difficult to spot the one student who is struggling if they don’t ask for help. Learning analytics might mean that we can identify when they are struggling early enough to intervene, even if they don’t actively seek out help themselves.
However, if institutions implement learning analytics correctly, it’s also a resource that students can use themselves. If students can see easily that they are at risk of underperforming, learning analytics can help them take charge of their own learning and get back on track. Of course, they can choose not to. Students should still have the right to make unwise choices; we would argue that learning analytics can give them realistic information about that course of action and no more.
This report is timely. The UK is behind both the US and Australia in it’s adoption of learning analytics; 98 per cent of respondents to the quoted Heads of E-Learning survey are only working towards implementing learning analytics or are yet to start. The authors succinctly lay out some of the challenges facing the sector. Any institution developing learning analytics will need to overcome a series of technological, ethical, process and policy challenges. Importantly, the authors suggest work is needed by individual institutions and sector bodies such as HESA and JISC.
Our own experience suggests that the journey is worth the effort. The NTU Student Dashboard is now in its second full year of use. It provides students and staff with accurate and timely information about student engagement and we are starting to see evidence that dashboard use positively changes student behaviours.
Developing the Dashboard has been a complex process involving a large multidisciplinary team. Students, academics, educational developers, technologists, researchers, and experts in equality and diversity, ethics and widening participation have worked with our external supplier, DTP Solutionpath to build a resource to fit our needs. As we progress further into the project, we still face many questions about how we integrate the data the dashboard provides into the University’s normal working practices.
There are a number of key questions for anyone interested in implementing learning analytics. Chief of these is ‘why do you believe that having more, and more timely, information is going to change anything?’ We have seen evidence that some of our students have changed their behaviour in response to their dashboard data. However, in focus groups these were often students who were already highly engaged with their courses. Just assuming that all students will be their own change agents is unrealistic.
It’s vital to focus on what happens next. At the point where learning analytics identifies that a student is at risk of failing or underperforming what systems or support exist? We are working as part of an Erasmus+ project with KU Leuven and Universiteit Leiden to test which interventions successfully support students once the dashboard identifies that they have low engagement. There is a considerable amount of work still to come looking at the role of the tutor, communication with students, dashboard design or any number of avenues. However, we feel that this next step is essential to fully exploit the potential of the resource. We would suggest that the successful implementation of learning analytics will still require messy, complicated interactions in the real world in order to succeed.
Ed Foster is the Student Engagement Manager in the Centre for Academic Development & Quality at Nottingham Trent University. He is the project lead for the Student Dashboard and the ABLE Erasmus+ Project. His research interests are student transition, engagement and retention. NTU use the DTP Solutionpath StREAM tool to power the Dashboard.
© 2013 Media FHE, all rights reserved