If you are a registered HEi-know user, please log in to continue.
You must be a registered HEi-know user to access Briefing Reports, stories and other information and services. Please click on the link below to find out more about HEi-know.
The new regulatory framework for universities will have “an unflinching focus” on students but not reduce English higher education to a “crude transaction between buyer and seller” said Sir Michael Barber as the government published a 181-page consultation report on its proposals.
Tuition fees should be capped at as low as £5,000 and the interest rate on student loans lowered to match inflation levels, according to a report published by the free-market think tank the Centre for Policy Studies.
The Higher Education Statistics Agency has published the specification of student data to be returned by higher education providers from the 2019/20 academic year. The release represents the biggest change to the way student data is collected since the Cheltenham agency’s first data collection in 1994.
Over a quarter of students from multiple disadvantaged groups are dissatisfied with their non-academic higher education experience, new research shows.
Live higher education news roundup
As a new report from the Higher Education Commission calls on universities to catch up with international competitors in their use of learning analytics, Ed Foster, Student Engagement Manager in the Centre for Academic Development & Quality at Nottingham Trent University, explains how his institution is already responding to the challenge.
Learning analytics is a potentially transformative technology for the sector. Students, staff and HEIs could all benefit from the increased information and early warning indicators provided. That said, probably the most important word in the title of the Higher Education Commission’s Report ‘From Bricks to Clicks: The Potential of Data and Analytics in Higher Education’ (see HEi-know Briefing Report 281) is ‘potential’. We’ve seen other potentially transformative technologies that haven’t yet lived up to their promise or perhaps never will.
The promise of learning analytics is that it might enable us to deliver personalisation at scale. It can be difficult to spot the one student who is struggling if they don’t ask for help. Learning analytics might mean that we can identify when they are struggling early enough to intervene, even if they don’t actively seek out help themselves.
However, if institutions implement learning analytics correctly, it’s also a resource that students can use themselves. If students can see easily that they are at risk of underperforming, learning analytics can help them take charge of their own learning and get back on track. Of course, they can choose not to. Students should still have the right to make unwise choices; we would argue that learning analytics can give them realistic information about that course of action and no more.
This report is timely. The UK is behind both the US and Australia in it’s adoption of learning analytics; 98 per cent of respondents to the quoted Heads of E-Learning survey are only working towards implementing learning analytics or are yet to start. The authors succinctly lay out some of the challenges facing the sector. Any institution developing learning analytics will need to overcome a series of technological, ethical, process and policy challenges. Importantly, the authors suggest work is needed by individual institutions and sector bodies such as HESA and JISC.
Our own experience suggests that the journey is worth the effort. The NTU Student Dashboard is now in its second full year of use. It provides students and staff with accurate and timely information about student engagement and we are starting to see evidence that dashboard use positively changes student behaviours.
Developing the Dashboard has been a complex process involving a large multidisciplinary team. Students, academics, educational developers, technologists, researchers, and experts in equality and diversity, ethics and widening participation have worked with our external supplier, DTP Solutionpath to build a resource to fit our needs. As we progress further into the project, we still face many questions about how we integrate the data the dashboard provides into the University’s normal working practices.
There are a number of key questions for anyone interested in implementing learning analytics. Chief of these is ‘why do you believe that having more, and more timely, information is going to change anything?’ We have seen evidence that some of our students have changed their behaviour in response to their dashboard data. However, in focus groups these were often students who were already highly engaged with their courses. Just assuming that all students will be their own change agents is unrealistic.
It’s vital to focus on what happens next. At the point where learning analytics identifies that a student is at risk of failing or underperforming what systems or support exist? We are working as part of an Erasmus+ project with KU Leuven and Universiteit Leiden to test which interventions successfully support students once the dashboard identifies that they have low engagement. There is a considerable amount of work still to come looking at the role of the tutor, communication with students, dashboard design or any number of avenues. However, we feel that this next step is essential to fully exploit the potential of the resource. We would suggest that the successful implementation of learning analytics will still require messy, complicated interactions in the real world in order to succeed.
Ed Foster is the Student Engagement Manager in the Centre for Academic Development & Quality at Nottingham Trent University. He is the project lead for the Student Dashboard and the ABLE Erasmus+ Project. His research interests are student transition, engagement and retention. NTU use the DTP Solutionpath StREAM tool to power the Dashboard.
© 2013 Media FHE, all rights reserved