If you are a registered HEi-know user, please log in to continue.
You must be a registered HEi-know user to access Briefing Reports, stories and other information and services. Please click on the link below to find out more about HEi-know.
The government's announcement of a major review of the National Student Survey signals a worrying shift in the HE regulatory landscape, warns Jon Scott, higher education consultant and former Pro Vice-Chancellor (student experience) at the University of Leicester.
Statements from ministers this week have made it clear that higher education in England is facing significant reforms, re-setting its focus towards helping to plug the UK's skills gaps and rebuilding the economy. Fariba Soetan, Policy Lead for Research and Innovation at the National Centre for Universities and Business, argues that the proposed changes bring a welcome focus on graduate outcomes and supporting the careers of young people.
Universities UK and GuildHE have commissioned the Quality Assurance Agency to develop a new approach to reviewing and enhancing the quality of UK TNE. QAA will consult on a new review method later this year and will launch a programme of in-country enhancement activity in 2021.
After a week of largely disappointing news for UK higher education, Nicola Owen, Deputy Chief Executive (Operations) at Lancaster University, fears that gloomy forecasts for the future of the sector may prove to be uncomfortably accurate.
Loughborough University has been named University of the Year for the second time in three years in the latest Whatuni Student Choice Awards .
UK higher education had more than its fair share of ups and downs over the past week. Charlie Ball, Head of Higher Education Intelligence at Prospects, charts the highs and lows.
As the Office for Students places a moratorium on ‘conditional unconditional offers’, Jon Scott, HE consultant and former Pro Vice-Chancellor (student experience) at the University of Leicester, reviews the context of the decision and considers its implications.
Universities across the UK have rapidly moved their learning, teaching and assessment online in response to the COVID-19 pandemic. The unprecedented overhaul of traditional teaching practices has presented a major challenge to institutions, staff and students. In this Good Practice Briefing, HEi-know shows how some universities have responded to the situation.
Sutton Trust associate director of media and communications Hilary Cornwell and research and policy assistant Maariyah Dawood comment on equality and widening access issues that have emerged in a week of higher education news.
The use of big data to improve the student experience is a rich seam that universities are increasingly mining. In this Good Practice Briefing, HEi-know looks at a variety of approaches that have been taken by eight universities to collect and make use of data to enhance learning, and provide better support and feedback for students.
The wealth of higher education data available is vast and the various methods of collecting it are becoming ever more sophisticated. Examples include module level exam results, real-time data on student attendance, frequency of access to virtual learning environments (VLE) and library services, and the level of student contact with tutors.
Applications of the data are equally varied, covering student engagement and retention, early intervention, tailored support, relationship management and assessing student, staff and departmental performance.Universities are increasingly attempting to analyse students’ behaviour – through their digital footprint – so they can anticipate students’ prospects of success, or otherwise.
Students paying some of the highest tuition fees in the world are demanding more for their money but at the same time, in order to make academic progress, significant amounts of independent learning are required from them. As a result, universities are striving to ensure undergraduates remain engaged and are offered the personalised support that advances in digital technologies and communication make possible.
In an increasingly competitive market, using data and learning analytics to advance student success is seen as imperative. But universities are equally aware that it is not a panacea.
Recent research into the use of learning analytics, such as For the Record: Learning Analytics and Student Engagement, advises universities to treat mined data with some degree of circumspection and caution. Log-ins and time spent on the VLE are not necessarily a proxy for effective independent study; and time spent online, and the usage of various tools, can mean different things for different students. As Paul Haskell-Dowland, the architect of the student support system at Plymouth University, puts it: “We can check to see if a student takes a book but not if they have read it.”
UNIVERSITY OF EAST LONDON
Predictive analytics are not just about improving retention rates, nor are they the preserve of institutions that suffer from low rates; they represent an opportunity to improve the overall student experience, according to Charles Prince, Director of the Centre for Student Success at the University of East London (UEL).
Predictive analytics can help make learning and support services more targeted, identifying students according to their characteristics, and understanding the risk of disengagement or drop out. Since the 2016/17 academic year UEL has invested in a predictive analytics system and platform, to better understand students in a way that allows tutors to shape educational offerings and career support.
The system has allowed the university to assess its retention rate by looking at students enrolled at the institution over the past five years that have dropped out. A predicted score is calculated by an algorithm which evaluates all student records and identifies the characteristics of those students who continue and complete their studies, and those that do not. The system then applies those outcomes to currently enrolled students, controlling for their characteristics.
“Predictive analytics helps us to identify which students to focus our limited resources on, in order to better understand their needs and respond to their actual live behaviour, rather than just their feedback after it’s too late,” said Charles Prince.
The data can also have a powerful predictive ability, depending on the subject area and cohort. For example, students in one programme and one year might have a very different set of predictors than students in another. Students who might be more at-risk in the modelling could include those without GCSEs, or who those who have failed one class.
These data points are important for the Centre for Student Success (CfSS) team, who can target students based upon their levels of risk. Evidence of these kinds of interventions is also important in Teaching Excellence Framework (TEF) submissions.
Another area that has seen improvements at UEL is the expanded use of diagnostic testing with students. Instead of assessing every student, staff are able to target those who might need the diagnostic, and provide them with plans to help support their studies. So far, students that have been engaged with the diagnostic have seen a 3.5 percentage points increase in their assessment grades.
When and how to give advice to struggling students can be important elements in how successful it is. Predictive analytics can enable support staff to assess whether advice is effective and what components of the approach are directly related to retention goals.
“Predictive analytics will allow us to determine the correct pattern of these meetings, and also which type of advising has better results with different students. It allows us to ensure that students are being assigned to appropriate advisors and advising approaches, to help them to achieve and also to get the biggest return on our investment,” said Prince.
Data collection is also helping to target students with on-campus jobs and internships, taking into consideration predicted risk scores, bursary qualification, and other data.
“It ensures that the career support we provide increases their attendance and engagement in the classroom this year, but ultimately, also encourages them to return to the institution next year,” said Prince. “The programme has been in effect this first year and we will be able to measure its impact on student retention.”
MANCHESTER METROPOLITAN UNIVERSITY
At Manchester Metropolitan, data services have been used to support an institution-wide commitment to a four-week deadline for returning marks and feedback on work and helped create an environment which supports student engagement and scholarship.
At the beginning of the project there were no particular institutional expectations about marking or feedback or guidelines about the time to mark and return student work. Scores in the National Student Survey led the university to view its systems and the procedures that supported them.
A new system for logging and tracking submissions was rolled out across departments. The Coursework Receipting System (CRS) provides students with bar-coded coversheets for tracking paper submission of assessments and records submission of assignments online via the VLE.
Unit leaders now have clear responsibility for managing the marking and moderation data and ensuring that final grades have been entered into the Student Record System within four weeks of the assignment submission deadline, at which point the grades are automatically released to students.
The project also acted on a recommendation to focus more effort on the re-assessment period: the time between the board of examiners’ meetings and the deadline for re-submission of failed work or the re-sitting of failed examinations.
All faculties are now required to put in place a clear plan for this period. It means that staff are now on rota to ensure that someone is available to answer student queries and that all students who are being given a reassessment opportunity are telephoned. Student support staff are also briefed about arrangements and staff availability. There is also a consistent approach to providing reassessment information and resources in the VLE.
Students now have clear, personalised information about when marks and feedback will be received. Return of marks to students happens automatically via a feed to their Moodle area from the Student Record System. Electronic submissions are normally returned to students in the same way.
As part of this evolution, MMU wanted to encourage students to reflect on their own performance and create a personal action plan for the future. Tutors are also required to reflect on the effectiveness of each part of the assessment cycle from setting to the return of work. Data about unit reassessments is now included in performance monitoring and provision of support for individual units.
“This will enable heads of department and programme leaders to target resources more quickly on units which staff and students perceive as ‘difficult’ in assessment terms,” said a university spokesman.
Retention of students is a strategic priority for Glyndwr University, in Wrexham, Wales, and is one of the institution’s key performance indicators.
Using a range of data sources, Glyndwr has constructed a basket of indicators to identify the departments, programmes and student groups with poor levels of retention, progression and completion.
HESA provides institutional level data but it was also important to identify programmes and modules which required interventions to improve the student experience and to boost retention and progression.
Data such as student satisfaction rates and continuation figures, and on recruitment and student feedback, is used to inform funding decisions for each programme. This annual commissioning model contributes to devolving responsibility for improving recruitment, retention, progression, satisfaction, completion and attainment to schools, departments and programme leaders.
Brigades of staff are engaged in analysing programme and module level data and discussing the implications. The institution keeps staff up-to-date with the performance of their modules and programmes via a dedicated website: programme leaders can see how their area is performing in relation to key performance indicators, and they are rated as Green (indicating strongest performance), Amber and Red.
In the School of Aeronautical and Mechanical Engineering, for instance, which has a diverse student body, including local, European and international students, an initial review of the data identified poor retention rates in the school as a whole, and in particular programmes and modules.
A closer examination of the data identified poor progression between levels 4 and 5 as a major problem. In particular students who had failed an assessment disengaged over the summer period and failed to turn up for re-sits in September.
Another area of concern was the performance of full-time Foundation Degree programmes. These were found to have low recruitment and high attrition rates, particularly as a result of non-progression from level 4 to 5. This threatened the pipeline of students to the engineering degree as the Foundation Degree is a significant feeder for the course for local students with low entry qualifications.
To address these problems, identified through an analysis of the data, a range of interventions were made, including curriculum review and a re-focused programme; an intensive induction; additional attention to the entry level and development of maths skills; and a summer support pilot for students with referrals and re-sits. The university has also developed a student feedback system to actively seek views from students during their courses. Information is collected via module feedback, through course representatives feeding into the newly formed Student Representative Council, Staff Student Consultative Committees and informally from staff. When students withdraw from the university they must complete a pro-forma and provide a reason why they are leaving, giving further insight into student experiences. On top of this, the University has undertaken some telephone follow-up to gain a fuller understanding of the student perspective.
As a result of this work, there has been a 10-15 percentage point improvement in non-completion, across the university, while Aeronautical and Mechanical Engineering has seen an improvement of 11.1 per cent. National Student Survey performance in Engineering has seen increases of up to 9 per cent.
A university spokesman said: “The improvement in retention is a result the cumulative effect of a raft of interventions, but central to this is being informed by data about specific programmes and student groups were further research and action is required.”
Loughborough University has developed an easy-to-use, relationship management system called Co-Tutor which holds data on student welfare, progression and attendance.
It allows tutors to input conversations or meetings with students as free text, pre-defined statements and tags. Non-attendance is highlighted by emailing ‘at risk’ flags to personal tutors and class lecturers and automated emails to students.
For example, students with less than 50 per cent attendance are automatically flagged, allowing tutors to access other relevant information and context-sensitive links to professional services in the institution, such as the finance office, disability services and others.
The notification system ensures that tutors can make timely decisions about putting in place interventions to improve retention and progression for those deemed at risk.
The system, which contains details of academic and pastoral records for all 19,000 registered students, can also be used to communicate with a whole cohort and is employed to manage other activities such as industrial placements.
In essence Co-Tutor helps staff to manage their relationships with students, and provides them with information and reminders to enhance this connection.
The logging of information also allows managers to be sure that all students are receiving suitable levels of pastoral care and academic support, reinforcing the personal responsibility members of staff have for supporting and guiding their students, and providing high quality information and advice.
Monitoring reports make the frequency and quality of support provided by staff to students completely transparent. Regular reports highlight elements such a staff online activity, total number of comments per students and total number of student/staff meetings both missed and attended. It also highlights patterns, such as the distribution of alert flags, the frequency of comments, meetings and emails and the percentage of attendance across programme or module, year group or level of study.
Co-Tutor is credited with improving the student experience and changing students’ behaviour. Data from the five years following the system’s introduction showed the number of modules where attendance was monitored rose from 62 to 260 and the average attendance rate rose from about 65 per cent to 70 per cent.
Data analysis at the university shows a correlation between attendance and the degree level achieved. Students graduating with a first and a 2:1 are most likely to attend more than 91 per cent of the time, those gaining a 2:2 tend to attend between 71 and 80 per cent of the time, and those with a third are most likely to attend between 61 and 70 per cent of the time. The analytics provide an evidence-base for putting resources into attendance interventions with an aim to boost degree attainment.
Data analysis at Huddersfield University has provided the institution with vital information about underachieving groups of students and led to an award-winning intervention that targets those who need it most.
In order to make even more progress on closing the white-BME attainment gap, Huddersfield has invested heavily in an exploration of its data to gain a thorough understanding of patterns of achievement and underachievement across the university and of the characteristics of those students who are at greatest disadvantage.
By analysing the data, it was discovered that modules and courses that had recruited students chiefly through vocational paths, for example BTEC routes, had lower results and lower retention. They also found that those courses had significant numbers of British Bangladeshi and Pakistani students.
In response, the university introduced the Flying Start initiative in 2017, designed to benefit these targeted ethnicities, as well as male students, entrants with vocational qualifications, those from low socio-economic groups and commuting students who live at home.
The programme overhauled the initial first year curriculum to promote a stronger sense of belonging among 900 undergraduates.
For two weeks, undergraduates attend sessions from 9am-5pm, five days per week and take part in an intense, subject-specific and participative programme.
It has two objectives: enhancing students’ academic abilities and exploring identity and belonging. There is a mix of organised social time and supervised independent study. A sense of community and identity within the classroom is created with the message that: “the university is you”.
Staff are encouraged to be very aware of who their students are and what they are doing. Work is also undertaken to build relationships not only between staff and students but also between students through peer work and group interactive work.
In its first year, the programme has seen statistically significant results. All Flying Start students report stronger relationships compared to non-Flying Start students, and in addition male Flying Start students also report significantly higher levels of belonging and engagement compare to male students on other courses.
The evidence suggested the intervention, which won the “course and curriculum” category of the 2018 Guardian higher education awards, has had a positive impact on outcomes. Tutors noted a dramatic difference in students' levels of confidence compared with previous years, and that students were much more likely than previous cohorts to ask for help, to contribute in group sessions and to engage in critically reflective activities.Cheryl Reynolds, project manager and senior lecturer at Huddersfield, said that the multifactorial analysis of the university’s own data provided the impetus for action.
“The Flying Start programme encouraged tutors to devise a range of activities that encouraged students to identify with their peers and with their subject discipline, to see themselves as academics and to appreciate the intense effort required to fulfil their potential,” she said.
Initially focusing on accountancy and finance, biological and chemical sciences, contemporary art and illustration, computer games programming, law and sports science, all schools will be offering some FS courses this year, and two schools will do so for all of their first-year undergraduates.
LIVERPOOL JOHN MOORES UNIVERSITY
Universities have made significant investments to provide lecture capture systems to meet student demand. While the practice generates a certain amount of controversy, it is regarded as useful for embedding knowledge and revision and can relieve stress for the increasing number of students who have to work and, as a result, sometimes miss classes.
Institutions have developed different approaches, with some choosing a university-wide policy, while others allow decisions at department level.
Teams across universities are now beginning to evaluate how well lecture capture works, what it is being used for and are trying to pin down how it can be improved.
One such project at Liverpool John Moores University looked at the use of lecture capture software. Outcomes of this investigation showed that while many sessions were recorded, a significant number of lectures were not being released to students.
Analysis of access data showed that most students used the recordings strategically, repeatedly watching them to help them understand particular aspect of the lecture. Sections that related to assessment requirements or an activity that would be carried out in the laboratory were also watched more frequently.
In focus groups students confirmed that recordings were useful, particularly where students were “struggling to understand a principle”. Because of this analysis the role of the software has shifted from lecture capture to education enhancement with more flexible pedagogical approaches to the technology supported and promoted through the university’s Teaching and Learning Academy.
This has resulted in an increase in the number of staff releasing recordings and students accessing them. Recordings have become shorter and more precise, focused on strategic aspects of the delivery for which repeated instructions or explanation are useful. Access data also shows that students are using recordings to support their studies throughout the semester rather than just for revision.
Martin Hanneghan, from the university’s Department of Computer Science, said that after completing a year-long observational study into lecture capture, leading to the publication of a paper, it was clear that there was no evidence of negative impact on attendance at lectures that are captured electronically and subsequently made available online. In fact, the students who physically attend lectures were most likely to watch the recorded versions more often than those who did not attend. It also highlighted the most effective use of recordings.
“Lecture capture becomes useful when it is used for bite-sized sections of learning, under five minutes or so,” said Hanneghan. “There is no point in subjecting students to a 90-minute video of a tutor talking as many will switch off quickly, or fast forward to try and find the juicy bits in much the same way as we have changed our TV viewing habits.”
The University’s monitoring of student performance and experience and key performance indicators is managed through the University’s centralised online dataset, WebHub. It provides extensive analytical capabilities and monthly updated reports on the University’s key performance indicators, alongside those of comparator universities.
These are regularly accessed by 1,400 staff across the University at all levels, from module leaders to the Senior Management Team. Content includes every type of data recorded on the University student records system, including average tariff points, employability and degree class outcomes of graduates, yearly retention and completion rates, and weekly updated student engagement and feedback data, all accessible at University, faculty, school, programme and module level.
NOTTINGHAM TRENT UNIVERSITY
Nottingham Trent University (NTU) has found that giving students access to their own data via the Dashboard - an award-winning learning analytics resource for students and staff - encourages greater self-reflection about engagement with their studies. It also encourages healthy competition, in that some students try to beat their own scores of the previous week, the class average, or even compare engagement with their peers.
Ed Foster, Student Engagement Manager at NTU, said: “Through the Dashboard, we give students the opportunity to think for themselves and look at how they are doing compared to their peers. Students who engage more with the resource tend to do better. There is a really clear line that students who log in to it more regularly are more likely to be academically successful. That is our starting point.”
For example in 2016/17, 90.1 per cent of first year students who logged in to the Dashboard 20+ times in the year progressed to the second year, compared to only 43.3 per cent of those who logged in once; 57.8 per cent of students who logged in 20+ times also achieved the equivalent of a 2:1 or 1st at the end of the year, compared to only 31.3 per cent of students who logged in once.
The Dashboard is designed to be used by both students and staff. It gives students reassurance or an incentive for students to work harder, but the same information is also designed to be used by personal tutors and other staff. The University expects staff to be looking at the data to decide whether students need further support, and to use the information in the Dashboard as part of the intervention.
The University stresses to students that the Dashboard is meant to be the starting point for a conversation rather than a definitive commentary. Students are given information how the Dashboard works during their induction. A short film is also played to first years to give them an idea about how they should engage with it.
On the whole, staff are very positive about the resource, although significant challenges remain about using data and metrics as part of the tutoring conversation. Staff would like as much certainty as possible about factors such the risk of non-progression, but this certainty needs balancing against the need to act early.
“If staff perceive there to be an error, it’s irritating for them as their time is precious,” said Ed Foster. “The data will never be perfect. We can never say ‘this student in November is 100 per cent likely to drop out in January’, for instance. There will always be some ambiguity. You might have an individual who buys all their own text books, is rarely on campus but works furiously. Potentially they’re going to do very well, despite what the data is pointing to. Everybody can remember someone who did brilliantly but you never saw them in classes. But the truth is that 99 per cent of students who don’t come to class and don’t appear to be engaging won’t do very well.”
The Dashboard can spark action and the aim is for that to turn into a conversation between the student and their tutor about the best way forward.
“We don’t want staff to think ‘oh, I’ve got to talk to you because the Dashboard says you’re in trouble’, we want this to be an ongoing conversation about what can be done to help students achieve their goals,” said Foster. “So it’s not just restricted to when there may be a problem, it is much wider than that.”
The NTU Student Dashboard is powered by the Solutionpath StREAM tool. The University has worked on learning analytics since 2013 and has further work to do in activities such as discipline-specific algorithms and student goal-setting.
Greenwich University is one of a number of institutions that have signed up to give students access to the Study Goal App, developed by Jisc.
The app is the equivalent of an academic “fitness tracker”, using many of the same motivational aspects that are present in Fitbits and similar devices.
It tracks learning activities, draws attention to patterns of behaviour, allows for comparisons with the activities and progress of other learners and enables the user to set study goals.
Students are given a score for engagement, calculated from details of their digital footprint. Attainment, as measured by marks and grades, are also displayed in the app.
It has a feature that permits students to register their attendance for a given learning activity (e.g. a class at a set time) by logging in and entering a four-digit code generated through the system and enabled by the class tutor. Students can look at their own VLE activity for a specific module and then see a comparison with the module average or with identified friends. They can also see in graphical form VLE Activity and recorded weekly attendance over the past four weeks.
It allows target setting for study activities which might, for instance, include: “read for two hours per day”, or “work on assignment 1 for two hours a day.”
If students consent, the app prompts action through email or text message alerts in order to keep goals in mind and sends congratulatory messages as a reward.
Greenwich points out that the app, and other learning analytics, are only useful as one element of a package of teaching and learning support.
Providing richer information to personal tutors allows them to discuss progress in regular meetings with students, contact students to check they feel on track or to arrange a meeting to review progress.
The app has won support from the university’s student union, which describes it as “a useful tool to support the student-tutor partnership, which is at the heart of education”.
While outlining the positive benefits of learning analytics, Greenwich has recognised that for some students, the comparative element of the app can cause anxiety. Tutors have been advised to suggest to students who have become stressed through comparing themselves to others via the app that they opt out of it until they “feel better able to deal with it”.
Plans to further develop the use of big data are in the pipeline at Greenwich.
“Learning Analytics can provide predictive indicators for achievement by comparing a learner's patterns of activity and achievement with those of previous groups of students,” said a spokesman. “It is our aim to introduce these. We expect these indicators to help us identify more easily those who may need additional support with their academic studies, which will enable us to contact them to see how we can help.”
This Good Practice Briefing was first published on HEi-know on 4th March 2019.
© 2013 Media FHE, all rights reserved