Our Higher Education team have a long history of harnessing data to improve graduate outcomes. Having supported universities in their Destination of Leavers from Higher Education (DLHE) survey, produced every iteration of the affectionately known Long DLHE, and its recent successor, it’s fair to say the team are experts in graduate outcomes, and indeed Graduate Outcomes (NB the subtle capitalisation nuance).

When reflecting on how the UK higher education sector can harness data to improve graduate outcomes, we often draw on our experience. As Andrew outlined in his blog last week, our team have intimate knowledge of several large data available to UK institutions, like the Employer Skills Survey. However, many other data can support the process of improving teaching and learning, and in turn, the outcomes of higher education for students.

universities

Working closely with several universities to unpack and investigate research questions unique to their own institutional context has given us insight into the power and potential, and also the challenges, associated with harnessing their own data sources. In many cases institutions suffer from lack of interdepartmental system integration, the legacy of outdated digital structures, and the huge cost of investment to improve and update technical infrastructure.

In their 2017 report entitled ‘Learning analytics in higher education’ JISC present a definition of learning analytics, and eleven case studies of emerging uses of learning analytics in the US, Australia and the UK. They describe learning analytics as ‘the measurement, collection, analysis and reporting of data about the progress of learners and the contexts in which learning takes place’.

The report describes several benefits, and drivers, compelling universities to improve their use of learning analytics. Positively impacting both students and institutions. These include:

  • Providing lecturers with information on the quality of their course material
  • Understanding how course content is being used by students
  • Reducing the volume of resits each semester
  • Communicating students’ educational goals
  • Providing accurate and timely student feedback
  • Empowering students to become more reflective learners
  • Improving retention by identifying those students who may be more likely to drop out

Case studies featured include the ‘Signals’ project at Purdue University in Indiana, US, designed to help students understand their course progress early enough to be able to seek help and either increase their likely grade or withdraw from the module in time to select an alternative. At the New York Institute of Technology (NYIT), staff used learning analytics to identify at risk students.  Developing their own model and dashboard with the help of the counselling staff who support students. Via this process NYIT has been able to identify at-risk students with a high degree of accuracy.

Another case study featured Nottingham Trent University, who produced a Student Dashboard, designed to facilitate dialogue between students, their personal tutors and support staff. The dashboard was widely used across the institution and was reported to have positive impacts on student engagement, creating a step change in organisational culture towards a more data driven approach to managing the student experience.

The report goes on to illustrate the complexities of learning analytics technical architecture, as shown in Figure 1 below. This systematic approach, designed by JISC, is described as a basic solution that allows institutions to experiment with learning analytics within the context of their own institution, with all of the stakeholders and audiences involved.

Learning analytics architechture

The case studies presented within this report highlight the significant opportunity universities in the UK have, to harness data about the progress of their own learners. Analysis of which has shown to positively impact graduate outcomes. However, the complexities that surround harnessing these data are as abundant as the opportunities they can provide. Support from sector organisations, like JISC, to share examples of best practice from the UK and internationally, is invaluable to help make use and understanding of learning analytics commonplace throughout the UK sector.

Written by Elizabeth Shepherd, Director in our Higher Education team

Elizabeth Shepherd, Research Director