Press "Enter" to skip to content

6th UK Learning Analytics Network Meeting – Newport 11/05/2016

As always the views expressed here are my own, are usually live so please forgive errors and are open for you to send me corrections / additions.
Programme:
10:25 – 10:50 Update on Jisc’s learning analytics programme

Michael Webb and Paul Bailey, Jisc

10:50 – 11:10 Implementing Welsh language support in learning analytics systems

Rob Wyn Jones, Jisc

11:25 – 12:30 Getting your data right for learning analytics

Richard Palmer, Tribal & Rob Wyn Jones, Jisc

13:30 – 14:15 Learning analytics at the Open University: an update on developments

Kevin Mayles, OU

14:15 – 15:00 Attendance monitoring systems as a source for learning analytics data – ethics and logistics

Chris Price, Aberystwyth

15:20 – 15:55 Learning analytics: a perspective from down under

Cathy Gunn, University of Auckland

Slides from this event will be made available via the JISC LA Blog.  I’ll add a link here in the future.

UPDATE
Slides and notes from the event: https://analytics.jiscinvolve.org/wp/2016/05/22/notes-and-presentations-from-6th-uk-learning-analytics-network-event-in-newport/

10:25 – 10:50 Update on Jisc’s learning analytics programme

Michael Webb and Paul Bailey, Jisc

Paul:
82 institutions interested in JISC project. Engaged with 35 institutions. 26 have undertaken readiness exercise with rest by September.
One of the biggest parts of the project at present is the legal side of data access.

Next year they are going to change the focus of the discovery phase.

  1. Workshop
  2. Self-Assessment
  3. Institutional Readiness
  4. Signed-up for service
  5. Implementation Support

With more of a focus on stage 5 and streamline stages 1-4.

Currently starting a Library Analytics project to look at what data can be used and how.

Michael:
Quick refesh on what JISC are trying to acheive and the structure of the services and how they connect (see my previous blog posts for details).

xAPI, UDD (unified / universal data definitions), LRW API, Plugins (Blackboard and Moodle) are getting ready soon (May 2016).

Testing with one HEIs data and checking its portability etc.
Starting the initial testing of the student app.
JISC felt they could add on an extra process and are now starting to look at Data Explorer.
3 streams on this –

  1. HEIDI Labs
  2. Own BI Tools
  3. Explorer Data Viz/Dash
These should create the groundwork for something that will hopefully be adopted globally.

<Addendum>
Have come up with draft T&C for access and use of personal data for the project. (Protection of personal data made available by the University of xxxxxxx to JISC).  JISC have had this drawn up with lawyers and have managed to get this down to 2 pages of A4.

10:50 – 11:10 Implementing Welsh language support in learning analytics systems

Rob Wyn Jones, Jisc

Dysgu Dadansoddol / Learning Analytics

Staff dashboard, Alert system and student app all require welsh language support.
Translation templates available to institutions for local customisation.
Potential to develop automated translation / access interface(s).

Has been working with UNICON OpenDashboard and Tribal Student Insight.
Also the UNICON Student Success Planner (SSP) and the LA Student App.

Translation has been context sensitive – not just thrown into google translate.

<Break for Tea and Coffee>

11:25 – 12:30 Getting your data right for learning analytics

Richard Palmer, Tribal & Rob Wyn Jones, Jisc

Rob:
The more historical data (18-36 months) the better your models will be.
To validate the current predictive model requires a minimum 12 months historical data.

UDD applies to HE and FE data.
Similar structure to the HESA student return.

Some feedback from FE indicates it might be that the UDD might not be as unified as liked – ongoing discussion and exploration.

For the predictive models institutions can provide anonymous data – but this is a bit more work.

In the future they are likely to standardise data input on XML and or JASON rather than accepting csv etc.

Will try and keep alignments with HESA where possible.  It would be great if HESA update their structure through their current project and our UDD could match, thus institutions can format their data and hit 2 birds with 1 stone.

Blackboard will have two plugins – one for current data and one for access to historical data.

If there is no or not enough historical data then JISC can look at other models.

Data transformation – in-house or using off the shelf offering – investigating.

Ask if we forsee any issues with near-live data capture or timely data extraction for LA, resource implications, access to student assessment information and other sources of information / indicators of student engagement or activity.

Richard:
LA Processor and Staff dashboard and being looked at by Tribal.
Richard is here today to discuss data structure and getting it into the data warehouse as a lot of HEIs use SITS.
The data in SITS needs to be merged with activity data from, for example, VLE, door access, library usage. Can Tribal sort this data for you?

They have found that Assessment Data to be more predictive but is also more complicated to collect.
When was this due, when did they submit, when was it open to submit to, was there a re-assessment, mark(s)?
There are many ways of storing this data and in many locations.

Tribal could account for the different ways of sorting this data and account for it so that it can be discussed in a unified way.

Top Tips for Data Preparation:

  • Understand where and how you store your information
  • Aim for the most robust data, not the easiest to extract
  • Understand how administrative processes are represented in your systems
  • Consider how you treat special circumstances
It seems that daily data transfer to the data warehouse is sufficient.  Slower than that will affect the interventions and faster than that would really be of much benefit.

When building the predictive model, they will use around 75% of the data and then show the model the 25% it has not seen but that we already know the outcome of.

<break for lunch>

13:30 – 14:15 Learning analytics at the Open University: an update on developments

Kevin Mayles, OU

Three focal points:

  1. Analysis and creation of insight
  2. Processes that impact student success
  3. Availability of data
Broke this down further to help guide them.
Predictive Analytics at the OU
First model was created to predict student numbers.  Provided a probability of completing their module (30 or 60 credits typically), passing it and returning at the start of the next module.
Second model predicts weekly if a student will submit their next assignment. This is a combination of four models that vote against each other for a predictive outcome.
Measures:
  • Accuracy of predictions
  • Volume of predictions
  • Timeliness of predictions
Accuracy % of all predictions
Precision % of non sub predictions that turn to be true
Recall % of actual non submitters identified
Whilst during a module their accuracy of predicting non submitters didn’t change, their precision increased.
Interesting data on those who didn’t submit and the last time they accessed / engaged with the system. their last engagement vs the prediction Mean was -14 days of prediction with a St Dev of 80 days.
40% of students interviewed said they had starting to think of withdrawing 1 month before they did.
[I think the slides might help explain this better]
1/3 of all new predictions fall in a six week window of opportunity around the last engagement date based on an analysis of predictions for non-completers.
OU are using peer super users to train new users so they can give ‘out in the field’ experience.

Do not currently have a picture as to if it is improving student retention.

Tutors find this a useful tool to understand student enguagement.

Has helped students identify students who have needed help.

Are very explicit in how they use the students data and make this available to the students.

14:15 – 15:00 Attendance monitoring systems as a source for learning analytics data – ethics and logistics

Chris Price, Aberystwyth

Since 2003 they have in ways been recording student attendance.

Chris started out asking why do they get to week 7 and he is the first to know that this freshers has only been to 10% of their lectures?… and what can they do to change this?

Emphasis is pastoral

How do you keep an eye on large classes?
What are the ethical issues to monitoring student attendance and performance?

Once issue of attendance identified:

  1. Personal Tutor
  2. Year Tutor
  3. Director – who sends ‘URGENT’ letter to student (clearly named to student) at term and home address with urgent on it – a % of these end up being opened by mum which has a positive effect on the student’s attendance / engagement.

To improve problem detection – automate the recording process – automate the monitoring
Card readers in teaching spaces to cross reference against the timetable.
Using 4 readers for a 250 seat room.

Can see individual attendance as a % of classes they should have attended.

Administrators can see those who have a low % and you can create rules. From system can send email / make meetings with correct staff to support student.

15:20 – 15:55 Learning analytics: a perspective from down under

Cathy Gunn, University of Auckland

Three major national analytics projects.

Undertaking research looking at student motivation and autonomous learning.

Building an evidence-base teaching and learning design using learning analytics data.
4 institutions, 8 case studies, 15 interviews, 1 survey

broad areas
Student retention – monitor & support progress
Student engagement – observe interactions, understand learning, promote achievement

SRES student relationship engagement system – look into.

Feedback on course design and better evaluation tools

promoting adoption

  • lower barriers to entry
  • avoid controversy (while it’s being sorted out)
  • Practice based & practice focused approach – to address common questions & challenges
Scenarios derived from case studies & interviews

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.