Press "Enter" to skip to content

JISC’s 4th Learning Analytics Network Event

The following are my notes and thoughts from this event.  Please feel free to comment pointing out errors or suggested additions.  I will try and clean them up post-event.

Housekeeping taken care off.  Interesting that Bradford are smoke free on their campus except designated shelters.

10:15
Introduction to the day – Niall Sclater, Jisc

Rundown of what to expect today and a reminder to network and explore others expectations and experiences.
————

10:30
A working demonstration of Jisc’s learning analytics solution for UK further and higher education – Michael Webb, Jisc (MW)

Aims of project:
Application of big data techniques to help learners meet their goals.

Developed a ‘plug-and-play’ system to bring this all together.

Data Collection

  1. About the student
    1. Demographics etc.
  2. Activity Data
    1. Interaction with HE systems etc.
      1. e.g. VLE interaction
      2. Library book loans
    2. Using TinCan (xAPI)

Student Insight

Provided by Tribal – Target Date November 2015 – 98% ready to go.
Tool for doing one specific thing – predicting withdrawals and dropouts.

accessing this system you are presented with charts depicting the university then into a hierarchical structure of the university. These show risk scores of the students as to how at risk they are of failing. Both on mass then allowing you to drill down to an individual student.
This can then be exploded to see what data in the model is indicating where the student is failing.
Some of these a student can not change – e.g. enrolment and demographics.

[I asked if we want to be collecting and processing demographic data – thus having to show it to a student.  This would result in us saying if you are of the following demographic you are less likely to pass — MW agreed and suggested this would be a HEI’s own decision, but followed up to say that early indications showed that this was important]

[Question – how much data do you need to provide for the predictions to be most accurate – Answer – 3 years worth of data is required]

[Question – how fine is the data?  Can it show the difference between a practical and a written component? – Answer – Yes it can look at this.]

Example of use from this tool – a member of staff noticed a demographic (ethnic) group was identified as being poor on a particular topic.  The tutor was able to see this and was able to provide additional support in this area.

The tool shows a prediction history which shows if a student was predicted to fail but has now changed the engagement causing their risk factor to fall.

[Question – we’re only being shown two areas of data collection, but we don’t yet know which data sources are important – Answer – yes, it’s currently based on the metrics wound most useful from early partners]

Student Success Plan

Target Date – Q1 2016 – 70% ready
This is a case management system. Based on an existing solution, but it’s very USA centric, so JISC are making more UK centric and multi-language (Welsh etc.).
The person looking at this would be the student welfare officer – personal tutor – retention manager (basically someone who’s job it is to look out for the student).
This system receives an alert sent out from one of the other systems and assigns that to the relevant person (tutor). This will allow them to see a degree of information as to why the alert was generated.

From this the tutor can go and see more data about the student to assess what might be the best course of action for this student.

around 10 traffic light indicators can be show that might indicate where the student needs attention.  These could be things like – how long before the deadline does the student submit their work?

From there you can email the student (keeping a journal showing the support with the student).

The basic version of this a ‘freemium’ model – I.E. basics for free but there is a lot more function that might require purchase.

[We’re being asked to come up with the 10 traffic light indicators on our tables]

Ideas from room:
Pre-entry information – either single or multiple box – e.g. did they attend visit day, engage with pre-arrival content etc.
Do they have a job – Number of hours working
Are they a member of a group or society?
Digital literacy

Test scores being different

How well a student is doing against their own set goals

Social engagement?

Student-student interactions – e.g. group work

Attendance
VLE Usage
Need to be able to adapt to each module
Research needs to be done on data to find the best areas to monitor

Learning Locker

Target Date – NOW – 99% ready

Data Warehouse
This offering will be cloud storage – will be able to install your own.

Receives xAPI statements.

Demoed Moodle plugin,
Blackboard are now developing equivalent and will be licenced to all of the UK through JISC.

Student App

This is completely bespoke new development.
Discarded massaging your tutor etc. and focused on the analytics.  They based the app on the style of fitness apps.
Besed when HEI is signed up to Data Warehouse but can be used stand alone with reduced functionality.

Friends – Shows your friends and allows for motivational messages. – e.g. Mr x did 6 hours on this and has improved their score – why don’t you try.

Engagement and and attainment
You are in the top x% etc.

Self-declared data
Equivalent of logging your food in a fitness app

Student set targets

Rewards

Might be able to include tutor set targets in the future – student might have to choose to accept.

Consent Service

This will be built into the app – but over time.  First release – student doesn’t give consent for their self declared data to be shared except what explicitly stated (e.g. sharing with friends).  Later on more consent options will start to be built into the app.

Demo Course

Have produced a demo Moodle course with beta documentation and is linked to a data warehouse so you can try the student app.

————

12:00
Learning analytics: from vision to implementation – Trevor Meek, University of Bradford

Are using business analytics but not yet learning analytics so was starting from a blank sheet.
There is no common approach to this across HEIs.

Academics need to be engaged with this project, but typically the processing of data and analytics can be a barrier to them.  Once shown the benefits this usually can be changed.

Change management process needs to be followed.
Right from the start you need to provide an impetus and strategic drive / energy.

The analysis of this data can help drive curriculum development but this shouldn’t be the sole driver for change in the curriculum – i.e. let’s use more of the VLE to give us better analytics.

If we have this data we start to become more accountable for the students progress.  What happens if we get it wrong? We’ve told students this is going to help them and that we will be alerted when they are falling behind – yet for a student this fails – what then?

Do we all agree on what LA are?
To help students succeed, increased funding to support student preparation?

What’s working / What’s not working – Student retention, auditing, using data to inform the process and practices of HE

How will staff have time to act on this data?  Does the HEI have sufficient support services to handle those flagged as requiring further support?

Challenges:

  1. Acceptance
  2. Variability
  3. Belief
    1. Is this top down?
  4. Buy-in
  5. Costs (/benefit)
  6. Security
  7. Legality
  8. Longevity

Can LA be simple?
Use data to create SSP (Student Success Plan)
Does it need to be Dynamic?

This is Action Research

International Commonality

  • Predicting student potential
  • Identify effective institutional techniques
  • analysis of on-line & off-line data
  • analysis of assessment data
  • Testing and evaluation of curricula
Follow IBM predictions of strategic success 2001
What do we need?
  • Strategy
  • Sustainability
  • Belief
  • Use
  • Longevity
    • (process / models)
  • Support
    • IT, Core staff, management, students

Implementation

  • Try something
    • toe in water
  • Planning
  • Test
  • Evaluate
  • Model – local
  • National / International initiatives
  • Research and Innovate
What do we need to deliver?
  • Personalised learning
  • Increased resources
  • Increased reflection
    • Student reflecting on their own learning
  • Behaviour Traits?

————

12:30
Lunch

————

13:30
Updates from early trials with institutions – Michael Webb, Jisc

8 HEIs on discovery
about 6 done by Christmas

50 HEIs signed up with interest

Discussing the discovery phase – details in documentation on https://courses.alpha.jisc.ac.uk/moodle/

If you enter the discovery phase there is no requirement to continue with JISC afterwards.

————

14:30
Implementing the Code of Practice – Niall Sclater, Jisc

https://www.jisc.ac.uk/guides/code-of-practice-for-learning-analytics

Parents started to get worried over this in the USA and started groups to prevent it.  This is why JISC wanted to start the UK off on a better footing by developing the above code.

Eight main areas to consider.

We’re now exploring the question “Who is responsible for learning analytics in your HEI?”

Most feel the the main answer would be the executive team (VC et al) but most also felt that others were responsible for other key bits.

Next question “Should students be asked for consent to use their data for LA?”

The general feeling was that the consent for the collection of the data was already handled but the consent to act on the data and alert the student and tutor might need to be sought.

Next Question “Does someone in your institution understand the algorithms and metrics?”
Generally there should be someone who has a loose grasp on this to make sure it follows a sensible path.

Next Question “Can students access all the data held about them in your institution?”
They should be able to but probably cannot.

Next Question “Should Interventions always be human-mediated?”
Not always but possibly a two level of alert, the second or more severe should be human-mediated.
Might be possible to give the student the ability to choose the level / number of notifications they receive.
————

15:00
Privacy impact assessments (PIA) for learning analytics – Dean Machin, Trilateral Research

What is a PIA – a systematic attempt to understand the privacy risks.
A Process for assessing the impacts on privacy – culminating in a report with recommendations

Why do PIAs

  • Identify material risks e.g. unthought of risks
    • Recommended remedial actions
    • Or Just make an organisation aware of risks
  • Legal requirements
  • Recommendation
  • ICO Code of Practice
  • Public assurance
When looking at LAs
Purpose – Retention / Attainment
Is the data collected required – no extra data should be collected that’s not used
Will it be used for other purposes – if so everyone needs to know about it
Who will have access?
Do students know / accept how the data will be used
To Whom will the data be disclosed?

————

15:15
Jisc’s student app for learning analytics – interactive session

Principles:

  • Comparative – seeing how you compare to the class etc.
  • Social
  • Gamified
  • Private by default
  • Usable standalone
  • Uncluttered

Being shown some basic screen shots / mockups of student app. (too far from screen to take a photo)

App will be open source.

————

16:00
Close

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.