Big Data Photo
CC BY-SA 2.0 big-data_conew1 by luckey_sun

7 tips for developing a learning analytics policy

The University of West London (UWL) published its learning analytics policy in September 2016. In this post I share some tips on developing an institutional learning analytics policy. The UWL policy was written and published in 3-months. The need for speed was the equally rapid implementation of a predictive analytics project at the university which is also outlined below.

1) Focus on purpose

Be clear about what you are trying to achieve and why you need a policy. If I’m honest, the initial impetus behind our policy was that we knew we ought have one. We were aware of the advice coming from Jisc and the Higher Education Commission. But for me, a key purpose of policy is to set expectations. I saw the development of the policy at the start of our learning analytics project, as a way of raising awareness among the project team and the wider institution. Learning analytics is a complex area that throws up many ethical issues. A policy is essential. Within the policy itself we also focused on purpose. The first principle in our policy is Clarity of Purpose: The overarching purpose for the use of learning analytics at the University is to help students succeed and achieve their study goals.

2) Don’t re-invent the wheel

Although institutional learning analytics policies are still relatively far and few between there are a growing number to build on and many are have creative commons licences. You are very welcome to make use of the UWL Learning Analytics Policy which itself draws heavily on the Open University’s Ethical use of Student Data for Learning Analytics Policy. There is also a list of policies maintained by the SHEILA Project. We also used Jisc’s Code of practice for learning analytics and since our policy was published Jisc have produced a model institutional policy and they continue to provide further invaluable advice such as this month’s guidance on consent.  Another resource I found useful was the Lace Project’s DELICATE checklist.

3) Involve everyone

The use of learning analytics affects many groups across the institution. I strongly recommend involving all stakeholders in the policy development from the start to avoid problems at a later stage. For the policy to be effective and for the first project to succeed it was important to get buy-in. I chaired a cross-university working group that discussed the issues and drafted the policy. It included representatives from the the Students’ Union, teaching staff, administrative staff, the University Secretary, our Academic Registrar, representatives from Student Services and IT Services as well as our PVC (Education).

4) Start now

As soon as you know a learning analytics project is on the horizon start work on your policy. The discussions can influence and help with project implementation. We were fortunate to be able to develop and gain approval for the policy fairly quickly. I’m well aware it can take much longer elsewhere. The policy is likely to require changes to institutional processes. For example to meet our principle of informed consent we had to make changes to the students’ terms and conditions. Although we worked quickly, we had done our homework. Several of us had been keeping up-to-speed with learning analytics through events such as those organised by the Heads of E-Learning Forum (HeLF) and UCISA as well as the Learning Analytics Jiscmail discussion list.

5) Take the lead

In my experience our community is very well versed in learning analytics and the related ethical issues. We have contributed much to this growing field already and shouldn’t shy away from taking the lead. I wasn’t a learning analytics expert, far from it. I was somewhat nervous when I stepped up, insisted that the university needed a policy and volunteered to organise it. However, It turns out I did know more than most (if not all) of the other stakeholders at the university. Policy development can often get left behind when there are the practicalities of project implementation to deal with. Someone needs to ensure it happens.

6) Get senior support

The UWL policy couldn’t have been developed and approved without the support of the University’s senior management. As our project was being led by two members of the university’s Directorate this was straight-forward for me. We are also a relatively small institution which can help get things done more quickly. The policy had to be approved by the University’s Academic Board. If you’ve not introduced policy before at your institution you’ll need to familiarise yourself with the relevant committees, processes and people.

7) Plan ahead

Although our policy was initiated by a particular project we wanted a policy that would apply to all future learning analytics activities. It is the UWL Learning Analytics Policy not the Predictive Learning Analytics Project policy. The policy needed to both set expectations and be a practical tool.  We purposely split our policy into two main elements. Part one is a set of 10 broad principles that all learning analytics activities at the university must follow. These cover consent, privacy, transparency, appropriate use, compliance with legislation and so on. The second part of the policy is a set requirements that must be completed on a project-by-project basis. Central to this is a Project Requirements Form where each project must show how it adheres to the ten principles and crucially, specify the details, including data in scope, access controls and areas of responsibility, for that project.

Our policy was approved by Academic Board in September 2016 and we now have two completed requirements forms covering two sub-projects within our predictive analytics projects.

The UWL Predictive Learning Analytics Project

The initial driver behind the policy development was our student success project which is a partnership with Civitas Learning. We are reviewing data from previous and current students to look for patterns that contribute to success. The insights from the data are used by our central Student Engagement Team and by personal tutors to inform conversations with students. We collect and analyse data from three sources: our student records system, Blackboard and our classroom attendance monitoring system.

As one of the three data sources for the project is Blackboard the TEL team have been involved in the technical side of the project too. The university has invested in Blackboard’s Open Database so that we can easily provide both the historical data and daily updates to the Civitas analytics engine. Our team are also involved in the staff training for the use of one of the Civitas tools called Inspire for Advisors (IfA) which gives our personal tutors a view of insights for their personal tutees. It is currently being rolled out to the second of our eight Academic Schools. The IfA interface is simple and the training is not too technical; it focuses more on process, best practice and issues relating to the learning analytics policy such as appropriate use of the insights.

UWL analytics image created using Icons from

  • Gear by Gan Khoon Lay from the Noun Project
  • Calendar by anbo from the Noun Project
  • Laptop by BenPixels from the Noun Project
  • Information Files by BenPixels from the Noun Project
  • Solution by Gregor Črešnar from the Noun Project

Photo of Matt LingardMatt Lingard is the Head of Technology-Enhanced Learning at the University of West London & a Trustee of ALT. Contact: or @mattlingard

If you enjoyed reading this article we invite you to join the Association for Learning Technology (ALT) as an individual member, and to encourage your own organisation to join ALT as an organisational or sponsoring member

1 Comment

Leave a Reply

Your email address will not be published. Required fields are marked *