CSU learning analytics Strategy

Version 1.3

Table of Contents

1. Introduction

2. Glossary

3. Purpose and Vision

4. Drivers

5. Values

6. Objectives

7. Strategies

A. Gathering learning analytics data

B. Technical architecture

C. ProactiveAnalytic Strategies

8. Development and review of policies

9. Governance

10. Specific roles and responsiblities

11. Ethical issues and risk mitigation mechanisms

12. High level indicators of successful analytics usage at CSU

1.Introduction

This strategy covers learning analytics, and acknowledges the critical role of the learner, the teacher and course and subject design in the educational process.Note that learning and academic analytics overlap and a clear distinction might not be possible. Learning analytics is situated within the general notion of “analytics” (see Glossary below).

Learning analytics can be uni-dimensional but the goal is to develop multi-dimensional models where various factors are integrated and analysed in a systemic way and being an expression of the agency of those involved in the learning and teaching systems at all levels.

The Working Party acknowledges that learning is a complex social activity and that technical methods do not fully capture the scope and nuanced nature of learning. It is further critical that the type and nature of interactions are analysed and not merely the amount of interactions.

Aspects of learning analytics are dependent on the availability of “big data” and as such areresearch that uses massive amounts of data.

Learning analytics can be provided to the student, teaching staff, student support staff, teaching support staff, and administratorsto support adaptive practice and adaptive systems. It can be used to support a wide range of activities from day-to-day teaching and learning activities, to institutional support for at risk students, and the generation of new insights regarding teaching and learning patterns and effectiveness.

Below is a description of possible uses of learning analyticswhich is an adaption of work by George Siemens (2012).[1]

Focus of analytics / Who Benefits?
Items below can be described as “learning analytics”
Subject-level: alignment with learning experience design, social networks, conceptual development, language analysis / Learners, teaching staff, support staff
Aggregate (big data) predictive modeling, patterns of success/failure / Learners, teaching staff, support staff
Items below can be described as “academic analytics”
Institutional: learner profiles, performance of teaching staff, quality of course and subject design, resource allocation / Administrators, IR, funders, marketing, learners
Regional & National (state/provincial): comparisons between systems / Governments, administrators
International: ‘world class universities’ / National governments (OECD)

Internationally there is progression to multi-dimensional and systemic learning analytics that e.g. include data external to the LMS and from support services. Focussed research occurs through the international Society for Learning Analytics Research (SoLAR) (

There are a number of types of learning analytics systems (George Siemens, 2012): Dashboards; Recommender Systems; Predictive models; and Alerts/warnings/interventions.

Analytics can be provided in the following areas (George Siemens, 2012) that may be applied to students or teaching staff:

  • Analytics around social interactions;
  • Analytics around learning content;
  • Analytics in different spaces;
  • Analytics on interaction with the university system;
  • Analytics on intervention and adaptation; and
  • Assessment of analytics.

Please see Section 10 on roles and responsibilities of various players and for more information on developments at CSU.

The working party that was established by the ILSC to create this strategy were: Assoc Prof Philip Uys (convenor) (DSL); Nina Clemson (P&A); Simon Thomson (Office of Student Services); Liz Smith (Academic Support); Paul Bristow (DIT); Nadine Mckeown (DSL); Assoc Prof Barney Dalgarno (Sub-Dean L&T, Faculty of Education). The working party was supported by Kate Rose (DSL).

Special input was received from Assoc Prof Alan Bain (Faculty of Education).

2. GLOssarY

Academic analytics / “The application of business intelligence tools and strategies to guide decision-making practices in educational institutions. The goal ... is to help those charged with strategic planning in a learning environment to measure, collect, decipher, report and share data in an effective manner.” (
Note that learning analytics and academic analytics are interrelated and that the course and learning designs determine what a university might wish to analyse through academic analytics.
Analytics / “The use of data, statistical analysis, and explanatory and predictive models to gain insights and act on complex issues.” (EDUCAUSE)
AUSSE / Australasian Survey of Student Engagement
BICC / Business Intelligence Competency Centre
Business Intelligence / A set of methods and techniques that are used by organisations for tactical and strategic decision making.
Business Intelligence Competency Centre / An organisational team that has defined tasks, roles, responsibilities and processes for supporting and promoting the effective use of Business Intelligence across an organisation.
CEQ / Course Experience Questionnaire
Community of practice / A group of people who share a craft and/or a profession and/or an interest
Data warehousing / A consolidation of data from a variety of sources that is designed to support strategic and tactical decision making.
DWBI / Data warehousing and business intelligence
ILSC / Information and Learning Systems Committee
Learning analytics / Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs. (SoLAR) Note that the learner context referred to above includes relevant computer systems, learning experience design, the role of teaching staff as well as learning and teaching support staff.
Master data / Persistent, non-transactional data that defines a business entity for which there is, or should be, an agreed upon view across the organisation.
SES / CSU Student Experience Survey
UES / University Experience Survey

3. Purpose and Vision

Purpose:

  • To provide a university-wide framework to guide the use of learning analytics at CSU , and to improve our understanding of models of learning and teaching at CSU and its performancein the context of:
  • International and national developmentsand drivers contextualised to CSU;
  • Institutional values, objectives and strategic priorities;
  • Relevant institutional policies, governance, roles and responsibilities; and
  • Ethical issues and institutional risks;
  • Including:
  • High level objectives and strategies;
  • High level indicators of successful usage;
  • Infrastructure and human resource issues;
  • Risk mitigation strategies; and
  • Policy and governance issues.

This Strategy informs the CSU Educational Technology Framework, the DIT ICT Strategy in this area and the work of the BI Steering Committee.

Vision:

  • A high quality student centred learning experience supported by subject, course and institutional analytics that reflect the activities of students, teaching and support staff.

4. Drivers

External

  1. Remaining competitive in an increasingly global market;
  2. Increasing public accountability and transparency;
  3. Performance in external benchmarking quality indicators such as CEQ, UES, AUSSE and International Student Barometer;
  4. CSU’s Compact with the Government highlights retention in the context of increasingly diverse intakes;
  5. TEQSA is increasingly focussing on course delivery and support processesrequiring increased accountability;
  6. Evidence of successful use of Analytics within Higher Education internationally (e.g. Purdue University) andin other sectors of society (e.g. business); and
  7. Increasing expectations by students as customers of a rich and responsive online environment

Internal

The need for:

  1. Improved student retention and progress;
  2. More efficient and effective targeting ofteaching and support activities and resources;
  3. Improved understanding of the relationship between models of learning and teaching and performance indicators at CSU;
  4. Evidence-based professional learning of teaching staff;
  5. Prioritisation of investment in learning systems and activities; and
  6. Improved overall satisfaction of CSU students

5. ValueS

  1. Alignment – we seek to improve alignment betweenmodels of learning and teaching and performance indicators.
  2. Supporting student success – we recognise that the analysis of learning and teaching related behaviours and data can provide valuable insights into the student experience. We use this data for the purpose of supporting student progress and retention and promoting teaching excellence.
  3. Student centred – we place students at the centre of the learning experience by accommodating diverse individual characteristics in the learning process; providing them with choice; and allowing them to be active learners who are capable of managing their own learning, also through the use of analytics.
  4. Continuous improvement – we recognise the learning and teaching environment and needs of students change constantly. We reassess our practice on a regular basis to ensure we are collecting and analysing relevant and actionable data that presents significant opportunities for improving the student learning and teaching experience.
  5. Evidence based decision making – we refine learning and teaching and the support of students in response to differing and changing needs and behaviours and ensure that actions are linked to strong evidence.
  6. Respect – we abide by all related confidentiality and privacy regulations and ensure that actions taken as a result of access to student and staff data is done so in an ethical, supportive and respectful manner.

6. objectives

learning and teaching

  1. To provide feedback to students on their learning interactions and progress to improve their likelihood of success, their choice of subjectsand their self-management of learningon the basis of models of learning and teaching at CSU.
  2. To provide alerts and reports relating to student activity and progress to teaching and teaching support staff that would enable and inform appropriate intervention strategies on the basis of models of learning and teaching at CSU.
  3. To provide feedback to teaching staff on the effectiveness of their learning designs and learning and teaching interactions on the basis of models of learning and teaching at CSU.
  4. To provide reports to Course Directors to help inform revisions to the learning designs within subjects and courses as part of curriculum renewal and course reviews using the Smart Tools project as appropriate.
  5. To provide alerts and reports to Heads of School that would enable and inform appropriate management interventionsand professional development strategies.

OrganisationAL

  1. To support the student experience through the enhancement of University systems and processes.
  2. To provide information to support institutional strategic planning in the learning and teaching area, including indicators to allow the achievement of learning and teaching objectives to be more effectively monitored.
  3. To provide information to support theimprovement of relevant administrative processes.
  4. To allow benchmarking of online learning interactions against other institutions.
  5. Support selection, and prioritisation of systems and servicesthrough increased awareness of actual usage by students and staff.
  6. Provide dynamically, adaptive systems informed by analytics to ensure adaptive learning.

7. Strategies

The CSU Learning Analytics Strategy will be underpinned by a whole of institution approach that ensures consistency of approach for all students combined with the flexibility to allow teaching staff and school specific initiatives tailored to the needs of specific student cohorts.

Relevant theories, learning and teaching models, analytical methods and technologies will be used to implement the strategies below.

A. gathering learning analYtics data

At an institutional level we will collect and use data related to areas like the following:

  1. The design of courses and subjects;
  2. Student activities in relation to learning design;
  3. Historical data such as students Grade Point Average (GPA); previous performance by the student in subjects’ historical performance rates for the subject;
  4. Student access of key information (e.g. access to online subject outline within first 3 weeks of session);
  5. Student assessment (e.g. failure to submit an assignment, failure of an assignment);
  6. Student engagement in compulsory online activities (e.g. introductory forum posting, download essential readings);
  7. Student feedback on university wide surveys (e.g. OES, SES, CEQ, UES and AUSSE);
  8. Student participation in online orientation and induction activities;
  9. Staff uploading of online subject outlines within required timeframe;
  10. Staff posting welcome/expectations message in Week 1;
  11. Staff turnaround of assignments within appropriate timeframe (currently within 21 days); and
  12. Teaching staff participation in online communication strategies i.e. forum activity; or chat activity; or online meeting activity;
  13. Teaching staff access to teaching resources.

Note:These factors will be reviewed on a regularbasis to ensure continued relevance by relevant parties including Academic support and DSL.

Faculties, schools and teaching staff may develop more specific data collection and response strategies according to the individual needs and characteristics of their cohorts for instance using online dashboards. Such strategies will normally be agreed to by the relevant Head of School or Executive Dean, clearly documenting the purpose for data collection, the method and the organisational responsibilities for such data collection and the strategies that will be put in place as a result of this data collection. Such strategies should complement rather than duplicate university wide processes and where possible should integrate existing resources, processes and information.

Predictors of student success will be developed based on the cohort, national and international good practice.

In addition to predictors of success, further indicators need to be identified that describe deeper student engagement and higher levels of learning such as critical curiosity; meaning making; creativity; resilience; strategic awareness; and the building of learning relationships (Shum, 2012[2])

B. TEchnical architecture

Source Systems

  1. All University information systems should be considered as potential analytics source systems unless specifically excluded.
  2. Learning Management and related systems acquired or built by the University (such as Smart Tools and other course design and management tools) must capture data relevant to learning and academic analytics.
  3. Learning Management and related systems acquired or built by the University must provide a reasonable level of ‘operational’ level reporting for student and teaching users.
  4. Learning Management and related systems acquired or built by the University must provide direct access to captured learning and academic analytics data for extraction into other systems, such as a data warehouse or other analytic tools.

Enterprise Analytics Systems

  1. Learning and teaching data should be processed and consumed as per the pipeline below. That is, learning and teaching data may exist in either transactional data tables, unstructured files or other source. All data sources should be extracted from the source location and transformed into data warehouse structures. In most cases users and tools should consume the learning and teaching data from the data warehouse and associated reporting functionality where needs are not met by the application’s reports.
  2. Data warehouse structures should ensure that data is stored at the lowest appropriate level of granularity and that dimensions are ‘conformed’ to other student related data structures i.e. data need to be aligned structurally with other university data.
  3. Where application reports exist, as far as practicable data warehouse reports should be consistent with results displayed in source system reports.

  1. Learning Management and related systems could be capable of sophisticated learning analytics that could parallel some of the stages in the diagram above.

Enterprise Reports

  1. A suite of standard reports should be developed in consultation with relevant groups, and may be embedded within the Learning Management system, if the system does not provide appropriate reporting functionality. Additional reports and ad hoc requests will be developed on an as needs basis.

Access To Analytic Data

  1. Access to data will be controlled by roles, with appropriate access privileges set depending on the sensitivity of the data and the role of the individual. Policies need to be developed to clarify the approvals required for access to data at various levels. Access to analytics systems should provide audit trails of what data is accessed by whom.
  2. Where reports don’t directly require identifying individuals data will be aggregated and anonimised to unlink personal identification with captured data. The core data set will, however, retain the identification data.
  3. A community of practice should be responsible for providing contextual information regarding teaching and learning data and reports, such as the meaning of and limitations regarding each data element, the use and inference of analytics and to refer questions to subject matter experts as appropriate.

Requesting Analytics Services

This section relates to the creation of reporting channels as well as access to such channels.

  1. Source system reporting tools – eg LMS dashboards, traffic lights.
  2. Faculty or School requests authorised by Executive Dean or Head of School for analytics from DSL.
  3. Academic Support request from P&A.
  4. Committee and executive requests.
  5. Self serve access to authorised data using pre built data cubes and visualisation/reporting tools.
  6. Custom requests may frequently evolve into standard reporting services.

C. PROACTIVE ANALYTICS STRATEGIES

In addition to the provision of a variety of reports for students, subject coordinators, course directors, heads of school and teaching support staff relating to individual or collective student interactions and progress and staff activities, a number of types of alerts will be generated, which will either be delivered via emails or will be presented inside the LMS. Some categories of alert will be generated automatically in all cases, while other categories will be configurable either by the student or the subject coordinator.

The following table shows a non-exhaustive list of examplesof some of the types of alerts which could be further developed through close consultation with stakeholders especially the faculties. In each case policies need to be developed to make clear whether the display of such alerts is governed by either opt in or opt out decisions.

Trigger for alert / How delivered / To whom / Degree to which configurable
Issues with the design of courses and subjects / Through the Smart Tools system / HOS, Subject coordinator / Always generated
Non reading of student forum postings by subject coordinator for period of 3 weeks / Message / HOS, Subject coordinator / Always generated
Unanswered messages on subject forum for period of one week / Message / Subject coordinator / Configurable
Non access of subject outline by student 2 weeks into the session / Message / Student / Always generated
Non submission of an assignment by the due date / Message / Student, International Student Support Officer / Configurable
Non access by a student to a resource marked as important by the subject coordinator / Message within LMS / Student / Configurable
Non completion of a course or subject survey / Message / Student / Always generated
Learning activity profile indicating that student is at risk of failure / Traffic indicator within LMS and perhaps also Email depending on severity / Student, Subject coordinator / Always generated

8. development and review of policies

Policiesaround analytics need to be deeply embedded in the core policies of CSU around all of its learning and teaching functions. Possible relationships between learning analytics and a number of CSU policies are identified below. Critical policy areas include student access to analytics; use of historical data; privacy; access to information; and collection and storage of data