Learning analytics

Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.[1] A related field is educational data mining. For general audience introductions, see:

What is Learning Analytics?

The definition and aims of Learning Analytics are contested. One earlier definition discussed by the community suggested that "Learning analytics is the use of intelligent data, learner-produced data, and analysis models to discover information and social connections for predicting and advising people's learning." [5]

But this definition has been criticised:

  1. "I somewhat disagree with this definition - it serves well as an introductory concept if we use analytics as a support structure for existing education models. I think learning analytics - at an advanced and integrated implementation - can do away with pre-fab curriculum models". George Siemens, 2010.[6]
  2. "In the descriptions of learning analytics we talk about using data to "predict success". I've struggled with that as I pore over our databases. I've come to realize there are different views/levels of success." Mike Sharkey 2010.[7]

A more holistic view than a mere definition is provided by the framework of learning analytics by Greller and Drachsler (2012).[8] It uses a general morphological analysis (GMA) to divide the domain into six "critical dimensions".

A systematic overview on learning analytics and its key concepts is provided by Chatti et al. (2012) [9] and Chatti et al. (2014) [10] through a reference model for learning analytics based on four dimensions, namely data, environments, context (what?), stakeholders (who?), objectives (why?), and methods (how?).

It has been pointed out that there is a broad awareness of analytics across educational institutions for various stakeholders, but that the way 'learning analytics' is defined and implemented may vary, including:

  1. for individual learners to reflect on their achievements and patterns of behaviour in relation to others;
  2. as predictors of students requiring extra support and attention;
  3. to help teachers and support staff plan supporting interventions with individuals and groups;
  4. for functional groups such as course team seeking to improve current courses or develop new curriculum offerings; and
  5. for institutional administrators taking decisions on matters such as marketing and recruitment or efficiency and effectiveness measures." [11]

In that briefing paper, Powell and MacNeill go on to point out that some motivations and implementations of analytics may come into conflict with others, for example highlighting potential conflict between analytics for individual learners and organisational stakeholders.[11]

Gašević, Dawson, and Siemens argue that computational aspects of learning analytics need to be linked with the existing educational research if the field of learning analytics is to deliver to its promise to understand and optimize learning.[12]

Differentiating Learning Analytics and Educational Data Mining

Differentiating the fields of educational data mining (EDM) and learning analytics (LA) has been a concern of several researchers. George Siemens takes the position that educational data mining encompasses both learning analytics and academic analytics,[13] the former of which is aimed at governments, funding agencies, and administrators instead of learners and faculty. Baepler and Murdoch define academic analytics as an area that "...combines select institutional data, statistical analysis, and predictive modeling to create intelligence upon which learners, instructors, or administrators can change academic behavior".[14] They go on to attempt to disambiguate educational data mining from academic analytics based on whether the process is hypothesis driven or not, though Brooks [15] questions whether this distinction exists in the literature. Brooks [15] instead proposes that a better distinction between the EDM and LA communities is in the roots of where each community originated, with authorship at the EDM community being dominated by researchers coming from intelligent tutoring paradigms, and learning anaytics researchers being more focused on enterprise learning systems (e.g. learning content management systems).

Regardless of the differences between the LA and EDM communities, the two areas have significant overlap both in the objectives of investigators as well as in the methods and techniques that are used in the investigation. In the MS program offering in Learning Analytics at Teachers College, Columbia University, students are taught both EDM and LA methods.[16]

History

The Context of Learning Analytics

(This section is adapted from the CC-By-SA on the EdFutures.net wiki)

In “The State of Learning Analytics in 2012: A Review and Future Challenges” Rebecca Ferguson [17] tracks the progress of analytics for learning as a development through:

  1. The increasing interest in 'big data' for business intelligence
  2. The rise of online education focussed around Virtual Learning Environments (VLEs), Content Management Systems (CMSs), and Management Information Systems (MIS) for education, which saw an increase in digital data regarding student background (often held in the MIS) and learning log data (from VLEs). This development afforded the opportunity to apply 'business intelligence' techniques to educational data
  3. Questions regarding the optimisation of systems to support learning particularly given the question regarding how we can know whether a student is engaged/understanding if we can’t see them?
  4. Increasing focus on evidencing progress and professional standards for accountability systems
  5. This focus led to a teacher stakehold in the analytics - given that they are associated with accountability systems
  6. Thus an increasing emphasis was placed on the pedagogic affordances of learning analytics
  7. This pressure is increased by the economic desire to improve engagement in online education for the deliverance of high quality - affordable - education

History of The Techniques and Methods of Learning Analytics

In a discussion of the history of analytics, Cooper [18] highlights a number of communities from which learning analytics draws techniques, including:

  1. Statistics - which are a well established means to address hypothesis testing
  2. Business Intelligence - which has similarities with learning analytics, although it has historically been targeted at making the production of reports more efficient through enabling data access and summarising performance indicators.
  3. Web analytics - tools such as Google analytics report on web page visits and references to websites, brands and other keyterms across the internet. The more 'fine grain' of these techniques can be adopted in learning analytics for the exploration of student trajectories through learning resources (courses, materials, etc.).
  4. Operational research - aims at highlighting design optimisation for maximising objectives through the use of mathematical models and statistical methods. Such techniques are implicated in learning analytics which seek to create models of real world behaviour for practical application.
  5. Artificial intelligence and Data mining - Machine learning techniques built on data mining and AI methods are capable of detecting patterns in data. In learning analytics such techniques can be used for intelligent tutoring systems, classification of students in more dynamic ways than simple demographic factors, and resources such as 'suggested course' systems modelled on collaborative filtering techniques.
  6. Social Network Analysis - SNA analyses relationships between people by exploring implicit (e.g. interactions on forums) and explicit (e.g. 'friends' or 'followers') ties online and offline. SNA developed from the work of sociologists like Wellman and Watts, and mathematicians like Barabasi and Strogatz. The work of these individuals has provided us with a good sense of the patterns that networks exhibit (small world, power laws), the attributes of connections (in early 70's, Granovetter explored connections from a perspective of tie strength and impact on new information), and the social dimensions of networks (for example, geography still matters in a digital networked world). It is particularly used to explore clusters of networks, influence networks, engagement and disengagement, and has been deployed for these purposes in learning analytic contexts.
  7. Information visualization - visualisation is an important step in many analytics for sensemaking around the data provided - it is thus used across most techniques (including those above).[18]

History of Learning Analytics in Higher Education

The first graduate program focused specifically on learning analytics was created by Ryan S. Baker and launched in the Fall 2015 semester at Teachers College - Columbia University. The program description states that "data about learning and learners are being generated today on an unprecedented scale. The fields of learning analytics (LA) and educational data mining (EDM) have emerged with the aim of transforming this data into new insights that can benefit students, teachers, and administrators. As one of world's leading teaching and research institutions in education, psychology, and health, we are proud to offer an innovative graduate curriculum dedicated to improving education through technology and data analysis."[19]

Analytic Methods

Methods for learning analytics include:

Analytic outcomes

Analytics have been used for:

Software

Much of the software that is currently used for learning analytics duplicates functionality of web analytics software, but applies it to learner interactions with content. Social network analysis tools are commonly used to map social connections and discussions (see Social network analysis software). Some examples of learning analytics software tools:

Ethics & Privacy

The ethics of data collection, analytics, reporting and accountability has been raised as a potential concern for Learning Analytics (e.g.,[8][25][26]), with concerns raised regarding:

As Kay, Kom and Oppenheim point out, the range of data is wide, potentially derived from: "*Recorded activity; student records, attendance, assignments, researcher information (CRIS).

Thus the legal and ethical situation is challenging and different from country to country, raising implications for: "*Variety of data - principles for collection, retention and exploitation.

In some prominent cases like the inBloom disaster even full functional systems have been shut down due to lack of trust in the data collection by governments, stakeholders and civil rights groups. Since then, the Learning Analytics community has extensively studied legal conditions in a series of experts workshops on 'Ethics & Privacy 4 Learning Analytics' that constitute the use of trusted Learning Analytics. Drachsler & Greller released a 8-point checklist named DELICATE that is based on the intensive studies in this area to demystify the ethics and privacy discussions around Learning Analytics.[29]

  1. D-etermination: Decide on the purpose of learning analytics for your institution.
  2. E-xplain: Define the scope of data collection and usage.
  3. L-egitimate: Explain how you operate within the legal frameworks, refer to the essential legislation.
  4. I-nvolve: Talk to stakeholders and give assurances about the data distribution and use.
  5. C-onsent: Seek consent through clear consent questions.
  6. A-nonymise: De-identify individuals as much as possible
  7. T-echnical aspects: Monitor who has access to data, especially in areas with high staff turn-over.
  8. E-xternal partners: Make sure externals provide highest data security standards

It shows ways to design and provide privacy conform Learning Analytics that can benefit all stakeholders. The full DELICATE checklist is publicly available here.

Open Learning Analytics

Chatti, Muslim and Schroeder [30] note that the aim of Open Learning Analytics (OLA) is to improve learning effectiveness in lifelong learning environments. The authors refer to OLA as an ongoing analytics process that encompasses diversity at all four dimensions of the learning analytics reference model.[9]

See also

References

  1. "Call for Papers of the 1st International Conference on Learning Analytics & Knowledge (LAK 2011)". Retrieved 12 February 2014.
  2. Eli (2011). "Seven Things You Should Know About First Generation Learning Analytics.". EDUCAUSE Learning Initiative Briefing.
  3. Long, P.; Siemens, G., (2011). "Penetrating the fog: analytics in learning and education.". Educause Review Online. 46 (5): 31–40.
  4. Buckingham Shum, Simon (2012). Learning Analytics Policy Brief (PDF). UNESCO.
  5. Siemens, George. "What Are Learning Analytics?" Elearnspace, August 25, 2010. http://www.elearnspace.org/blog/2010/08/25/what-are-learning-analytics/
  6. George Siemens in the Learning Analytics Google Group discussion, August 2010
  7. Mike Sharkey - Director of Academic Analytics, University of Phoenix, in the Learning Analytics Google Group discussion, August 2010
  8. 1 2 Greller, Wolfgang; Drachsler, Hendrik (2012). "Translating Learning into Numbers: Toward a Generic Framework for Learning Analytics." (pdf). Educational Technology and Society. 15 (3): 42–57.
  9. 1 2 Mohamed Amine Chatti, Anna Lea Dyckhoff, Ulrik Schroeder and Hendrik Thüs (2012). A reference model for learning analytics. International Journal of Technology Enhanced Learning (IJTEL), 4(5/6), pp. 318-331.
  10. Chatti, M. A., Lukarov, V., Thüs, H., Muslim, A., Yousef, A. M. F., Wahid, U., Greven, C., Chakrabarti, A., Schroeder, U. (2014). Learning Analytics: Challenges and Future Research Directions. eleed, Iss. 10. http://eleed.campussource.de/archive/10/4035
  11. 1 2 Powell, Stephen, and Sheila MacNeill. Institutional Readiness for Analytics A Briefing Paper. CETIS Analytics Series. JISC CETIS, December 2012. http://publications.cetis.ac.uk/wp-content/uploads/2012/12/Institutional-Readiness-for-Analytics-Vol1-No8.pdf.
  12. Gašević, D.; Dawson, S.; Siemens, G. (2015). "Let's not forget: Learning analytics are about learning" (PDF). TechTrends. 59 (1): 64–71. doi:10.1007/s11528-014-0822-x.
  13. G. Siemens, D. Gasevic, C. Haythornthwaite, S. Dawson, S. B. Shum, R. Ferguson, E. Duval, K. Verbert, and R. S. J. D. Baker. Open Learning Analytics: an integrated & modularized platform. 2011.
  14. Baepler, P.; Murdoch, C. J. (2010). "Academic Analytics and Data Mining in Higher Education". International Journal for the Scholarship of Teaching and Learning. 4 (2).
  15. 1 2 C. Brooks. A Data-Assisted Approach to Supporting Instructional Interventions in Technology Enhanced Learning Environments. PhD Dissertation. University of Saskatchewan, Saskatoon, Canada 2012.
  16. "Learning Analytics | Teachers College Columbia University". www.tc.columbia.edu. Retrieved 2015-10-13.
  17. Ferguson, Rebecca. The State of Learning Analytics in 2012: A Review and Future Challenges. Technical Report. Knowledge Media Institute: The Open University, UK, 2012. http://kmi.open.ac.uk/publications/pdf/kmi-12-01.pdf
  18. 1 2 Cooper, Adam. A Brief History of Analytics A Briefing Paper. CETIS Analytics Series. JISC CETIS, November 2012. http://publications.cetis.ac.uk/wp-content/uploads/2012/12/Analytics-Brief-History-Vol-1-No9.pdf.
  19. "Learning Analytics". www.tc.columbia.edu. Retrieved 2015-11-03.
  20. Buckingham Shum, S. and Ferguson, R., Social Learning Analytics. Educational Technology & Society (Special Issue on Learning & Knowledge Analytics, Eds. G. Siemens & D. Gašević), 15, 3, (2012), 3-26. http://www.ifets.info Open Access Eprint: http://oro.open.ac.uk/34092
  21. Brown, M., Learning Analytics: Moving from Concept to Practice. EDUCAUSE Learning Initiative Briefing, 2012. http://www.educause.edu/library/resources/learning-analytics-moving-concept-practice
  22. Buckingham Shum, S. and Deakin Crick, R., Learning Dispositions and Transferable Competencies: Pedagogy, Modelling and Learning Analytics. In: Proc. 2nd International Conference on Learning Analytics & Knowledge (Vancouver, 29 Apr-2 May 2012). ACM: New York. pp.92-101. doi:10.1145/2330601.2330629 Eprint: http://oro.open.ac.uk/32823
  23. Ali, L.; Hatala, M.; Gaševic, D.; Jovanovic, J. (2012). "A qualitative evaluation of evolution of a learning analytics tool" (PDF). Computers & Education. 58 (1): 470–489. doi:10.1016/j.compedu.2011.08.030.
  24. Ali, L.; Asadi, M.; Gaševic, D.; Jovanovic, J.; Hatala, M. (2013). "Factors influencing beliefs for adoption of a learning analytics tool: An empirical study" (PDF). Computers & Education. 62: 130–148. doi:10.1016/j.compedu.2012.10.023.
  25. Slade, Sharon and Prinsloo, Paul "Learning analytics: ethical issues and dilemmas" in American Behavioral Scientist (2013), 57(10), pp. 1509-1528. http://oro.open.ac.uk/36594
  26. Siemens, G. "Learning Analytics: Envisioning a Research Discipline and a Domain of Practice." In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, 4–8, 2012. http://dl.acm.org/citation.cfm?id=2330605.
  27. Kristy Kitto,Towards a Manifesto for Data Ownership http://www.laceproject.eu/blog/towards-a-manifesto-for-data-ownership/
  28. Drachsler, H. & Greller, W. (2016). Privacy and Analytics – it’s a DELICATE issue. A Checklist to establish trusted Learning Analytics. 6th Learning Analytics and Knowledge Conference 2016, April 25–29, 2016, Edinburgh, UK.
  29. Mohamed Amine Chatti, Arham Muslim, and Ulrik Schroeder (2017). Toward an Open Learning Analytics Ecosystem. In Big Data and Learning Analytics in Higher Education (pp. 195-219). Springer International Publishing.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.