More and more trainers and facilitators use online platforms for online interaction to facilitate learning. This is akin to face-to-face facilitation but also has its own dynamics. For example, how do you know if an online article is read? Or how much time a participant spends viewing a video or doing an assignment? Which participants connect online? How to judge whether valuable conversations place? In a face-to-face setting you can ask process questions to participants: "do you need more time for this discussion?" And "an extra assignment about ... seems to make sense." You can observing interactions and sense the group's enthousiasme. You may adjust the learning process gradually as a result.
In an online environment you miss these kind of observations. Online, some data can be helpful. Data collected by the system, such as degree of online activity, time spent viewing source, reached level, length of discussions. The value of these data is in analyzing and interpreting: what we call learning analytics. We recently experienced the power of learning analytics and gladly share this experience with you.
The case: the food and nutrition security course
We have designed and facilitated a five-week online course (a so-called SPOC- Small Private Online Course) "Food and Nutrition Security". About 90 people took part, and we have worked with them in Curatr, a learning platform that supports social learning. Four of the five weeks had a specific theme in the fifth week was reserved to work on practical cases, brought forward by participants and experts. The participants received a certificate after earning a minimum number of points per week and writing a case reflection. 19 people have received the certificate.
Online monitoring- using your intuition
During the course, uur most important source of information were feedback from participants on the online platform and e-mail messages, about content as well as the process. We had a picture of important issues, issues that led to discussion and effective learning activities. For instance, we received many compliments on the weekly interactive webinars. Another example: after an upgrade of the software technical problems arose which discouraged participants. And a number of participants informed us that they appreciated the content very much, but got in trouble with new ad hoc task at work. By following discussions and explicit interaction with a number of participants, we have developed a feel for the quality of the learning process. A group of enthusiastic participants began to emerge as a core group. Intuitively we felt that we were on the right track with this online learning program. But ... what can the data can tell us?
Online monitoring- analytics
Curatr collects behind the scenes lots of data using Experience API (xAPI). Beyond the general data such as name, surname, e-mail, organization and function you also get very good insight into the 'User'- and' Usage 'data. What is the difference between user and usage data?
User data is: all qualitative data you collect within the platform:
- All threads are started by participants;
- All reactions started in several discussions;
- All sources (videos, PDFs, blogs, websites) which were added by participants;
- Any responses to learning activities offered (eg reflection questions, quiz, open questions)
Usage data are the quantitative data (see illustration)
- How many comments have been posted by participants;
- How many resources have been added by a participant;
- How many comments by those who have liked;
- Who what when level has been reached;
- Number of participants who completed the SPOC.
- Visualize: You are viewing the data, for instance, 100 participants who completed the SPOC, 20 participants in the test with a level one completed. This qualitative data is generally speaking easy to generate from a learning platform.
- Clustering: The data that you see, can be clustered: all activities of a participant, all threads which have been started, all the sources added. Together they form perhaps a cluster 'involvement' or 'quality'.
- Relationships: For instance "after a mail by the facilitator the activity on the learning platform has gone up." Or "the most important activity on the learning platform is to discuss with each other. "
- Patterns: activities that have repeatedly proven themselves. We see for example, by looking at different SPOCs that 80% of participants watching a video stop after three minutes.
- Actions for the future: Based upon the pattern 'participants stop a video after 3 minutes' we may decide to use only videos within that period, or ensuring that the core of the message is transmitted within the first three minutes of the video.
- Most of the participants who have ended high in the leadership board, earned points by contributing to the discussions. This group was also very visible to us and we had them 'spotted'. However, there appeared to be two participants who earned quite a few points, purely by studying sources. They had not responded online and had therefore remained invisible to us. While they actively participated in the course.
- We were also curious to know about sources and questions that entice good discussions. 'Good' a qualitative concept and difficult to measure. But some threads stood out because of the large number of responses from participants. We have analyzed those discussions discovered that the striking aspect here was the contribution of an expert / coach: the fact that the expert herself was involved in the discussion by asking questions, giving an opinion, made the discussion very lively.
- A diverse group took part in this course, and we were wondering if we would see differences in participation between certain "groups" (eg working in the Netherlands versus working in Asia, or working for NGOs versus working for government). This certainly was the case. Some "groups" had a much more explicit contribution to the social learning than other groups. Important to know when designing new courses! In the illustration, you can see the visualized total scores of the different groups.
- Using learning analytics can be very supportive of your role as an online facilitator. You can find confirmation for possible interventions. In the beginning as an online facilitator you may be scared or disappointed by the fact that not all participants are active online. Why is that? Is it unclarity, disinterest? If data show that they participants have accessed the available resources but have not commented on the platform can may react different and try to stimulate reactions. You might invite those who did not respond specifically to do so.
- It is valuable to combine data from the 'system' with own feelings and observations, such as reactions of participants by mail. Thus observations and analytics can reinforce each other.
- Before you start with the learning process make an inventory of your questions and possible indicators and numbers. What would you like to see? Are you happy with 10 active participants? What sources are you doubtful about? If you do this before the start, you will know better what information will help you to gain insights during the course.
- Make a plan for how often want to use and analyze data during the course. In 'Food and Nutrition Security' case we analyzed more deeply after the course. Of course we held an eye on the obvious data during the project (who log in and who contributed) but we could have put the data to better use. An exmple: afterwards we compared the activity in the various user groups. We could have done this even earlier and spend more energy to involve some groups.
- The learning analytics as described took place at course level. You can also go one step further and compare the data from the corresponding course with data from other courses. The results from such analysis can contribute to conclusions that fuel future designs.
- Take time for accessing the data. Plan it in! Learning analytics is still a relatively new activity for many online facilitators. And certainly if you want to monitor during the course by means of data, it requires a regular look at the dashboard, analysis and opinion to determine your future interventions.
- And ... start conversations with people involved to interpret the data! Check the data with curiosity and discuss it with those involved. Feeling and intuition are crucial. However, sometimes data offer new perspectives but often just confirms a certain intuition. This may then be a catalyst in order to take action. In our case we had weekly progress discussions, which might have benefitted from some data interpretation.