Monday, September 21, 2015

Social media in organizations: the case of the ambulance services in the Netherlands

Ambulance

With great pleasure, I conducted a field research together with Sibrenne Wagenaar with one of the the ambulance services in the Netherlands. The ambulance service asked us to look for improvement in internal communication; driven by the fact that technological developments offer new opportunities for collaboration and communication within the organization. The initial question was: how can we improve communication within the organization using new media?

We went in search of new forms of internal communication that fit the organization. We did an action research which lasted five months and was conducted by a pioneer group of 17 people, accompanied by us. The action research consisted of internal reflections, research within the organization, visiting other organizations and an experimental phase. Experiences are interesting from the standpoint of how technology influences collaboration within an organization and are likely to be recognizable to other organizations.

Some conclusions
What was especially noticeable is that even if you do not use social media, these media still have an influence on the organization. In this case there was a static intranet, with information from management. At the same time, there were a large amount of Whatapp groups within the organization, often invisible to management. Communication in these groups was often not favorable to the organization in the sense that confusion messages were amplified. If management in an organization does not have an eye for this type of communication via new media, it may strengthen a culture that you do not favour. The outcome was that the organization may benefit from a social intranet, however, a lot of attention should be devoted to stimulate a new way of communication, both face-to-face and online. An experiment showed that people did not like to participate in open groups online, but really need the safety of a small closed group. A social intranet is a big step for this organization and will need a clear vision and several years of implementation.

Download the full article
We wrote an article about our experiences for TVOO in the Netherlands. You may download the article sociale media bij de ambulancedienst  which is in Dutch.

Thursday, September 17, 2015

From intuition to knowing for sure: a case of applying learning analytics at course level

Blogpost is written in collaboration with Francois Walgering from MOOCfactory and Sibrenne Wagenaar from Ennuonline.

More and more trainers and facilitators use online platforms for online interaction to facilitate learning. This is akin to face-to-face facilitation but also has its own dynamics. For example, how do you know if an online article is read? Or how much time a participant spends viewing a video or doing an assignment? Which participants connect online? How to judge whether valuable conversations place? In a face-to-face setting you can ask process questions to participants: "do you need more time for this discussion?" And "an extra assignment about ... seems to make sense." You can observing interactions and sense the group's enthousiasme. You may adjust the learning process gradually as a result.

In an online environment you miss these kind of observations. Online, some data can be helpful. Data collected by the system, such as degree of online activity, time spent viewing source, reached level, length of discussions. The value of these data is in analyzing and interpreting: what we call learning analytics. We recently experienced the power of learning analytics and gladly share this experience with you.

The case: the food and nutrition security course
Schermafbeelding 2015-09-17 om 13.28.54

We have designed and facilitated a five-week online course (a so-called SPOC- Small Private Online Course) "Food and Nutrition Security". About 90 people took part, and we have worked with them in Curatr, a learning platform that supports social learning. Four of the five weeks had a specific theme in the fifth week was reserved to work on practical cases, brought forward by participants and experts. The participants received a certificate after earning a minimum number of points per week and writing a case reflection. 19 people have received the certificate.

Online monitoring- using your intuition
During the course, uur most important source of information were feedback from participants on the online platform and e-mail messages, about content as well as the process. We had a picture of important issues, issues that led to discussion and effective learning activities. For instance, we received many compliments on the weekly interactive webinars. Another example: after an upgrade of the software technical problems arose which discouraged participants. And a number of participants informed us that they appreciated the content very much, but got in trouble with new ad hoc task at work. By following discussions and explicit interaction with a number of participants, we have developed a feel for the quality of the learning process. A group of enthusiastic participants began to emerge as a core group. Intuitively we felt that we were on the right track with this online learning program. But ... what can the data can tell us?

Online monitoring- analytics
Curatr collects behind the scenes lots of data using Experience API (xAPI). Beyond the general data such as name, surname, e-mail, organization and function you also get very good insight into the 'User'- and' Usage 'data. What is the difference between user and usage data?
progress analytics
User data is: all qualitative data you collect within the platform:

  • All threads are started by participants;
  • All reactions started in several discussions;
  • All sources (videos, PDFs, blogs, websites) which were added by participants;
  • Any responses to learning activities offered (eg reflection questions, quiz, open questions)

Usage data are the quantitative data (see illustration)

  • How many comments have been posted by participants;
  • How many resources have been added by a participant;
  • How many comments by those who have liked;
  • Who what when level has been reached;
  • Number of participants who completed the SPOC.

Learning Analytics involves looking at the data, analyzing it and acting on the results. We can perform learning analytics at different levels . The depth of analysis affects the reliability of the actions we arrive at. You could say that learning analytics roughly consists of five steps:

  1. Visualize: You are viewing the data, for instance, 100 participants who completed the SPOC, 20 participants in the test with a level one completed. This qualitative data is generally speaking easy to generate from a learning platform.
  2. Clustering: The data that you see, can be clustered: all activities of a participant, all threads which have been started, all the sources added. Together they form perhaps a cluster 'involvement' or 'quality'.
  3. Relationships: For instance "after a mail by the facilitator the activity on the learning platform has gone up." Or "the most important activity on the learning platform is to discuss with each other. "
  4. Patterns: activities that have repeatedly proven themselves. We see for example, by looking at different SPOCs that 80% of participants watching a video stop after three minutes. 
  5. Actions for the future: Based upon the pattern 'participants stop a video after 3 minutes' we may decide to use only videos within that period, or ensuring that the core of the message is transmitted within the first three minutes of the video.

Learning analytics in the case of the Food and Nutrition Security SPOC
Mooc score plaatje 1
In the case of the Food and Nutrition Security course we have taken a number of peaks at the data like people logged in and the leadershipboard during the course. Afterwards, we dived deeper into the data, with extensive analysis and interpretation. This proved really valuable and has given us new insights regarding the manner of participation of participants and the motion when it comes to social learning. A few examples:
  • Most of the participants who have ended high in the leadership board, earned points by contributing to the discussions. This group was also very visible to us and we had them 'spotted'. However, there appeared to be two participants who earned quite a few points, purely by studying sources. They had not responded online and had therefore remained invisible to us. While they actively participated in the course.
  • We were also curious to know about sources and questions that entice good discussions. 'Good' a qualitative concept and difficult to measure. But some threads stood out because of the large number of responses from participants. We have analyzed those discussions discovered that the striking aspect here was the contribution of an expert / coach: the fact that the expert herself was involved in the discussion by asking questions, giving an opinion, made the discussion very lively.
  • A diverse group took part in this course, and we were wondering if we would see differences in participation between certain "groups" (eg working in the Netherlands versus working in Asia, or working for NGOs versus working for government). This certainly was the case. Some "groups" had a much more explicit contribution to the social learning than other groups. Important to know when designing new courses! In the illustration, you can see the visualized total scores of the different groups.
Some of our lessons about using learning analytics as online facilitator
  • Using learning analytics can be very supportive of your role as an online facilitator. You can find confirmation for possible interventions. In the beginning as an online facilitator you may be scared or disappointed by the fact that not all participants are active online. Why is that? Is it unclarity, disinterest? If data show that they participants have accessed the available resources but have not commented on the platform can may react different and try to stimulate reactions. You might invite those who did not respond specifically to do so.
  • It is valuable to combine data from the 'system' with own feelings and observations, such as reactions of participants by mail. Thus observations and analytics can reinforce each other.
  • Before you start with the learning process make an inventory of your questions and possible indicators and numbers. What would you like to see? Are you happy with 10 active participants? What sources are you doubtful about? If you do this before the start, you will know better what information will help you to gain insights during the course. 
  • Make a plan for how often want to use and analyze data during the course. In 'Food and Nutrition Security' case we analyzed more deeply after the course. Of course we held an eye on the obvious data during the project (who log in and who contributed) but we could have put the data to better use. An exmple: afterwards we compared the activity in the various user groups. We could have done this even earlier and spend more energy to involve some groups. 
  • The learning analytics as described took place at course level. You can also go one step further and compare the data from the corresponding course with data from other courses. The results from such analysis can contribute to conclusions that fuel future designs.
  • Take time for accessing the data. Plan it in! Learning analytics is still a relatively new activity for many online facilitators. And certainly if you want to monitor during the course by means of data, it requires a regular look at the dashboard, analysis and opinion to determine your future interventions.
  • And ... start conversations with people involved to interpret the data! Check the data with curiosity and discuss it with those involved. Feeling and intuition are crucial. However, sometimes data offer new perspectives but often just confirms a certain intuition. This may then be a catalyst in order to take action. In our case we had weekly progress discussions, which might have benefitted from some data interpretation. 
Last but not least ... be careful about how you use your data. Be transparent about your intention with it. Tell participants very clearly what you will do with the data. After all, you are working with data on the performance of individuals in a learning environment.