Beyond marks: new tools to visualise student engagement via social networks

Joanne L. Badge, Neil F.W. Saunders, Alan J. Cann
Beyond marks: new tools to visualise student engagement via social networks

Beyond marks: new tools to visualise student engagement via social networks

Joanne L. Badgea, Neil F.W. Saundersb and Alan J. Canna,*

aSchool of Biological Sciences, University of Leicester, Leicester, UK; bCSIRO Mathematics, Informatics and Statistics, North Ryde, NSW, Australia

(Received 7 July 2011; final version received 12 November 2011; Published: 3 February 2012)

Abstract

Evidence shows that engaged students perform better academically than disinterested students. Measurement of engagement with education is difficult and imprecise, especially in large student cohorts. Traditional measurements such as summary statistics derived from assessment are crude secondary measures of engagement at best and do not provide much support for educators to work with students and curate engagement during teaching periods. We have used academic-related student contributions to a public social network as a proxy for engagement. Statistical summaries and novel data visualisation tools provide subtle and powerful insights into online student peer networks. Analysis of data collected shows that network visualisation can be an important curation tool for educators interested in cultivating student engagement.

Keywords: student engagement; data visualisation; social networks

*Corresponding author. Email: alan.cann@le.ac.uk

Citation: Research in Learning Technology 2012, 20: 16283 - DOI: 10.3402/rlt.v20i0/16283

RLT 2012. © 2012 J.L. Badge et al. This is an Open Access article distributed under the terms of the Creative Commons "Attribution 3.0 Unported (CC BY 3.0)" license (http://creativecommons.org/licenses/by/3.0/) permitting use, reuse, distribution and transmission, and reproduction in any medium, provided the original work is properly cited.

Introduction

Discussion-based education is far from new. The Socratic method, in which the educator engaged with the student through discussion and questioning on a one to one basis, gave rise to the viva voce1 style of assessment of knowledge. While this time-honoured method is effective, it is difficult to apply to large groups of students as the staff-student ratio climbs ever higher, particularly in the case of younger educators less confident in their depth of knowledge or who have been asked to teach outside of their narrow area of expertise. Online discussion boards are a feature of most virtual learning environments (VLEs) (Cann et al. 2006), yet they are frequently underused, used to ask low-level organisational questions, or, if linked to assessment, used grudgingly to get the marks (e.g. Lee, Kim, and Hackney 2011). In contrast, a visit to a computer user area in any university reveals rows of screens displaying the familiar Facebook livery www.facebook.com, where access to such services is not banned. Student online attention is focussed on sites that with an “activity stream” architecture, a design that focuses and reinforces attention by returning current active items to the top of the users home screen. The contrast between students’ engagement with public social network services and online education tools is stark (Jones, Gaffney-Rhys, and Jones 2011).

In a comprehensive literature review of student engagement, Trowler and Trowler (2010) describe how Mann (2001) contrasted engagement with alienation, and proposed engagement – alienation as a more useful framework to understand students’ relationship to their learning than the more frequently discussed surface – strategic – deep model of learning (Marton and Säljö 1976). In a context prior to that involving social networks, Mann went on to describe the problem of lack of engagement as a failure of communication (Mann 2005). However, engagement is more than participation – it requires emotion and sense-making as well as activity (Harper and Quaye 2009). Descriptions of beliefs around student engagement are interesting but cannot inform educators as to how to guide individual students.

Social networks are rapidly moving beyond their original purpose and are inevitably becoming part of the learner experience (Conole and Alevizou 2010). Minocha (2009) gives numerous examples of the use of social software to promote engagement. The promise of social software to engage distance education students is great (Lester and Perini 2010). According to social learning theory (Bandura 1977), individuals’ active engagement functions as an initial motive for achieving desirable learning outcomes. However, most if not all of the published studies of any size concentrate on the softer, social side of student engagement rather than on academic outcomes (Hawe and Ghali 2008; Lefever and Currant 2010; Madge et al. 2009). Yu et al. (2010)suggest that online social networking directly influences university students’ learning outcomes, and a more recent study found that time spent on Facebook engaged in various activities can be positively predictive, negatively predictive, or positively and negatively predictive of student engagement (Junco 2012). Notably, this paper found that there is a connection between online engagement and engagement in offline activities.

To address the issue of measuring whether student engagement in a peer to peer discussion in an activity stream (“Facebook-like”) environment could also be used to inform educators how to guide individual students, we conducted research to discover if student contributions to a social network could be used to discern student engagement with education, and also to investigate how complex data from social network activity can be interpreted effectively using emerging graphical tools which give a far richer picture of online activity than summary statistics.

Background

All first year undergraduate students in the School of Biological Sciences at the University of Leicester (approximately 250 students per year) take common key skills modules that encompass Information Technology (IT) and numeracy skills. For two years, this module has included assessed use of a portfolio of freely available Web 2.0 tools. In 2009/2010 the authors introduced a social network element to encourage peer support and discussion of scientific and technical issues relevant to the curriculum. In active social networks, “friends” status updates form intermittent variable rewards, one of the most powerful methods of operant conditioning (Ferster and Skinner 1957; Zeiler 1968). We sought to derive student engagement with the process of academic reflection from these attributes. Although Facebook is pre-eminent in commanding student attention (Long 2010), we decided not to use this site in order to avoid complications arising from the overlap of social and professional online identities (Baran 2010). Instead, we chose to use the Friendfeed social network (Friendfeed.com). None of the 2009/2010 cohort of students entering our degree courses had an existing Friendfeed account and so all created dedicated accounts for their degree course. We linked Friendfeed into the institutional VLE by a link in the navigation sidebar of VLE module sites. Since Friendfeed logins are persistent, whenever students clicked this link in the VLE they moved seamlessly between the VLE and the social network.

Friendfeed has a very similar structure to Facebook, allowing users to post entries that can include links or attached files. Friendfeed was purchased by Facebook in August 2009, and with the merging of technologies, Facebook increasingly resembles Friendfeed. The vast majority of students were already familiar with Facebook and needed little if any instruction in how to use the extremely similar Friendfeed environment. Like the Twitter microblogging service (twitter.com) but unlike Facebook, Friendfeed allows asymmetrical following, that is, non-reciprocal subscriptions. In mathematical terms, this is referred to as a directed graph. Subscribers can “Like” entries from the people they follow, or comment on them in a threaded fashion. Friendfeed accounts can be private or public (controlled by a single checkbox) but all operate on a friend of a friend (FOAF) basis. All students were asked to subscribe to the three staff members involved in supporting the course and in turn, the staff subscribed to each student. The FOAF nature of the Friendfeed system meant that all students in the extended network saw any comments that the staff members made on other student entries, regardless of whether they were subscribed to that student themselves. We asked students to keep their existing Facebook accounts for private and social content, and reserve Friendfeed for education-related reflections and content, in a similar approach to that suggested by previous publications (Duffy 2010; Jones et al. 2010) and informed by our own prior experience. We aimed to promote the creation of a peer-supported network of students that had the potential for discussion-led learning and engagement with course material. This network would be separate from personal social networks, demonstrating the principles of professional development and conduct as scientists.

Methodology

The study described in this paper was approved by the University of Leicester Committee for Research Ethics Concerning Human Subjects prior to commencement. Students were introduced to Friendfeed at the start of the module, given guidance via the institutional VLE on the rationale for sharing information in this way, and on how Friendfeed use would be assessed. This was reinforced in practice throughout the module by academic staff input via Friendfeed itself and in voluntary face to face help sessions if students chose to attend these. Essentially, our approach was one of action research, as defined by Reason and Bradbury (2000), which developed in response to student use of the Friendfeed social network throughout the module.

Students were informed that they would be assessed on the number of contributions they made to Friendfeed averaged over the course of an academic term, with a minimum expectation of four academically-relevant items per week to qualify for the marks allotted. Contributions could be in the form of status updates, comments on others updates, or shared links, but to count for credit, external links must be accompanied by a short commentary explaining how and why it is relevant to academic study. Academic staff gave students support and feedback by joining in conversations on Friendfeed, using direct (private) messages where appropriate, as well as in weekly face to face support sessions. Marks were awarded for activity over the course of the whole module. At the end of the 10 week assessment period allotted to this course, data was extracted from the Friendfeed application programming interface (API) http://friendfeed.com/api We used the open-source Gephi tool http://gephi.org to visualise the network graphs and perform statistical analysis of the dataset. Gephi is a powerful tool that makes it easy to analyse individual elements of large networks.

All code used for this project is open-source and available at GitHub http://github.com/neilfws/Friendfeed Data were extracted using the Friendfeed application programming interface (API, http://Friendfeed.com/api/documentation User data were retrieved, parsed and stored as comma-separated value (CSV) files, using a script written in the Ruby programming language. Statistical analyses were performed using the R programming environment (R Development Core Team 2011).

Results

Activity

Although they could easily access the account details of the entire cohort via the VLE, most students formed relatively small networks with a median size of 21 people, somewhat smaller than the typical network size on other social services such as Facebook (Facebook Data Team 2009) and Twitter (Goncalves, Perra, and Vespignani 2011). However, each student network overlapped with other networks, and the FOAF “Like” feature on Friendfeed allowed information to be freely transmitted from one network to another. Unlike Facebook, Friendfeed is a directed network that allows asymmetric relationships between subscribers, but in spite of this, the distribution of subscribers was very similar to that of subscriptions (data not shown).

Contributions for credit made by students included shared links pointing at online tools, resources and scientific news, but also status update entries that contained reflection on the learning process. Students also received credit for adding to existing discussion threads comments containing significant reflective content or commentary. Figure 1 summarises student activity at the end of the 10 week assessment period. The number of substantive comments showed a greater range than the number of direct entries, reflecting the distribution of students who completed the online exercise to earn the credits allotted to it and students who engaged deeply with this approach to networked learning. As with comments, the number of “Likes” awarded across peer networks showed a considerable range. Over 10 weeks, 134 students whose accounts we were able to analyse in detail via the Friendfeed API produced 5376 entries, ranging from a single word to several hundred words, 8151 comments, 5232 “Likes”, and wrote 199853 words (an average of 1491 each, not including private messages).


Fig 1

Figure 1.  Boxplots of student contributions to Friendfeed. Boxplots showing distributions of student contributions to Friendfeed over the 10 week assessment period.

Figure 2 shows contributions broken down by type and gender. There are clear gender differences for each of the four types of contribution. Statistical analysis confirms that females make more significantly contributions than males (Wilcoxon rank sum test, p<0.05). This data is summarised in Table 1.


Fig 2

Figure 2.  Boxplots of student contributions to Friendfeed type and gender. Boxplot showing distributions of student contributions to Friendfeed over the 10 week assessment period broken down by type and gender.


Table 1. Friendfeed contributions by gender.
Entries – female Entries – male Subscriptions – female Subscriptions – male Comments – female Comments – male Likes – female Likes – male
Minimum 1 1 7 1 3 1 2 1
Maximum 108 75 81 44 207 215 168 166
Median 31 21 23 18 47 25 29 16
p-value 0.0008698 1.31e-05 2.965e-05 0.002275
Note: This table shows the minimum, maximum and median values for contribution types by female (n=95) and male (n=74) students, and the p-values of Wilcoxon rank sum tests comparing contribution types by gender. Females make significantly more contributions than males in each category of contribution.

Students were assessed on the number of contributions they made to over the course of an academic term, with a minimum expectation of four academically-relevant items per week to qualify for the marks allotted. Female students scored higher marks than males (median mark 100 versus 75). This difference is statistically significant (Wilcoxon rank sum test, p<0.001). Assessment of contributions to the social network formed only part (40%) of the overall module mark. In contrast to assessment of Friendfeed contributions, there was no significant overall gender difference in overall module marks excluding the social network contributions (Wilcoxon rank sum test, p>0.05). This finding suggests that although female students make more online contributions than males, typical male students tend to be strategically focussed on meeting assessment criteria.

Overall, Spearman's correlation test shows a very strong and statistically significant link between marks awarded for the Friendfeed exercise and other marks awarded (excluding the Friendfeed mark) (r=0.75, p<0.001). This strongly supports the argument that Friendfeed performance is an accurate proxy for student engagement.

However, these crude summary statistics address the behaviour of the entire cohort and are not representative of individual students. For example, each of the distributions of entries, subscriptions, comments and likes is skewed, with many students making relatively low number of contributions and a few students making large numbers. In contrast to the summary statistics, the individual who made the highest number of academically-relevant comments was male (Table 1). In order to try to achieve a more sophisticated individual assessment of student engagement, we turned to data visualisation tools to tease out individual's levels of engagement from the mass of data provided by the Friendfeed API.

Network visualisations

De Laat et al. (2006) advocate a three-pronged approach consisting of: Social Network Analysis to find out “who is talking to whom”, Content Analysis through coding teaching and learning activities as a way to find out “what they are talking about”, and Context Analysis focusing on the experiences of the participants to find out “why they are talking as they do”. In this study we have performed social network analysis through data visualisation, content analysis (see above), and minimal context analysis through a course questionnaire and exemplar student comments (see later). The Friendfeed data can be used to visualise three types of network:

  1. Subscriptions (or “following”).

  2. Comments (made and received).

  3. Likes (affirmations which refocus attention by moved the liked item to the top of the activity stream).

In order to investigate network relationships, we used Gephi to visualise interactions in student networks. Although Gephi makes it easy to analyse the behaviour of individuals and groups within large networks, for simplicity we only present here a static overview of network structure here. (The live data in Gephi presents a much enhanced vision of the contributions of each individual within the network.)

Figure 3 demonstrates the central role of staff members who act as “authorities” by subscribing to all students (in green). It also illustrates highly engaged students who act as “hubs”, facilitating network building. These individuals subscribe to many other students, hence the term “hub” (equivalent to Facebook “friends’ or Twitter “followers”). It is important to note that although all the data in this paper has been anonymised for ethical reasons, in the “live” Gephi view, the identity of each node is revealed by popup labels when the operator hovers the mouse pointer over a node. Thus the identity of individual students can be instantly and easily revealed amid the mass of network data.


Fig 3

Figure 3.  Subscriptions by gender and role. The cohort subscriptions (“following”) network. Female students shown in red, males in blue, non-students (staff) in green. All network diagrams in this paper were plotted with Gephi using graphml format files derived from the Friendfeed API. Network diagrams are ranked by size, degree and arranged using the Gephi force atlas layout, repulsion strength = 600.

Gephi contains several algorithmic community detection tools such as modularity maximisation (Newman 2004), which shows possible divisions of a network based on modularity – groups of nodes with dense internal connections within modules but only sparse connections with different modules. Using the community detection tools in Gephi on this data, three communities emerge (Figure 4). These three communities appear to characterise three different types of student:

  1. A very active group (shown in red in Figure 4), where a lot of the interaction stems from the green “authority” node – that is, contacts with the staff in the network.

  2. A smaller group (coloured purple in Figure 4), who prefer to “talk amongst themselves”.

  3. A mid-sized group (coloured blue in Figure 4), who prefer to remain on the periphery and interact mainly with group (2) rather than with staff.


Fig 4

Figure 4.  Subscriptions by community. Friendfeed subscription network diagram prepared using Gephi as described in figure 3. Using Gephi analysis for community detection suggested the presence of a three community structure within the overall network in all cases, that is, in analysis of subscriptions, comments and likes.

In addition to network visualisations, Gephi also provides summary network statistics (Table 2).


Table 2. Gephi network statistics.
Statistic Subscriptions Comments Likes
In/out degree 30.821 23.488 26.527
Diameter 5 6 6
Density 0.09 0.072 0.08
Modularity (communities) 0.354 (4) 0.339 (3) 0.308 (4)
Average clustering coefficient 0.294 0.236 0.236
Average path length 2.16 2.507 2.442
Note: This table shows network statistics calculated by Gephi for subscriptions, comments and likes networks containing only students (no staff), which are defined as follows:
  • In/Out Degree is the average number of both in and out edges per node; in other words, a measure of how many connections a node has.
  • Diameter is the “longest geodesic”, or line drawn through connected nodes spanning the network.
  • Density is the ratio of edges to possible edges, that is, a completely connected graph has density = 1.
  • Modularity ranges from – 1 to 1, where positive values indicate more edges within the modules (communities) than expected by chance.
  • Average clustering coefficient is a measure of how connected nodes are, ranging from 0 to 1.
  • Average path length is a measure of network transport efficiency, the average number of steps along the shortest path for all possible node pairs.

Qualitative student feedback and their use of the network

In addition to network analysis, an end of module questionnaire was used to elicit student feedback on the exercise. Although some negative views were expressed by a minority of students:

  • Friendfeed was an annoying intrusion into what I wanted to do with my own learning in biology.

  • I found Friendfeed a distraction, and prevented me from doing other work which would have been more useful.

Most of the student feedback was positive:

  • I found Friendfeed useful and enjoyed doing it even though some weeks it wasn't at the top of my to do list and got a bit neglected now and then. It encouraged everyone to communicate and help each other out as we are all doing the same course and also encouraged student, lecturer/tutor communication outside lectures and tutorials.

  • Friendfeed was very useful for addressing queries from students, and for open discussions between students and teachers.

  • Friendfeed is great for improving communication between staff and students.

  • Friendfeed was very effective in motivating me to discuss the topics regarding our lectures and questioning some of them during the Friendfeed posts. It is also a good way to keep up to date with any upcoming assessments and seeing the progress my other colleagues have made motivated me to catch up with them and do well in the assessments. Also, it is a helpful way of getting some questions asked since everyone can contribute and a one to one meeting is not necessary since you can log in to Friendfeed anytime.

Students made good use of the network of peers by asking unprompted questions relating to their studies and courses. Initially these were practically based, such as discussions about which text books to buy and where the best prices could be found. Such object-orientated discussions include academic-related news items that enabled students to focus their attention on current news relevant to subject courses. Students separated social comments for Facebook and academic ones for Friendfeed with ease. A very few “inappropriate” (i.e. social rather than academic) comments were dealt with by discrete responses from staff and these discussions moved to Facebook quickly. Later in the term, when revising for examinations, students began spontaneously to share best practice on different learning techniques.

Further evidence justifying the approach of using social software to engage students in discussion about both science and their own learning is suggested by the number of students who continued to use the service beyond the end of the assessment period and indeed across university vacations, continuing to discuss current scientific issues reported in the media. In contrast to other social services we have tested which very few students continued to use post assessment (~1%), approximately 15% of students continued to use Friendfeed, posting items for discussion and commenting on academic issues. Although still a minority of the cohort, this still represents an increase in engagement from that seen previously with other tools, and engaged students have continued to use Friendfeed for over a year since it was formally required.

Discussion

Analysis of the contribution of network structure to learning is an emerging topic in education (Dawson 2010). The work presented here shows how the freely available tools we have developed and used can be applied to data mined from popular social networks to inform educational practices. This study aimed to investigate whether a social network could be established on a professional basis for science students to discuss their work. The connections and interactions between students were analysed using publicly-available network analysis tools. This data shows that students formed connections to one another not only as subscribers but also through commenting on each other contributions to the network and through threaded conversations. Use of this social network curation approach to student engagement has now been taken up by other academic disciplines within this university. This trend is likely to continue, and Longman and Green (2011)argue that the emerging post-print digital culture of knowledge creation and dissemination in higher education is even more demanding of effective and committed teaching than hitherto.

While the numbers of students still active in the network after the compulsory assessment period is not high, it is an order of magnitude greater than that observed in other similar situations. This observation supports the view that teaching students to use isolated tools is not effective, as there is no narrative rationale that supports and engages them in increasing their understanding beyond completion of basic assessment tasks. “Authentic assessment” in which marks are awarded for tasks which students perceive to be clearly linked to their course of study rather than designed to familiarise themselves with arbitrary technologies is also important for engagement. For this reason we required that student contributions to Friendfeed be relevant to the degree course to count for credit rather then mere social chat. Our contention, supported by the data presented here, is that object-oriented social networks such as Friendfeed (and Facebook) provide “social glue” which both provides rationale and encourages student engagement with scientific and academic issues. Although this paper investigates the use the Friendfeed social network, there is no reason why the conclusions drawn cannot be applied in a wider context to any social tool with similar attributes.

A recent study suggests that teaching staff are positioned in a higher proportion of the higher-performing than lower-performing student networks (Dawson 2010). This entirely agrees with our observations and suggests that focussing student attention online has a significant a role to play as do face to face interactions in the lecture room or laboratory. The same study also suggests that higher-performing students primarily develop connections with students of a similar academic capacity. The implication is that where online communication channels are adopted, teaching staff need to ensure they have adequate network connections with all students, but especially to cultivate connections with and the networks of lower-performers. On social networks such as Twitter it has become accepted to use network interactions as a proxy for engagement (e.g. Huston, Benyoucef, and Weiss 2011).

One of the potential risks of this project was that students would confuse behaviour commonly seen in purely social online networks with a professional representation of their online identity that enhances their reputation. Students were made aware that they could control if their Friendfeed activity was public or private, approximately two thirds of students opted for a public account. Very few instances of publicly inappropriate online behaviour were observed, possibly due to the fact that participating students were encouraged to separate their new, professional online identities from pre-existing online social identities. Less successful were attempts to encourage students to engage in peer review of colleagues online activity as a means of introducing them to this central concept. Even using the private “direct message” channel, students were not willing or able to produce a balanced review that contained appropriate criticism or suggestions for improvement. Instead, they produced overtly complimentary commentaries even when defects were apparent. Although private, these peer reviews were not anonymous, which may have been part of the problem, but we suspect it is more likely that these junior scientists were insufficiently confident of their own abilities so that they were unwilling to criticise defects in others.

Conclusions

The purpose of this investigation has been to explore if social network tools such Friendfeed can be used to move away from teaching isolated facts to isolated students towards a more collaborative approach to science education in which student peer networks leverage academic input for maximum engagement. The outcomes to date have been sufficiently positive to encourage us to continue to develop a social approach to science education using freely available public networks to support and engage students. No published studies to date have looked specifically at the educational potential of the Friendfeed system, although recent studies have indicated that specific types of Facebook use do correlate with engagement (Junco 2012). This is in agreement with the findings presented here and suggests that social network-mediated academic activity has considerable untapped educational potential.

The data analysed in this paper represents only a fraction of the total interactions between students and between students and teaching staff that occur, since face to face interactions and private backchannel interactions cannot, and ethically should not, be captured and analysed. Nevertheless, network data captured via public API's and visualised with free open source tools provides a simple and powerful proxy for student engagement which moves beyond crude summary statistics of online behaviour and promises to provide more subtle and effective ways for educators to measure and guide student engagement with academic learning.

Acknowledgements

We are grateful for funding which contributed to this work received from the University of Leicester Teaching Enhancement Fund and the HEA UK Centre for Biosciences.

Notes

1. Viva voce is an oral examination where the student defends their knowledge of a submitted piece of work.

References

Bandura, A. (1977) Social Learning Theory. Pearson Education, New York.

Baran, B. (2010) ‘Facebook as a formal instructional environment’, British Journal of Educational Technology, vol. 41, no. 6, pp. E146–E149. [Crossref]

Cann, A. J. et al. (2006) ‘Assessed online discussion groups in biology education’, Bioscience Education, vol. 8, [online] Available at: http://www.bioscience.heacademy.ac.uk/journal/vol8/beej-8-4.aspx (accessed July 19, 2011).

Conole, G. & Alevizou, P. (2010) ‘A literature review of the use of Web 2.0 tools in Higher Education’, Higher Education Academy Evidence Net, [online] Available at: http://www.heacademy.ac.uk/assets/EvidenceNet/Conole_Alevizou_2010.pdf (accessed July 19, 2011).

Dawson, S. (2010) ‘“Seeing” the learning community: an exploration of the development of a resource for monitoring online student networking’, British Journal of Educational Technology, vol. 41, no. 5, pp. 736–752. [Crossref]

De Laat, M., et al. (2006) ‘Analysing student engagement with learning and tutoring activities in networked learning communities: a multi-method approach’, International Journal of Web-Based Communities, vol. 2, no. 4, pp. 394–412. [Crossref]

Duffy, P. D. (2010) ‘Facebook or Faceblock: cautionary tales exploring the rise of social networking within tertiary’, in Web 2.0-Based E-Learning: Applying Social Informatics for Tertiary Teaching. eds M. J. W. Lee & C. McLoughlin, Information Science Reference, Hershey, PA, USA, pp. 284–300.

Facebook Data Team (2009) Maintained Relationships on Facebook, [online] Available at: http://www.facebook.com/note.php?note_id=55257228858 (accessed July 19, 2011).

Ferster, C. B. & Skinner, B. F. (1957) Schedules of Reinforcement. Appleton-Century-Crofts, New York.

Goncalves, B., Perra, N. & Vespignani, A. (2011) Validation of Dunbar's number in Twitter conversations. arXiv:1105.5170v2. [online] Available at: http://arxiv.org/abs/1105.5170 (accessed July 19, 2011).

Harper, S. R. & ed. Quaye, S. J. (2009) Student Engagement in Higher Education, Routledge, New York and London.

Hawe, P. & Ghali, L. (2008) ‘Use of social network analysis to map the social relationships of staff and teachers at school’, Health Education Research, vol. 23, no. 1, pp. 62–69. [Crossref]

Huston, C., Benyoucef, M. & Weiss, M. (2011) ‘Following the conversation: a more meaningful expression of engagement’, E-Technologies: Transformation in a Connected World – Lecture Notes in Business Information Processing, vol. 78, no. 6, pp. 199–210.

Jones, N. et al. (2010) ‘Get out of MySpace! learning in digital worlds: selected contributions from the CAL 09 conference, April 2010’, Computers & Education, vol. 54, no. 3, pp. 776–782. [Crossref]

Jones, J., Gaffney-Rhys, R. & Jones, E. (2011) ‘Social network sites and student-lecturer communication: an academic voice’, Journal of Further and Higher Education, vol. 35, no. 2, pp. 201–219. [Crossref]

Junco, R. (2012) ‘The relationship between frequency of Facebook use, participation in Facebook activities, and student engagement’, Computers & Education, vol. 58, no. 1, pp. 162–171. [Crossref]

Lee, H., Kim, J. W. & Hackney, R. (2011) ‘Knowledge hoarding and user acceptance of online discussion board systems in eLearning: a case study’, Computers in Human Behavior, vol. 27, no. 4, pp. 1431–1437. [Crossref]

Lefever, R. & Currant, B. (2010) How can Technology be Used to Improve the Learner Experience at Points of Transition? Higher Education Academy, [online] Available at: http://www.slideshare.net/rcurrant/how-can-technology-be-used-to-improve-the-learner-experience-at-points-of-transition (accessed July 19, 2011).

Lester, J. & Perini, M. (2010) ‘Potential of social networking sites for distance education student engagement’, New Directions for Community Colleges, vol. 2010, no. 150, pp. 67–77. [Crossref]

Long, A. (2010) ‘Social media overtakes search engines’, Hitwise Intelligence, [online] Available at: http://weblogs.hitwise.com/alan-long/2010/01/post.html (accessed July 19, 2011).

Longman, D. & Green, K. (2011) ‘Digital enlightenment: the myth of the disappearing teacher’, Collected Essays on Teaching and Learning, vol. 4, [online] Available at: http://celt.uwindsor.ca/ojs/leddy/index.php/CELT/article/view/3283 (accessed July 19, 2011).

Madge, C. et al. (2009) ‘Facebook, social integration and informal learning at university: it is more for socialising and talking to friends about work than for actually doing work’, Learning, Media and Technology, vol. 34, no. 2, pp. 141–155. [Crossref]

Mann, S. J. (2001) ‘Alternative perspectives on the student experience: alienation and engagement’, Studies in Higher Education, vol. 26, no. 1, pp. 7–19. [Crossref]

Mann, S. J. (2005) ‘Alienation in the learning environment: a failure of community?’, Studies in Higher Education, vol. 30, no. 1, pp. 43–55. [Crossref]

Marton, F. & Säljö, R. (1976) ‘On qualitative differences in learning – I: outcomes and process’, British Journal of Educational Psychology, vol. 46, pp. 4–11. [Crossref]

Minocha, S. (2009) A Study on the Effective Use of Social Software by Further and Higher Education in the UK to Support Student Learning and Engagement, [online] Available at: http://www.jisc.ac.uk/whatwedo/projects/socialsoftware08 (accessed July 19, 2011).

Newman, M. E. J. (2004) ‘Fast algorithm for detecting community structure in networks’, Physical Review E, vol. 69, no. 6, pp. 66133 [Crossref]

R Development Core Team (2011) R: A Language and Environment for Statistical Computing, R Foundation for Statistical Computing, Vienna, Austria. ISBN 3-900051-07-0, [online] Available at: http://www.R-project.org (accessed July 19, 2011).

Reason, P. & Bradbury, H. eds. (2000) Handbook of Action Research: Participative Inquiry and Practice, Sage, Thousand Oaks, CA.

Trowler, P. & Trowler, V. (2010) Research and Evidence Base for Student Engagement, The Higher Education Academy, [online] Available at: http://www.heacademy.ac.uk/resources/detail/studentengagement/Research_and_evidence_base_for_student_engagement (accessed July 19, 2011).

Yu, A. Y. et al. (2010) ‘Can learning be virtually boosted? An investigation of online social networking impacts’, Computers & Education, vol. 55, no. 4, pp. 1494–1503. [Crossref]

Zeiler, M. D. (1968) ‘Fixed and variable schedules of response-independent reinforcement’, Journal of the Experimental Analysis of Behavior, vol. 11, pp. 405–414. [Crossref]

About The Authors

Joanne L. Badge

United Kingdom

Neil F.W. Saunders

Australia

Alan J. Cann

United Kingdom