Australasian Journal of Educational Technology
2008, 24(1), 57-72.
AJET 24

Effective online interaction: Mapping course design to bridge from research to practice

Mary Thorpe
The Open University

Quantitative and qualitative research of a case study course confirmed that the course achieved a highly interactive learning experience, associated with more effective student support and high student retention. Computer conferencing achieved high participation from the beginning and evidence of dialogue and argumentation within online tutor groups. This was achieved not by active tutor moderation but by a sequence of structured tasks. Compendium mind mapping software has been used to represent the design of this sequence of tasks and this has refined interpretation of the research findings. The positive outcomes identified relate not purely to computer conferencing but to an integration of individual and group tasks feeding forward into a well-designed assignment. The usability of case study data relates to the ability of practitioners to compare their own context with that of the case. The visual representation of the design of the task sequence is providing a better bridge from the research to the practice context than the use of general description of findings alone. This is particularly important in an area which has generated a range of sometimes conflicting findings, with weak links to the challenges of course design.


Introduction

This paper has two themes. First is the theme of interaction in online learning and the interpretation of research findings about an undergraduate course that successfully integrated computer mediated interaction into the study process. Second is the theme of educational technology research and the ways in which this might be communicated to practitioners with an interest in applying research based understandings to future design of online learning environments. It is suggested that research might have increased value if it provided more information about the design of the teaching and learning interactions associated with its findings. This would enable the findings reported to be interpreted in relation to the way in which the technology was implemented, and the context of the implementation, rather than to the technology as an abstract concept such as 'computer mediated communication'.

The format of this information is perhaps less important than that it is effectively communicated. Whether drawing on the work on patterns (Goodyear, 2005) or notational forms of learning design (Agostinho, 2006; Conole & Fill, 2005) representations of the events and processes involved are essential if we are to reveal key features of the design embedded in the learning and teaching episodes being researched. The justification for such an approach is twofold: first, that it should lead to a refinement of research outcomes about what the impacts of particular pedagogies are, and second that it should better support the use of that knowledge by the practitioner, as they make decisions about what technologies and pedagogical designs to use and adapt in their teaching.

Reeves has argued that instructional technology research has too often ignored issues that are important in current practice and has generally made little or no impact on practitioners (Reeves, 2000). If we consider the research on computer mediated conferencing and interaction as an example, research on learner reactions and perceptions is often presented with insufficient specification of the learning environment and activities that created the context for the research. In a review of research studies of interaction, Bannan-Ritland also identified 'multiple definitions of interactivity... in the primary research reported in educational technology academic journals' (2002, p162) and that 'only when distinct definitions of interactivity are delineated and types of interaction are clearly identified will the research... progress to provide informative practical guidance for the eLearning design and development community' (p172). Furthermore she reported a large number of studies of interpersonal interaction - learner-learner and learner-instructor types of interaction - with a majority of research studies focusing on asynchronous communication. Using Hirumi's (2002) taxonomy of interaction types, she noted that 'other types of interactivity... were minimally represented in the review, including learner-instruction and most learner-non-human types of interaction (learner-content, learner-interface, and learner-environment).' (Bannan-Ritland, p172-3). Bannan-Ritland also lists the outcomes of the research reviewed, for example 'interpersonal issues and creation of an initial welcoming atmosphere are important in eLearning courses', 'cooperative or collaborative activities are perceived to foster interactivity'. However, these general points still leave the practitioner with a very wide range of choices of design as ways of implementing these research outcomes.

Thus practitioners may struggle to build on the results of research by being unable to find ways of achieving the positive outcomes that others have reported. A recent review of professional online learning, (Maor & Volet, 2007) also notes that most of the empirical research 'reflects trial and error approaches with limited direction to guide instructional design, implementation, and future research'. As Haythornthwaite (2006, p9) comments, 'Much has been written in dystopic and utopic terms about the transformative nature of computer-mediated communication' and this is in part a result of the fact that CMC can be implemented in many different ways, in different contexts, with diverse effects. Using such research findings for the development of practice is therefore limited, unless we incorporate the details of the implementation into the research process and its communication.

This is not to argue that research thus far has nothing to offer practice. Haythornthwaite (2006) for example, has combined analysis of the research findings in collaborative learning, with clarification of the many dimensions of collaboration and the practical issues to which they give rise. She also includes evidence of some of the negative impacts, such as overload for staff and students, too little time for trust to develop and the clash between collaboration and individual achievement orientation. She identifies the key differences between cooperation and collaboration, and between knowledge application or knowledge construction, as distinctive forms that involve 'an important fundamental difference in process and goals' (2006, p6) The learning design intended to deliver these different forms of interaction will involve decisions about type and timing of learning tasks, structured or less structured task management, locus of control and duration of the interaction. Recommendations are made about what should be taken into account in the learning design, if not the design itself.

Other researchers of CMC have also sought to move beyond the early research emphasis on outcomes, specifically whether social interaction was beneficial for learning. The use of a computer mediated environment for interaction led to recognition that the nature of the interaction had to be designed for, and greater insight was required into what constitutes productive collaborative activity and the processes of learning collaboratively. (Littleton & Hakkinen, 1999). Littleton and Hakkinen highlight the developments in research from Piagetian cognitive constructivism, through the social constructivism inspired by Vygotsky's work, to cultural psychology (situated learning). Each perspective has emphasised key features of interaction, such as perspective differences and socio-cognitive conflict between collaborating pairs, the content of discourse, particularly the degree to which argumentation and justifications are involved, and the powerful influence of social representations of self and others. Much of this work however is based on either tightly controlled experimental situations or classroom observation. The introduction of computer managed and designed environments opens up new possibilities for collaboration, in terms of both who might be involved in collaborations, in what ways, and over what sort of time periods. This complexity adds a new dimension to an already complex situation in which, as Littleton and Hakkinen assert, 'the challenge... is how best to develop an understanding of collaborative learning environments as systemic wholes where all the factors reciprocally affect each other'. (1999, p.30).

Interaction research and research methods

Within distance education, Moore (1989) first identified interaction as a key factor in 'transactional distance'. Interaction whether between learner and content or between the learner and other learners and/or the tutor might reduce the negative effects on communication of distance between teacher and taught. The world wide web however has revolutionised what is feasible for distance education, in that learners can be networked and able to interact, whether one to one, one to many or many to many, at any time during their studies, at least in theory. Furthermore, being networked means that interaction with content and with people can be integrated within one virtual learning environment, as this definition makes clear:
networked learning is learning in which ICT is used to promote connections: between one learner and other learners; between learners and tutors; between a learning community and its learning resources (Goodyear et al, 2004)
This overcomes therefore not only the limitations of face to face interaction in distance education, which is likely to be infrequent and some distance away from most students, but also on the campus, where interactive events are still scheduled and available only at specific times and places. However, availability 'in principle' has not easily translated into availability and use in practice, whether for distance education or for the campus. An ESRC research series on networked learning concluded with a manifesto statement arguing for a move away from the use of ICT for delivery of resources, in favour of interactivity through use of conferencing, email and virtual learning environments (ESRC, 2002).

This lack of effective use of ICT to promote dialogue and interaction has persisted in reports of ICT usage for at least a decade. Within the Open University, student feedback over the same period has demonstrated a lack of takeup by students of opportunities for computer conferencing, unless such opportunities are made virtually compulsory by being tied into the assessment (Kirkwood & Price, 2005). McAlister et al (2004) also report that design of interaction and language supports can bring positive outcomes in terms of use of effective argumentation skills among groups of students studying masters level science. Notwithstanding such experiments, the use of conferencing has not produced easy wins in terms of high levels of participation and quality contribution by students.

The opportunity to pursue these issues arose within the context of a research project at the Open University UK, focusing on the issue of interaction in computer mediated teaching. A sample of thirty-six courses using interaction in varying ways and to varying degrees was selected in order to explore the nature and impact of different kinds of interaction (Thorpe & Godwin, 2006). One of the courses was made the subject of an in depth case study because of the high level of interaction used in its design and the integrated nature of the interaction within both the content and the process of the course.

This course, titled The Environmental Web (U316), was researched using a combination of both quantitative and qualitative methods. Conventional performance indicators suggested that the course was successful in terms of student retention and outcomes. Interviews with tutors and students were then used to identify key factors in the success of the course. These pointed towards the particular way in which students interacted both with content and with each other online. These findings are presented in the account which follows, but we also go further to document one particular activity as an instance of the success of the general pedagogical approach. A notational map and account of one of the most successful of the interactive learning activities on the course is used to specify the tasks and the design of the learning environment in detail. The aim is to provide evidence about what kind of interaction and collaboration generated the findings we report. We also aim to identify processes embedded in the design that appear to have played a highly facilitative role and thus to substantiate what it might be about the design of the activity that drives its success in context.

The course context: The Environmental Web

This course is equivalent to half of a full time year's study and recruits approximately 450 students each year, many of whom study it because it is mandatory within the Environmental Studies degree. Almost all students have studied other courses and some will be close to graduating. The course team asserts that environmental studies is carried out on the web and that students must use the web, not only for gathering information but for evaluating and discussing its significance. The emphasis is carried through into the aims and learning outcomes, summed up by the chair as follows:
Our overall aim is to provide you with the skills needed to develop your own environmental literacy and to take part in informed environmental debate and action, rather than to expand your environmental knowledge as such. (Course introduction).
The study process is led by an Online Activities guide for each of the four blocks of the course, and overall, students are expected to spend at least half their time studying online. Activities done individually require students to be active in searching, evaluating and using information from the web. Students also receive feedback on responses and individual activities lead into online discussion and interaction, and subsequently contribute to marked assignments. Although there are no tutorials, there is one day school, and tutors interact with students frequently online.

Interactivity in this course can be related to two of Bannan-Ritland's (2002) definitions of interaction - as active involvement by the learner, and as a range of instructional activities and technologies. The course also includes learner-environment interaction (Hirumi, 2002), in that students undertake field observations in their local area of specific types of birds, dragonflies and woodlice, inputting their data to the course biodiversity database which displays it in geographically referenced form for analysis and use in assignments. Students also learn about environmental journalism on the web, drafting their own article and voting on the best in their group and later, the best on the course. Two climate modelling tools also require students to interact with software and use the results in an assignment. Students submit several assignments as web pages, including their final project report, which is developed on a topic of their own choosing and replaces the conventional examination. Students thus interact in diverse ways - with their local environment, with other students on a continuing basis, with their tutor and with course resources and tools. All of Hirumi's types of human and non-human interaction are represented in this way.

The course was researched using both quantitative and qualitative methods. Data about the course overall is discussed first, then issues arising from these findings and from interviews with both tutors and students.

Evidence of the impact of the course design as a whole

One of the key performance indicators for any course within an open access, distance learning system, is its completion and pass rate. Table 1 shows that the course completion rate is around 10% higher than the average for all level 3 courses in the Science Faculty and marginally higher than the level 3 average in the Social Science Faculty in two out of three years. In 2005, 75.5% of all students who started the course achieved a credit, the highest percentage on this indicator for any level 3 course in the Science Faculty.

Table 1: Rates of completion compared with faculty averages from Science and Social Science

Course(s) base for calculation% of students* who complete
200320042005
U316: The Environmental Web78.674.077.6
Science Faculty average for all Level 3 courses69.268.566.5
Social Science Faculty average for all Level 3 courses76.477.376.5
* Total numbers for % calculation are as in the returns to the UK Higher Education Funding Council.

A survey of thirty six courses (including The Environmental Web) using computer mediated interaction was carried out in 2004 with a response rate of 47%, using an adapted version of Ramsden's Course Experience Questionnaire (Ramsden, 1991). This contained 36 questions generating seven scales: appropriate assessment, appropriate workload, clear goals and standards, emphasis on independence, good materials, good tutoring and generic skills. Student responses concerning The Environmental Web were highly positive in relation to appropriate assessment and generic skills, with highly ranked scores on items shown in Table 2 (where 1 equals 'strongly disagree and 5 equals 'strongly agree').

Students' responses showed that the course successfully requires them to apply understanding in completing the assignments and to develop higher level skills, in particular problem solving and team work. These relate to key aims for the course as a whole. The course meets the most demanding test in terms of retention and successful completion of assessment requirements. However, qualitative research was undertaken to explore in more depth the nature of the impact of computer mediated interaction on the course and whether this might play a strong role in its success.

Table 2: Positive responses to The Environmental Web
on items from the Course Experience Questionnaire

Items from the CEQScore,
TEW*
Median
score+
Rank
1=best
Helped me develop problem solving skills3.83.66
Helped my ability to work as a team member3.42.14
Has sharpened my analytic skills3.83.813
To do well on this course all you need is a good memory1.51.91
More confident about tackling unfamiliar problems3.63.38
The course was more to do with testing memory than understanding1.41.92
Helped me to develop the ability to plan my own work3.63.611
This course really tries to get the best out of all students3.93.78
* The Environmental Web      + based on 36 courses surveyed

The tutor perspective

Five tutors based in three regions (one quarter of all tutors) were interviewed. The regions were selected by the researchers and drew on diverse populations and all tutors in those regions participated on request. A semi-structured schedule was used, interviews were recorded and transcribed, checking the text with the respondent. A grounded approach was used in the analysis (Strauss, 1987), with detailed study of each transcript and building of rich connections between the perceptions communicated in each interview. The interview strategy was not to assume that computer mediated interaction was the most important feature of the course, but to explore tutor perceptions first, to find out how they perceived the important issues. Progressive focusing enabled exploration of the significance of interaction and the form it took on this course.

Tutors were asked to identify whether particular aspects of the course teaching were key to its success. All tutors picked out aspects associated with interpersonal interaction, with three tutors commenting on the conferencing, the high rate of participation and continuity of student contact with peers as well as the tutor. Tutors also highlighted the beginning of the course as key to its success. The first six weeks of the course focus on a learning activity which feeds into an online debate and culminates in a day school.

...Yes again going back to the beginning of the course, the way it starts, it's very intense at the beginning...that particular aspect of getting everybody involved right at the very beginning really sets the scene for the rest of the course. It blends tutor groups, it gets students involved with other students on a national basis and it starts in a very interesting way where students can get very involved. (Tutor 1)

...it starts well. Looking at information on a particular island so people feel that they get to grips with something and they've got ideas that they can take forward. (Tutor 2)

Students are not given a choice about online participation. All students are allocated to a tutor in groups of approximately 20 students per tutor. They must complete the activities and conference with tutorial group peers if they want to submit the first assignment. Tutors say that this gets students involved right from the start and ensures that they engage with each other and with the course. Tutors were also asked whether students are effectively supported on a course which, unlike most other OU courses, does not have regular face to face tutorials. All responded unhesitatingly that students are much better supported on The Environmental Web than other courses they tutored. The support comes as much from other students, they felt, as from tutors, with accessibility to online help being available at most times of the day and evening. Support can be for any issue that students raise, whether related to course content, assignments, technical difficulties or personal matters - though these are more likely to be raised via one to one emails with a tutor. This finding replicates one of the research outcomes that Bannan-Ritland notes, namely 'Instructors report that they spend more time (interacting) in an eLearning course than in traditional courses' (2002, p172) but also emphasises peer support as much more accessible where CMC is used:
...without question the computer conferencing aspect of the course offers so much support both at a national and tutor group level... the overwhelming feedback from that... was that it had been a huge help. (Tutor 1)

I think I give more support in lots of ways, and they certainly support each other. (Tutor 5)

I think they're more effectively supported because... I'm checking the conferences and my email everyday... if they really needed it for a period of time, they could get day to day support on the course. (Tutor 3)

I think that the students interact far far more... there's more of a sense of the group as a whole contributing and working together to develop their understanding. (Tutor 4)

All tutors were asked to reflect on what made the conferencing on this course work, when experience on other courses can be so disappointing. Most cited the uncompromising approach of the course team and the integration with assessment. However this compulsory aspect went along with high levels of interest and involvement that made the process enjoyable, not merely mandatory:
...with [The Environmental Web] there are marks in [assignments] for contributions and for writing up discussions, so in a way if they want marks... they've got to contribute... but it just seems to be that... students have to to some degree and I think they then enjoy it - most of them enjoy it - and that as I say involvement of most of the tutor group actually stays after the initial really intensive small islands debate that we have that goes on for several weeks, leading up to the day school and the day school brings all that together and the students get to know each other face to face and that's really really valuable because they then carry on the input to the tutor group conference. (Tutor 1)
This tutor and others emphasised the importance of the online activities in generating high levels of participation - again contrasting this with other less successful course approaches. These activities achieved genuine involvement - gave students a reason for being online, as this tutor expressed it:
...with [The Environmental Web] there are all these structured online tutor group conference activities... We're given guidelines... but with other courses we're not given that. It's very much up to individual tutors to get students to try and participate... that's the difference with U316. It's got these structured activities that hold students there and give them a reason for being there. (Tutor 1)
Activities requiring students to discuss and debate online were designed by the course team and did not require a strong moderating role on the part of the tutor. Indeed tutors were asked not to moderate actively but to monitor and intervene only if necessary to bring things back on track. Tutors retained their usual important role of individual contact and support, particularly at the beginning to make sure every student was successfully online, and throughout in terms of responsiveness to electronic contact. Tutors were also able to take local decisions about timing to ensure that the online collaboration worked successfully for their group, as these tutors outlined:
...with the conferencing pretty well everybody is involved, and I don't think anybody is worse off... I'm quite flexible with closing times for example, so if I know people are away they can still come in late to contribute and the students who've done the work by the deadlines for the conference, I make sure that they are not disadvantaged if they haven't picked up something later, but a student who turned up later and contributed will still get credit for that within the mark scheme. (Tutor 4)

I've been encouraged to use (conferencing) flexibly... to the advantage of my students... I try to start this activity as soon as possible and stretch it out a bit to enable everybody to contribute... Start the process a little early, brief a little early... it kind of smoothes out, dampens out the problems for some people and enables things to run more smoothly. (Tutor 3)

Tutor interviews therefore highlighted the richness of a genuinely constructivist pedagogy, where students have to develop their own judgements and apply knowledge in challenging problem solving tasks and collaborative activities. Their emphasis on the activities and the integration with assessment was taken forward into data collection with students, and reported in the next section.

Student feedback on the design of a collaborative learning task

Tutors had identified the start of the course - the first six weeks approximately - as critical to its success. The online activity guides were used to identify the structure and design of study tasks during this period. In brief, students engage in two main phases of activity which feed forward into the first assignment where 35% of the marks are based on analysis of the outcomes of online interaction in tutor groups. Students prepare for the collaboration by working individually on data collection and documentation. They then upload data and suggestions to the tutor group conference, where together they subsequently discuss and draft a consensus statement of demands as if from the Association of Small Island States (AOSIS), to the United Nations. The online guides are extremely clear, using a template for every task that specifies its name and number, states the learning outcomes and the estimated study time, explains the rationale for the task and sets out clearly what the student has to do. These tasks require students to act and to reflect on the results, representing a form of intrapersonal cognitive interaction (Bannan-Ritland, 2002, p.172), particularly where student responses generate feedback.

Ten students volunteered to contribute to the research and were contacted by email about their experience of completing these activities. Three were also interviewed by telephone, each interview lasting approximately one hour, using a semi-structured schedule. Interviews were transcribed and sent to students for checking. Most students were found to log on daily, and all had been able to do virtually all of the tasks set out in the series of activities, though study times varied. Student feedback on the collaborative online group activity in particular was very positive. Students were asked whether not having met their fellow students in advance made it difficult to participate. This student identified task clarity as key to being able to 'project themselves socially and emotionally' (Rourke et al, 2001, p.3) and to make effective contributions from the beginning:

Interviewer: So did you find it difficult to contribute... because you hadn't met these people first?
Student: No no not at all. Because in there we had an aim, we had a target so I didn't mind at all that I did not know the fellow students. We just exchanged views...
All were asked whether it had been possible to express differences of view, since previous research suggests students tend to avoid argumentation (McAlister, et al, 2004). The mechanism of representing a particular small island state appears to have had the positive effect of enabling students to express their views directly without fear of personal offence. The phased activity design meant that they had researched the vulnerabilities of their island before conferencing, and used evidence to support their views, genuinely engaged with the needs of their island.
Interviewer: did you find it possible to disagree?
Student: Oh very much so - people did disagree a lot and managed to put forward their points of view a lot, which I really liked, and backed it up with examples... most people's decisions were informed and you could see that.
This group had taken the initiative to use a spreadsheet to plot their views - evidence of self organisation in a context where the group were clear about their task and not dependent on tutor facilitation. It was also possible to disagree with a majority view where that clashed with what was in the interests of the island that a student was representing, as this student made clear:
About half way through we put everything on a spreadsheet to see what kind of opinions were coming forward, and it was quite clear that three issues were coming forward from most people, so you... thought... if you weren't in that consensus you would be in a minority and probably you'd have more sway if you felt able to join the majority... on most of the issues I could but there was one or two issues where I said no there's no way I'm going to compromise on that... I was Haiti, so I was very poor... there was a lot of wealthy islands, so some people didn't have the issues that Haiti did so there was some things that I just couldn't compromise on.
This comment reveals a degree of identification with the island being represented, so that the student feels that her arguments and views relate to something beyond herself or the preferences of other students. It appears to have released her into feeling able to disagree and where necessary, to take an independent position from the group. Another student worked in a group that agreed to work online on a selected date, to improve the process of reaching consensus - another sign of self organisation. She also described a process of reasoned debate where views could be changed:
Student:...We had a discussion about tourism... and that was one of the points that we'd agreed on the Sunday and then after some more of the comments the following week it was changed to not stopping tourism at all but going for eco-tourism and going for high taxes on air flights... so that opened up a separate debate in that area and that was one of the things that we altered the opinion on.
The phased design of the activity motivated students to engage with the evidence about small islands before interacting with their peers, and the role play freed them from unease in putting forward and justifying their positions - they argued for 'their' island, not for themselves.

Communicating research results to practitioners

The findings reviewed reveal a case study which offers practitioners a successful example of computer conferencing that achieved participation from all active students (i.e. those who submitted the first assignment), working in groups that had never met, but where students were willing nevertheless to engage in genuine debate, using evidence for their views and arguing constructively both for and against the consensus of their group.

However, the findings themselves raise many issues for the course designer interested in applying them to the design of online interaction that would achieve similar positive results. A simplified diagram of the activity was used in presentations of the research, to highlight the three stages of individual online data collection, followed by group collaboration and then the assignment. However this also proved too general to reveal important details. These were revealed however by a more detailed mapping using open source mind mapping software named Compendium. Compendium is currently being used to document learning designs in use at the Open University (Conole et al, 2007). Here it has proved useful in revealing key aspects of the design that relate to the research findings documented. Accordingly, the results of the research have been presented in workshops for practitioners alongside a Compendium map of the sequence of activities at the beginning of the course that produce such positive outcomes. This is shown in Figure 1, although it should be emphasised that this print version obscures the functionality of the software.

Figure 1

Figure 1: Mapping a sequence of tasks leading to completion of Assignment 1

As Goodyear asserts, academics do not want packaged solutions but 'customisable, reusable ideas' (Goodyear, 2005, p1). Compendium can be used either to clarify one particular task sequence, as here, or to provide a generic account of a number of tasks where there is a broad similarity of structure that can be revealed. In both cases, the practitioner can see the design and reflect on its possible applicability to a different context and similar but not identical learning objectives. The detail that can be incorporated 'behind' each icon can be accessed if necessary, or not viewed, in order to see the main steps in the sequence, the inputs, outputs and support roles.

Compendium has proved useful in the task of communicating from research results back to the design of student tasks. It has required detailed study of the actual tasks that students are required to do, including study time per activity. Each icon in the map on Figure 1 can be expanded digitally to show a text box where such details are included. Of the 8.5 hours required for the whole sequence for example, 3 hours are allocated to the group online collaboration. Almost all the rest of the time is spent working individually, involving content interaction and intra-personal interaction through searching for, analysing and using data about the small island allocated to the student.

The individual study phase is spread over three main stages where different types of data have to be collected and used by the student. These stages appear to play an important role in getting students to know their island and therefore to be in a position to represent it and even to identify with it. A much briefer period for data collection and review might not lead to such positive results. The tutor initiates the activity by allocating a different small island to each student in their tutor group and all are required to log on to a short practice conference which enables the tutor to chase any 'no shows' and ensure they engage with the main debate. During the individual research phase, students follow four main activity steps and interact with online resources:

Step 1   Gather data on the topography and main features of the island and input that into a prepared table.
Step 2Gather information about the likely environmental impacts on the island and write this up as a text summary.
Step 3Collect data on the carbon dioxide emissions of the island and breakdown of the energy sources used, added again to the data table for this purpose.
Step 4Draft an initial statement of possible claims by AOSIS and upload to the tutor group forum.

Students are then ready to engage in online collaboration with members of their tutor group. Their task is to draft a statement as from the Association of Small Island States (AOSIS) to the United Nations, outlining a number of demands and reparations required by AOSIS in light of the environmental vulnerabilities of the small island states that they each represent. Ideally all members of the group should feel able to agree with the statement but minority views can be expressed and signed by individual members if necessary. The interaction is triggered by the tutor and begins with each student submitting a summary of the vulnerabilities of their island and demands that would be in its interest. Students are directed to read their peers' postings (each of which will be about a different small island, as researched by the student) and to formulate ideas for the collective demands. Students then debate the issues online with their tutor group and try to reach agreement on their collective statement. This stage may extend over one to two weeks, and is allocated 3 hours study time though some students spend longer than this.

The final statement agreed by the group is submitted as part of the first assignment in the course, where students receive 35% of their marks for a) writing a review of the discussion; b) explaining their role/contribution to the discussion and how a consensus was reached, or if not why that was not possible; and c) identifying one key item of data from their individual research activities that would support one of the demands their group discussed, explaining the significance of the data and how it supports the demand.

Compendium mapping also creates a single 'view' of all the elements involved in the sequence, such as the input from the tutor that is required, and the role of resources and outputs in achieving the final result. Bringing together all these aspects into one 'view' enables the practitioner to get an overview of what Littleton and Hakkinen refer to as a 'systemic whole' - the many factors that together construct a collaborative learning environment. In using Compendium with practitioners at the OU, we are finding that a team can view a map and immediately reflect on its applicability for themselves, and on the various aspects that might be needed or adapted in their own context (Conole, 2007). It stimulates useful reflection at a number of levels - not only the detail of what happens when, but also on the suitability of tasks in relation to particular learning outcomes and alternative routes to achieving similar ends.

However, a visual map alone is unlikely to satisfy all requirements associated with educational design. This application of Compendium is being used to illuminate research findings and take them a stage closer to the practitioner perspective. It therefore sits within a detailed account of a particular context and rich information about student and tutor response and student achievement. This kind of narrative detail is helpful in making judgements about what from a particular case might be adapted or applied elsewhere (Yin, 2003). The unpredictability of the impact of educational design and the complexity of the process leads Goodyear (2005) to favour Alexander's framework of a pattern language. One element in the pattern framework is to identify the problem for which the design offers a solution.

Addressing this question in relation to the sequence of tasks mapped in Figure 1 reveals the fact that 'the problem' is multiple rather than singular - there are many challenges that this pattern is a 'solution' for:

The response to this list of problems is not so much a single task as a sequence of linked tasks that together address all the challenges that the course team had to succeed in at this point in the course. There is no single 'activity type' that alone delivers the positive student response.

These multiple challenges are typical of the task facing practitioners designing a course and while we can simplify them by splitting them into component tasks, in reality many teams and designers will need to recombine patterns or small scale designs - whatever is the favoured terminology - in order to deliver an effective design overall. We have much to learn about the orchestration of tasks and resources, as well as about the design of their component parts.

Design mapping and the interpretation and communication of research

The design of a course selected as a case study of integrated, computer mediated interaction provided a highly structured context which successfully engaged students and supported their achievement of key skills and assessment goals, notably problem solving, team work and tackling unfamiliar problems. Tutor feedback was extremely positive, and interaction was valued primarily in terms of the high level of interpersonal contact, and for the active study process it delivered. Students were also seen to be much better supported than on other OU courses, even though there were no regular face to face tutorials. These features were seen as key to the success of the teaching, particularly at the beginning of the course. The number of students researched was small, though tutors drew on their experience of approximately a quarter of the students on the course. Nevertheless generalisation beyond the case depends on the relevance of contextual and design features, rather than statistical representation.

Research into the student experience confirmed the success of the structured activities at the beginning of the course, and their effectiveness in supporting reasoned argument and constructive difference of opinion. This was achieved through the design of activity and did not depend primarily on the skills of a moderator or facilitator, which previous research has emphasised (Salmon, 2000). The research therefore provides positive evidence in favour of conferencing and collaborative online tasks, even at a stage where students had not met each other.

However it was important to document the design of the course that delivers this positive outcome, in order to identify both the pedagogical strategy used and the way in which a sequence of structured tasks supports effective participation. This was done using Compendium software in addition to prose description. This process of documenting the design of key stages in a successful course helped refine both the interpretation of research findings and their communication to practitioners. Although the qualitative research with tutors and students pointed towards interpersonal interaction as key to the course's success, analysis of the online activities revealed a more complex situation. A sequence of carefully crafted online tasks required students to engage first in content and intra-personal interaction (Bannan-Ritland, 2002) and these forms of interaction were essential to the quality of the online group collaboration they fed into. The mapping also revealed other important elements, namely inputs in the form of detailed explanations in the online guides, outputs that students were required to construct at various points, and the format of the assignment. The findings from the research therefore offer a different perspective from that of studies providing evidence about how CMC in general can play a productive role in supporting learning. They are evidence of how a sequence of tasks requiring diverse forms of interaction effectively combined individual and group work online, to ensure high participation and online argumentation.

While a narrative account of the course design was still required, it was the construction of the map that prompted detailed checking of the design, the inputs and the outputs it required, and led to refinement of the interpretation of the research findings. In communicating the outcome of research to teaching staff, it has been essential to use both a narrative and a visual representation (as in Figure 1) to clarify the contextual factors and pedagogical practices associated with, in this case, positive outcomes from a particular technology in use. Without this detailed mapping it would be more difficult to enable teaching staff to apply the results to the design of their own courses. This is unlikely to be a process of direct reuse but of learning about how a sequence in one context worked, and being stimulated to see how it might be creatively applied elsewhere. The clarification of the visual representation together with the rich detail that is available should the practitioner need it, provides a good basis for judging whether this is (for the practitioner) an instance of 'good design' and one which might be adapted for their own teaching.

Acknowledgements

We wish to thank Professor Jonathan Silvertown, former chair of The Environmental Web, and members of the course team, for their cooperation in the research. We also acknowledge the Andrew W. Mellon Foundation who provided financial support for the research on The Impact of Computer-Mediated Interaction in Higher Education. Reports are available from http://kn.open.ac.uk/workspace.cfm?wpid=2668 [login required]

Some elements in this paper also appear in the Journal of Lifelong Learning Society published by the Korea National Open University, Institute of Distance Education. Thanks also to the AJET reviewers for their helpful comments on an earlier version of this paper.

References

Agostinho, S. (2006). The use of a visual learning design representation to document and communicate teaching ideas. In Who's learning? Whose technology? Proceedings ASCILITE Conference. The University of Sydney. http://www.ascilite.org.au/conferences/sydney06/proceeding/pdf_papers/p173.pdf

Bannan-Ritland, B. (2002). Computer-mediated communication, elearning and interactivity: A review of the research. Quarterly Review of Distance Education, 3(2), 161-179.

Conole, G. & Fill, K. (2005). A learning design toolkit to create pedagogically effective learning activities. Journal of Interactive Media in Education, 2005/08. [viewed 15 Nov 2007] http://www-jime.open.ac.uk/2005/08/

Conole, G., Thorpe, M., Weller, M., Wilson, P., Nixon, S. & Grace, P.(2007). Capturing practice and scaffolding learning design. Paper presented at the EDEN annual conference 2007 New Learning 2.0? Naples, June 2007.

ESRC (Economic & Social Research Council) (2002). Working towards e-quality in networked e-learning in higher education: A manifesto statement for debate. Report of UK ESCR Seminar Series. [viewed 21 July 2006, verified 16 Jan 2008] http://www.csalt.lancs.ac.uk/esrc/manifesto.pdf

Goodyear, P., Banks, S., Hodgson, V. & McConnell, D. (2004). Research on networked learning: Aims and approaches. In P. Goodyear, S. Banks, V. Hodgson & D. McConnell (Eds), Advances in research on networked learning. Dordrecht: Kluwer Academic Publishers.

Goodyear, P. (2005). Educational design and networked learning: Patterns, pattern languages and design practice. Australasian Journal of Educational Technology, 21(1), 82-101. http://www.ascilite.org.au/ajet/ajet21/goodyear.html

Haythornthwaite, C. (2006). Facilitating collaboration in online learning. Journal of Asynchronous Learning Networks, 10(1), 7-23. http://www.sloan-c.org/publications/jaln/v10n1/v10n1_2haythornthwaite.asp

Hirumi, A. (2002). A framework for analyzing, designing and sequencing planned elearning interactions. Quarterly Review of Distance Education, 3(2), 141-160.

Kirkwood, A. & Price, L. (2005). Learners and learning in the twenty-first century: What do we know about students' attitudes towards and experiences of information and communication technologies that will help us design courses? Studies in Higher Education, 30 (3), 257-274.

Littleton, K. & Hakkinen, P. (1999). Learning together: Understanding the processes of computer-based collaborative learning. In P. Dillenbourg (Ed), Collaborative learning: Cognitive and computational approaches. Oxford: Elsevier.

Maor, D. & Volet, S. (2007). Interactivity in professional online learning: A review of research based studies. Australasian Journal of Educational Technology, 23(2), 269-290. http://www.ascilite.org.au/ajet/ajet23/maor.html

McAlister, S., Ravenscroft, A. & Scanlon, E. (2004). Combining interaction and context design to support collaborative argumentation using a tool for synchronous CMC. Journal of Computer Assisted Learning, 20(3), 194-204.

Moore, M. G. (1989). Editorial: Three types of interaction. The American Journal of Distance Education, 3(2), 1-6.

The Open University (2006). The Environmental Web, Course Welcome. [viewed 21 July 2006; login required] http://students.open.ac.uk/u316/course/welcome.cfm

Ramsden, P. (1991). A performance indicator of teaching quality in higher education: The Course Experience Questionnaire. Studies in Higher Education, 16, 129-150.

Reeves, T. (2000). Enhancing the worth of instructional technology research through "design experiments" and other development research strategies. Paper presented at 'International Perspectives on Instructional Technology Research for the 21st Century', a Symposium sponsored by SIG/Instructional Technology at the Annual Meeting of AERA, New Orleans, LA, USA. [verified 26 Dec 2007] http://it.coe.uga.edu/~treeves/AERA2000Reeves.pdf

Rourke, L., Anderson, T. & Garrison, R. (2001). Assessing social presence in asynchronous text-based computer conferencing. Journal of Distance Education, 14(2), 1-16.

Salmon, G. (2000). E-moderating: The key to teaching and learning online. London: Kogan Page.

Strauss, A. L. (1987). Qualitative analysis for social scientists. Cambridge: Cambridge University Press.

Thorpe, M. & Godwin, S. (2006). Interaction and e-learning: The student experience. Studies in Continuing Education, 28(3), 203-221.

Wu, D. & Hiltz, S. R. (2004). Predicting learning from asynchronous online discussions. Journal of Asynchronous Learning Networks, 8(2), 139-152. http://www.sloan-c.org/publications/jaln/v8n2/v8n2_wu.asp

Yin, R. K. (2003). Case study research: design and methods (3rd ed.). Thousand Oaks California: Sage Publications.

Author: Professor Mary S. Thorpe, Institute of Educational Technology,
The Open University, Walton Hall, Milton Keynes, MK7 6AA, United Kingdom.
Email: M.S.Thorpe@open.ac.uk

Please cite as: Thorpe, M. (2008). Effective online interaction: Mapping course design to bridge from research to practice. Australasian Journal of Educational Technology, 24(1), 57-72. http://www.ascilite.org.au/ajet/ajet24/thorpe.html


[ PDF version of this article ] [ AJET 24 ] [ AJET home ]
HTML Editor: Roger Atkinson [rjatkinson@bigpond.com]
This URL: http://www.ascilite.org.au/ajet/ajet24/thorpe.html
Created 16 Jan 2008. Last revised 16 Jan 2008.

Page access count