|Australasian Journal of Educational Technology
2006, 22(4), 548-567.
The introduction of an online learning management system (LMS) raises a number of complex issues involving institutional responses at various levels to the adoption and diffusion of technological change. Issues include those related to governance, management and technical support, as well as to core learning and teaching matters associated with the professional development and teaching of academic staff, and the support of staff and students. This paper draws on two cycles of an evaluation conducted in one institution as WebCT Vista was introduced and piloted, highlighting the key issues that emerged from the evaluation. These issues are considered in the context of a selected model for examining the adoption and diffusion of information and communication technologies (ICTs) in higher education, with a view to analysing the outcomes of the initiative, and guiding future planning.
WebCT Campus Edition (CE) was introduced as a University wide central service in 2002. By late 2003 the new WebCT enterprise product (WebCT Vista) was seen as having the potential to deliver substantial benefits to the University and address deficiencies encountered with the CE version. These deficiencies included:
The pilot project was designed to support the University's commitment to student centredness and flexibility in its learning and teaching programs, taking direction from its Global Development Framework which specified institutional use of opportunities presented through changing technology. The project was sponsored by the Deputy Vice Chancellor (Academic and Planning), with steering committee members from the Office of the sponsor, ITS, CeLTS and the faculties. A reference group consisting of representatives appointed by Deans was responsible for faculty related decision making, and a project team represented key ITS and CeLTS service responsibilities.
Most models initially examined were relevant to the adoption and implementation of an LMS at an institutional level, but did not have the breadth of coverage to constitute an appropriate framework on their own. This was because they often focused on one or more aspects of the process rather than the whole (Burkman, 1987; Ely, 1990; Hall & Hord, 1987; Havelock & Zlotolow, 1995; Palaskas, 2002; Rogers, 2003; Sherry, Billig, Tavalin & Gibson, 2000; Stockdill & Morehouse, 1992; Tessmer, 1990; Zaltman & Duncan, 1977). These models can be categorised under three broad headings: those focusing on the characteristics of the adopters and users of technology; those that address concerns about the environment; and those that consider the change process itself and conditions that support or constrain it.
In the first group, the seminal work on the diffusion of innovation by Rogers (2003) deals with various attributes of diffusion including the rate of adoption, adopter categories, innovation attributes, and the diffusion process itself. Burkman's (1987) model focuses on the maintenance of positive perceptions towards the innovation by prospective adopters. Typical of the second category is the model by Tessmer (1990) which supports an environmental analysis, including physical considerations and patterns of use. Several models in the third category examine the change process in broad terms and facilitate the identification of conditions that support or impede its progress. For example, Stockdill and Morehouse (1992) identify critical factors that facilitate adoption; Havelock and Zlotolow (1995) focus on the various stages of planned change; and Zaltman and Duncan (1977) identify eighteen potential barriers to change. Palaskas (2002) supports ICT based innovation through a framework for the development of technology mediated teaching strategies.
The model which appeared to be most useful for post-adoption analysis of an institutional innovation, with the potential for pre-adoption guidance of future practice, was the RIPPLES (Resources, Infrastructure, People, Policies, Learning, Evaluation and Support) model developed by Surry, Ensminger and Haab (2005). This model comprehensively covers a range of factors for consideration including:
Evaluation strategies in both semesters included analysis of project documentation, along with a range of approaches to gain information directly from respondents. In Semester 1 the evaluation focused on responses from members of staff (though some individual staff members implemented student surveys, and one a tutor survey, and provided their results for inclusion in the evaluation). In Semester 2 both staff and student responses were sought (Table 1). The 'online communication space' was a set of discussion topics on a WebCT (CE) site in Semester 1 (as WebCT Vista stability could not be guaranteed at that stage), and on WebCT Vista in Semester 2. The student survey in Semester 2 was administered by the University's Centre for Higher Education Quality.
|Semester 1||Semester 2|
|Analysis of project documentation||Analysis of project documentation|
|Online communication space (staff)||215||Online communication space (staff)||149|
|Staff interviews||24||Staff focus groups (4)||19|
|End of semester staff questionnaire||5||Staff questionnaire||29|
|Student surveys administered by individual |
- Medicine, Nursing & Health Sci
- Information Technology
|Tutor survey administered by individual staff member||3||Student focus group||2|
|Evaluation framework||RIPPLES model|
|Training and professional devt issues||People; Learning; Support|
|Pedagogical issues||People; Learning [and teaching]|
|Staff and student support issues||People; Support|
|Administrative issues||Infrastructure; People; Support|
|Technical issues||Infrastructure, People; Support|
|Communication issues||People; Policies; Evaluation; Support|
|Overall response||People; Evaluation; Support|
Four components of support which are important to the successful introduction of learning technologies are identified in the RIPPLES model: training, technical support, pedagogical support, and administrative leadership (Surry, Ensminger & Haab, 2005). Thus, in this context, the above findings highlight the importance of training support, indicating a preference for informal training over formal training, as familiarity with the technology develops. There was also recognition of the need for pedagogical support in the way that the training was framed, which links to the learning component of the model, reflecting some recognition that the use of technology should be primarily driven by the fulfilment of learning needs. The concerns of teaching staff about training and other issues also underline the importance of individual people gaining the skills to use the technology appropriately, suggesting the relevance of stages of concern and levels of use (Hall & Hord, 1987) in adoption patterns.
An effort was made in Semester 2 to broaden the scope of the staff evaluation to cover ways that WebCT Vista was used and to note pedagogically innovative and effective uses which had potential for sharing with other teaching staff. However, staff focus group responses indicated that its main use was to provide lecture notes or off campus materials online, although use also included social interaction and provision of assistance to students. Responses to the Semester 2 online student survey were consistent with this, indicating that the main reasons students used WebCT Vista were to access unit outlines and other unit information, and the calendar and discussion function. In some cases staff reported that they were responding to the demand from students to provide information online, noting the importance of a flexible learning environment for students who are employed, and the potential of providing extension materials for more capable students. Table 3 indicates the most frequent uses and intended uses of the LMS by Semester 2 staff questionnaire respondents.
Responses on pedagogical issues relate most obviously to the learning aspect of the RIPPLES model, though they also reflect concerns and levels of use of people, as indicated above, and the potential for the social engagement of people. Despite recognition of the importance of pedagogy in the use of the technology in responses on training issues, it did not appear to be the centre of attention for teaching staff. Use of the LMS for learning, for the most part, seemed to involve fairly unsophisticated use of the tools available, and in some cases it was used primarily to provide access to information, rather than to engage students directly in an online learning environment. These uses can be compared with the findings of an earlier study of the use of WebCT CE at the University (Weaver, Nair & Spratt, 2005) which suggested that staff focused on the technical, administrative and workload aspects of using the LMS and that many students reported poorly designed sites, little or no feedback from staff, outdated information on sites and broken links. They also reflect the limitations in research and understandings about the pedagogical issues related to LMS use which are evident in the broader higher education community (Coates, 2005; Coates, James & Baldwin, 2005). Nevertheless, there was evidence in the pilot evaluation of some inclusion of online pedagogical components additional to LMS tools, and of intentions to explore the use of LMS tools not currently being implemented.
Just under half of the Semester 2 student survey respondents reported that they were aware that the WebCT Helpdesk was the first point of contact if they had problems, and about four in ten indicated that they knew how to contact the Helpdesk if they needed to. General satisfaction with the services was reported, though nearly two thirds indicated that they had not used or tested them. Of those who had contacted the Helpdesk, the two main purposes were to seek assistance with login problems and to access features or materials on WebCT Vista sites.
As the title of this aspect of the evaluation suggests, responses on this issue relate centrally to the importance of support systems identified by the RIPPLES model, and to their appropriateness for the people using them. In this case, responses referred specifically to technical support, indicating qualified satisfaction with this, despite some clear directions for rethinking support arrangements at faculty level (for teaching and administrative staff), and evidence of limited ability to comment by students who had not used the services (which could indicate success in other aspects of implementation if this level of support had not been required).
From the perspective of the RIPPLES model, administrative aspects of the evaluation referred particularly to the new roles required of some people (the faculty administrators) as devolution related to the innovation was implemented. Their concerns also raised issues about infrastructure, indicating a need to ensure that this was adequate at faculty level. Responses also raised again the importance of training support associated with the group administration role. Although, at the time the evaluation was implemented, there was evidence of some apprehension related to this new role, no major problems were indicated.
In Semester 1 the main infrastructure issues raised by staff users related to server downtimes (particularly unscheduled downtimes), the speed of the server, and the need to monitor server performance. As a consequence, timing of scheduled downtimes was discussed to find a time which caused minimum disruption to teaching. Disaster recovery planning was recognised as a key issue and addressed. A server administration and archiving policy was also discussed and accepted. Decisions about middleware functionality were made in consultation with staff users to provide for different models of faculty administration, leading to a targeted completion date in mid 2004, which was successfully achieved.
A major concern of users was the balance between server performance and LMS functionality, resulting from the decision to turn off the Mail and Chat functions, and the My Files area, to improve performance. Functionality issues dominated discussion in project documentation, interviews and the project communication space, with major concerns including problems with quiz migration, bugs identified, and student login problems (including issues related to absence of the Java Virtual Machine which was needed to run the system, and to promote the browser tune up function to ensure correct browser configuration). A particular issue raised during the evaluation, with implications at faculty level relating to login and access, was the need for appropriately supported computer laboratories.
By Semester 2, despite the escalation in service that had been required to accommodate the increased number of units, focus group respondents indicated that access was generally good, although there had been some login problems. However, questionnaire responses suggested that technical problems caused a substantial number of interruptions to preparation time and class time, and staff postings in the online communication space were predominantly concerned with technical issues (130 of 149 postings). As in Semester 1, the focus of most of these was on specific aspects of LMS functionality, rather than the infrastructure itself, though there were some comments about modifications to the service being made without notifying the user community, and there was some tension evident in postings late in the semester as performance problems arose during the assessment period. There was evidence of continuing access difficulties by some tutors. The availability of technical support on a 24/7 basis was requested.
Technical issues formed the recurring theme in relation to Semester 1 staff information about student responses, particularly relating to student problems in logging into sites or accessing site features. In the Pharmacy survey this did not appear t o be a major issue, but there were some vigorous complaints about the occasional server downtime. Access was more of a problem in a survey of two units (with 129 responses) in the Faculty of Medicine, Nursing & Health Sciences, with about four in ten students indicating login problems on or off campus, and some experiencing difficulty in accessing site features. Similar access problems were reported in a survey of 30 Information Technology students. In many cases students were not aware of, or had not undertaken, the browser tune up process.
Nearly three quarters of the respondents to the Semester 2 student survey indicated no login problems, with only small percentages of those using various tools reporting problems in accessing them. Over four in five had not performed a browser tune up. Staff focus group respondents in Semester 2 also indicated that students did not have problems, though access and login problems experienced by off campus students were mentioned. However, only half of the staff questionnaire respondents considered that their students were able to access the LMS when required. A third of them suggested that students had reported that technical problems constantly interrupted their work time.
Responses on technical issues relate to a major focus of the RIPPLES model, that of infrastructure, which was also intended as the major technical focus of the evaluation. While infrastructure issues were raised, the preoccupation of users (people) with the functionality of the system highlighted once again the importance of support, both training and technical support, as many of the problems experienced did not appear to be as major as they were perceived to be as a result of heightened anxiety levels. In this context, there was also some evidence that staff perceived students' problems to be greater than the students did themselves. While there were some problems as the infrastructure was established and fine tuned, from a technical service provider perspective these issues were not insurmountable, and the identification of them was a major purpose of the pilot.
A number of other communication problems were reported in Semester 1 evaluation interviews, both from support staff and teaching staff. Specific concerns, from a central support perspective, included:
All respondents to the Semester 1 staff survey indicated that their students were satisfied with the WebCT Vista learning environment. The Pharmacy student survey also indicated a general sense of satisfaction from students. About a third of the respondents to the student survey implemented in the Faculty of Medicine, Nursing & Health Sciences found it harder to use than the existing WebCT service, while about half the Information Technology students indicated satisfaction, with a staff member posting the following student comment in the online communication space:
I think Vista is absolutely fantastic! Especially since I am quite busy with work, it allows me to catch up on all of the discussions, lecture material, grades and notices in one location, at any computer. Also the ability to save our work to the site is of great importance. The lack of media such as paperwork and computer disks, has made it easier to concentrate on the contents of the subject, rather than 'administration' tasks. I personally would like to see Vista used for every subject I am doing at Monash, and hopefully the lecturers will be as great as [name deleted] has been in supporting the students through Vista.Only a third of the Semester 2 student survey respondents indicated high satisfaction with the system, nearly half expressing varying degrees of dissatisfaction and about one in five expressing neither satisfaction nor dissatisfaction. A third of staff questionnaire respondents agreed that their students had a positive attitude to WebCT Vista and the same percentage disagreed.
It is acknowledged that questionnaire responses need to be considered in the context of the limited response rates achieved, though comparison of survey data with information derived from other strategies provided a means of tr iangulation and identification of common themes. A more limiting factor may have been the scope of the evaluation itself. Surry, Ensminger and Jones (2003), in an earlier paper, refer to four areas of evaluation which apply to the introduction of technology: evaluation in relation to learning goals; evaluation of the technology, including ongoing assessment of technology alternatives; evaluation of the integration plan to determine the factors that have facilitated or impeded the introduction of technology; and a benefit/cost evaluation. The evaluation reported here primarily focused on the third of these, though efforts were made to focus on the first as the major issue underpinning the innovation. Consequently, the overall responses outlined above summarise the responses of people (the users), reflecting more directly the adequacy of technical support, than the infrastructure and learning opportunities offered by the innovation.
Similarly, although the existence of the evaluation process indicated acknowledgement of the need to monitor the introduction of the technology, the evaluation did not cover all of the areas suggested by the RIPPLES model, and was not adequately integrated into the project to inform decision making at an institutional level while the pilot progressed. A separate technical evaluation and decision making process was, in fact, driving much of the project, perhaps underpinning the limited emphasis on learning. While technical evaluation (including evaluation of technical alternatives) has continued since the project, this has not been complemented by an ongoing coordinated evaluation of other aspects of the service.
Another notable absence in the evaluation described here is the lack of reference to resources. Surry, Ensminger and Jones (2003, p.14), in explaining the breadth of their model and the need to plan for adequate funding, comment that:
It is interesting to note ... that the adoption and diffusion literature includes very little discussion about the importance of financial resources to the change process. One reason for this may be that most adoption and diffusion models assume that funding has already been secured and an innovation is available for adoption.As indicated earlier, this latter situation was the case in relation to the WebCT Vista pilot at Monash University.
Two related questions arise from considering the results of the evaluation in the context of the RIPPLES model:
|Review||Priorities for action|
|Resources||Not applicable||Not applicable|
|Infrastructure||Addressed by pilot||Low: appropriate infrastructure in place|
|People||Concerns in a number of areas require addressing for transition to a pedagogically effective production service||High: to improve perceptions and use of the LMS|
|Policies||Not specifically articulated at institutional level and not operationalised through appropriate management structures which foreground use of the LMS to improve learning and teaching||High: policies and strategies to link governance, management and administration needed, involving leadership by educators supported by appropriate technical infrastructure|
|Learning||Importance recognised but not the major focus of pilot participants||High: to improve use of the LMS|
|Evaluation||Scope and potential for impact too limited for an institutional initiative||High: needs to be ongoing and better integrated with strategic goals and implementation plans related to pedagogy. Responsiveness to learning and teaching needs identified by student evaluations would also be desirable.|
|Support||Addressed by pilot and evaluation||Medium/low: ongoing monitoring needed|
As indicated, a review of evaluation results using the RIPPLES model suggests that the highest priorities for action relate to people, policies, learning and evaluation. This differs from the findings of Surry, Ensminger and Jones (2003, p.13) who state that according to their study 'technology infrastructure is the single most important factor in integrating technology into the curriculum.' While it could be argued that all subsequent elements are contingent on adequate infrastructure, the findings of this study suggest that these other elements have a high degree of importance, if improvements in learning and teaching are to be achieved.
In response to the second question, use of the model for the purposes described in this paper has raised some issues that might suggest potential for its refinement. Firstly, although the model addresses the importance of using technology for improving learning, there is scope for increasing focus on the implications of this for teaching, since the quality of online teaching is a major factor in successful online learning. While the model acknowledges the need for pedagogical support, this appears to underplay the professional development that is frequently needed for teaching staff to adapt to teaching online (Epper & Bates, 2001). Secondly, there appears to be room for increased focus on management issues relating to the adoption of technology innovations, particularly to accommodate different institutional structures. Appropriate policies are vital but they require complementary implementation processes. The model assumes administrative leadership but does not suggest ways of achieving this. Thirdly, broadening the resources element of the model to include more than fiscal resources seems appropriate, given the resource commitments required at various levels to introduce an institutional tec hnology innovation successfully.
Burkman, E. (1987). Factors affecting utilization. In R. M. Gagne (Ed), Instructional Technology: Foundations (pp. 429-455). New Jersey: Lawrence Erlbaum Associates.
Coates, H. (2005). Leveraging LMSs to enhance campus-based student engagement. Educause Quarterly, 28(1). http://www.educause.edu/apps/eq/eqm05/eqm05110.asp?bhcp=1
Coates, H., James, R. & Baldwin, G. (2005). A critical examination of the effects of learning management systems on university teaching and learning. Tertiary Education and Management, 11, 19-36.
Cummings, R., Phillips, R., Tilbrook, R. & Lowe, K. (2005), Middle-out approaches to reform of university teaching and learning: champions striding between the top-down and bottom-up approaches. International Review of Research in Open and Distance Learning, 6(1). http://www.irrodl.org/content/v6.1/cummings.html
Ely, D. (1990). Conditions that facilitate the implementation of educational technology innovations. Journal of Research on Computing in Education, 23(2), 298-305.
Epper, R.M. & Bates, A.W. (Eds) (2001). Teaching faculty how to use technology. Best practices from leading institutions. Westport, USA: Oryx Press.
Hall, G. & Hord, S. (1987). Change in schools: Facilitating the process. Albany, NY: State University of New York Press.
Havelock, R., & Zlotolow, S. (1995). The change agent's guide (2nd ed.). Englewood Cliffs, NJ: Educational Technology Publications.
Monash University (2003). Learning and Teaching Plan 2003-2005. Melbourne: Monash University.
Monash University (2006). Monash statistics. [viewed 5 Apr 2006] http://www.monash.edu.au/about/stats.html
Palaskas, C. (2002). A model for selecting technology mediated teaching strategies. Educational Technology Magazine, 42(6), 49-54.
Reeves, T. C., & Hedberg, J. G. (2003). Interactive learning systems evaluation. Englewood Cliffs, NJ: Educational Technology Publications.
Respondus (2000-2006). Assessment, survey and game applications for elearning. http://www.respondus.com/
Rogers, E. M. (2003). Diffusion of innovations (5th ed.). New York: The Free Press.
Schön, D. A .(1983). The reflective practitioner: How professionals think in action. Aldershot: Ashgate.
Schön, D. A. (1987). Educating the reflective practitioner. San Francisco: Jossey-Bass.
Sherry, L., Billig, S., Tavalin, F. & Gibson, D. (2000). New insights on technology adoption in communities of learners. In C. Crawford et al. (Eds), Proceedings of Society for Information Technology and Teacher Education International Conference 2000 (pp. 2044-2049). Chesapeake, VA: AACE.
Stockdill, S. H., & Morehouse, D. L. (1992). Critical factors in the successful adoption of technology: A checklist based on TCD findings. Educational Technology, 32(1), 57-58.
Surry, D. W., Ensminger, D.C. & Haab, M. (2005). A model for integrating instructional technology into higher education. British Journal of Educational Technology, 36(2), 327-329.
Surry, D. W., Ensminger, D.C. & Jones, M. (2003). A model for integrating instructional technology into higher education. [viewed 5 Apr 2006] http://www.iphase.org/papers/RIPPLES.rtf
Tessmer, M. (1990). Environmental analysis: A neglected stage of instructional design. Educational Technology Research & Development, 38(1), 55-64.
Weaver, D., Nair, S. & Spratt, C. (2005). Experiences of online teaching using WebCT: An institutional evaluation. Paper presented at the Fourth International Conference on Science, Mathematics and Technology Education, 25-28 August, Victoria, Canada.
Zaltman, G., & Duncan, R. (1977). Strategies for planned change. New York, NY: John Wiley and Sons.
|Authors: Dr Robyn Benson, Centre for Medical & Health Sciences Education, Building 52, Clayton Campus, Monash University, Victoria, 3800. Email: Robyn.Benson@med.monash.edu.au
Dr Tom Palaskas, Faculty of Business and Law, Footscray Campus (G4.11), PO Box 14428,Victoria University, Victoria, 8001. Email: Tom.Palaskas@vu.edu.au
Please cite as: Benson, R. and Palaskas, T. (2006). Introducing a new learning management system: An institutional case study. Australasian Journal of Educational Technology, 22(4), 548-567. http://www.ascilite.org.au/ajet/ajet22/benson.html