|Australian Journal of Educational Technology
2002, 18(2), 208-226.
This paper reports on two cycles of evaluation conducted on a CD-ROM tutorial, Reflex Control of Blood Pressure, during 2000. The evaluation was undertaken as part of a national ASCILITE and CUTSD project, Learning Centred Evaluation of Computer Facilitated Learning Projects in Higher Education.
Traditionally, the function of this reflex has been taught in lectures, using diagrams of a 'black box' to describe what was happening to nerve impulses in the brain. Not surprisingly, students memorised the diagram, but had little understanding of how negative feedback occurred. A common student misconception (reported by academics) was that negative feedback mechanisms would over correct blood pressure beyond the set point.
Literature and software searches for developments dealing with this area have to date been unsuccessful. Several simple animations of the baroreceptor reflex operating are available, but these are all of the "watch and learn" type, with little or no interactivity, or are merely question and answer exercises. Mathematical simulations have been attempted for some other biological mechanisms, but in the case of blood pressure control, which has a multitude of different factors all contributing to the outcome, it is easy to lose sight of the underlying concept of this reflex. The aim of this project was to develop a model building exercise, which would involve a very simple animation to demonstrate the outcome of student decisions, with no attempt to make this a "real world" or quantitative simulation. This would allow students to explore the consequences of different scenarios, and receive feedback on their decisions. Highly interactive model building exercises of this type are rare, and we have been unable to find anything similar in this field.
The software developed, Reflex Control of Blood Pressure, provides introductory information on the regions of the brain involved in this reflex pathway, definitions of feedback control, and an introduction to the symbols used to represent elements of a neural pathway. The major section of the program is a model building exercise, where students manipulate nervous system components to construct their own reflex pathway, and then can test this by increasing the blood pressure. Feedback in the form of animations and text panels are provided for every combination of elements. Once the model is completed, the effects of postural changes can be investigated. A case study is included, and a paper based Tasks sheet is supplied in the classroom (the questions from this Tasks sheet are also included in the program).
The software enables students to explore the subject by actively experimenting with outcomes of their decisions in a setting that assumes little background knowledge of cardiovascular physiology, yet capitalises on everyday knowledge of human function (eg. ability to stand up without fainting). The use of simulated environments like the one in this resource is consistent with the view of the learner as self directed and constructing knowledge afresh every time (Papert, 1996). Recalling Piaget's quote "To understand is to invent", the CD-ROM resource was designed to create conditions for invention, rather than provide ready made knowledge (quoted from Papert, 1996).
In addition, it was intended that students and teachers would use this resource within scheduled tutorial classes in ways that encouraged active discussion and debate about the decisions students made while building their model. Thus, the design of the program is consistent with Vygotsky's theory of the importance of the social context in the development of scientific concepts (Vygotsky, 1962).
In 1999, the Medical course at the University of Melbourne changed to a form of problem based curriculum. The number of lectures and practical classes was drastically reduced, and replaced with a 'Problem of the Week', where students work in groups of 10-12 with a tutor for 4 hours each week. As a result of these course changes, the subject material of blood pressure control is now covered early in second year Medicine, rather than at the end of first year, so was not taught in 1999. The CD-ROM is a recommended self directed learning resource for two weeks of the new problem based learning (PBL) schedule. It is available on all computers in many computer laboratories and specialised PBL rooms across the Faculty. The effects of the change of course structure on how students use the program is not known, leading to a key question: How do students use a tutorial designed for a traditional curriculum in the newly re-designed problem based curriculum?
In the study undertaken for this paper, students were invited to work through the program in a scheduled session, with tutors present for assistance.
In addition, it was always the intention of the developers to design the program to be used by other courses taught in the Department of Physiology, particularly as part of the extensive computer facilitated learning (CFL) curriculum for second year students undertaking the 3 year Bachelor of Science degree. During the current study, the program was used as part of weekly scheduled CFL sessions by both second year Science and second year Biomedical Science students. Both courses are conducted under the more traditional format of scheduled lectures, practical classes and tutorials (in the form of CFL sessions with tutors present for assistance). Biomedical Science is a new 3-year degree course at The University of Melbourne, and this was the first cohort of second year students in this course.
As several modifications were introduced as a result of these previous cycles, it was considered important to repeat evaluation into the monitoring of the learning environment to determine whether this CFL package was functional and enjoyable to use. In particular, previous evaluations had revealed students had experienced a great deal of frustration in getting started on the model building exercise. Several causes for this had been identified, one of which was the large cognitive load associated with learning how to use the tools provided, identifying the graphical representation of components of the nervous system, and at the same time trying to build a difficult model (cognitive load is defined as the level of "mental energy" required to process a given amount of information, after Cooper, 1990).
The current study also aimed to conduct evaluation into monitoring of the learning process, to determine whether this CFL was influencing the learning process as intended, and also to begin evaluation of the learning outcome. Some questions on the educational relevance of the project were also included in student surveys to begin evaluation on the innovation appropriateness. We recognise that the findings of this study are affected by the educational context of the tutorial, for the different cohorts of students using the program. The tutorial is now a recommended resource for medical students, rather the originally intended scheduled class, and this will have an affect on how the students view the importance of the program for their study.
Characteristically, action inquiry is conducted in a cyclical, or spiral, manner, and involves one or more series of cycles, which can be summarised by the following stages:
In addition, the CD-ROM was also scheduled for use by second year Science and BioMedical Science students in Semester 2. Although both these courses are very different to the new PBL Medical course, evaluation with both groups of students have allowed us to test the impact of the modifications made since Semester 1.
170 second year Science students undertake weekly 2-hour scheduled CFL classes, with one tutor to about 30 students in each of 5 sessions. The course is of the traditional lecture / practical / tutorial structure, and we have been teaching this course for many years.
BioMedical Science is a new course, which commenced in 1999, and in 2000, we taught the first cohort of 80 second year students. The course is of the traditional format, with a weekly 2-hour timeslot of either CFL tutorial or laboratory practical classes. Students within the BioMedical Science course are derived from amongst the highest achieving secondary school students, and we anticipated these students would perform better when compared with students undertaking the more general Science course.
Cycle Two (identified from Cycle One evaluation)
Again, the major foci of this cycle of evaluation included program usage and interface issues, and learning process issues, as well as beginning evaluation into learning outcomes. The first question was whether the modifications implemented as a result of Cycle One evaluation were effective in reducing student frustration and confusion. We were also interested in determining how student use of some of the features of the program may have changed as a result of these modifications. In addition, we also attempted evaluation into whether student use of this program has helped achieve the desired learning objectives.
Further elaboration of these questions is included in Table 1.
|Action Cycle One||Data Collection and Analysis||Key findings|
|1. Program usage/ interface issues
||Students enjoyed the model building exercise, but most found it very challenging, and frustrating to get started.
Students were confused about the terms used for naming regions of the brain
Students did not use the practice screen for its intended purpose
Students were confused about the concepts of excitation / inhibition
Students forgot the colour codes and graphical represent-ation of the tools
Most students did not notice changes in rate of signalling
Students are confused what to do when finished building their model
The built in case study was not viewed by students
|Action Cycle Two||Data Collection and Analysis||Key findings|
|1. Program Usage / Interface Issues
||Students still found the model building challenging, but achievable, and the level of frustration observed was much reduced.
Most areas of confusion identified in Semester 1 have been resolved by the modifications introduced.
Students are still experiencing difficulty with the concept of inhibition, (particularly the removal of inhibition) but the questions are now more advanced than previously.
Student performance on the tests showed significant improvement on Questions 2 & 3, but Q1 appeared to be too easy! (most got it right before and after the CFL).
(See Table 4 for test results)
Cycle One was conducted with second year Medical students in a PBL curriculum, in Semester 1, 2000. This CFL was recommended as a self directed learning resource, and the curriculum provides the opportunity for students to select this resource at a time convenient to them. Efforts were made to encourage students to attend a voluntary CFL session, but only 49 students (out of an enrolment of 180) attended during these times. It is possible that some students completed the tutorial at other times, but feedback from PBL tutors indicated that most students did not complete the resource at all.
Cycle Two was conducted in two stages, firstly with 82 second year BioMedical Science students in August, and then with 170 second year Science students in October. Attendances and rates of questionnaire returns are given in Table 2.
|Attendance at CFL session||49||75||125|
|Number of questionnaires returned||49|
(100% of attendees)
Of the 14 questions on the survey, 8 questions focussed on interface or usage of the program, 2 on the learning process, and 3 could be classified as self reported learning outcome questions. The final question asked for suggestions for improvements.
The same basic questionnaire was used in both action inquiry cycles, with the only alterations being on relevance or appropriateness to the curriculum, allowing for differences in the two different types of courses being investigated. All student responses were transcribed, and collated into similar categories.
A major development aim of the project, involving both the computer tutorial and the accompanying paper Tasks sheet, was to encourage group discussions (both student-student and student-tutor) of the problems presented. Accordingly, observations also focussed on the level and quality of discussion in the classroom.
The author was introduced to students as the Educational Programmer in Physiology and as the developer of this CFL, and was performing evaluation on this project as a participant-observer. It was made clear that she had no involvement in student assessment, but also has a strong background in Physiology, and was effectively acting as a senior tutor during all evaluation sessions.
The shortcomings inherent in pre- and post-tests is acknowledged by the authors, but this was never intended to be a standalone evaluation strategy, rather it provided additional information to be used together with data from observations and student feedback.
The evaluation was undertaken primarily by the educational programmer-developer of this tutorial. Inherent bias was minimised by extensive consultation with an external mentor (co-author of this paper), as part of a broader national network of mentors and mentees. An evaluation plan was submitted to an electronic forum, and evaluation strategies from other similar projects were consulted.
The primary stakeholders for the evaluation were the developer and project group, mentor and the ASCILITE and CUTSD network. In addition, reports were also circulated to teaching academics and tutors, and to curriculum coordinators.
All students attending the session completed a written questionnaire at the end of the session, giving a response rate of 100% for this group of students, and observations of student use of particular screens were noted. Attention of this cycle focussed predominantly on issues related to program use and interface design, particularly in relation to modifications introduced as a result of a previous cycle (Weaver et. al, 1999).
We were especially interested in any effects of the changes introduced in an attempt to reduce the reported high cognitive load associated with the model building exercise. Several modifications were made in an attempt to reduce this load, the key one being the introduction of a practice screen to encourage familiarity with the tools and animations, etc, involved. This cycle of evaluation concentrated on student use of these new features, and attempted to determine their effectiveness.
Student usage and interface issues
Analysis of the questionnaire, combined with observations and tutor feedback, revealed that students were still experiencing a great deal of difficulty and frustration with building their model, and amongst other findings, were not using the practice screen in the intended manner. Participation in group discussion was low, as most students worked through the program alone. The key findings are summarised in Table 3.
Students reported the best understood parts of the program were in the roles of different efferent neurones, but the number of responses for each different answer to this question was low. Least understood was the naming of regions of the brain, followed closely by problems understanding the concept of inhibition ("Do inhibitory interneurones increase or decrease signal, or have the same amount of signal but different neurotransmitter?"), both of which accorded well with our own observations. Most liked were the animations and visual representations, and the hands on approach of constructing their own model. Least popular was the perceived lack of explanations - many students asked for more directive hints ("Just tell me what to do") and more text they could copy into their workbooks.
Learning process issues
Our second focus of this cycle of evaluation was to determine whether there were any noticeable differences in how students use the program in the new PBL course. Due to the low level of student attendance, it was difficult to make any conclusions about how student use of this resource may have changed. The questionnaire did not ask questions on reasons for attendance at the CFL session or completion of the tutorial, as we had expected the majority of students to use this resource, but this does signal an issue of concern for self directed learning curricula. This resource does not appear to be seen by students in this curriculum as important, and this has been conveyed to the relevant coordinators, but may also indicate a problem with the construction of the PBL course.
(from Semester 1)
(for Semester 2)
Students experienced difficulty with the functional names given to regions of the brain- due to our use of terms which are also applied elsewhere.
|The brain regions were re-named, keeping functional terms, but avoiding terms which are also used to describe the end effects of this system.|
Students were confused about the definition of excitatory vs inhibitory - what exactly makes a neurone inhibitory? We have not encountered this problem before, but this confusion is highlighted by the problem of naming of brain regions.
|Existing definitions of excitation and inhibition were clarified, and we attempted to increase the viewing of these definitions by making the hypertext definitions appear automatically at the end of a popular animation, as well as by the existing link.|
This screen was introduced to reduce the large cognitive overload experienced by students starting on the model building exercise. The current evaluation found that students were not using these practice features built into the CFL - mainly because it was not clear that they could do so.
|The title and instructions of the practice screen were changed, to highlight that this is for practising using the tools.|
|Colour and graphical representation:|
Some students reported that while they were building their model, they forgot the colour coding / graphical representation used to depict different elements of the nervous system. (It was hoped that use of the practice screen would familiarise them with this, but these students may not have used this screen.)
|A box showing examples of the nervous system elements was included on the practice screen, and same box was also included on the main model building screen, accessible by clicking on an 'information' button.|
|Rate of signalling:|
Most students did not notice the changes in rate/ number of signals (action potentials) moving around their model, adding to the confusion about inhibition, since they did not notice when the signal increased or decreased. Also, they were unaware that they tested their circuit by increasing the original signal (increasing blood pressure at the receptor site) - the button they clicked to do this was named "Test Circuit", so gave no indication of what this test was.
|The first time the model is tested and an animation is run, a dialogue box appears to prompt students to consider the rate/ magnitude of the signal moving around the circuit.|
The "Test Circuit" was renamed "Raise BP", and clearer explanations provided about how the circuit is tested.
Students are confused what to do next when they reach the end of the model building exercise.
|A new screen ("Summary") was included, after the model building exercise, to summarise the major points of the tutorial.|
A case study (3 screens of a large cartoon image and small amount of text) had been included since the last version, and was intended to give some applied context to the main learning issue. The current evaluation revealed that very few students even saw this case study.
|The case study was moved to be the first item in the Contents menu, and screens were re-positioned, so that students who do not use the Contents menu to navigate (ie. those who just click on the right arrow to proceed through the program linearly) will still view the case study.|
26 out of 49 respondents thought the CFL was relevant to their Problem of the Week, and a further 5 students believed it was important material for their course, although not directly relevant to the Problem of the Week. Medical students rated their own understanding of this topic prior to the CFL as 2.8 (1 = understood topic not at all, 5 = understood very well), and this improved to 3.8 after the class.
After analysis of the Semester 1 data, the project group met and decided on a range of modifications to the program. Most of the issues raised by students were considered to be related to various aspects of the program or interface design, and could be addressed by relatively simple re-design of the program. These responses are also summarised in Table 3.
The same evaluation methods were used by both groups, and included a similar questionnaire as previously, observations and tutor feedback as well as pre- and post-CFL tests, in an attempt at evaluation into learning outcomes. All students were encouraged to complete the questionnaire, and no students refused to do this (one Science student left the classroom prior to completing the tutorial) (see Table 2). Again, observations focussed on the effects of modifications introduced since the previous cycle, particularly on whether student use of introductory material and the practice screen was improving their ability to get started on the model building exercise, but also attempted to gather evidence on learning process and learning outcome issues.
Student usage and interface issues
Results from the second cycle of evaluation were very positive. Problems identified earlier with different aspects of the interface or use of the program appear to be largely rectified. Nearly all students used the practice screen in its intended manner, and fully completed the short model building exercise contained there. This appeared to have the desired effect of reducing the cognitive load previously found with the major task of the program, and was reflected in generally much more positive statements about the program from the students. The most popular aspect of the program was again the animated model building exercise, and the high level of interactivity ("Building the circuit myself, rather than just reading and answering questions, required more thought and really improved my learning"). Responses to least popular were much less than previously seen, with the highest number of responses being "Nothing", followed by "Not enough explanation".
Learning process issues
Overwhelmingly, both groups of students believed the CFL was relevant and appropriate for their course ("helped reinforce what was covered in lectures and extended some of the topics covered"; "I wouldn't have understood the lecture material without the tutorial"; "Pitched at a level that can be understood in second year"). Most students also thought it helped to integrate material from different lectures or different subjects ("Autonomic nervous system and cardiovascular physiology were done separately in lectures so this was good to integrate them"; "Covers a number of topics in both our core subject and also aids in consolidation of material from Pharmacology").
Classroom discussion had moved from the technical difficulties of using the program or interpreting the feedback statements, to more physiological investigation of the reflex circuit, and tutors all reported that they were challenged more than they had ever been to resolve wider ranging areas of discussion, as students attempted to integrate the topics covered in this CFL with knowledge from other sources. Other areas of difficulty previously identified (eg. confusion over naming of brain regions) were not reported and so were no longer apparent.
Learning outcome issues
Students reported the best understood areas of the program related to the overall physiological responses and control of blood pressure ("I now understand the effects of changes in blood pressure on sympathetic and parasympathetic activity"), which is a major aim of the innovation. Least understood was the role of the inhibitory interneurone, particularly the concept of decreased inhibition producing an increased response. Observations by both the author and tutors revealed that no students claimed negative feedback would over correct blood pressure beyond its set point, a common student misconception reported by academic staff teaching in previous years. BioMedical Science student rating of their own understanding of this topic increased from a pre-CFL rating of 2.2 (1 = understood topic not at all, 5 = understood very well) to a post-CFL rating of 3.7. Science students' perception of their understanding increased from 2.4 (pre-CFL) to 3.5 (post-CFL).
A difficulty in analysis arose because there were 2 correct answers to question 2. This was deliberate, as one of the key learning objectives of this program is to identify and understand the two different methods of negative feedback, and we were interested to see whether students understood that there were two mechanisms involved. However, where students selected only one correct option, it is impossible to tell whether they believed there was only one mechanism, or whether they did not read the (bold) instruction at the top of the test that there may be more than one correct answer. The following results are based on students receiving 0.5 marks for one correct answer, and only receiving the full mark for both correct options.
Question 1 was not difficult to work out, or at least take an educated guess at, and this is reflected in the group results - a high number of students answered this correctly prior to the CFL, so it was difficult to see a significant improvement in performance. Performance on Question 2 improved similarly in both courses, with the BioMedical science students generally performing better both prior to and after the CFL, but showing about the same level of improvement.
- Aug 2000
|Question 1||0.85 ± 0.36|
(n = 75)
|0.89 ± 0.31|
(n = 75)
(p = .496, n = 75)
|Question 2||0.58 ± 0.26|
(n = 75)
|0.77 ± 0.25|
(n = 74)
|p < 0.001|
(n = 74)
|Question 3||0.36 ± 0.48|
(n = 75)
|0.49 ± 0.50|
(n = 74)
(p = .185, n = 74)
|Whole test||1.79 ± 0.66|
(n = 75)
|2.13 ± 0.59|
(n = 75)
|p < 0.005|
(n = 75)
|Sci - Oct 2000||Pre-CFL||Post-CFL||Significance|
|Question 1||0.84 ± 0.37|
(n = 124)
|0.90 ± 0.31|
(n = 115)
|N.S. (p = .18)|
(n = 115)
|Question 2||0.48 ± 0.27|
(n = 123)
|0.67 ± 0.34|
(n = 117)
|p < 0.001|
(n = 116)
|Question 3||0.24 ± 0.43|
(n =1 23)
|0.50 ± 0.50|
(n = 115)
|p < 0.001|
(n = 114)
|Whole test||1.56 ± 0.58|
(n = 124)
|2.03 ± 0.59|
(n = 117)
|p < 0.001|
(n = 117)
Performance on Question 3 was lower than expected, given that it is very similar to a question on the Tasks sheet which students had been working on. It was observed that a few groups of students did not know the definition of haemorrhage - some said it was a bruise, others used the word clot. The question on the Tasks sheet refers to blood loss, so the different term used may have led students to believe this was a completely different question. But since most students completed the test without discussion, as instructed, it is impossible to conclude where they had difficulty with this.
From observations only, we tend to believe that the CFL session itself was responsible for the improved performance on questions 2 and 3, since we did not observe students referring back to the questions during the class, or discussing these topics more than in similar sessions without tests, even though they had been informed they would be undertaking the same test at the end of the session. Even when specific reference was made in discussions with tutors to points which had been included as options on the multiple choice questions, no students commented on this or seemed to recognise these from the test. However, further evidence is needed before any conclusions can be reached on the effect of the tutorial in meeting the learning objectives.
The original intention was to look at the CFL tutorial and its immediate context, and not at the whole curriculum, however, this evaluation project did identify some issues about the broader context in which these types of application can be used.
The process of action inquiry allowed us to evaluate different versions of the CD-ROM, not just for interface or useability issues, but on the whole learning experience it offered to students, and to systematically identify areas of concern, address these issues in an appropriate manner, and to focus further evaluation rounds on the impact of these changes.
We are seldom able to anticipate all the different ways in which students relate to CFL, consequently an evaluation framework must be integral to the use of CFL in teaching today. Such a strategy needs to be ongoing throughout any project, adopting a cyclic method geared towards continuous improvement in a systematic way.
Clearly, there was considerable value in the iterative evaluation of this CFL resource as we were able to better understand how students related to this resource and made use of it in their learning. We were unable to fully anticipate the way students understand the software interface or the context in which it is used. Iterative approaches to evaluation, like the one we have employed in this project, can greatly improve the quality of the resource and its use.
Kember, D. & M. Kelly (1993). Improving teaching through action research. HERSDA Green Guide No. 14.
Papert, S. (1996). The connected family - bridging the digital generation gap. Longstreet Press, Atlanta, Georgia, USA.
Phillips R., Bain J., McNaught C., Rice, M. & D. Tripp (2000). Handbook for Learning-Centred Evaluation of Computer-Facilitated Learning Projects in Higher Education. ASCILITE / CUTSD publication. [verified 24 Jul 2002] http://cleo.murdoch.edu.au/projects/cutsd99/handbook/handbook.htm
Vygotsky, L. (1962). Thought and Language. MIT Press, Cambridge, Mass., USA.
Weaver D., Kemm R., Petrovic T., Harris P. & Delbridge L. (1999). Learning about control systems by model building - A biological case study. In Responding to Diversity. Proceedings of ASCILITE 99 Brisbane, Qld, pp. 381-389. http://www.ascilite.org.au/conferences/brisbane99/papers/weaverkemm.pdf
Weaver, D.A, Delbridge, L.M.D., Harris, P.J., Petrovic, T. and Kemm, R.E (2000). Blood Pressure: Reflex control. Pub. AD Instruments, Sydney. ISBN 0 7340 2085 6
|Authors: Debbi Weaver, Department of Physiology, The University of Melbourne|
Tony Gilding, Teaching and Learning Development Support, James Cook University
Please cite as: Weaver, D. and Gilding, A. (2002). Iterative evaluation of Reflex Control of Blood Pressure. Australian Journal of Educational Technology, 18(2), 208-226. http://www.ascilite.org.au/ajet/ajet18/weaver.html