|Australasian Journal of Educational Technology
2011, 27(2), 274-289.
Evaluating computer-based simulations, multimedia and animations that help integrate blended learning with lectures in first year statistics
David L. Neumann, Michelle M. Neumann and Michelle Hood
The discipline of statistics seems well suited to the integration of technology in a lecture as a means to enhance student learning and engagement. Technology can be used to simulate statistical concepts, create interactive learning exercises, and illustrate real world applications of statistics. The present study aimed to better understand the use of such applications during lectures from the student's perspective. The technology used included multimedia, computer-based simulations, animations, and statistical software. Interviews were conducted on a stratified random sample of 38 students in a first year statistics course. The results showed three global effects on student learning and engagement: showed the practical application of statistics, helped with understanding statistics, and addressed negative attitudes towards statistics. The results are examined from within a blended learning framework and the benefits and drawbacks to the integration of technology during lectures are discussed.
From the perspective of the learner, the blended learning experience is determined by the organisational level at which the approach is applied. For most learning institutions, these are activity level, course level, program level, and institutional level (Graham, 2005). At the activity level, lectures remain the primary means by which teaching is conducted for on campus courses, particularly those with large enrolments. This trend is likely to continue in the foreseeable future due to tradition and the organisational structure of universities. Moreover, Hanson and Clem (2005) argue that many students have a preference for the face to face component of a blended learning experience. As noted by Bligh (2000), the standard lecture format may not be the most effective way to promote thinking and develop attitudes, but changes to lecturing techniques may help to overcome such limitations. Selecting appropriate lecturing techniques is also one way to help lecturers become more effective (Bligh, 2000). As such, there remains considerable scope to explore the use of technology in enhancing the delivery of, and ultimately, the learning outcomes from a lecture.
The application of technology during a lecture has been an area of interest for some time (e.g., Gilroy, 1998; Roberts & Dunn, 1996). The main focus of earlier research was on using computerised methods to present lecture notes (e.g., PowerPoint slides), incorporating multimedia (e.g., videos), using visualisation devices, and accessing the Internet during lectures. More recently, researchers have examined other issues such as downloadable lecture notes and podcasting lecture content and their effects on attendance and learning outcomes (e.g., McGarr, 2009; Traphagan, Kucsera & Kishi, 2010; Wieling & Hofman, 2010). Other technologies that have been developed include computer-based demonstrations that are interactive by allowing the user to change values and see the effects that they have in real time. Less attention has been placed on assessing the effects of using these computer-based simulations during lectures. The present study aimed to investigate this issue, in addition to the use of other technologies, in statistics lectures.
Statistics is a discipline well suited to integrating technology during lectures. Computer-based activities have been developed to allow students to conduct virtual studies and data analysis (Malloy & Jensen, 2001) and simulate statistical concepts (e.g., Dunn, 2004; Lane, 1999; Meletiou-Mavrotheris, 2003; Morris, Joiner & Scanlon, 2002). Such activities are argued to improve student learning outcomes because they can correct misconceptions (Morris, 2001), make use of representations that are dynamically linked (Meletiou-Mariotheris, 2003; Morris & Scanlon, 2000), and combine declarative knowledge with experiences in working with data (Neumann, 2010). The effects of computer-based activities on quantitative learning outcomes have been assessed for systems such as Stat Lady (Shute & Gawlick-Grendell, 1994; Shute, Gawlick-Grendell, Young & Burnham, 1996), Link (Morris, 2001), a dynamic visualisation of ANOVA, regression and the general linear model (Rey, 2010), and web-based tutorials on statistics (Bliwise, 2005).
In contrast to quantitative evaluations, some researchers have highlighted the importance of collecting qualitative data concerning the learning process and student experiences when technology is used to teach statistics (Jones et al., 1996; Morris, Joiner & Scanlon, 2002; Morris & Scanlon, 2000). This perspective is echoed by researchers in other disciplines. Boud and Prossner (2002), for example, argue that learning arises from student experiences and not from what the technology does. Gunawardena, Lowe and Anderson (1997) also note that qualitative approaches can examine the efficacy of the learning approach and can give insight into the process of learning and knowledge construction from the student's viewpoint. Qualitative approaches can give insight into how technologies in a blended learning approach promote a student centred approach to teaching (Graham, 2005). While survey-based instruments could also provide an alternative means to collect such information, this approach can be limited by the constructs measured in the survey and they cannot give much information about the motivation behind the responses (Gal & Ginsburg, 1994).
Previous research that has used a qualitative approach to examine the integration of technology with face to face teaching of statistics generally has used small classes. Lee (1999) compared two classes of 30 students in which one was taught using a traditional note taking approach and a calculator and the other was taught using hands on activities and statistical software on a computer. During a subsequent interview, the technology rich approach was reported to reduce boredom, be better at relating statistics to research and problem solving, and make statistics more concrete (Lee, 1999). However, it is not clear whether the differences between the classes were due to the technology used, the different teaching approach (note taking vs. hands on activities), or both. Meletiou-Mavrotheris (2003) employed a similar hands on activity approach to Lee (1999), but combined it with a computer software package designed to enhance learning during data analysis. Observations of five students and an examination of the dialogue between them and their instructor indicated that this approach facilitated the learning of statistical concepts. One limitation, as noted by the author, was that the study took place outside of the natural classroom setting. However, the results are consistent with those of Morris et al. (2002) in that students can use computer-based activities in a productive way to learn statistics.
In contrast to the previous research that has examined technology in small classes (Lee, 1999; Meletiou-Mavrotheris, 2003; Morris et al., 2002), Neumann, Hood and Neumann (2008) assessed the use of interactive technology during large lecture classes. Neumann et al. (2008) reported using video demonstrations and computer-based simulations of statistical concepts during the lectures. A qualitative analysis of student feedback suggested that the technology helped learning and provided interaction during the lecture. Although the feedback indicated some benefits, the study had several important methodological limitations. The methods were not clear in how frequently the technology was used, what topics the technology were used for, and if other blended learning approaches were used during the lectures (e.g., online supply of lecture notes). In addition, the data was based on written feedback obtained from a generic course evaluation questionnaire. The response rates across the three years of analysis were moderately low (17.8% to 45.5%) and it is possible that only certain types of students provided feedback (e.g., only high achieving students).
The present study aimed to examine the integration of technology during statistics lectures. To ensure an authentic investigation, the technology was integrated during actual lecture classes (cf. Meletiou-Mavrotheris, 2003). Moreover, a stratified random sample of students was adopted to ensure that the results were representative of the student population (cf. Meletiou-Mavrotheris, 2003; Neumann et al., 2008). A semi-structured interview approach was used to elicit feedback from students at the end of the course. It was considered that an interview would result in richer and more diverse feedback regarding the use of technology in the lectures than a questionnaire.
Therefore, while it was hypothesised that the same themes that emerged in the study by Neumann et al. (2008) would also be found, it was also expected that additional themes relevant to student learning would be identified. Several processes have been suggested to be important in technology based learning, including information processing learning processes (González & Birch, 2000), visual learning (Davis & Bostrom, 1992; Psotka, Kerst & Westernman, 1993), cognitive style (Carlson, 1991; Riding & Douglas, 1993), communication (Kuehn, 1994), self-efficacy (Ertmer, Evenbeck, Cennamo & Lehman, 1994), and motivation (Lens, 1994). As such, it was hypothesised that some of these themes would also emerge.
Examples of the simulations used are described by Neumann (2010) and screenshots of five simulations used are presented in Figure 1. Similar interactive simulations have been developed by other authors (e.g., Morris et al., 2002; Meletiou-Mavrotheris, 2003; Dunn, 2004). The Rice Virtual Lab in Statistics (Lane, 1999) is an online compilation of such simulations (see http://onlinestatbook.com/rvls.html). The simulations covered the following topics: scatterplots and correlation, probability (flipping a coin over repeated trials), taking random samples and sampling error, the sampling distribution of the mean, confidence interval of the mean when the population standard deviation is not known, hypothesis testing (factors that influenced power/Type II error), confidence interval of the mean when sigma is known, and degrees of freedom.
The simulations were integrated into the lecture by using them as a tool to illustrate statistical concepts. Take the example of the scatterplot and correlation simulation (see Figure 1, middle left panel). After describing the main features of the simulation, the lecturer could select various real data sets and produce a scatterplot of the data to discuss the graphics representation of association. The correlation for the data was also displayed and this could be related to the scatterplot. Different data sets could be selected to illustrate positive and negative association. A regression line could be drawn to illustrate the principles of least square regression and prediction. A sliding scale was also used to change the data dynamically so that the strength of the association could be changed. The sliding scale affected both the calculated correlation and the location of the data points in the scatterplot such that strong, moderate, and weak associations could be discussed. Students were also provided with screenshots of the simulations in their lecture notes and they also had access to the simulations online via the course website. In this way, the students could write notes to accompany the demonstration in class and also gain hands on experience in using the simulation outside of class time.
Figure 1: Examples of the computer-based simulations and animations that were integrated within the lecture. The computer-based simulations shown illustrated the concepts of probability (top left panel), graphical depictions of qualitative data (top right panel), correlation and regression (middle left panel), confidence interval of the mean when the population standard deviation is not known (middle right panel), and statistical power and errors in hypothesis testing (bottom right panel). The animation (bottom left panel) illustrated concepts related to the confidence interval of the mean.
Overall, 50 students were randomly selected. Five students could not be contacted due to incorrect or out of date contact details. Another five students declined to participate (two of these students received a pass grade and one each received a credit, distinction, and high distinction). One student with a fail grade who agreed to participate on initial contact could not subsequently be contacted and one further student with a pass grade declined to participant when subsequently contacted for the interview. The final sample consisted of 38 students (27 females and 9 males, which reflects the 80% female enrolment in psychology programs) with a mean age of 23.97 years (SD = 7.33). Thirty students had no prior post-secondary school education, four students had completed a diploma, three students had completed a certificate, and one student a bachelors degree. The students reported attending 91% (SD = 13.14) of the classes in the course.
All interviews were conducted over the telephone and lasted approximately 20 minutes. Permission was obtained to digitally record the interviews to allow for later transcription. A portion of the interview was used to collect student feedback regarding other initiatives in the course (e.g., use of humour) and are reported elsewhere (Neumann, Hood & Neumann, 2009; Neumann, Neumann & Hood, 2010). The interview used a semi-structured format. Initially, the interviewer and student discussed the various ways in which technology had been used during the lectures. The student was next asked a series of closed and open ended questions with follow up questions used to gain further details as required. For example, the students were asked "What general comments do you have about the in class computer-based and video demonstrations of statistical concepts?" Follow up questions were also used and consisted of "Did it help you engage with the material and how? Did it help motivate you to learn about statistics and in what way? What were some positive aspects to it? What were some negative aspects to it? and Would you recommend that it be used in the future and why?" At the end of the interview, demographic information about the student was obtained.
Four additional themes were identified that were each endorsed by 21 to 29% of students. Some students commented that the use of technology Created interest (29%). A somewhat related theme was Engagement (29%). The theme of created interest was related more to the fact that the technology made the lecture and statistics in general less boring. Students commented that it was a less boring way of learning, that is was interesting to see how statistics worked, and some identified the videos as interesting. For engagement, some students stated that the videos and computer-based demonstrations were a good tool to help people engage. Another student noted that the use of technology was more engaging than listening to words for a long period of time. The additional theme of Different approach (29%) reflected that the technology changed the traditional lecture experience. Some students noted that it was different to just sitting in the lecture and listening to someone talk and that this encouraged a different form of learning. The theme of Visual aid to learning (21%) emphasised the visual nature of the technology. Students noted that it was beneficial to "see the graphs being made" and to "show graphically what [the lecturer] was talking about". Some students stated that this visual nature made things easier to remember.
The final four themes were each identified by less than 20% of the sample. The theme of Motivation (18%) suggested that the face to face instruction added to the use of technology in that "you had a real person motivating you, not just a computer". The theme of Enjoyment (16%) indicated that the methods were "fun", "light hearted", "funny", and were "enjoyed". In some cases these comments were made in relation to a specific concept that was being taught. The theme of Broke up content (11%) captured comments that were specific to the application of technology as giving a break from the talking and showing of slides in the lecture and the focus on theory and numbers. Finally, the theme of Lightened the mood (11%) included comments about the stress and tension associated with statistics and also the dry nature of statistics. Students noted that it was a good way of reducing the seriousness of statistics.
Table 1: The labels, definitions, percent of sample that contributed statements, and representative comments for the final themes resulting from the qualitative data coding methods on the use of technology during lectures
|Practical application||The use of technology increased the connection between statistics and real concepts and showed examples of practical applications||66%||"It helped reiterate the concepts in a more practical base than just studying theory"; "They were a different way to apply the knowledge so we can relate to it more rather than just equations"|
|Helped understanding||The use of technology helped students to comprehend and learn about research methods and statistics||60.5%||"It helped conceptualise the point"; "Helped me get my head around it"|
|Created interest||The use of technology increased student interest and attention towards statistical concepts||29%||"It made you see where it could be more interesting"; "It was a less boring way of learning"; "It was interesting to see how statistics can work"|
|Engagement||The use of technology engaged students in the lecture and learning about statistics||29%||"It helped me to engage with it"; "You actually got engaged with your lecturer out the front saying what this means so on and so forth"|
|Different approach||The use of technology was a unique way to illustrate stat-istical concepts to students||29%||"It was something a bit different to sitting in the lecture listening to someone talk"; "It was a different form of learning"|
|Visual aid to learning||The use of technology helped students to better visualise the statistical concept being learnt||21%||"If you have a picture in your mind then you can relate to it so I found the class demonstrations very helpful"; "I could visualise it and imagine it happening"|
|Motivation||The use of technology provided a means to motivate students to learn about statistics||18%||"It was motivational because you had a real person motivating you not just a computer or a book"; "This can motivate, I suppose"|
|Enjoyment||The use of technology was a fun, enjoyable activity||16%||"By using examples it was more enjoyable than using slides and slides"; "They were fun"|
|Broke up content||The use of technology broke up the content into more manag-eable amounts of information||11%||"It gave a break from all the numbers"; "It broke up the class a bit"|
|Lightened the mood||The use of technology lightened the mood of the lecture and helped reduce anxiety and stress in students||11%||"Stats is very heavy and full on and that made it a little bit like it broke up the tension"; "It broke up the stress in the class"|
A comparison between the present results and those reported by other researchers indicate a number of similarities and differences. Like Lee (1999), the present study found that the technology reduced boredom and showed the practical applications of statistics. Like Meletiou-Mavrotheris (2003) and Neumann et al. (2008), the present results indicate that students perceived that the technology helped the learning of statistics. However, unlike Neumann et al. (2008) no theme emerged to indicate that the technology "provides interaction" during the lecture. The present findings also yielded a larger and more diverse range of themes than can be deduced from previous research. This is likely a result of the methods used. Unlike the written feedback examined by Neumann et al. (2008), the one-on-one interview approach used here was more detailed in that it could seek clarification and elaboration from students as needed. The present study also ensured that a broader and more representative sample of students were used than Meletiou-Mavrotheris (2003) and Neumann et al. (2008) and this may have also produced a wider diversity of themes.
Pulling the results of the present study together, a conceptual framework may be developed on how integrating technology during the lectures influenced the student learning experience. This framework consists of three major components created by the grouping of themes. These are Learning (helped understanding, visual aid to learning, different approach, broke up the content), Addressing negative attitudes (created interest, engagement, motivation, enjoyment, lightened the mood), and Practical application (practical application). The latter of these components is consistent with the aim to contextualise statistics within the discipline of study. It is also consistent with efforts to avoid an overemphasis on theory and calculations (Cobb, 1992). Given that multimedia and computer simulations can be developed in other disciplines, it would be expected that a similar benefit would be afforded by using them as part of a blended learning approach in lectures on subjects other than statistics.
The component of Learning is consistent with the goals of the learner and teacher in that it suggests that a blended learning approach applied to a lecture can help with understanding statistics. The theme of Helped understanding supports prior quantitative evaluations that have showed learning benefits from the use of interactive, computer-based activities (e.g., Bliwise, 2005; Morris, 2001; Shute & Gawlick-Grendell, 1994; Shute et al., 1996). These results indicate that even vicarious experience of these interactive, computer-based activities via blending them into a large class, face to face lecture can help understanding. Enhanced student understanding is not reliant on a more typical blended learning approach where students would have direct interaction with these applications only via independent online access. The way in which computer-based tasks can improve understanding by visual learning (Davis & Bostrom, 1992; Psotka et al., 1993) seems to be one means by which the present use of technology helped learning. The advantage of computer-based simulations is that they can actually show the process in action visually rather than asking students to "imagine" the process (e.g., taking a random sample). The technologies that were used also provided a different approach to teaching and learning. As such, this might be expected to benefit those students who find it difficult to learn from a traditional talking, lecture slides only lecture. The way in which the use of the technology broke up the lecture may also benefit learning because it divided the lecture into smaller, more manageable amounts of information. It may have provided an opportunity to consolidate what had just been covered.
The final component is Addressing negative attitudes. The negative attitudes that many students have towards statistics has been documented by several investigators. Up to 80% of students in the social sciences report that statistics courses are the most anxiety-inducing in their degree (Onwuegbuzie & Wilson, 2003). Many students also report a low interest and overall negative attitudes in statistics (Neumann & Hood, 2009; Tremblay, Gardner & Heipel, 2000). Such factors can lead students to delay taking statistics courses and this can have a 'knock on' effect in delaying the completion of their degree (Onwuegbuzie, 2004). Holding a negative attitude, low motivation, and anxiety over statistics also predicts a lower level of academic achievement (Tremblay et al., 2000). Considering such findings, the benefits to addressing negative attitudes found in the present research are a welcome effect of the approach used. The reported effects of the approach to lightening the mood in the lecture and increasing enjoyment would benefit those students who are anxious or otherwise negative towards statistics. The way in which the approach created interest, increased motivation, and was engaging would also benefit those students who perceive statistics to be boring and something to be avoided. It is known that a student's attention will decline markedly within 25 minutes of the start of the lecture (Bligh, 2000). The use of blended learning technologies, such as the simulations and multimedia examined in this study, may break the monotony of a lecture and recapture the attention of students.
Graham (2005) reviewed four dimensions of blended learning environments that will impact upon the learning experiences of students: space, time, fidelity, and humanness. In terms of space, the present approach was based on a physical, face to face teaching approach. However, the methods used would easily translate into a more virtual or distributed space. Video podcasting of the lectures would still ensure that the computer-based simulations, demonstrations using SPSS, animations, and videos could be used. A similar extension would apply to the dimension of time. The present evaluation was based on a live synchronous time, although podcasting would allow for asynchronous learning based on a variable lag time. In terms of fidelity, the present approach was relatively high by incorporating the visual and auditory senses and it used various types of technology. It lacked a kinaesthetic component because it was not possible for students to use the technology themselves in a lecture theatre. However, the computer simulations were made available online and students also practised using SPSS in tutorial classes. Finally, for the dimension of humanness, the present application scored high due to the face to face lecturing. This fact would be expected to benefit students that profess a preference for face to face teaching (Hanson & Clem, 2005).
Although the present findings appear to be positive for future applications of technology in a lecture, some limitations should be noted. The present study was designed to examine students' learning experiences resulting from participating in the lectures. While the results suggested that there were benefits, no quantitative measures of learning outcomes or attitude change (e.g., statistics anxiety) were used. Moreover, to determine a causal effect to the use of technologies during lectures on improvements in these variables would require a comparison with a suitable control group (for an example that examined animations during a statistics lecture, see Wender & Muehlboeck, 2003). Nevertheless, the present results are consistent with the idea that student learning and engagement will be improved by implementing the methods used here. Further research would also be required to examine exactly how the approaches used here helped the students to learn the material. This would require an in-depth examination of the learning processes involved when students engage with the blended learning technology. This could be done by observing students interacting with the technology and analysing the dialogue that surrounds its use, as done by Meletiou-Mavrotheris (2003). Finally, a potential limitation of the teaching methods is that it requires that the instructor possesses suitable computer simulations and multimedia for use in a lecture. To address this problem, lecturers could make use of resources that are freely available on the Internet (e.g., The Rice Virtual Lab in Statistics at http://onlinestatbook.com/rvls.html; Lane, 1999).
From a teaching perspective, it is important that technologies are suitably integrated with the lecture content. Technology should not be used merely for the sake of using it. Instead, a learning outcomes focus should be adopted to guide the selection and use of technology within a blended learning approach. It would be poor teaching practice if a video or the computer simulation was merely presented to the students during a lecture without any accompanying discussion. The use of a computer simulation, for example, should be suitably introduced to the students. The lecturer should describe the features of the simulations and how it can be controlled using the buttons and entering values. When the lecturer interacts with the simulation, he/she should ask students to guess what will happen before actually showing the process in action. The desired learning outcomes when using the simulation should be clear to students. When finishing using the simulation it should be suitably linked to the text-based lecture material in a way that reinforces the desired learning outcome. Making the simulation available online after the lecture will also encourage students to interact with it themselves. This approach is, therefore, likely to be superior in engaging student learning than simply making the computer-based applications available to students online.
Teachers should also be mindful of how to best use multimedia and computer-based demonstrations in a lecture format to enhance student learning outcomes. The results from the present study suggest that improved learning outcomes will be due to the increased attention, interest, and motivation that result from using the technology. These aspects are an important part of the student learning experience and should not be ignored. They would appear to be particularly important for subjects that are traditionally perceived by students as boring or overly theoretical. Statistics, which was the focus of the lectures examined in the present study, is one example. Other subjects that could benefit from increasing student interest and motivation include engineering, information technology, and chemistry.
However, what are arguably of most interest to educators are the cognitive learning processes by which blended learning techniques improve outcomes. The present study has suggested that contextualising the material by showing practical applications of theory may serve to broaden student understanding and allow students to link concepts with existing knowledge. It may also promote a deeper level of processing. The visual nature of the technologies used also emerged as an important mechanism by which learning outcomes can be improved. Students have different learning styles and visual learners would be expected to gain most benefit from seeing theoretical concepts in action.
Finally, the methods used were noted by students to be a different approach and this suggests that the methods may particularly benefit students who do not learn optimally through a traditional lecture-based format. Based on these considerations, it would be suggested that the techniques used in the present study would be best applied in teaching material for which students find difficult to see the potential practical applications of, material that can be presented visually and dynamically, and material that does not suit the traditional lecture-based format. Further research is needed to more fully understand the specific learning processes by which the technologies used in the present study benefited student learning.
Bliwise, N. G. (2005). Web-based tutorials for teaching introductory statistics. Journal of Educational Computing Research, 33, 309-325.
Boud, D. & Prosser, M. (2002). Appraising new technologies for learning: A framework for development. Educational Media International, 39, 237-245.
Carlson, H. L. (1991). Learning style and program design in interactive multimedia. Educational Technology Research & Development, 39, 41-48.
Cobb, G. (1992). Teaching statistics. In L. A. Steen (Ed.), Heeding the call for change: Suggestions for curricular action. MAA Notes No. 22. (pp 3-43). Washington DC: Mathematical Association of America.
Davis, S. & Bostrom, R. (1992). An experimental investigation of the roles of the computer interface and individual characteristics in the learning of computer systems. International Journal of Human-Computer Interaction, 2, 143-172.
DET (2003). Blended learning. NSW Department of Education and Training.
Dunn, P. K. (2004). Understanding statistics using computer demonstrations. Journal of Computers in Mathematics and Science Teaching, 22, 83-103.
Ertmer, P. A., Evenbeck, E., Cennamo, K. S. & Lehman, J. D. (1994). Enhancing self-efficacy for computer technologies through the use of positive classroom experience. Educational Technology Research & Development, 42, 45-62.
Gal, I. & Ginsburg, L. (1994). The role of beliefs and attitudes in learning statistics: Towards an assessment framework. Journal of Statistics Education, 2(2). [verified 14 Apr 2011] http://www.amstat.org/publications/jse/v2n2/gal.html
Gilroy, M. (1998). Using technology to revitalize the lecture: A model for the future. In Issues of education and community colleges: Essays by fellows in the mid-career fellowship program at Princeton University (pp. 1-12). Princeton University, NJ.
González, G. M. & Birch, M. A. (2000). Evaluating the instructional efficacy of computer-mediated interactive multimedia: Comparing three elementary statistics tutorial modules. Journal of Educational Computing Research, 22, 411-436.
Graham, C. R. (2005). Blended learning systems: Definition, current trends and future direction. In C. J. Bonk & C. R. Graham, The handbook of blended learning: Global perspectives, local designs (pp. 3-21). New York, NY: Pfeiffer & Company.
Gunawardena, C. N., Lowe, C. A. & Anderson, T. (1997). Analysis of an online global debate and the development of an interaction analysis model for examining social construction of knowledge in computer conferencing. Journal of Educational Computing Research, 17, 397-431.
Hanson, K. S. & Clem, F. A. (2005). To blend or not to blend: A look at community development via blended learning strategies. In C. J. Bonk & C. R. Graham, The handbook of blended learning: Global perspectives, local designs (pp. 136-150). New York, NY: Pfeiffer & Company.
Heinze, A. & C. Procter (2004). Reflections on the use of blended learning. In Education in a Changing Environment Conference Proceedings, University of Salford, Salford: Education Development Unit. [verified 14 Apr 2011; 3.3 MB] http://www.ece.salford.ac.uk/proceedings/papers/ah_04.rtf
Jones, A., Scanlon, E., Tosunoglu, C., Ross, S., Butcher, P., Murphy, P. & Greenberg, J. (1996). Evaluating CAL at the Open University: 15 years on. Computers & Education, 26, 5-15.
Kuehn, S. A. (1994). Computer-mediated communication in instructional settings: A research agenda. Communication Education, 43, 171-183.
Lane, D. M. (1999). The Rice Virtual Lab in Statistics. Behavior Research Methods, Instruments, & Computers, 31(1), 24-33.
Lee, C. (1999). A comparison of students' beliefs and attitude towards statistics between technology-rich environment and traditional lecture. International Conference on Mathematics/Science and Technology (MEST), 1999, 133-138. [viewed 4 Aug 2010] http://www.editlib.org/p/9127
Lens, W. (1994). Personal computers in the learning environment and student motivation. Scandinavian Journal of Educational Research, 38, 219-230.
Malloy, T. E. & Jensen, G.C. (2001). Utah virtual lab: JAVA interactivity for teaching science and statistics on line. Behavior Research Methods, Instruments, & Computers, 33, 282-286.
McGarr, O. (2009). A review of podcasting in higher education: Its influence on the traditional lecture. Australasian Journal of Educational Technology, 25(3), 309-321. http://www.ascilite.org.au/ajet/ajet25/mcgarr.html
Meletiou-Mavrotheris, M. (2003). Technological tolls in the introductory statistics classroom: Effects on student understanding of inferential statistics. International Journal of Computers for Mathematical Learning, 8, 265-297.
Morris, E. J. (2001). The design and evaluation of Link: A computer-based learning system for correlation. British Journal of Educational Technology, 32, 39-52.
Morris, E. J., Joiner, R. & Scanlon, E. (2002). The contribution of computer-based activities to understanding statistics. Journal of Computer Assisted Learning, 18, 114-124.
Morris, E. J. & Scanlon, E. (2000). Active learning of statistics: A case study. Journal of the Association of Learning Technology Journal, 8(1), 80-91. http://repository.alt.ac.uk/id/eprint/323
Neumann, D. L. (2010). Using interactive simulations in assessment: The use of computer-based interactive simulations in the assessment of statistical concepts. The International Journal for Technology in Mathematics Education, 17, 43-51. [verified 14 Apr 2011] http://www98.griffith.edu.au/dspace/bitstream/10072/34144/1/64475_1.pdf
Neumann, D. L. & Hood, M. (2009). The effects of using a wiki on student engagement and learning of report writing skills in a university statistics course. Australasian Journal of Educational Technology, 25(3), 382-398. http://www.ascilite.org.au/ajet/ajet25/neumann.html
Neumann, D. L. Hood, M. & Neumann, M. M. (2008). Strategies that enhance student engagement during the teaching of statistics in psychology programs. In Proceedings of 43rd APS Conference, pp. 234-238. Melbourne, Australian Psychological Society. http://www98.griffith.edu.au/dspace/bitstream/10072/23367/1/53168_1.pdf
Neumann, D. L., Hood, M. & Neumann, M. M. (2009). Statistics? You must be joking: The application and evaluation of humor when teaching statistics. Journal of Statistics Education, 17(2). http://www.amstat.org/publications/jse/v17n2/neumann.pdf
Neumann, D. L., Neumann, M. M. & Hood, M. (2010). The development and evaluation of a survey that makes use of student data to teach statistics. Journal of Statistics Education, 18(1). http://www.amstat.org/publications/jse/v18n1/neumann.pdf
Onwuegbuzie, A. J. (2004). Academic procrastination and statistics anxiety. Assessment and Evaluation in Higher Education, 29, 3-19.
Onwuegbuzie, A. J. & Wilson, V. A. (2003). Statistics anxiety: Nature etiology, antecedents, effects, and treatments - a comprehensive review of the literature. Teaching in Higher Education, 8(2), 195-209.
Procter, C. (2003). Blended learning in practice. In Proceedings of Conference on Education in a Changing Environment 2003. Salford, UK: The University of Salford. http://www.ece.salford.ac.uk/proceedings/papers/cp_03.rtf
Psotka, J., Kerst, S. & Westerman, T. (1993). The use of hypertext and sensory-level supports for visual learning of aircraft names and shapes. Behavior Research Methods, Instruments, & Computers, 25, 168-172.
Rey, G. D. (2010). Instructional advice, time advice and learning questions in computer simulations. Australasian Journal of Educational Technology, 26(5), 675-689. http://www.ascilite.org.au/ajet/ajet26/rey.html
Riding, R. & Douglas, G. (1993). The effect of cognitive style and mode of presentation on learning performance. British Journal of Educational Psychology, 63, 297-307.
Roberts, G. & Dunn, P. (1996). Electronic classrooms and lecture theatres: Design and use factors in the age of the mass lecture. In J. G. Hedberg, J. Steele & S. McNamara (Eds.), Learning technologies: Prospects and pathways (pp. 144-152). Canberra: AJET Publications. http://www.ascilite.org.au/aset-archives/confs/edtech96/roberts.html
Shute, V. J. & Gawlick-Grendell, L. A. (1994). What does the computer contribute to learning? Computers & Education, 23, 177-186.
Shute, V. J., Gawlick-Grendell, L. A., Young, R. K. & Burnham, C. A. (1996). An experiential system for learning probability: Stat Lady description and evaluation. Instructional Science, 24, 25-46.
Swan, K. (2009). Introduction to the special issue on blended learning. Journal of the Research Centre for Educational Technology, 5(1), 1-3. http://www.rcetj.org/index.php/rcetj/article/view/20/25
Traphagan, T., Kucsera, J. V. & Kishi, K. (2010). Impact of class lecture webcasting on attendance and learning. Educational Technology Research & Development, 58, 19-37.
Tremblay, P. F., Gardner, R. C. & Heipel, G. (2000). A model of the relationships among measures of affect, aptitude, and performance in introductory statistics. Canadian Journal of Behavioural Science, 32(1), 40-48.
Wieling, M. B. & Hofman, W. H. A. (2010). The impact of online video lecture recordings and automated feedback of student performance. Computers & Education, 54, 992-998.
Wender, K. F. & Muehlboeck, J. S. (2003). Animated diagrams in teaching statistics. Behavior Research Methods, Instruments, & Computers, 35, 255-258.
|Authors: Associate Professor David Neumann|
School of Psychology
Griffith University QLD 4222, Australia.
Michelle Neumann, School of Psychology
Griffith University QLD 4222, Australia.
Dr Michelle Hood, School of Psychology
Griffith University QLD 4222, Australia.
Please cite as: Neumann, D., Neumann, M. & Hood, M. (2011). Evaluating computer-based simulations, multimedia and animations that help integrate blended learning with lectures in first year statistics. Australasian Journal of Educational Technology, 27(2), 274-289. http://www.ascilite.org.au/ajet/ajet27/neumann.html