|Australasian Journal of Educational Technology
2012, 28(4), 619-638.
Comparing computer game and traditional lecture using experience ratings from high and low achieving students
Michael Grimley, Richard Green, Trond Nilsen and David Thompson
University of Canterbury
Computer games are purported to be effective instructional tools that enhance motivation and improve engagement. The aim of this study was to investigate how tertiary student experiences change when instruction was computer game based compared to lecture based, and whether experiences differed between high and low achieving students. Participants consisted two cohorts enrolled in a first year university course (Cohort 1, traditional: male=42, female=17; Cohort 2, computer game: male=42, female=7). Cohort 1 experienced course content as traditional lectures, Cohort 2 experienced course content embedded within a computer game. Csikszentmihalyi's experience sampling method was used to sample experiences of students for each cohort during instruction. Results showed that the computer game group were more challenged and valued the activity more than the traditional group, but were inclined to wish they were doing something else. High achieving students during game mode showed greater concentration but found it harder to concentrate and found game mode more sociable and lecture mode more boring. High achievers perceived greater success for lecture mode and found lectures more satisfying. Individual profiles of high and low achieving students for each mode indicated that games afforded better experiences for low achieving students but poorer experiences for high achieving students.
Some commentators revere computer games as being revolutionary educational tools (e.g. Prensky, 2001; DeHaan 2005; Lainema & Nurmi 2006; Ip, Capey, Baker & Carroll, 2009), and suggest that their promise lies in their utility for being interactive, social and highly motivational. The intrinsic motivational value of computer games is hard to deny, given the large proportion of children and adults who now play these games in their leisure time. Reports (Lenhart, Jones & Macgill, 2008) indicate that in the US 53% of adults play video games and 97% of teens. Such motivational value has drawn educators towards using computer games as instructional tools. Malone and Lepper (1987) offer a number of characteristics that stimulate intrinsic motivation, including, challenge, curiosity, control, fantasy, competition, cooperation and recognition. These characteristics are usually displayed in most modern computer games (Prensky, 2001; Dickey, 2011).
Another argument for the use of computer games as instructional tools at the tertiary level is that most students enrolled in university courses are digital natives (Prensky, 2001) or 'Net Geners', a generation of learners who are only engaged if they are learning by interaction, through experience and in exploratory ways (Oblinger & Oblinger 2005; Prensky 2001). It might therefore be reasonable to assume that students in universities would have improved instructional experiences through computer game instruction compared with traditional lectures.
A number of studies have investigated the utility of computer games for instructional purposes. Two recent meta-analyses synthesise these studies. Vogel et al (2006) performed a meta-analyses of studies conducted for computer games and interactive simulations, compared to traditional instructional methods. They found that games and interactive simulations showed greater cognitive gains than more traditional methods. Sitzmann (2011) conducted similar analyses and found greater training self-efficacy, declarative knowledge, procedural knowledge and long term retention for computer games and interactive simulations, compared with traditional instructional methods. Sitzmann (2011) goes on to point out that it is the interactive nature of the game that is important in the learning process, and the difference between traditional instruction and computer games is this active engagement. However, it could be argued that most of the learning at tertiary level occurs post-instruction and that lectures are merely catalysts for more self regulated learning, thus negating the need for interactive instruction.
Another viewpoint to the argument between traditional methods and more contemporary methods of instruction is that it is not merely the instructional technique that promotes learning, but how the learner perceives that technique (Entwistle, 1991; Struyven et al, 2008), and instructional techniques that give the perception of encouraging deep learning will also facilitate such learning. In this respect, lectures give the perception of surface learning (Case & Marshall, 2004) and as such may facilitate surface learning and do nothing for deeper learning and understanding. Further, expectations of learning and learning environments have been suggested to be important when considering learning outcomes. Such views imply that if expectations are to be met, performance may be improved (Sander et al, 2000). Therefore, such a view may be construed as suggestive of computer games being fun, but not suited to the serious business of learning, thus lowering expectations, and as a result detracting from learning and understanding. However, O'Leary et al, (2005) showed that instruction which supported active learning and attained high student satisfaction demonstrated minimal improvement in achievement, when compared to lecture based instruction. Hardy et al (2003) emphasised that it is not necessarily instruction that predicts exam achievement, but students' antecedents.
It seems clear that learning is multi-faceted and not fully measurable through traditional tests of academic achievement such as examinations. Kirkpatrick (1994) includes both affective and cognitive variables and describes learner reaction as being important. For example a learner's motivation to engage with the learning material is an important aspect of these reactions, so instructors need to engage students and instil intrinsic motivations to learn. Without motivation, most learning environments are ineffective (Lepper & Chabay, 1985). It is therefore important to understand how student experiences within a course relate to overall achievement in that course.
The majority of studies evaluating the efficacy of computer games focus on cognitive outcomes. More research is needed to evaluate students' affective outcomes. In addition, different instructional techniques are not necessarily effective techniques for all learners, therefore some effort is required to investigate how these techniques may be varied according to student characteristics. To date, insufficient empirical research has been conducted to fully validate the use of computer games, especially given the practical constraints of using computer games for instructional purposes (Connolly, Hainey & Stansfield, 2007). This paper describes some of the findings of a study designed to explore the impact of using computer games to teach first year undergraduates. In particular, it addresses the issue of how using computer games in a tertiary course changes student experiences compared to the lecture approach, and differentiates between high achieving students and low achieving students.
This study investigates the changes in learner experience brought about by changes in instructional mode and compares the experiences of high and low achieving students. It asks two important questions:
Neverwinter Nights and its toolset (Bioware, 2002) were used to construct the game modules, chosen because of its comprehensive graphically advanced content, and capacity to construct original modules with relative ease. Neverwinter Nights is a medieval fantasy role playing game based on the dungeons and dragons system. Individual modules constructed for the purposes of delivering the educational content were embedded into an overall hub module (depicted by Ye Olde University of Canterbury) by placing each content module in different areas of the hub (i.e. within different university departments). The overall narrative experienced by players depicted the players as first year students at a medieval University of Canterbury and encouraged them to progress into subsequent years and to follow the career development of an academic as they completed modules successfully and gained experience tokens.
During play, characters received experience points for solving problems associated with the educational content, and for end of unit quizzes designed to test their knowledge and understanding of the key educational concepts embedded in the module. As experience points reached certain amounts, the characters gained in power and were able to 'go up a level'. Character levels affect almost everything about a character and allowed him or her to achieve much more within the game. The game was set up in such a way as to allow players to return to modules to improve upon their original score if they so wished (in their own time), thus encouraging them to revisit the module content. Players progressed through the game by engaging with the various challenges built into each department's module with the ultimate aim of progressing with enough experience points to challenge for the position of Vice Chancellor in the finale (see Figure 1 for screen pictures).
Experience sampling method
Student experiences were rated using the Experience Sampling Method (Hektner, Schmidt & Csikszentmihalyi, 2007) originally designed to capture real time experience and measure feelings of flow. The experience sampling form selected for this study was adapted from that used in the 'Talented Teenagers' study (Csikszentmihalyi, Rathunde & Whalen 1997, p.52-53) and contained subjective questions designed to sample participant's mood, thoughts, general feelings, and feelings about the activity. Table 1 shows the experience indicators contained in the experience sampling forms which were completed by students. Students completed one experience sampling form per hour of instruction and administered at a random time during each session, predetermined by a random number generator and administered by an objective observer within each session.
Figure 1: Screen pictures of the game environment
Table 1: A list of experience indicators
|How well were you concentrating?|
Was it hard to concentrate?
How self conscious were you?
Did you feel good about yourself?
Were you in control of the situation?
Were you living up to your own expectations?
Were you living up to others expectations?
|Alert - drowsy|
Happy - sad
Irritable - cheerful
Strong - weak
Active - passive
Lonely - sociable
Ashamed - proud
Involved - detached
Excited - bored
Closed - open
Clear - confused
Tense - relaxed
Competitive - cooperative
|Challenges of the activity|
Your skills in the activity
Was the activity important to you?
Was the activity important to others?
Were you succeeding at what you were doing?
Do you wish you had been doing something else?
Were you satisfied with how you were doing?
How important was this activity in relation to your overall goals?
|Did you feel any pain or discomfort as you were beeped?|
Rating scores were standardised by creating individual z scores in order to remove individual differences. This procedure "removes differences between individuals in how they respond to each item. These z-scores are created by subtracting the subject's overall mean for the item and then dividing by the subject's standard deviation" (Larson & Delespaul, 1992; p.75). Three high and three low achieving students within each cohort were compared qualitatively by graphing the aggregated standardised experience scores for all 29 experience indicators to produce individual experience profiles. These six profiles were chosen by taking the three lowest and three highest students who had completed data. Finally, a short interview was conducted with participants in the game condition about their individual thoughts on using computer games for learning. The following three questions were asked of participants in the game condition: 1. What do you think of learning the course material through playing a game? 2. Would you prefer to learn the course material through a game course or a traditional lecture? 3. Are there any other benefits or negatives using game style learning? Responses were described by broad theme and example responses demonstrated.
|Group||Cohort 1 (Lecture)||Cohort 2 (Game)|
|Mean score||Range (n)||Mean score||Range (n)|
|High achievement group||38.2||32-50 (24)||35.7||31-57 (19)|
|Low achievement group||22.5||8-31 (24)||18.2||7-25.5 (19)|
|Total||30.3||8-50 (48)||26.9||7-57 (38)|
|Challenges of the activity||3.77(1.75)||4.69 (1.59)||0.55|
|Was the activity important to you?||4.96 (1.65)||6.09 (1.48)||0.72|
|Was the activity important to others?||5.36 (1.41)||6.02 (1.49)||0.46|
|Do you wish you had been doing something else?||4.50 (2.1)||5.64 (2.09)||0.54|
|How well were you concentrating?||High||5.42 (1.41)||6.08 (1.43)|
|Low||5.79 (1.14)||5.21 (1.55)|
|Was it hard to concentrate?||High||3.16 (1.23)||3.63 (1.87)|
|Low||4.00 (1.54)||2.92 (1.98)|
|Lonely - sociable||High||3.41 (0.58)||3.66 (0.71)|
|Low||3.77 (0.81)||3.27 (0.57)|
|Excited - bored||High||2.91 (0.64)||2.59 (0.64)|
|Low||2.62 (0.69)||2.91 (0.78)|
|Were you succeeding at what you were doing?||High||6.05 (0.98)||5.98 (1.10)|
|Low||5.68 (1.17)||6.67 (1.10)|
|Were you satisfied with how you were doing?||High||5.89 (1.05)||5.50 (1.16)|
|Low||5.42 (1.30)||6.25 (1.18)|
First an effect for level of concentration (F(1, 82) = 4.380, p = .039) showed a classic cross over effect, with high attainment students showing greater concentration for game delivery and low attainment students with greater concentration levels for lecture delivery. Second, a significant interaction was displayed for hardness to concentrate (F(1, 82) = 4.711, p = .033), with high attainment students finding it harder to concentrate in the game mode and low attainment students finding it harder to concentrate in the lecture mode. Third, there was an interaction effect between mode and attainment for level of sociability experienced by students (F(1, 82) = 6.214, p = .015), with high attainment students feeling more sociable in the game mode and low attainment students feeling more sociable in the lecture mode.
Fourth, an interaction effect between delivery mode and attainment level for boredom level (F(1, 82) = 3.951, p = .05) showed that high attaining students found lectures more boring and low attaining students found games more boring. Fifth, an interaction was observed between mode and attainment for perceived level of success (F(1, 82) = 5.044, p = .027) with high attainment individuals showing similar perceived levels of success for both modes and low attaining students showing higher perceived levels of success in the game mode compared to lecture mode. Finally, a mode by attainment interaction was shown for satisfaction level (F(1, 82) = 5.721, p = .019), with high attaining students more satisfied with the lecture mode and low attaining students more satisfied with the game mode.
Lecture mode profiles
Figures 2, 3 and 4 show three different pairs of individual lecture experiences for students differentiated by their examination result (three highest versus three lowest). When viewing these profiles it is useful to note that scores were standardised to provide a mean of zero and standard deviation of 1, thus a positive index indicates a positive experience and a negative index indicates a negative experience. It is evident from these comparisons that high achieving students generally have more positive experiences than low achieving students.
Figure 2: Individual experience profile of a low attainment student and a high
attainment student for aggregated standardised LECTURE experiences
Figure 3: Individual experience profile for a low attainment student and a
high attainment student for aggregated standardised LECTURE experiences
Figure 4: Individual experience profile for a low attainment student and a
high attainment student for aggregated standardised LECTURE experiences
Game mode profiles
Figures 5, 6 and 7 show three different pairs of individual game experiences for students differentiated by their examination results. Game experience profiles differentiated by attainment level show no clear patterns. Compared to lecture profiles, these high and low achieving students show very similar experiences. It is clear that compared to the lecture experience profiles, game experience profiles show that high achieving students have reduced positive experiences and low achieving students have increased positive (reduced negative) experiences.
Figure 5: Individual experience profile for a low attainment student and a high
attainment student for aggregated standardised GAME experiences
Figure 6: Individual experience profile for a low attainment student and a high
attainment student for aggregated standardised GAME experiences
Figure 7: Individual experience profile for a low attainment student and a high
attainment student for aggregated standardised GAME experiences
"Yes, I found it very enjoyable. Find games are easy to learn."2. Would you prefer to learn the course material through a game course or a traditional lecture?
"Yeah, quite enjoyable."
"It is definitely different. Nothing that I have been exposed to before. It is pretty good and I'm enjoying it. And supposed to wait until the exam to see if it works or not."
"Yeah, I think I remember faster if I doing it yourself. By playing it yourself, it is easier to learn."
"Enjoyable and good for learning. You learn a lot of information faster."
"Quite interesting & refreshing what I did."
"Definitely quite different. Yes, enjoyable."
"Yes, it's alright, it's pretty fun. Yes, it is useful for learning, but you get distracted sometimes from the game."
"It is alright in learning through the game. It is enjoyable, but not sure how much I'm taking in."
"It's quite good. We learn as we play and it's quite fun."
"Yes, I find it interesting for learning."
"I think it is brilliant & wonderful."
"Game of course, because it is more entertaining."3. Are there any other benefits or negatives using game style learning?
"Prefer computer games. More fun."
"I think a mix of both. Because the computer games, play the game from the actual learning, while the lectures you can actually learn, listen to the lecturers, taking notes and go through that a lot. But in the game, all the material can't actually go through, unless you open up the game again and again. So a mixture will be good."
"I think both is fine to me."
"Good idea to use computer game for learning and studying. More practice no more theory."
"Lectures are a bit easier. You don't get so frustrated."
"I prefer computer game, because it is more fun and interesting."
"I don't really know."
"Game probably works for some people, different people have different way, some like to sit there quietly to learn."
"Easier to stick to it. You wouldn't find it easy to give up."
"Means you always get lecturers which is a good thing."
"Will benefit because reinforcing what we learn and means we need to do our reading well to use in the game."
"It does get confusing. Because it is new to me, but I'm sure other people find it easier and some find it harder. So different effects, but I wouldn't say it is negative."
"Bugs and codes... can be very irritating. There is one at the moment which means that no matter how many questions I get right in the module I go back to zero."
Results also indicated that students felt that the activity of learning using a computer game was important to them compared to the lecture experience. This may reflect the fact that this was a self selecting sample which was likely to have an interest in computer games and in their use as an instructional tool. In addition, lectures would be more familiar, thus making the game experience different and potentially perceived as being more important. Interestingly, students in the game mode also indicated that compared to lectures they thought that learning through games was important to others. However, this question is somewhat ambiguous as it is difficult to ascertain who 'others' might refer to. For instance, 'others' could refer to the researchers in this instance, who the students may perceive as thinking that the activity was important to them.
One assertion that many researchers make about game based learning is that it is fun and motivating (Prensky, 2001; Dickey, 2011) and this is often a major reason for pursuing games as an instructional tool. However, students in game mode were more likely to wish that they were 'doing something else' compared to students in the lecture mode. One explanation for this difference may be that the incorporation of computer games for instructional purposes in a formal course renders the fun in the game as inert. Thus, the game becomes just another instructional tool rather than the fun activity that they are used to doing when playing computer games as a leisure activity. If this is the case it is important for instructors who are contemplating using a computer game as an instructional tool to consider its educational potential in terms of what else it can add to the learning environment rather than just something that makes learning fun or intrinsically motivational. The challenging and active nature of the game experience likely adds quality to the learning experience thus maximising instructional time more effectively.
Further to the main effects, six interaction effects indicated that high and low achieving students tended to react differently to different instructional modes. The first of these interaction effects asked students how well they were concentrating. High attainment students indicated that they were not concentrating as well for the lecture condition as they were for the game condition, with the reverse being true of the low attainment group who indicated that they were concentrating well for the lecture condition, but not so well for the game condition. This finding on its own seems puzzling as one would expect the high attainment group to concentrate well in lectures. However, the second interaction effect may illuminate the true meaning of this first interaction because high achieving students indicated that they found it harder to concentrate in the game condition than the lecture condition with the reverse true for high achieving students. This may suggest that low attainment students find the lecture content more difficult to understand thus causing them to doubt their concentration levels. In the game condition low attainment students may believe that they are concentrating well by focusing on the game play (rather than the passive nature of the lecture material), but high attainment students may find it easy to concentrate on lecture content whilst finding the added distraction of game interaction frustrating. Clearly the outcome of the assessment of the low attainment group shows that they failed to judge the adequacy of their concentration effectively, whereas the high group were more able to cope with both types of instruction.
Another difference in student experience was that high achieving students found the game mode more sociable than lecture mode, with the reverse true for low achieving students. This may be due to low achieving students being more 'off task' in the lecture scenario, adding to their lack of ability to concentrate in a lecture context. In reality, game mode should be much more sociable than lecture mode, because although students worked individually at a computer they worked together in a computer lab, at their own pace, without the need to listen to the lecturer and so were free to talk to each other.
Low attaining students indicated that they were more bored in the game situation compared to high attaining students, who were more bored in the lecture situation. This result is difficult to explain because one would expect the reverse. One explanation for this result may be that students who are struggling with the content of the lectures are having to concentrate hard to comprehend the material, and thus are less likely to be bored. The same students in the game mode may fail to engage fully with the course content, but are able to concentrate on game playing, only to find that it fails to live up to their normal leisure time game play. This is supported by the result showing that low attaining students were inclined to perceive that they were not succeeding during lectures, but were succeeding during game mode. However, contrary to this, high attaining students felt that they were succeeding during both forms of instruction. The final interaction also supports these ideas because low attainment students seemed much more satisfied with their performance in the game mode than in the lecture mode, whereas high attaining students were more satisfied with their performance in the lecture scenario than in the game scenario.
Through the qualitative analysis of individual experiences between modes for high and low achievers some clear patterns were observed. It was apparent that low achievers in lecture mode were encountering some extreme negative experiences compared to high achievers. These negative extremes were not evident for game mode experiences. In addition, lecture experiences seemed to differentiate between high and low achievers in that high achievers indicated a more positive profile, compared to the low achievers, who had more negative profiles. However, profiles between high and low achieving students in game mode were much less differentiated. Although it could be argued that high achievers showed a slightly more positive experience profile than low achievers, they were in fact very similar. It seems that the introduction of a computer game instructional mode tended to decrease the experiences of high achievers, but the introduction of a computer game for low attainment students improved their experiences.
Given this difference, we argue that there is a need to match the instructional strategy employed by instructors on more than the perceived experience of students. It is important that the use of computer games is matched closely to the needs of the students' learning, rather than just on their motivational value. In addition, if the perception of the learner point of view is important, as suggested by Entwistle (1991) and Struyven et al (2008), then it would be important to distinguish between struggling learners and those who are more accomplished, before using innovative techniques such as computer games, in order to establish their likely impact on the learner. It is clear from the responses given by participants during short interviews about using games to learn that not all students felt that they learned best from computer games, thus highlighting the need to match the instructional mode to the individual learner.
This study does not come without a number of limitations. Firstly, the sample was relatively small and restricted to one course within one university department. Second, the sample itself was self selecting and given the name of the course (Computer Games and Education) was likely to attract students with a vested interest in computer games. Finally, because the field of computer games is developing very quickly, the type and sophistication of modern computer games changes quickly, which in turn is likely to affect the experiences that students gain from different games. This study only looked at one game, which did not come without its flaws. For example, it relied on textual communication which could be construed as a limitation of this particular game. In addition, students played it in single player mode rather than multi-player mode. These factors are likely to affect the experiences gained by the students.
However, one of the main reasons for employing this particular game was that it came with an extensive toolset which could be utilised by instructors to avoid having to employ expensive computer programmers. As computer games and their toolsets develop instructors will be in a position to build their own educational games. Such advances are now being realised through platforms such as multi-user virtual worlds, notably Second Life and OpenSim which allow games to be easily built within them. It is therefore important that such environments are fully investigated to ascertain their utility for education and in particular examine the experiences of players in educational environments.
Cohen, J. (1988). Statistical power for the behavioral sciences (2nd ed.). Hillsdale, NJ: Erlbaum.
Connolly, T. M., Stansfield, M. & Hainey, T. (2007). An application of games-based learning within software engineering. British Journal of Educational Technology, 38(3), 416-428. http://dx.doi.org/10.1111/j.1467-8535.2007.00706.x
Csikszentmihalyi, M., Rathunde, K. & Whalen, S. (1993). Talented teenagers: The roots of success and failure. Cambridge. Cambridge University Press.
DeHaan, J. W. (2005). Acquisition of Japanese as a foreign language through a baseball videogame. Foreign Language Annals, 38(2), 278-282. http://dx.doi.org/10.1111/j.1944-9720.2005.tb02492.x
Dickey, M. D. (2011). Murder on Grimm Isle: The impact of game narrative design in an educational game-based learning environment. British Journal of Educational Technology, 42(3), 456-469. http://dx.doi.org/10.1111/j.1467-8535.2009.01032.x
Entwistle, N. J. (1991). Approaches to learning and perceptions of the learning environment: Introduction to the special issue. Higher Education, 22, 201-204. http://www.jstor.org/stable/3447172
Hardy, S. A., Zamboanga, B. L., Thompson, R. A. & Reay, D. (2003). Student background and course involvement among first-year college students in introduction to psychology: Implications for course design and student achievement. Psychology Learning and Teaching, 3(1), 6-10. http://dx.doi.org/10.2304/plat.2003.3.1.6
Hektner, J. M., Schmidt, J. A. & Csikszentmihalyi, M. (2007). Experience sampling method: Measuring the quality of everyday life. London: Sage.
Ip, B., Capey, M., Baker. & Carroll, J. (2009). Evaluating coursework in computer games degrees: Students and assessors as virtual characters. Australasian Journal of Educational Technology, 25(1), 80-100. http://www.ascilite.org.au/ajet/ajet25/ip.html
Kirkpatrick, D. L. (1994). Evaluating training programs: The four levels. San Francisco, CA: Berrett- Koehler.
Klopfer, E., Osterweil, S., Grogg, J. & Haas, J. (2009). Using the technology of today in the classroom today: The instructional power of digital games, social networking, simulations and how teachers can leverage them. The Education Arcade. [viewed 13 Sep 2011] http://education.mit.edu/papers/GamesSimsSocNets_EdArcade.pdf
Knight, J. K. & Wood, W. B. (2005). Teaching more by lecturing less. Cell Biology Education, 4, 298-310. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1305892/
Lainema, T. & Nurmi, S. (2006). Applying an authentic, dynamic learning environment in real world business. Computers & Education, 47, 94-115. http://dx.doi.org/10.1016/j.compedu.2004.10.002
Larson, R. & Delespaul, P. A. E. G. (1992). Analyzing experience sampling data: A guidebook for the perplexed. In M. DeVries (Ed.), The experience of psychopathology: Investigating mental disorders in their natural setting. Cambridge, UK: Cambridge University Press.
Lenhart, A., Jones, S. & Macgill, A. (2008). Adults and video games. (Research report). Pew Internet and American Life Project. [viewed 29 Aug 2011]. http://www.pewinternet.org/~/media//Files/Reports/2008/PIP_Adult_gaming_memo.pdf.pdf
Lepper, M. R. & Chabay, R. W. (1985). Intrinsic motivation and instruction: Conflicting views on the role of motivational processes in computer-based education. Educational Psychologist, 20(4), 217-230. http://dx.doi.org/10.1207/s15326985ep2004_6
Malone, T. W. & Lepper, M. R. (1987). Making learning fun: A taxonomy of intrinsic motivations for learning. In R. E. Snow & M. J. Farr (Eds), Aptitude, learning, and instruction. Volume 3: Cognitive and affective process analyses. Hillsdale, NJ: Lawrence Erlbaum Associates, pp. 223-250.
Oblinger, D. G. & Oblinger, J. L. (2005). Educating the Net Generation. EDUCAUSE. http://www.educause.edu/educatingthenetgen/
O'Leary, S., Churley-Strom, R., Diepenhorst, L. & Magrane, D. (2005). Educational games in an obstetrics and gynaecology core curriculum. American Journal of Obstetrics and Gynaecology, 193(5), 1848-1851. http://www.ajog.org/article/S0002-9378(05)01144-0/abstract
Prensky, M. (2001). Digital Natives, Digital Immigrants. On the Horizon, 9, (5). [viewed 19 August 2011]. http://www.marcprensky.com/writing/Prensky%20-%20Digital%20Natives,%20Digital%20Immigrants%20-%20Part1.pdf
Sander, P., Coates, D., King, M. & Stevenson, K. (2000). University students' expectations of teaching. Studies in Higher Education, 25(3), 309-323. http://dx.doi.org/10.1080/03075070050193433
Sitzmann, T. (2011). A meta-analytic examination of the instructional effectiveness of computer-based simulation games. Personnel Psychology, 64, 489-528. http://dx.doi.org/10.1111/j.1744-6570.2011.01190.x
Struyven, K., Dochy, F., Janssens, S. & Gielen, S. (2008). Students' experiences with contrasting learning environments: The added value of students' perceptions. Learning Environments Research, 11(2), 83-109. http://dx.doi.org/10.1007/s10984-008-9041-8
Vogel, J. J., Vogel, D.S., Cannon-Bowers, J., Bowers, C. A., Muse, K. & Wright, M. (2006). Computer gaming and interactive simulations for learning: A meta-analysis. Journal of Educational Computing Research, 34(3), 229-243. http://dx.doi.org/10.2190/FLHV-K4WA-WPVQ-H0YM
|Authors: Michael Grimley, Associate Professor Education|
Faculty of Business and Enterprise,
Swinburne University of Technology, Hawthorn Vic 3122
(formerly School of Education Studies and Human Development, University of Canterbury)
Richard Green, Trond Nilsen and David Thompson
Department of Computer Science and Software Engineering,
University of Canterbury, Private Bag 4800, Christchurch 8140, New Zealand
Please cite as: Grimley, M., Green, R., Nilsen, T. & Thompson, D. (2012). Comparing computer game and traditional lecture using experience ratings from high and low achieving students. Australasian Journal of Educational Technology, 28(4), 619-638. http://www.ascilite.org.au/ajet/ajet28/grimley.html