|Australasian Journal of Educational Technology
2011, 27(4), 727-750.
Using computer-based instruction to improve Indigenous early literacy in Northern Australia: A quasi-experimental study
Colorado State University
Janet Helmer, Tess Lea, Helen Harper, Kalotina Chalkiti, Christine Bottrell
Charles Darwin University
The effectiveness of a web-based reading support tool, ABRACADABRA, to improve the literacy outcomes of Indigenous and non-Indigenous students was evaluated over one semester in several Northern Territory primary schools in 2009. ABRACADABRA is intended as a support for teachers in the early years of schooling, giving them a friendly, game and evidence-based tool to reinforce their literacy instruction. The classroom implementation of ABRACADABRA by briefly trained and intensively supported teachers was evaluated using a quasi-experimental pretest, post-test control group design with 118 children in the intervention and 48 in the control. Children received either a minimum of 20 hours of technology-based intervention or regular classroom teaching. Results revealed both Indigenous and non-Indigenous students who received ABRACADABRA instruction had significantly higher phonological awareness scores than their control group peers. The effect size for this difference was large (eta squared=.14). This finding remained when controlling for student attendance and the quality of general non-technology-based literacy instruction. Limitations of the study and implications for effective practice in remote and regional contexts are discussed.
The National Report also ushered in an era of increased emphasis on assessment and evaluation in Australia, leading to the introduction of assessment at Years 3, 5, 7 and 9 to measure students for the reporting of student and school achievement. One issue brought to the fore by the national testing system is the significant 'gap' in the academic achievement between Indigenous and non-Indigenous students, with Indigenous students lagging far behind their non-Indigenous counterparts (Masters & Forster, 1997; Ladwig & Sarra, 2009, p. 14). For example, in the Northern Territory (NT), which has the greatest proportion of Indigenous students (45% of students in the NT are Indigenous), only 40% of Indigenous students achieved minimum benchmarks in reading by Grade 3, compared to 90% of non-Indigenous students in 2009 (ACARA, 2010). Wise, da Silva, Webster and Sanson (2005) recommend interventions in the early years of schooling to remediate the literacy gap between Indigenous and non-Indigenous students. Similarly, Leigh & Gong (2008) provide evidence that the gap widens during school years, which suggests interventions targeted at Indigenous children in early schooling could well have a significant impact.
Reasons for poor literacy outcomes among Indigenous students are complex, and it has been convincingly argued that they are part of a colonial, historical, social and cultural dynamic that resists straightforward delineation (Bourke, Rigby & Burden, 2000; Gray & Hunter, 2000; Louden, et al., 2005). Alongside these important distal factors, two proximal factors that may affect these outcomes, positively or adversely, are attendance and teacher quality. Australian Bureau of Statistics data show reasons for absences of Indigenous students from school include cultural events, illness, sorry business (funerals) and community flooding (2009). However, school attendance may also be a response to quality and continuity of teaching. In 2009 over 550 of the 2133 (26%) teachers in the NT left their positions and teacher turnover is known to be higher in remote communities, in some instances leaving remote schools sorely understaffed (NT DET, 2009).
Whatever the reasons for poor overall literacy performance, Indigenous students' results point to the need for early, intensive and evidence-based reading interventions. This paper investigates the impact of an early literacy intervention, ABRACADABRA (CSLP, 2009), that targets children identified as struggling readers, on Indigenous student literacy in Australia's Northern Territory (NT). The term 'literacy' is used here in the sense elaborated by the Australian National Curriculum Board (ACARA, 2009, p.6), as referring to 'a flexible, sustainable mastery of a set of capabilities in the use and production of traditional texts and new communications technologies using spoken language, print and multimedia.' The ability to read print fluently is one aspect of literacy. A strong consensus in research evidence suggests that among other abilities, fluent word reading underpins this capacity for reading fluency and that this in turn is underpinned by foundational alphabetic skills early in reading acquisition (e.g. Byrne, 1998; Savage et al., 2007).
Punitive attempts to improve attendance either through local 'no school, no pool' policies (Ah Kit, 2004) or by linking welfare payments to student attendance (Behrendt & McCausland, 2008) have had little sustained impact on improving student attendance. Increased school attendance in remote communities appears to be associated with more positive measures such as providing breakfast, creating a welcoming school climate and offering programs in which students are engaged (Batten and Russell, 1995); however, much of the evidence for improved attendance is anecdotal. A number of studies show that poor attendance at school has an adverse effect on student academic achievement generally (Dunn, Cadane & Garrow, 2003; Gray & Partington, 2003; Mellor & Corrigan, 2004), and on the achievement of Indigenous students in particular (Ehrich, et al., 2010; Frigo, et al., 2004). In turn, poor achievement may compound the reasons for poor attendance.
In the NT, attendance and quality of instruction are major factors that impact on student achievement and must be considered when designing and implementing effective programs for Indigenous students. An easily delivered, self-paced literacy program that supports teachers to provide direct early literacy instruction may help counter the effects of frequent student absences and inadequate preparation to teach in the early years.
ABRACADABRA has been shown to enhance student literacy in Canada with a wide range of learners. Since 2004, several randomised controlled trials (RCTs) and quasi-experimental studies have been conducted in Canadian classrooms to measure the impact of ABRACADABRA on the literacy development of kindergarten (transition) and grade 1 students. The Canadian RCT data to date has shown that ABRACADABRA aids typical students in Grade 1 (Savage, Abrami, Hipps & Deault, 2009) as well as children with poor attention (Deault, Savage & Abrami, 2009) and low socio-economic pre-reading students in transition level classrooms (Comaskey, Savage & Abrami, 2009). An RCT comparing two different ABRACADABRA treatments (ABRACADABRA with a focus on synthetic phonics and ABRACADABRA with a focus on analytic phonics) to typical instruction revealed significant advantages for ABRACADABRA students on key literacy skills of letter-sound knowledge, phonological blending, listening comprehension and reading comprehension (Savage, et al., 2009; Savage, et al., 2008). Results of an RCT conducted in kindergarten, first and second grade classrooms in Canada showed ABRACADABRA students significantly outperform non-ABRACADABRA students on measures of sight word reading and phonological blending (Savage, et al., 2008).
However, there have been very few rigorously designed studies in Australia investigating the use of computer software to support literacy learning for students in general (Brooks, Miles, Torgerson & Torgerson, 2006); and within the Indigenous domain, there is an extreme lack of information. What we do know about the effectiveness of computer-based literacy instruction with struggling readers suggests that these programs provide teachers with explicit strategies for teaching literacy, making them potentially useful in remediating literacy gaps (Fish et al., 2008; Garcia & Arias, 2000; Phillips, Clancy-Menchetti & Lonigan, 2008). Computer-based instruction may equalise learning opportunities by increasing the opportunities for all students to participate in literacy activities (Hitchcock & Noonan, 2000; Hutinger, Bell, Daytner & Johanson, 2006). Further to participation, early readers can achieve greater independence in their literacy development because of the motivational features of technology (Leloup & Ponterio, 2003). Also, computer-based programs are believed to be well suited for supplementary instruction to reading and, in turn, offer more intensive practice (Magnan & Ecalle, 2006).
As a flexible, evidence-based literacy intervention with a history of positive effects, ABRACADABRA was selected as a promising platform for rigorous research on improving Indigenous literacy in the NT. Given the limited capacity to conduct systematic research in the NT, especially within the remote schools, designing a program of research around an established literacy instruction tool enabled our research to build teachers' skills whilst being confident the intervention is unlikely to do harm.
The study was guided by the following research questions:
RQ1. Is there a difference in post-test literacy scores between Indigenous and non-Indigenous students who do and do not receive ABRACADABRA instruction when controlling for pretest scores? RQ2. Do the differences in research question 1 remain when students' post-test literacy scores are adjusted for by attendance and literacy instruction quality?
Prior to pretesting, students were randomly assigned to one of two parallel forms of the GRADE K (A or B) and to whether they received the GRADE K or the PIPS-BLA assessment first. Students who received the GRADE K A at pretest were post-tested using the GRADE K B and vice versa. Students who received the GRADE K first at pretest received the PIPS-BLA first at post-test and vice versa.
Literacy instruction quality was assessed during bi-weekly site visits by university researchers who used a revised version of the Classroom Literacy Observation Survey instrument (Louden & Rohl, 2003) to evaluate control and ABRACADABRA teachers' literacy lessons. A total of 92 observations were conducted for an average of 6 observations per teacher.
Four GRADE K subscales (phonological awareness, early literacy skills, phoneme-grapheme correspondence and word reading) were used. The GRADE has been shown to have strong internal consistency (.95-.99), high alternate form reliability (.81-.94), and high test-retest reliability (.80) (Williams, 2001).
The PIPS-BLA is a computer-based literacy and numeracy assessment developed by the Curriculum Evaluation Management (CEM) Centre at Durham University in England and consists of three measures: Reading, Maths and Phonics. Test-retest reliability for the UK version of the PIPS-BLA ranged from .91 to .98 and predictive validity for reading and maths was .70 and .65, respectively (Tymms, 2002). Studies have shown the four scales to be internally reliable with Cronbach's alphas of 0.95, 0.93, 0.86 and 0.86 for Reading, Mathematics, Vocabulary and Phonological Awareness, respectively (Merrell & Tymms, 2007).
In Australia, studies have examined the reliability and validity of the PIPS-BLA when used with Indigenous students. Godfrey and Galloway (2004) administered the PIPS-BLA to 191 Indigenous students from government primary, Catholic primary and community primary schools and found Cronbach's alpha was .98 and split-half reliability was .98, leading them to recommend the PIPS-BLA to "...teachers as a reliable instrument to use with Indigenous students" (p. 154).
Literacy teaching practices observation instrument and teacher perceptions
The Literacy Teaching Practices Observation instrument was developed based on the Classroom Literacy Observation Survey (Louden & Rohl, 2003). The CLOS is a 33 item teacher literacy practice observation instrument divided into 6 key areas: Participation, Knowledge, Orchestration, Support, Differentiation and Respect (Louden & Rohl, 2003; Louden, et al., 2005). All items from the CLOS were retained in the Literacy Teaching Practices Observation instrument, but the scale was modified from a checklist of present or absent behaviours to a rating scale from 1 to 5 with 1 being 'Strongly Disagree' and 5 being 'Strongly Agree.' This change was made to better assess the quality of literacy practices, rather than their presence or absence. Two university researchers observed 5 lessons together and their inter-rater reliability was .98 for the overall average test score (.90 for Knowledge, .99 for Participation, .90 for Orchestration, .99 for Support, .97 for Differentiation and .97 for Respect). Cronbach's alpha was .98. Teachers' average total score on the literacy teaching practices observation instrument was used as the literacy instruction quality covariate.
Teachers' perceptions of ABRACADABRA were gathered during two telephone-based focus groups facilitated by the university researchers at the mid-point and end of the ABRACADABRA implementation. These focus groups were transcribed, analysed for common themes and used to inform the interpretation of our findings.
Throughout the study, researchers observed how ABRACADABRA's multiple levels, activities and entry points allowed teachers to augment lessons for varying learning needs and school resources. At one remote community school, teachers used a learning centre approach that involved four children on laptops whilst others worked on other complementary literacy activities. At another very remote school two teachers used an activity centre approach one day a week with 4 children using ABRACADABRA on the interactive white board and all the children rotating between the learning centres approximately every 15 minutes. At the provincial city schools, teachers used an interactive white board to introduce lessons and then most students moved to a larger computer lab where children could practise activities on their own computers, whilst other students who needed more support received guided practice on the interactive whiteboard with the teacher or teacher assistant. Early in the intervention teachers found that students required a high level of support as they worked individually or in pairs on the computer. As they became more confident with the navigation of the program, most children were able to take greater responsibility for their learning, needing only minimal teacher support.
A total of fourteen early childhood classrooms from 2 provincial, 2 remote, and 2 very remote schools volunteered to participate in the study. Of the 14 classrooms, 5 were controls and 9 were interventions. The 14 classes were taught by 16 teachers whose years of teaching experience ranged from 2 to 37 (M=15.4, SD=10.5) and who had been early childhood teachers between 1 and 15 years (M=7.0, SD=6.0). Two classes were co-taught by two teachers. The teachers were mostly female (93%).
Figure 1: Participant flowchart for the 2009 ABRACADABRA study
The proportion of Indigenous students was 56.8% in the intervention group and 60.4% in the control group (see Table 1). There was no difference between intervention and control groups in their Indigenous student compositions (chi-squared=.18, p=.67). Average attendance rates, calculated by dividing the number of school days attended by the total number of school days in Semester 1, ranged from 80% (Indigenous students in the intervention group) to 94% (non-Indigenous students in the control group). Average literacy instruction quality scores, on a scale from 1 to 5 with 1 being 'low' and 5 being 'high,' ranged from 2.23 (Indigenous students in the control group) to 2.74 (Indigenous students in the intervention group). Students in the control group had significantly higher attendance (X=.90, SD=.09) than students in the intervention group (X=.86, SD=.14) (t159=2.09, p=.04). Permitted to select which classes would serve as the intervention and controls, the schools may have tended to choose classes for intervention viewed as in need of remedial instruction. These classes would be more likely to have lower student attendance and pretest literacy scores (although analyses revealed the pretest differences between intervention and control classes were not significant [see Table 2 in Results]). Students in the intervention group also received significantly better literacy instruction (X=2.54, SD=.70) than students in the control group (X=2.30, SD=.25) (t164=3.29, p=.001). This finding is likely related to the extra training and support in literacy instruction the intervention teachers received as a result of participating in the study.
|n (%)||Attendance rate:|
quality: Mean (SD)
|Intervention||Indigenous||67 (56.8%)||.80 (.16)||2.74 (.77)|
|Non-Indigenous||51 (43.2%)||.93 (.06)||2.28 (.50)|
|Total||118 (100%)||.86 (.14)||2.54 (.70)|
|Control||Indigenous||29 (60.4%)||.87 (.09)||2.23 (.25)|
|Non-Indigenous||19 (30.6%)||.94 (.06)||2.41 (.22)|
|Total||48 (100%)||.90 (.09)||2.30 (.25)|
While English as a second language (ESL) classifications were gathered for the students, permission to use this data in the analyses and reporting was not received from the NT Department of Education. From our classroom observations and conversations with teachers, however, we can say that many of the Indigenous students in our study spoke English as a second or third language.
|Pretest variable||Remained in study|
|In study X|
|GRADE K Phonological|
awareness Mean (SD)
|GRADE K Word reading|
|GRADE K Early literacy|
skills Mean (SD)
|GRADE K Phoneme|
grapheme corres. Mean (SD)
|* p<.05; ** p<.01; *** p<.001|
The interaction between Indigenous status and intervention was not significant for any of the analyses. The lack of a significant interaction suggests that ABRACADABRA is more effective than regular literacy instruction in improving both Indigenous and non-Indigenous students' phonological awareness. Table 3 displays the adjusted post-test means and standard errors for the GRADE and PIPS-BLA post-test ability scores and results of the F-tests.
ability, Mean (SE)
|ANOVA F (eta-squared)|
|Intervention vs control||Indigenous status||Interven. vs Ctrl X Indig status|
|GRADE K Phono-|
|Indigenous||.83 (.12)||-.10 (.18)||25.96***|
|Non-Indigenous||1.46 (.14)||.81 (.23)|
|GRADE K Early|
|Indigenous||2.38 (.12)||2.54 (.18)||.11|
|Non-Indigenous||2.92 (.14)||2.87 (.22)|
|GRADE K Phoneme-|
|Indigenous||1.16 (.19)||1.01 (.29)||.28|
|Non-Indigenous||1.63 (.22)||1.54 (.36)|
|GRADE K Word|
|Indigenous||.05 (.16)||.09 (.24)||.06|
|Non-Indigenous||.75 (.18)||.60 (.30)|
|PIPS-BLA Reading||Indigenous||.51 (.08)||.64 (.14)||.25|
|Non-Indigenous||.88 (.10)||.65 (.15)|
|Indigenous||1.12 (.25)||.94 (.38)||.48|
|Non-Indigenous||1.16 (.33)||.56 (.77)|
|* p<.05; ** p<.01; *** p<.001|
|ANOVA F; eta-squared|
|GRADE K (n=161)||Phonological awareness||F=1.03; .01||F=11.14**; .07|
|Early literacy skills||F=4.46*; .03||F=2.82; .02|
|Phoneme-grapheme correspondence||F=.43; .01||F=3.82; .02|
|Word reading||F=5.30*; .03||F=3.64; .02|
|PIPS-BLA Literacy||PIPS-BLA Reading (n=144)||F=1.8; .01||F=.25; .01|
|PIPS-BLA Phonics (n=61)||F=.11; .01||F=2.96; .05|
|* p<.05; ** p<.01|
RQ2: Do the differences in research question 1 remain when students' post-test literacy scores are adjusted for by attendance and literacy instruction quality?
Attendance and literacy instruction quality were included as covariates in the ANCOVA models because they are highly variable in the NT and top educationalists' lists of reasons for poor Indigenous student literacy (see, for example, Ehrich, et al., 2010; Frigo, et al., 2004; Lyons, Cooksey, Panizzon, Parnell & Pegg, 2006). By parsing out the variability associated with attendance and literacy instruction quality, we would have more power to detect differences between Indigenous and non-Indigenous, control and intervention students and their interactions. Non-significant intervention comparisons in the previous analyses would be more likely to be significant after adjusting for attendance and literacy instruction quality as known sources of variance. Yet, the adjustment could result in non-significant findings that had previously been significant. An example of this is; if the difference in literacy ability between Indigenous and non-Indigenous students was largely due to poor Indigenous student attendance, then adjusting for attendance would make Indigenous and non-Indigenous students' scores more similar.
The factorial ANCOVA tests revealed that when controlling for attendance and literacy instruction quality, the results remained the same. ABRACADABRA students had significantly greater post-test scores than control students only on the GRADE K Phonological Awareness (F,sub>1,165 = 30.90, p<.001, eta-squared = .17) (see Table 5).
Similar to the research question 1 findings, there was a significant main effect of Indigenous status for all outcome measures except for the PIPS-BLA Phonics subscale. The interaction between Indigenous status and intervention was not significant for any of the analyses. Of interest is that the F-values decreased (but remained significant) for the GRADE K ELS and word reading Indigenous status comparisons, when attendance and literacy instruction variables were included as covariates. This suggests that attendance and literacy instruction partially accounted for differences between Indigenous and non-Indigenous students' achievement in these two areas.
The quality of literacy instruction and student attendance did not account for differences in phonological awareness between students who did and did not receive ABRACADABRA instruction. This overall finding lends weight to the conclusion that ABRACADABRA is more effective than regular instruction for improving students' phonological awareness.
|Outcome||ANOVA F, eta-squared|
|Intervention||Indigenous status||Interv. x Indig. Status|
|GRADE K Phonological awareness||25.96, .14||30.9, .18||19.53***, .11||23.0***, .13||2.67, <.01||2.52, .02|
|GRADE K Early literacy skills||.11, <.01||.17, <.01||9.42*, .06||3.98*, .02||.38, <.01||.01, <.01|
|GRADE K Phoneme-grapheme correspond.||.28, <.01||.71, <.01||2.64*, .02||2.96*, .02||.01, <.01||.66, <.01|
|GRADE K Word reading||.06, <.01||.48, <.01||9.70**, .02||3.50*, .03||.18, <.01||.12, <.01|
|PIPS-BLA Reading||.25, <.01||.12, <.01||3.55*, .05||3.97*, .03||.80, <.01||1.91, .01|
|PIPS-BLA Phonics||.48, <.01||1.36, .02||.02, <.01||.01, <.01||.18, <.01||.01, <.01|
|* p<.05; ** p<.01|
That ABRACADABRA instruction improved phonological awareness is especially promising given 1) phonological awareness is an excellent predictor of outcomes at the end of primary education in the UK and Canada, even after SES and reading ability are considered (see, for example, Savage, Carless & Ferrero, 2007), and 2) direct phonological awareness instruction is particularly important for improving the reading and writing skills of low-SES (NICHHD, 2000) and language minority students (Ehri, 2009; Geva & Siegel, 2000), and that in the absence of effective supplementary tools, such instruction is highly dependent on prior teacher expertise. In the Handbook of Research on Literacy and Diversity, Ehri (2009) argues language minority children are more likely to begin school with very little foundational reading knowledge. "Whereas well-prepared children may scarcely be affected by instruction that slights foundational and word reading skills, ill-prepared children may find it devastating and may make little progress" (Ehri, 2009, p. 293).
The ABRACADABRA teachers appeared to be better prepared (through the teacher training) and equipped (with ABRACADABRA's 17 alphabetics activities) than control teachers to directly teach their students how to recognise and manipulate word sounds. In a survey designed to gather teacher perceptions, three of the six teachers commented they lacked training and experience to teach literacy, especially phonological awareness. This was echoed in focus groups conducted during the middle and end of the study. Teachers said ABRACADABRA helped them develop better lesson plans and identify strategies for direct instruction. One teacher said:
I feel I have gotten a better focus and overall understanding to my lessons since I started. Not being trained in early childhood I have very little knowledge about teaching phonemic awareness and ABRA provides me guidance in making sure I include these skills in my literacy lessons. There are lots of different phonic activities that one can be reminded to teach such as syllabication and word building.Controlling for teacher quality means that the positive findings for phonological awareness cannot be attributed solely to the ABRACADABRA teachers' delivery of higher quality literacy lessons. It seems likely that ABRACADABRA's effectiveness in improving phonological awareness is also related to how it helped teachers deliver direct phonological awareness instruction and included enough variety and flexibility that most students received the recommended 20 hours of direct instruction not likely to have been provided in the control classrooms. It would appear that in our study the ABRACADABRA tool and associated training up-skilled teachers in phonological awareness instruction in a context where these skills are reportedly lacking.
In contrast to the strong effects of intervention condition on phonological awareness, there was less clear evidence that ABRACADABRA directly affected the outcomes for early reading skills, phoneme-grapheme knowledge, and phonics ability. The main effect of intervention did not approach significance for these analyses. There may be a number of reasons for this pattern. The first is that large effects for phonological awareness in the absence of major changes in word reading have been reported at immediate post-test in studies using ABRACADABRA in high risk, urban, low SES communities in Canada. Comaskey at al. (2009) found exactly this pattern among a sample where many families experience English as an additional language. While clearly quite distinct, it may be that there exists some commonality between the samples. Both studies contained many students with English as an additional language and who are also 'at risk' in terms of SES, and both shared a pattern of response to intervention in which phonological awareness but not reading ability showed the strongest growth. If so, it is worth bearing in mind that a follow up of the Comaskey and colleagues study (2009) by Di Stasio, Savage and Abrami (2010) indicates that one year after the ABRACADABRA intervention closed in Canada, significant effects of intervention were evident on reading comprehension, and medium sized effects of intervention were evident across a range of other literacy measures. It may be speculated that the effects of learning about the metacognitive process of phonological awareness take time to directly impact reading ability. Such a question can only be answered with certainty in the present context by running a follow up post-test some time after the intervention has formally closed.
Our results may also be limited by the sample size. It is possible that the study was not adequately powered to detect differences in the interaction. This is especially true for the analysis of the PIPS-BLA phonics outcome. The unadjusted ability gain scores indicate it is possible that, had the study included more classrooms, we would have seen non-Indigenous students in the intervention outperform Indigenous students in the intervention, but not in the control group. Related, many of the students in our study spoke English as a second language. The size of our sample and failure to obtain departmental permission to use the ESL data meant we were unable to include this variable in the analysis, which may better account for the differences between Indigenous and non-Indigenous student literacy outcomes.
Another limitation is that teachers of varying experience and skills delivered the ABRACADABRA intervention. While this increased the ecological validity (or naturalness) of the study, it came at the expense of our ability to control the teaching environment. Even while the analyses controlled for overall teacher quality, our measurement of this complex variable was limited to a single instrument. There remains some question as to whether ABRACADABRA may be more effective in the hands of teachers determined to be highly skilled by multiple measures of instruction quality. Other ABRACADABRA studies suggest the quality of ABRACADABRA teaching and how well it is connected to the wider curriculum affects the power of the results. Savage and colleagues (2010) found that teacher variations in the use of ABRACADABRA affected student learning outcomes. Each of 3 teachers chose to use the program in qualitatively distinct ways that corresponded to the first three stages of Sandholtz, Ringstaff and Dwyer's (1997) technology integration model, namely: Entry, Adoption and Adaptation. Significant differences in growth in literacy between pre- and post-test were associated with technology integration style across all measures of literacy and related language skills. The largest and most widespread effects were evident for an Adaptation group that linked technology content to wider learning themes in the classroom. In terms of overall growth in standardised literacy scores across all six such measures used, Adaptation proved to be 60% more effective than other teaching methods.
The findings are also limited by the large number of students who left the study (24% for the GRADE and 30% for the PIPS). While attrition was not differential between the ABRACADABRA intervention and control groups, students who left the study had lower pretest scores for all outcomes than students who remained in the study, thus limiting the generalisability of our findings to students whose enrolment and attendance is relatively stable over a semester. The exclusion of the control class that used ABRACADABRA from the analysis is another limitation. Preliminary analyses showed that had this class been included as a control class, ABRACADABRA students would not have performed significantly better than control students on any of the outcome measures. It is impossible to tell whether this would have happened had the control class not used ABRACADABRA.
Finally, the study findings are limited by our ability to determine the type and amount of exposure to ABRACADABRA that causes gains in student literacy. While ABRACADABRA is part of a larger software called the Learning Toolkit (LTK) which includes a program to monitor and record the types of activities students use, for how long and the number of correct responses, the LTK was not available until two weeks into the year two study and required teachers and students to log into a local installation of ABRACADABRA rather than access the free online version. While some teachers did have their students login after week two, they did not do so consistently and most teachers chose to use the free online version. Therefore the ABRACADABRA usage data was too inconsistent to be analysable. Attendance was used as a proxy variable for exposure to ABRACADABRA, but true exposure in terms of how much time students spent using different activities was not recorded.
These limitations have been addressed in a recently concluded randomised controlled trial in six NT schools, under which individual students were randomly assigned to ABRACADABRA and ABRACADABRA instruction was delivered by specially trained teachers. With over 350 students participating, the study has greater power to detect effects and exposure to ABRACADABRA and ESL status is being recorded and will be included in the analyses.
Subject to the results of the small-scale RCT, we further recommend ABRACADABRA be trialled in a larger study adequately powered to examine questions of its relative effectiveness in provincial, remote and very remote contexts and under varying literacy curriculum, for example.
Ah Kit, J. (2004). No School no Pool means Healthy Pools. Northern Territory Government media release, 19th April 2004. http://newsroom.nt.gov.au/2004/20040419_healthy_pools.shtml
Altman, D. G., Schulz, K. F., Moher, D., Egger, M., Davidoff, F., Elbourne, D. et al. (2001). The revised CONSORT statement for reporting randomized trials: Explanation and elaboration. Annals of Internal Medicine, 134, 663-694. http://www.annals.org/content/134/8/663.abstract
Annadale, K., Bindon, R., Handley, K., Johnston, A., Lockett, L. & Lynch, P. (2004). First Steps 2nd Edition Reading Resource Book. http://www.ecurl.com.au/uk/resources/firststepsliteracy.asp
Australian Curriculum and Assessment Reporting Authority (ACARA) (2009). Shape of the Australian curriculum: English. Canberra, ACT: MCEECDYA. http://www.acara.edu.au/verve/_resources/Australian_Curriculum_-_English.pdf
Australian Curriculum and Assessment Reporting Authority (ACARA) (2010). 2009 NAPLAN National Report. Canberra, ACT: MCEECDYA. [verified 25 Jul 2011; 19.3 MB] http://www.nap.edu.au/_Documents/National%20Report/NAPLAN_2009_National_Report.pdf
Ball, E. W. & Blachman, B. A. (1991). Does phoneme awareness training in kindergarten make a difference in early word recognition and developmental spelling? Reading Research Quarterly, 26(1), 49-66. http://dx.doi.org/10.1598/RRQ.26.1.3
Batten, M. & Russell, J. (1995). Students at risk: A review of Australian literature 1980-1994. Camberwell, VIC: Australian Council for Educational Research.
Behrendt, L. & McCausland, R. (2008). Welfare payments and school attendance: An analysis of experimental policy in Indigenous education. Univerity of Technology Sydney. http://www.aeufederal.org.au/Publications/2008/LBehrendtpaper.pdf
Bodrova, E. & Leong, D. J. (2007). Tools of the mind: The Vygotskian approach to early childhood education (2nd ed.). Upper Saddle River, NJ: Merrill/Prentice Hall.
Bourke, C., Rigby, K. & Burden, J. (2000). Better practice in school attendance: Improving the school attendance of Indigenous students. Canberra, ACT: Department of Education, Training and Youth Affairs. http://www.dest.gov.au/NR/rdonlyres/BE155405-345F-4859-BCE6-F0B86A248F37/2506/Attend_Synth.pdf
Brooks, G., Miles, J. N. V., Torgerson, C. J. & Torgerson, D, J. (2006). Is an intervention using computer software effective in literacy learning? A randomised controlled trial. Educational Studies, 32(2), 133-143. http://dx.doi.org/10.1080/03055690500416116
Byrne, B. (1998). The foundation of literacy. Sussex: Psychology Press.
Cavanagh, R. F., Kent, D. B. & Romanoski, J. T. (2005). An illustrative example of the benefits of using a Rasch analysis in an experimental design investigation. Paper presented at the Annual Meeting of the Australian Association for Research in Education: Sydney, NSW. http://www.aare.edu.au/05pap/cav05081.pdf
Centre for the Study of Learning and Performance (CSLP) (2009). The Learning Toolkit [Computer software]. Montreal, Canada: Centre for the Study of Learning and Performance, Concordia University. Version 1.0. http://doe.concordia.ca/cslp/ICT-LTK.php
Comaskey, E. M., Savage, R. S. & Abrami, P. (2009). A randomised efficacy study of Web-based synthetic and analytic programmes among disadvantaged urban Kindergarten children. Journal of Research in Reading, 32(1), 92-108. http://dx.doi.org/10.1111/j.1467-9817.2008.01383.x
Deault, L., Savage, R. & Abrami, P. (2009). Inattention and response to the ABRACADABRA web-based literacy intervention. Journal of Research in Effective Intervention, 2(3), 250-286. http://dx.doi.org/10.1080/19345740902979371
Darling-Hammond, L. & Young, P. (2002). Defining "highly qualified teachers": What does "scientifically based research" actually tell us? Educational Researcher, 31(9), 13-25. http://dx.doi.org/10.3102/0013189X031009013
Davidson, M. & Jenkins, J. R. (1994). Effects of phonemic processes on word reading and spelling. Journal of Educational Research, 87(3), 148-157. http://www.jstor.org/stable/27541912
Department of Education and Training (2011). Enrolment and attendance statistics. Darwin, NT: Northern Territory Government. http://www.det.nt.gov.au/students/at-school/enrolment-attendance/enrolment-attendance-statistics
Department of Employment, Education, and Training (DEET) (2008). Annual Report 2007-08. Darwin, NT: Northern Territory Government. http://www.det.nt.gov.au/__data/assets/pdf_file/0018/4635/AnnualReport.pdf
Department of Employment, Education, Training & Youth Affairs (DEETYA) (1998a). The national report on schooling in Australia. Canberra, ACT: DEETYA. http://www.mceecdya.edu.au/mceecdya/anr_1998,12030.html
Department of Employment, Education, Training and Youth Affairs (DEETYA) (1998b). Literacy for all: The challenge for Australian schools: Commonwealth literacy policies for Australian schools. Canberra, ACT: DEETYA. http://www.dest.gov.au/archive/schools/literacy&numeracy/publications/lit4all.htm
Di Stasio, M., Savage, R. & Abrami, P. (in press). A follow-up study of the ABRACADABRA web-based literacy intervention in Grade 1. Journal of Research in Reading. http://dx.doi.org/10.1111/j.1467-9817.2010.01469.x
Dunn, M. C., Kadane, J. B. & Garrow, J. R. (2003). Comparing harm done by mobility and class absence: Missing students and missing data. Journal of Educational and Behavioral Statistics, 28(3), 269-288. http://dx.doi.org/10.3102/10769986028003269
Ehrich, J., Wolgemuth, J. R., Helmer, J., Oteng, G., Lea, T., Bartlett, C., Smith, H. & Emmett, S. (2010). Attendance, performance and the acquisition of early literacy skills: A comparison of Indigenous and non-Indigenous school children. Australian Journal of Learning Difficulties, 15(2), 131-149. http://dx.doi.org/10.1080/19404150903524580
Embretson, S. E. & Reise, S. P. (2000). Item response theory for psychologists. Mahwah, NJ: Lawrence Erlbaum.
Fish, A., Li, X., McCarrick, K., Butler, S., Stanton, B., Brummit, G., et al. (2008). Early childhood computer experience and cognitive development among urban low-income preschoolers. Journal of Educational Computing Research, 38(1), 97-113. http://dx.doi.org/10.2190/EC.38.1.e
Frigo, T., Corrigan, M., Adams, I., Hughes, P., Stephens, M. & Woods, D. (2004). Supporting English literacy and numeracy learning for Indigenous students in the early years. Indigenous Education, Paper 10. http://research.acer.edu.au/indigenous_education/10
Frontier, A. C. (2008). What is the relationship between student engagement and student achievement? A quantitative analysis of middle school students' perceptions of their emotional, behavioral, and cognitive engagement as related to their performance on local and state measures of achievement. Unpublished Doctoral Dissertation. Cardinal Stritch University, Milwaukee, WI.
Garcia, M. R. & Arias, F. V. (2000). A comparative study in motivation and learning through print-oriented and computer-oriented tests. Computer Assisted Language Learning, 13, 457-465. http://www.tandfonline.com/doi/abs/10.1076/0958-8221%28200012%2913%3A4-5%3B1-E%3BFT457
Godfrey, J. R. & Galloway, A. (2004). Assessing early literacy and numeracy skills among Indigenous children with the Performance Indicators in Primary Schools test. Issues in Educational Research, 14, 144-155. http://www.iier.org.au/iier14/godfrey.html
Gray, B. & Cowey, W. (2004). National Accelerated Literacy Program. http://www.nalp.cdu.edu.au/whatisnalp.htm
Gray, J. & Beresford, Q. (2008). A 'formidable challenge': Australia's quest for equity in Indigenous education. Australian Journal of Education, 52(2), 197-223. Retrieved from http://search.informit.com.au/documentSummary;dn=367622243947317;res=IELHSS
Gray, J. & Hunter, J. (2000). Breaking the cultural cycle: Reframing pedagogy and literacy in a community context as intervention measures for Aboriginal alienation. Paper presented at the Annual Meeting of the American Education Research Association, New Orleans, LA, 24-28 April) http://www.eric.ed.gov:80/ERICWebPortal/contentdelivery/servlet/ERICServlet?accno=ED441345
Gray, J. & Partington, G. (2003). Attendance and non-attendance at school. In Q. Beresford & G. Partington (Eds.), Reform and resistance in Aboriginal education: The Australian experience (pp. 133-163). Crawley, WA: University of Western Australia Press.
Heck, R. H. (2007). Examining the relationship between teacher quality as an organizational property of schools and students' achievement and growth rates. Educational Administration Quarterly, 43(4), 399-432. http://dx.doi.org/10.1177/0013161X07306452
Helmer, J., Bartlett, C., Wolgemuth, J. R., Lea, T. & Emmett, S. (2011). Coaching (and) commitment: Linking ongoing professional development, quality teaching and student outcomes. Professional Development in Education, 37(2), 197-211. http://dx.doi.org/10.1080/19415257.2010.533581
Helmer, J., Wolgemuth, J., Ehrich, J., Bartlett, C. & Lea, T. (under review). Navigating the systemic hurdles in remote Australia: Conducting rigorous early childhood literacy research. Submitted to Issues in Educational Research, March 2011.
Hipps, G., Abrami, P. C. & Savage, R. (2005). ABRACADARA: The research, design and development of web-based early literacy software. In S. Pierre (Ed.), Développement, intégration et évaluation des technologies de formation et d'apprentissage (DIVA). Innovations et tendances en technologies de formation et d'apprentissage (pp. 89-112). Montreal, QC: Presses Internationales Polytechnique.
Hitchcock, C. L. & Noonan, M. J. (2000). Computer-assisted instruction of early academic skills. Topics in Early Childhood Special Education, 20(3), 145-158. http://dx.doi.org/10.1177/027112140002000303
Hutinger, P. L., Bell, C., Daytner, G. & Johanson, J. (2006). Establishing and maintaining an early childhood emergent literacy curriculum. Journal of Special Education Technology, 21(4), 39-54. http://www.tamcec.org/jset-index/establishing-and-maintaining-an-early-childhood-emergent-literacy-technology-curriculum/
Leigh, A. & Gong, X. (2008). Estimating cognitive gaps between Indigenous and non-Indigenous Australians. Centre for Economic Policy research discussion paper. Canberra: Australian National University. http://ideas.repec.org/p/auu/dpaper/578.html
LeLoup, J. W. & Ponterio, R. (December 2003). Second language acquisition and technology: A review of the research. [verified 25 Jul 2011; 1.79 MB] http://www.cal.org/resources/digest/digest_pdfs/0311leloup.pdf
Louden, W. & Rohl, M. (2003). Classroom literacy observation survey. In W. Louden, M. Rohl, C. Barrat-Pugh, C. Brown, T. Cairney, J. Elderfield, H. House, M. Meiers, J. Rivalland & K. J. Rowe (Eds) (2005). In teachers' hands: Effective literacy teaching practices in the early years of schooling. Canberra, ACT: Australian Government Department of Education, Science and Training. http://www.dest.gov.au/NR/rdonlyres/2CE61B9C-C20B-4529-964B-5953311E5738/10110/In_Teachers_Hands_FINAL_for_web.pdf
Louden, W., Rohl, M., Barrat-Pugh, C., Brown, C., Cairney, T., Elderfield, J., House, H., Meiers, M., Rivalland, J. & Rowe, K. J. (Eds) (2005). In teachers' hands: Effective literacy teaching practices in the early years of schooling. Canberra, ACT: Australian Government Department of Education, Science and Training. http://www.dest.gov.au/NR/rdonlyres/2CE61B9C-C20B-4529-964B-5953311E5738/10110/In_Teachers_Hands_FINAL_for_web.pdf
Lyons, T., Cooksey, R., Panizzon, D., Parnell, A. & Pegg, J. (2006). SiMERR National Survey. National Centre of Science, ICT & Mathematics Education for Rural & Regional Australia. New South Wales: University of New England. http://www.une.edu.au/simerr/pages/projects/1nationalsurvey/index.html
Magnan, A. & Ecalle, J. (2006). Audio-visual training in children with reading disabilities. Computers & Education, 46(4), 407-425. http://dx.doi.org/10.1016/j.compedu.2004.08.008
Masters, G. N. & Forster, M. (1997). Mapping literacy achievement: Results of the 1996 National Schools English Literacy Survey. Canberra, ACT: Department of Education Employment, Training, and Youth Affairs. http://www.dest.gov.au/mla/contents.htm
McGarrigle, J. & Nelson, A. (2006). Evaluating a school skills programme for Australian Indigenous children: A pilot study. Occupational Therapy International, 13(1), 1-20. http://dx.doi.org/10.1002/oti.10
Mellor, S. & Corrigan, M. (2004). The case for change. A review of contemporary research on Indigenous education outcomes. Australian Education Review, Australian Council for Educational Research. Camberwell, Victoria: ACER Press. http://www.acer.edu.au/documents/AER_47-TheCaseforChange.pdf
Merrell, C. & Tymms, P. (2007). What children know and what they can do when they start school and how this varies between countries. Journal of Early Childhood Research, 5, 115-134. http://dx.doi.org/10.1177/1476718X07076679
Moher, D., Schulz, K. F. & Altman, D. G. (2001). The CONSORT statement: Revised recommendations for improving the quality of reports of parallel-group randomised trials. Lancet, 357, 1191-1194. http://dx.doi.org/10.1016/S0140-6736(00)04337-3
Moore, D. S. (1995). The basic practice of statistics. NY: Freeman and Co.
Morgan, G. A., Gliner, J. A. & Harmon, R. J. (2006). Understanding and evaluating research in applied and clinical settings. Mahwah, NJ: Lawrence Erlbaum.
National Institute of Child Health and Human Development (NIHCD) (2000). Report of the National Reading Panel. Teaching Children to Read: An Evidence-Based Assessment of the Scientific Research Literature on Reading and its Implications for Reading Instruction, Report of the Subgroups. Washington, DC: U.S. Government Printing Office. http://www.nichd.nih.gov/publications/nrp/upload/report.pdf
Northern Territory Government Department of Education & Training (2009). Compulsory teaching in English for the first four hours of each school day. http://www.det.nt.gov.au/about-us/policies/documents/schools/compulsory-teaching-in-english-for-the-first-four-hours-of-each-school-day
Northern Territory Government Department of Education & Training (2009). Annual Report, 2008-2009. Darwin, NT: Northern Territory Government. http://www.det.nt.gov.au/__data/assets/pdf_file/0017/8252/FullVersion.pdf
Palincsar, A. S. (2003). Advancing a theoretical model of learning and instruction. In B. J. Zimmerman (Ed.), Educational psychology: A century of contributions (pp. 459-475). Mahwah, NJ: Erlbaum.
Phillips, B., Clancy-Menchetti, J. & Lonigan, C. (2008). Successful phonological awareness instruction with preschool children: Lessons from the classroom. Topics in Early Childhood Special Education, 28(1), 3-17. http://dx.doi.org/10.1177/0271121407313813
Sanders, W. L. & Rivers, J. C. (1996). Cumulative and residual effects of teachers on future student academic achievement. Knoxville, TN: University of Tennessee Value-Added Research and Assessment Center. [not found 28 Jul 2011] http://www.mccsc.edu/~curriculum/cumulative%20and%20residual%20effects%20of%20teachers.pdf
Savage, R. S., Erten, O., Abrami, P., Hipps, G., Comaskey, E. & van Lierop, D. (2010). ABRACADABRA in the hands of teachers: The effectiveness of a web-based literacy intervention in grade 1 language arts programs. Computers & Education, 55(2), 911-922. http://dx.doi.org/10.1016/j.compedu.2010.04.002
Savage, R. S., Abrami, P., Piquette-Tomei, N., Wood, E. & Delevaux, G. (2008). ABRACADABRA: A study in the development, implementation and effectiveness of a web-based literacy resource. A research progress report. Report submitted to the Canadian Council for Learning and the Canadian Language and Literacy Research Network. http://www.ccl-cca.ca/pdfs/OtherReports/Abrami-ABRA-ReportE.pdf
Savage, R., Abrami, P. C., Hipps, G. & Wade, A. (2009). A randomized control trial study of the ABRACADABRA reading intervention program in grade 1. Journal of Educational Psychology, 101, 590-604. http://dx.doi.org/10.1037/a0014700
Savage, R., Carless, S., & Ferrero, V. (2007). Predicting curriculum and test performance at age 11 years from pupil background, baseline skills and phonological awareness at age 5 years. Journal of Child Psychology and Psychiatry, 48, 732-739. http://dx.doi.org/10.1111/j.1469-7610.2007.01746.x
Steering Committee for the Review of Government Service Provision (SCRGSP) (2009). Overcoming Indigenous disadvantage: Key indicators 2009. Canberra: Productivity Commission. http://www.pc.gov.au/__data/assets/pdf_file/0003/90129/key-indicators-2009.pdf
Stronge, J. H., Ward, T. J., Tucker, P. D. & Hindman, J. L. (2007). What is the relationship between teacher quality and student achievement? An exploratory study. Journal of Personnel Evaluation in Education, 20, 165-184. http://dx.doi.org/10.1007/s11092-008-9053-z
Thayer-Smith, R. A. (2007). Student attendance and its relationship to achievement and student engagement in primary classrooms. Unpublished doctoral dissertation. College of William and Mary, Williamsburg, VA.
Tymms, P. (2002). Baseline assessment, value added and the prediction of reading. Journal of Research in Reading, 22, 27-36. http://dx.doi.org/10.1111/1467-9817.00066/
Williams, K. T. (2001). Group reading assessment and diagnostic evaluation: Technical manual. Shoreview, MN: Pearson AGS Globe.
Wise, S., da Silva, L., Webster, E. & Sanson, A. (2005). The efficacy of early childhood interventions. Melbourne: Australian Institute of Family Studies. http://www.aifs.gov.au/institute/pubs/resreport14/aifsreport14.pdf
Wolgemuth, J. R., Helmer, J., Emmett, S., Bottrell, C., Lea, T., Bartlett, C., Harper, H., Abrami, P. & Savage, R. (2009). ABRACADABRA! (ABRA) Early Childhood Literacy Project Annual Report No. 2: A quasi-experimental study of the ABRA literacy software in Northern Territory Indigenous classrooms. Charles Darwin University, Darwin, NT. http://www.cdu.edu.au/sspr/documents/ABRA_2009_Annual_Report.pdf
Wolgemuth, J. R., Savage, R., Helmer, J., Harper, H., Lea, T., Abrami, P., Chalkiti, K. & Kirby, A. (under review). The impact of ABRACADABRA on Indigenous and non-Indigenous early literacy in Australia: A multisite randomized controlled trail. Manuscript submitted to Reading Research Quarterly, July 2011.
The Rasch logistic model for dichotomous items calculates the probability of success by a person n on an item i as a function of the ability of the person (thetan) and the difficulty of the item (betai) (See Equation 1) (Embretson & Reise, 2000).
Since a person's ability and item difficulty are located on the same scale in Rasch modeling, the person's ability, thetan, is calculated as the point where a person has a .50 probability of getting an item of difficulty, thetai, correct and values typically range from -3 to 3. Therefore Rasch analysis takes into account the difficulty of the individual test items, yielding a more accurate estimate of students' literacy abilities than their raw scores.
Items that did not fit the Rasch model (e.g., students with higher abilities did not perform better on these items than students with lower abilities) were eliminated from the analyses based on the conservative criteria that their fit residuals were between +2.5 and -2.5 and the Bonferroni adjusted p-value was <.01. Four items were removed from the analysis of GRADE K A and five items were removed from GRADE K B. Six items were removed from the PIPS-BLA for the same reason. Items were also assessed for Differential Item Functioning (DIF) between Indigenous and non-Indigenous students. No items on the GRADE K or PIPS-BLA instruments displayed significant DIF (Bonferroni adjusted p-value <.01).
The Person Separation Index for the GRADE K A and B and the PIPS-BLA Literacy were all high, indicating a good fit of the data to the model (rbeta=.94, .94, and .93, respectively). Consistent with the Separation Index, the mean Fit Residuals all indicated good fit for the items of each test (GRADE K A, M= -.30, SD=1.47; GRADE K B, M= -.44, SD=1.56; PIPS-BLA Literacy, M= -.08, SD=1.34). Separation indices for the GRADE K subscales were acceptable, ranging from .68 to .83. The separation index for the PIPS-BLA reading subscale was high (.94), but the separation index for the phonics subscale was moderate (.54).
For all students taking the GRADE K A and B at pre- and post-test, item difficulties ranged from approximately to -3 to just over 2 while student abilities ranged from -4.5 to just over 5. The average ability estimates for forms A and B were 0.86 (SD=1.54) and .96 (SD=1.56), indicating the tests were slightly easy for the students in the study (a well-targeted test would have an average ability estimate of zero) and were sufficiently similar that they could be analysed together.
For all students taking the PIPS-BLA Literacy at pre- and post-test, item difficulties ranged from approximately to -4.5 to just over 4, while student abilities ranged from -4 to just over 4.5. The average ability estimate was 0.40 (SD=1.32), indicating the test was well targeted for students in the study.
ANCOVA assumptions of normal distributions, linearity, and homogeneity of regression slopes were met for all analyses. In the cases where the homogeneity of variances assumption was violated (GRADE K phonological awareness, early literacy skills, and phoneme-grapheme correspondence), it was decided to proceed with the ANCOVAs as ANOVA is robust to small (ratio of 3 to 1) and moderate (ratio of 4 to 1) violations of variance homogeneity (Moore, 1995).
|Authors: Dr Jennifer Wolgemuth, Assistant Professor, |
Colorado State University, School of Education, Fort Collins, CO 80523, USA
Email: Jennifer.Wolgemuth@colostate.edu Web: http://www.cdu.edu.au/sspr/profiles/wolgemuth-j2.htm
Dr Robert Savage, Associate Professor, McGill University, Faculty of Education, Education Building B183, 3700
McTavish Street Montreal, Quebec, CANADA H3A 1Y2. Email: Robert.savage@Mcgill.edu.ca
Dr Janet Helmer, Research Fellow, Menzies School for Health Research, Darwin NT 0909, Australia.
Email: Janet.Helmer@cdu.edu.au Web: http://www.cdu.edu.au/thenortherninstitute/jhelmer.html
Dr Christine Bottrell, Research Fellow, Charles Darwin University, Darwin NT 0909, Australia.
Email: Christine.Bottrell@cdu.edu.au Web: http://www.cdu.edu.au/thenortherninstitute/cbottrell.html
Associate Professor Tess Lea, QE II Fellow, University of Sydney, Sydney NSW 2006, Australia.
Email: Tess.Lea@sydney.edu.au Web: http://www.cdu.edu.au/thenortherninstitute/tlea.html
Dr Helen Harper, Research Fellow, Menzies School for Health Research, Darwin NT 0909, Australia.
Email: Helen.Harper@cdu.edu.au Web: http://www.cdu.edu.au/thenortherninstitute/hharper.html
Dr Kalotina Halkitis, RARI Implementation Officer,
Northern Territory Department of Children and Families, Casuarina, NT 0810, Australia
Email: Kalotina.Halkitis@nt.gov.au Web: http://www.cdu.edu.au/thenortherninstitute/khalkitis.html
Professor Phil Abrami, Centre for the Study of Learning & Performance LB-589-2,
Concordia University, 1455 DeMaisonneuve Blvd. West Montreal, Quebec, Canada H3G 1M8.
Email: firstname.lastname@example.org Web: http://doe.concordia.ca/cslp/
Please cite as: Wolgemuth, J., Savage, R., Helmer, J., Bottrell, C., Lea, T., Harper, H., Halkitis, K. & Abrami, P. (2011). Using computer-based instruction to improve Indigenous early literacy in Northern Australia: A quasi-experimental study. Australasian Journal of Educational Technology, 27(4), 727-750. http://www.ascilite.org.au/ajet/ajet27/wolgemuth.html