|Australasian Journal of Educational Technology
2005, 21(3), 407-426.
Earlier work, often referred to as the "hole in the wall" experiments, has shown that groups of children can learn to use public computers on their own. This paper presents the method and results of an experiment conducted to investigate whether such unsupervised group learning in shared public spaces is universal. The experiment was conducted with "hole in the wall" (minimally invasive education, or MIE) computers in 17 locations in rural India. Focus groups in each location were tested for computer literacy for 9 months.
Results, which are discussed in the paper, show that groups of children can learn to use computers and the Internet on their own, irrespective of who or where they are. Furthermore, such group self instruction is as effective as traditional classroom instruction, whilst this learning is considerably less expensive and is independent of teachers and schools. The results point to a new pedagogy for children's education in those circumstances where schools and teachers are either absent or not effective due to any reason.
The present paper is about the acquisition of computer literacy in children in the age group of 6-14 years, it puts forward the following hypothesis, "if given appropriate access and connectivity, groups of children can learn to operate and use computers with none or minimal intervention from adults".
The paper is in two sections:
Minimally invasive education (MIE) is a pedagogic method, deriving its name partly from the medical term 'minimally invasive surgery' (Mitra & Rana, 2001; Mitra, 2003). The idea of MIE crystallised over a period of time, based on observations and educational experiments conducted at NIIT. The experiments were first conducted in Kalkaji, a suburb of New Delhi, India. A computer was connected to the Internet and embedded into a brick wall near a slum. The media often describes this experiment as "the hole in the wall". It was reported that most of the slum children were able to use the computer to browse, play games, create documents and paint pictures within a few days. Thus, it was observed that, even in the absence of any direct input, mere curiosity led groups of children to explore, which resulted in learning. This, coupled with minimal input from peers, or from anyone familiar with computers, helped the children learn more. This leads us to believe that any learning environment that provides an adequate level of curiosity can cause learning among groups of children. Children's desire to learn, along with their curiosity and peer interaction, drives them to explore the environment in order to satisfy their inquisitiveness. As the children explore their environment, they relate their new experience with their previous experience and thereby new learning takes place (Frontline World, 2002; Education Guardian, 2000; Businessweek Online, 2000; Mitra, 2000; Mitra 2003; van Cappelle, Evers & Mitra, 2004; Wullenweber, 2001). Hence, we define MIE as a pedagogic method that uses the learning environment to generate an adequate level of motivation to induce learning in groups of children, with none or minimal intervention from a teacher. In MIE, the role of the teacher is limited to providing, or guiding learners to, environments that generate adequate levels of interest. A known example of MIE is the type of learning that takes place when an appropriate puzzle is given to children with little or no input from others. The computer itself is capable of generating such intervention from time to time.
The original hole in the wall of 1999 has evolved into a brick structure with computers embedded in it. In the rest of this paper we will refer to this arrangement as "Minimally Invasive Education Learning Stations" (MIE learning stations). These have been set up in 22 rural and urban locations across India and similar results are reported through field observations as well as through a GUI Icon Association Inventory test (Mitra, 2003) administered to children. Observations across locations show a learning process of random exploration, collaboration, discovery, vocabulary construction, generalisation, practice and peer tutoring (Inamdar, 2004).
Minimally invasive education (MIE) is based on a paper (Mitra, 1988) that speculated that children can learn to use a computer on their own. Empirical research for five years (Mitra, 1999, 2000, 2001, 2003, 2004) substantiated this speculation. It is an approach that promises to bridge the "digital divide" by helping diverse populations achieve computer literacy. MIE is also a concept of important consequences to the area of education in general. "Minimally invasive" refers to the least possible, negligible, or the minimum help required by the child to initiate and continue the process of learning basic computing skills. This minimal amount of help from other children at the MIE learning station is necessary and sufficient for the children to become computer literate. This "help", which is the fundamental aspect of MIE, could be from peers, siblings, friends, or any other child familiar with computers. Children are found to collaborate and support each other. The learning environment is characterised by its absence from adult intervention, openness and flexibility. Children are free to operate the computer at their convenience, they can consult and seek help from any other child or other children, and are not dictated to by any structured settings. It is observed that children tend to rely upon themselves to generate the necessary learning environment, and to organise themselves for learning. It is to be noted that MIE learning stations are located in safe, open public spaces and are easily accessible to children. They have been designed for use by children (Mitra, 2004).
MIE experiments are located in urban slum and rural India. This has resulted in a design for MIE learning stations that address issues of access to technology, financial resources, and cost constraints. India has problems of poverty, illiteracy, inadequate infrastructure, diversity of socio-cultural communities, of varied socio-economic status, spread over a large geographical region.
Figure 1: A MIE learning station at Village D. Salhundi, Karnataka, India
MIE experiments clearly indicate that children are able to learn to use computers and the Internet on their own, irrespective of their social, cultural or economic backgrounds (Mitra & Rana, 2001; Mitra, 2004). The diversity of Indian conditions is, ironically, useful for applying these results anywhere in the world.
The background of the parents is equally diverse; from daily wage labour to farmers, shop owners, auto-rickshaw drivers, working in cottage industries or in a government organisation, etc. Men are found to be more educated (8th grade), while the women are mostly illiterate.
In order to study the impact of MIE learning stations on computer literacy amongst children, we have considered experimental or focus groups, frequent users and control groups from different states. We have grouped the states into four zones, each zone consisting of a particular state.
|South Zone||2 States - Karnataka and Tamil Nadu|
|North Zone||3 States - Uttaranchal, Jammu and Kashmir, Uttar Pradesh|
|East Zone||1 State - West Bengal|
|West Zone||2 States - Rajasthan and Maharashtra|
In order to develop a test for computer literacy, we started with children who had taught themselves to use computers at some of our earliest playground facilities (Mitra & Rana, 2001). These children had developed their own vocabulary to describe icons in the Microsoft Windows environment. For example, they referred to the mouse cursor as "teer" or "sui", Hindi words for arrow and needle, respectively. The folder symbol was described as a cupboard for keeping other objects. While the words used to describe the icons were chosen from their own language and experience, their descriptions of the functionality of the icons were accurate.
We then constructed a list of the 77 icons present in the Microsoft Windows and Microsoft Office environment. The test (see Figure 2) consisted of a list of these common computer GUI (Graphical User Interface) icons and persons taking the test are asked to describe the purpose of each icon. It is assumed that the number of correct descriptions of icons is correlated to the IT literacy level of the person taking the test. We decided to test this assumption.
Figure 2: The GUI Icon Association Inventory, sample layout
Not all persons who are adept at using computers use icons. Indeed, many users do not use icons at all but prefer to use drop down menus instead. It would, therefore, appear that the ability to identify the function of an icon may not correlate well with the ability of a respondent to use a computer.
We selected 74 users of computers in an urban environment and administered the IAI. These users were people who used the standard functions of MS Windows and MS Office such as word processing, spreadsheets, email and so on. They consisted of a heterogeneous group of office administrative staff, students of information technology, research assistants and faculty. All were competent and experienced users.
The average score obtained was 49%, with a standard deviation of 18%. The maximum score obtained was 76%, while the minimum was 7%. The results seemed to indicate that users, irrespective of whether they used icons or not, could guess the function of an icon provided it was from software that they used frequently. For example, a secretary who uses a word processor often would be able to guess the function of the icons of word processing correctly, while, if the secretarial job did not involve using spreadsheets, would not be able to guess the functions of the icons used in spreadsheets.
Next, we identified two frequent users (Pawan, age 16 and Lalit, age 12) in a slum area of New Delhi, and asked them to study the icons, and write down their descriptions. These independent descriptions were then matched for consistency. We then asked the two children to work together to resolve any inconsistencies and also describe the functions of the icons they had been unable to, when working independently. Over a period of a week they developed descriptions of most the icons they were shown.
The children could not identify the functions of the icons used in the spreadsheet program, MS Excel, because this program was not available to them. This was done to check whether they could arrive at a correct functional description of an icon using guesswork and reasoning alone.
At the end of this exercise, we matched the descriptions provided by the children with the descriptions given in the Windows and Office "Help" files. Finally, we developed a scoring key for the IAI that listed the 'correct' descriptions for each of the 77 icons. An evaluator would compare these descriptions with those provided by a user in response to the test, and make a subjective evaluation of how closely the two matched for any icon and decide whether to award a correct score for that item.
We then administered the test to 9 children in Madangir, a slum area in New Delhi. These children had been exposed to playground computers for 15 days. The results showed that frequent users of computers took less time and identified more icons correctly than others did.
It now became necessary to investigate two aspects of the test:
Figure 3: The GUI Icon Association Inventory Software, a sample screen
Note that while the IAI-S is an objective method for administering the IAI, its use is restricted to urban, English speaking populations, where computers are available to all respondents.
A task based computer literacy (TBCL) test (see Figure 4) has been devised by "Outreach and Extension", University of Missouri and Lincoln University, USA. This test consists of tasks that a person is asked to perform on a computer and its subsequent evaluation.
As such, the TBCL test is a literal test of computer literacy. It takes over an hour to complete the test and about 30 minutes to evaluate the results. While we could not find a validation study on the TBCL, we decided that the results of this test were a good benchmark for evaluating the effectiveness of the two versions of the IAI. This decision was based on the fact that a literal test, such as the TBCL, is close to how examinations are conducted in schools, and should be a good measure of actual performance.
Figure 4: Task based computer skills assessment test. Some sample questions.
It is important to mention here that we could not find a validated computer literacy test to measure the IAI against. Hence, we had to rely on the self consistency of the IAI to establish its validity.
We administered three tests, namely, the IAI, the IAI-S and the TBCL to a group of 18 students (all young adults) enrolled for a course in IT skills. These students consisted of an entire, randomly chosen, cohort of students taking these courses in a traditional computer training institute in New Delhi. There were 9 men and 9 women in the sample. The results, to be reported in detail elsewhere, showed all three tests results to be highly correlated (in the region of 0.95 with a very low probability of error).
We concluded that the GUI Icon Association Inventory in either of its two versions is an effective instrument for measuring computer literacy. The measurement of computer literacy of children at MIE learning stations was, therefore, done using the Icon Association Inventory, a test created and validated for this purpose.
Of the 77 icons in the IAI, 26 icons (in the Excel and Text Format categories) are not present in the computers as configured for use by children at MIE learning stations. Measurements were, therefore, carried out using 51 of the 77 icons. The children are also tested on the remaining 26 icons, as a check of the effectiveness of the IAI in measuring computing skills (there should be an uniformly low score on these icons throughout the testing period).
Figure 5: Time intervals for testing
Figure 5 shows the time intervals at which the IAI was administered. It was administered for the first time on the day the MIE learning stations were commissioned for the village children. For the other two groups of formal learners it was first administered on the day of the start of the respective courses. The test was then administered on the 3rd day, the 7th day and every 30 days for a nine month period. The total duration of the testing spanned 248 days, as shown in Figure 5.
Due to unavoidable circumstances, data for students pursuing professional IT course (DIT) could be collected for a period of only 8 months and for the regular school students, only 5 months.
We report the results in two sections below. Firstly the results of the measurements on children using MIE learning stations in 17 locations, and secondly, the results of the measurements on the two learner groups in New Delhi, namely, school students and students at NIIT.
Figure 6: National level - performance of MIE LS users in Icon Association Inventory
Figure 7 compares the average scores for the focus, control and frequent user groups over the experimental period. The focus group score is seen to rise from 6.65% to 43.07% while the control group score on the 9th month is seen to be 6.94%. The frequent users score an average of 43.73% in the ninth month. We observe that:
Figure 7: National level performance in Icon Association Inventory
Figure 8: Performances of regular school children in Icon Association Inventory
Figure 9: Performances of professional course students (DIT) in Icon Association Inventory
Table 1a: General attributes of the three groups studied
|MIE learning stations||Regular school||IT professional school|
|Age||Ranging from 7-14 years. Average age 10-11 years||Ranging from age 10-13 years||Ranging from 18-21 years|
|Gender||Males and females||Males and females||Males and females|
|Background||Majority from economically weaker sections||Lower to middle income groups||Middle to upper income groups|
|Education||Primary to middle||Middle||Undergraduates to graduates. [All students have completed 12th grade]|
|Access to computers||Shared public MIE learning stations||Classroom instruction and computer lab||Classroom instruction and computer lab|
|Assessment||Through the IAI||Assignments, exams- theory, practicals and projects. IAI||Assignments, projects, assessments and practicals. IAI|
|Teaching method||Self, collaborative, little or no intervention from adults||Teacher dependent approach||Faculty dependent approach|
|Time spent||No time restriction||2.5-4 hours per week||6 hours per week|
|MIE learning stations||Regular school||IT professional school|
|Context||Shared public MIE learning stations||Classroom instruction and computer lab.||Classroom instructions and computer lab|
|Access to computers during working hours||- on an average, the usage is between 10am-4 pm [MIE learning stations remain open from 9am-5.30pm, beyond which they are shut down]|
- no time restriction, children can access computers as long as the learning station is open
- accessible to all children
- visit MIE learning station before going to school or on return or on holidays, nearly the whole day
|- 2.4-4 hours per week|
- attend class at fixed time
- accessible only to students of a given class
|- 6 hours per week|
- attend class at fixed time
- accessible only to students of allotted batch
|Teaching method||- children organise themselves into small groups. Each child is both student (learns from others) and teacher (teaches children who have less knowledge than him/her). Hence, student-teacher boundaries are blurred.|
- peers, siblings, friends and others
- absence of adult intervention
- absence of formal teaching
|- teacher centric approach|
- entire class taught by one teacher
- children not allowed to interact or consult each other during class time.
|- teacher centric approach|
- faculty dependent
- entire class taught by one teacher
- students not allowed to interact or consult each other during class time.
|Qualifications||Mainly primary school children||Teachers are professionally trained and qualified and have teaching experience||Teachers are professionally trained and qualified and have teaching experience|
|Learning methods/ strategy||Mainly collaborative learning through the methods of observation, modeling, trial and error and self discovery||Individual based learning||Individual based learning|
|Assessment/ evaluation||No evaluation except IAI which is not seen as assessment. No examination.||Assignments, tests and final examination for both theory and practicals, and projects.||Assignments, periodic assessment and practicals. Final examination.|
|MIE group||Regular school||IT professional group|
|Rupee 1 per child per day, based on an estimate of an average of 200 children using each learning station. Annual cost Rs. 365/- per child||Rs. 1250/- per month per child|
Annual cost Rs. 15000/-
|Rs. 17000/- per semester per student.|
Annual cost Rs. 34000/-
|MIE group||Regular school||IT professional group|
|Inauguration (1st day)||6.65||10.44||11.96|
|3rd month (62 days)||22.12||24.01||23.73|
|5th month (124 days)||29.36||35.96||34.6|
|8th month (217 days)||38.18||Not available||49.17|
Regarding the learning environment in which all of the three groups accomplish computer literacy, the differences between MIE learning station users and others is significant. The learning method used by MIE learning station users draws upon the expertise of peers, siblings and friends. Each learner is both a learner and a trainer.
Table 2 provides the performance on IAI for the three groups studied. The MIE learning station users begin at the lowest level of performance - 6.65%, in comparison to regular school students (10.44%) and the IT professional course students (11.96%). By the third month, the three groups are at par. By the eighth month, the IT professional group of students stands at 49% in comparison to MIE learning station users at 38.18%. So, the IT professional group of students performed the best, to begin with, which was perhaps not surprising, given their background. However, the other groups caught up with them by the eighth month. In this connection, it may be noted that office secretaries score between 30% and 50% in the IAI, as seen from an independent (as yet unpublished) study.
The unstructured, open and flexible environment of the MIE learning station seems to produce comparable levels of computer literacy amongst learners as compared to formal methods. It does so at a considerably lower cost.
This method of acquisition of computer literacy does not depend on the existence of schools or teachers. It is also considerably less expensive than traditional methods of computer education. Therefore, in those circumstances where schools and teachers are absent, MIE learning stations are an adequate substitute. Places affected by natural disasters, such as the recent tsunami in the Indian Ocean, or places affected by war, such as Afghanistan or Iraq, or places affected by economic or social problems such as poverty or HIV/AIDS in Africa, are likely to benefit quickly and reliably through such self learning methods.
While this paper is about the acquisition of computer literacy, there are indications that MIE learning stations produce other changes in children's social and educational achievements. Such changes (Inamdar, 2004; and others to be published) are described elsewhere.
We wish to acknowledge the contributions of Marmar Mukhopadhyay (NIEPA), Kenneth Kenniston (MIT), Bruno Bondet de la Bernardie (IFC), Anil Inamdar (The Aquarians), Manas Chakrabarti and Suneeta Kulkarni (Soham), in advising the research effort.
Financial assistance from the International Finance Corporation, the Government of Delhi, the ICICI bank and NIIT Limited is gratefully acknowledged.
Clement, D.H. (1999). Young children and technology. In Dialogue on early childhood science, mathematics and technology education. Washington DC: American Association for the Advancement of Science Project 2061. http://www.project2061.org/publications/earlychild/online/experience/clements.htm
Inamdar, P. (2004). Computer skills development by children using 'hole in the wall' facilities in rural India. Australasian Journal of Educational Technology, 20(3), 337-350. http://www.ascilite.org.au/ajet/ajet20/inamdar.html
Mitra, S. (1988). A computer assisted learning strategy for computer literacy programmes. Presented at the Annual Convention of the All-India Association for Educational Technology, December 1988, Goa, India.
Mitra, S. (2000). Minimally invasive education for mass computer literacy. Paper presented at CRIDALA 2000 Conference, Hong Kong, 21-25.
Mitra, S. & Rana, V. (2001). Children and the Internet: Experiments with minimally invasive education in India. British Journal of Educational Technology, 32(2), 221-232. http://www.hole-in-the-wall.com/docs/Paper02.pdf
Mitra, S. (2003). Minimally Invasive Education: A progress report on the "Hole-in-the-wall" experiments. British Journal of Educational Technology, 34(3), 367-371.
Mitra, S. (2004). The hole in the wall. Dataquest, 23 September. http://www.dqindia.com/content/industrymarket/2004/104092301.asp#interact
MIE Users Manual (2003). Hole-in-the-wall Education Limited (HiWEL), NIIT Ltd. New Delhi, India. Contact http://www.niitholeinthewall.com/
Papert, S. (1980). Mindstorms: Children, Microcomputers and powerful ideas. New York Basic Books.
van Cappelle, F., Evers, V. & Mitra, S. (2004). Investigating the effects of unsupervised computer use on educationally disadvantaged children's knowledge and understanding of computers. Proceedings of CATaC 2004, Karlstad, Sweden. (pp528-542)
Wullenweber, W. (2001). Das Loch in der Wand. Stern Magazine, No. 42, October 11, pp 97-102. http://www.indien-netzwerk.de/navigation/kulturgesellschaft/gesellschaft/artikel/computer_slumkids.htm
|Authors: Sugata Mitra, Ritu Dangwal, Shiffon Chatterjee, Swati Jha, Ravinder S. Bisht and Preeti Kapur.|
Centre for Research in Cognitive Systems, NIIT Limited, Synergy Building, IIT Campus, Haus Khas, New Delhi 110016, India. Email: firstname.lastname@example.org Web: http://www.niitholeinthewall.com/
Please cite as: Mitra, S., Dangwal, R., Chatterjee, S., Jha, S., Bisht, R. S. and Kapur, P. (2005). Acquisition of computing literacy on shared public computers: Children and the "hole in the wall". Australasian Journal of Educational Technology, 21(3), 407-426. http://www.ascilite.org.au/ajet/ajet21/mitra.html