|Australasian Journal of Educational Technology
2012, 28(4), 684-702.
Evaluating quality in online asynchronous interactions between students and discussion facilitators
Dip Nandi, Margaret Hamilton
The University of Melbourne
Online discussion forums have become an essential part of university courses, whether the course is conducted online, or face to face, or in mixed or blended mode. Discussion forums are considered to engage students better with the course content and encourage them to share and gain knowledge from each other. However, online engagement does not always happen automatically between students. Hence grading of discussion forum participation has been recommended to ensure quality student participation. Currently, a major focus has been put onto the better use of discussion forums, but the way in which the quality of participation can be evaluated has yet to be adequately investigated. Furthermore, evaluation of the instructor participation in a discussion forum and its impact on students and their contributions is lacking. In this paper, we report on our research into online discussion forum quality through analysis of discussion forum activities, along with student focus group meetings and instructor interviews. We have devised a set of criteria for evaluating discussion forum activities. Our results show that students depend highly on the instructor's feedback and the participation of the students can only be evaluated with reference to the moderation of the instructors.
How can assessors evaluate quality in online asynchronous interactions between students and between students and their students' facilitators?
"Interaction" has been recognised as the most significant attribute in any online system or course. The importance of interactivity is highlighted by several researchers who have conducted research in online learning systems (Maor & Volet, 2007; Al-Mahmood & McLoughlin, 2004; Sharples, 2000). Without interactivity, a discussion forum simply becomes a bulletin board for posting messages and information.
On this point, a few authors, including Berner (2003) and Laurillard (2002), note that participation is more active if some sort of assessment is linked to it. Klisc, McGill & Hobbs (2009) uggested that incorporation of assessment of participation has a positive impact on learning outcomes. Indeed, whether courses are completely or partially online, Burkett, Leard and Spector (2004), Leh (2002) and Seo (2007) all indicated how grade points might be used as an incentive to enhance participation amongst learners.
However, assessing the participation in asynchronous discussions of the students is a major challenge and difficult for the instructors (Liu, 2007). The main concern is how to assess and what guideline to consider for assessment. This issue of assessment of student participation in the online discussion has been a heated topic among educators and researchers in online education (Bonk and Dennen, 2003).While there is some literature in this regard; there is a lack of empirical studies (Ho, 2002).
For assessment of discussion forum participation to work effectively, there needs to be a comprehensively defined framework that can assist the evaluators and students clearly. Having a comprehensive framework can also act as a guideline for participants and educators. Brannon and Essex (2001) stated the need for clear communication protocols and requirements for posting, and suggested that the continued development of an innovative evaluation framework is necessary to improve the quality of contributions to an online discussion. A rubric that explicitly describes levels of responses will stimulate learning by challenging students to reflect and think critically, rather than post basic statements of understanding and mere opinion. (Anderson & Krathwohl, 2001)
We selected a postgraduate subject for this study because it provided a mostly online learning environment for the instructors and students, with only four face to face classes. The face to face classes were spread across the semester with one class in the first week, two in the middle of the semester and one at the last week of the semester. All other communications between the students and instructors were online through the learning management system provided by the university. Students were encouraged to participate in asynchronous discussion forums by the instructors and assessed on their participation, according to whether they understood the concepts themselves and contributed towards enhancing the understanding of the other students. These initiatives of engaging students in online asynchronous discussion forums were taken by the instructors to encourage the students to participate actively in productive interactions with others students and instructors by sharing views and ideas and knowledge.
Specific themes were uploaded by the instructors and students were expected to work through the readings on that theme and post their comments. Comments could be in the form of questions, opinions, analyses, etc. Students were expected and encouraged to work on each specific theme (with its related topics) for the duration of 3-4 weeks (for example, the first theme was studied for a period of four weeks).
Each week, the instructors provided a brief overview of the topic in the Content area. Then, to initiate discussion, the instructors posted questions in the discussion forum for the students to consider, which they were expected to respond to and discuss regularly by exploring the readings and concepts. An illustration of how the course and discussion forum activities were conducted is given in Table 1.
Students were expected to study the readings and answer those broad questions, post their views, agree or disagree with other's posts, whilst periodical formative feedback was provided by the instructors. New questions arose and were discussed through the week as the discussion progressed and students and the instructors engaged with the theme in more depth. Strategies for encouraging students to participate include prompting them with new and increasingly more complex questions, once previous ones had been discussed. These were complemented by the questions raised by the students themselves. In addition, students were encouraged by the instructors to answer each other's questions and to provide additional resources that might promote understanding. The instructor also prompted the few students who may not have contributed as much each week, to participate more. In addition, a 10% of total mark for the subject was allocated for discussion forum participation, which acted as an incentive for participation.
|Theme 1: Business and Information Systems Fundamentals (Weeks 1 - 4)
Good management requires a balanced approach to decision making that acknowledges that both the external environment (e.g. regulation, competition, customers) and the internal practices (e.g. what to produce, how to produce, how to market) are important to the success of the organisation. In the management of IT, similar rules apply. The first theme of the subject exposes you to three key concepts of social informatics, competitive forces and competitive advantage.
In Week 1, we will explore the Kling's review of the concept of social informatics, which highlights a fundamental approach to managing and understanding IT within organisations.
In Week 2, we will explore Porter's frameworks for understanding the external environment are greatly relied upon by IS professionals and academics.
In Week 3, we will explore the issues around competitive advantage and what it actually means for businesses. How can IS be used to create and sustain competitive advantage for businesses?
To kick the discussion off in Week 1, the instructors posted the following questions.
What do you think are some of the main principles behind social informatics?
Additionally, what did you take away from reading the PWC case?
From your own experiences, have you come across similar situations and could social informatics have assisted in understanding your own experience?
The students were assigned marks based on the consistency of their participation across the weeks (not just quantity), and also on the quality of their participation, by looking at the development of ideas over the weeks, ability to apply concepts learned to real world cases, helping other students understand complex concepts, and providing new insights into concepts broadly. There were 12 students enrolled in the course and all but one contributed consistently to the discussion forum.
In order to carry out the research, we conducted a focus group meeting with the students from the chosen postgraduate course and separately interviewed the two instructors of the course. The posts by the students and the instructors in the course discussion board were analysed. This data was processed using grounded theoretic approach (Strauss & Corbin, 1998) i.e. open, axial and selective coding (Neuman, 2006; Strauss & Corbin, 1990) so that information relevant to the research could be extracted
We systematically analysed the discussion board data to identify the themes-related participation. These themes provided a clear representation regarding the qualitative discussion between students and the instructors. Then we analysed the data from the focus group and the interviews with the students and instructors to find out on what aspects of discussion they put their most emphasis on and regarded as productive. By analysing and combining the data from the discussion board, focus group and the interviews we prepared the set of themes that the students and instructors exercised during their discussion and valued highly. As mentioned above this data analysis was performed using open, axial and selective coding method. A similar three stage data analysis technique was used by Vlachopoulos and Cowan (2010a, 2010b) while exploring the different styles and practice of e-moderation; and reports that this method is useful in gaining deep understanding of a phenomenon or a theme from raw data.
The purpose of open coding was to identify the themes or concepts within the discussion board, focus group and interview transcripts. Each separate concept in the data was labeled and similar ideas were grouped and labeled. Following open coding, the next step was axial coding, where the aim was to assemble coding categories into larger conceptual groupings. This process was repeated until no additional categories were identified and all the data had been analysed. The third and final coding step was selective coding. Again, the data were re-examined and the prior coding and grouping was revisited and verified or changed as required so that all the data are accounted for under a theme or sub-theme (Glaser & Strauss, 1967). This set of themes is presented in the "Findings" section.
We compared the themes arising from this data analysis with the set of criteria presented in the conceptual framework (Nandi et. al., 2009). The framework was extended according to the findings and details are discussed in the "Discussion" section.
The data analysis enabled the extraction of key and relevant information to the research and as a result, the research question was explored based on the results ascertained through these methods.
(a) Cognitive skills presented by the students in discussion forum participation
All the students presented a wide range of cognitive skills while participating in online interaction. Seven key themes under this category that came out are stated below along with quotations and discussion forum posts by the participants:
About wiki, I think it is a good resource. but i always remember one thing "start your research with wiki, but not end with wiki". [Student E, Forum]
I've been working since before computers were commonplace in offices, (I remember 'typing pools'!) And it seems to me that the way people work, and what they do have changed so much that there's not much that comparable between the eras. [Student G, Forum]
However, one question came up when you told that "it depends on an individual and on his "needs" as to how much he wants to get influenced by technology and that is where the "multiple effects" comes in from. [Student A, Forum]
What do you mean when you are talking about "decentralized production"? [Student C, Forum]
I like your idea about combining frameworks and models. When I read the article "Principles and modules for organizing the IT function" (2002), I found that every model has its specific strengths. [Student A, Forum]
[Students can get time to understand the concept, help each other and contribute enough. [Instructor, Interview]
Here, I'd like to cite one concept from M [pseudonym] and F [pseudonym] to support my argument. [Student F, Forum]
When I was talking about centralization in every country, I meant that shops in every country will be interconnected (what I also mentioned in my other posts). [Student H, Forum]
For instance, as you said, if it can establish a decentralized IS to all the stores and allow them to share the information (i.e., share the inventory information and do speeding coordination between stores in same regions), the company could even do response much quicker than it does ever before. [Student A, Forum]
Whether students understand concepts, applying their knowledge or not. [Instructor, Interview]
It would be great to have a discussion which is consistent and not, like it doesn't stay standard for a time and you keep waiting for the post, it should all be flowing. [Student B, Focus Group](c) Use of formal/informal language
All but one has participated consistently. [Instructor, Interview]
Is this a matter of business strategy? I think I'm a little bit out of track on this topic... [Student E, Forum]
Should be formal, not everything informal, I would never say hi how you doing like that, I would never say it on the discussion forum, if I want to communicate with my friends I will invite them to Facebook. [Student A, Focus Group]
Academic English and use of formal language is preferred because of the mix of culture. As there is a cultural difference between students so the meaning of something may not be the same for everyone. [Instructor, Interview]
I am not worried about informal language, in discussion forum, it is to discuss. For discussion informal language can be used. [Instructor, Interview]
Four key themes that came out of the data analysis regarding the type of moderation activities by the instructors are stated below with appropriate quotations:
To kick it off, what do you think are some of the main principles behind social informatics? [Instructor, Forum]
In doing the reading which is relatively long and detailed, you will find that this is how many organizations are having to assess IT Alignment at various different levels of their business and IT. [Instructor, Forum]
As a guide, like answer our questions and tell us whether we are on the right track, how you going, how you can go to find books on the topic or this article. [Student A, Focus Group]
One drawback/lack is after a few round of discussion students tend to wander and move a bit away from the initial question. [Instructor, Interview]
Good point Student E, now, you might want to look at alignment maturity which extends and expands on these ideas about 'fit between IT and business'. [Instructor, Forum]
As mentioned earlier, we have compared the findings of the data collection with the set criteria from the conceptual framework (Nandi et al., 2009). Results of the comparison show that almost all the themes extracted from the data analysis are consistent with the criteria from the conceptual framework and can be used as an assessment framework. A few extra criteria for qualitative discussion also came out of the data analysis and are discussed below in detail.
From Table 2 we realise that not all the criteria for quality derived from the literature review were exercised and valued by the students. The criteria of clarification, justification or judgment, application of strategies, breadth of understanding, relevance, participation rate and consistency of participation were very consistent with the initial criteria from the framework. Data analysis showed that students practised these criteria consistently throughout and instructors also emphasised these issues.
|Criteria from the framework||Is the criterion consistent with the result from data analysis?|
|Content||Clarification||Generally consistent with the criterion as students clarified their positions while discussing online.|
|Justification or judgment||Consistent and the students tried to justify their opinions.|
|Inferencing or interpretation||Semi-consistent with the criterion from the framework.|
|Application of knowledge (relevance)||Very much consistent as almost all the posts done by the students were about the topic of discussion.|
|Prioritisation||Initially most importance was given to answer the questions from the instructors. Often students did not try to find out important issues in the readings and discuss for themselves and waited the inputs and feedbacks from the instructors.|
|Breadth of knowledge||Consistent as students showed their understanding while responding to the questions posted by the instructors.|
|Critical discussion of contributions||Consistent with criterion as evidence was imminent of this quality in the posts.|
|New ideas/solutions from interactions||Very much consistent with the initial criterion.|
|Sharing outside knowledge or expertise||Generally consistent with the criterion as personal outside experience used to support their opinion.|
|Use of social cues or emotions to engage with participants||Not totally consistent with criterion as participants had disagreement with the use of informal language in the discussion forum.|
|Participation rate||Consistent with criterion as found from the data.|
|Consistency of participation||Reasonable consistency in participation.|
Evidence of some new themes for quality arose along with a few extensions of the themes that we found from the literature review. These issues are discussed below, as they could be considered controversial and not agreed by everyone.
Consequently it can be said that quality of student participation cannot be evaluated fully unless the moderation of instructors is evaluated. The type of moderation carried out by the instructors has an impact on the student's participation and so on the quality of the discussion. The instructor's role is significant in providing encouragement, feedback and direct instructions in specific cases (Maor, 2008). Hence there should be a set of criteria for evaluating instructor participation. This can also provide valuable guidelines to the instructor on how the discussion forum can be moderated so that students can maintain the quality of discussion and in doing so can gain and share knowledge.
A number of criteria came up from the data which revealed how moderation by instructors can be evaluated.
We include the following quotes as examples of what we understand to be good quality productive contributions to the online discussion forum under the two themes of application of knowledge and new ideas from interactions.
Whether all the companies should be aiming for level 5 depends on several factors, in my opinion. Firstly, how company ambitious is. Secondly, how high the information content in the industry where the company operates is. Different companies have different levels of information content. Thirdly, how big the organization is. For example, small stores in countryside. These organizations even do not have all the attributes in the six SAM criteria. Whether to pay more attention to some criteria or not, I believe that governance and partnership require special attention because:
1) In governance criteria business and IT strategies are identified. On these strategies mostly depends competitive abilities of the company. 2) In partnership criteria business perception of IT value and role of IT in business strategic planning are identified. If IT and business are not interconnected well, IT alignment will be unlikely. Your questions were very interesting. What do you think about them? [Student C, Forum]
I like your idea about combining frameworks and models. When I read the article "Principles and modules for organized the IT function" (2002), I found that every model has its specific strengths. The first model enables IT to be the active partner in business innovation by having strong position of IT in top management team, whereas the second model allows business to provide services and infrastructure for business innovation, and to build enterprise-wide platforms and capabilities. Why can't we combine all the specific features and to create powerful IT function model? Maybe, it sounds unrealistic? Why not? [Student F, Forum]Our framework has proved to be very useful for analysing discussion forums. Due to the exploratory nature of our research, the major focus was to identify key themes and sub themes which apply to online forums. A number of issues relating to effective online participation and engagement were raised by the participants, and discovered through the analysis of the information provided by them.
In order to have a better understanding of what it means by "quality" of posts, two major areas were considered, including type of participation by the students and moderation activities by the instructor. These were conducted under the umbrella of the concept "quality of posts", with the results of the research giving an indication about a set of criteria for quality interaction in discussion forums.
We have applied the framework in a blended learning environment and found it to be useful for evaluating quality in online interaction. Results showed that students were actively participating in the discussion forum activities by providing new information about the topics, justifying their position in a discussion by providing references from the literature, and examples from real world situations. They also show that the students are thinking critically and responding to each other's posts by agreeing or disagreeing with them. This productivity in participation can prove to be very helpful for the students in gaining valuable knowledge from each other's experiences.
Results also showed that instructors were setting up the discussions with initial engaging questions and providing feedback on a regular basis. This helped students to broaden their focus and inspiring them to continue their discussions. As mentioned earlier, the implementation of each and every criterion is influenced by the perceptions of the role of the instructors.
Identifying and evaluating the quality of interaction has established our framework to be unique and robust for the purpose of measuring the different dimensions of qualitative interaction between students and instructors.
There are several key areas that would benefit from future detailed research:
Anderson, L. & Krathwohl, D. (Eds.) (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom's taxonomy of educational objectives. New York: Longman Publishers.
Berner, R. T. (2003). The benefits of discussion board discussion in a literature of journalism course. The Technology Source, Sep/Oct. http://technologysource.org/article/benefits_ of_bulletin_board_discussion_in_a_literature_of_journalism_course/
Bonk, C. & Dennen, V. (2003). The 3 T's of online assessment: Technology, tools, and (saving) time. Workshop given at the 19th Annual Conference on Distance Teaching & Learning. University of Wisconsin, Madison. [verified 29 Apr 2012] http://www.trainingshare.com/download/wisc2/ts_part1.ppt
Bradshaw, J. & Hinton, L. (2004) Benefits of an online discussion list in a traditional distance education course. Turkish Online Journal of Distance Education, 5(3). http://tojde.anadolu.edu.tr/tojde15/articles/hinton.htm
Brannon, R. & Essex, C. (2001). Synchronous and asynchronous communication tools in distance education. TechTrends, 45(1), 36-42. http://dx.doi.org/10.1007/BF02763377
Burkett, R., Leard, C. & Spector, B. (2004). Using an electronic bulletin board in science teacher education: Issues and trade-offs. The Journal of Interactive Online Learning, 3(1), 1-9. http://www.ncolr.org/jiol/issues/pdf/3.1.1.pdf
Garrison, D. R., Anderson, T. & Archer, W. (2001). Critical thinking, cognitive presence, and computer conferencing in distance education. The American Journal of Distance Education, 15(1), 7-23. http://dx.doi.org/10.1080/08923640109527071
Garrison, D. R. & Anderson, T. (2003). E-learning in the 21st Century: A framework for research and practice. New York: Routledge Falmer.
Gerbic, P. (2006). To post or not to post: Undergraduate student perceptions about participating in online discussions. In Who's learning? Whose technology? Proceedings ascilite Sydney 2006. http://www.ascilite.org.au/conferences/sydney06/proceeding/pdf_papers/p124.pdf
Cheung, W. S. & Hew, K. F. (2010). Examining facilitators' habits of mind in an asynchronous online discussion environment: A two cases study. Australasian Journal of Educational Technology, 26(1), 123-132. http://www.ascilite.org.au/ajet/ajet26/cheung.html
Glaser, B. G. & Strauss, A. L. (1967). The discovery of grounded theory. Aldine, New York.
Goold, A., Coldwell, J. & Craig, A. (2010). An examination of the role of the e-tutor. Australasian Journal of Educational Technology, 26(5), 704-716. http://www.ascilite.org.au/ajet/ajet26/goold.html
Guzdial, M. & Carroll, K. (2002). Explaining the lack of dialogue in computer-supported collaborative learning. Paper presented at the Computer Supported Collaborative Learning Conference, CSCL 2002. http://www.umsl.edu/~wilmarthp/mrpc-web-resources/Explaining-the-Lack-of-Dialogue-in-Computer....pdf
Hawkes, M. & Dennis, T. (2003). Supporting and assessing online interaction. Educational Technology, 43(3), 52-56.
Henri, F. (1992). Computer conferencing and content analysis. In A. R. Kaye (Ed.), Collaborative learning through computer conferencing: The Najaden papers (pp. 115-136). New York: Springer.
Ho, S. (2002). Evaluating students' participation in on-line discussions. In Proceedings Australian World Wide Web Conference (AUSWEB), Sunshine Coast, Queensland, Australia. http://ausweb.scu.edu.au/aw02/papers/refereed/ho/paper.html
Jackson, K. (2010). What value assessment rubrics in shaping students' engagement in asynchronous online discussions? In Curriculum, technology & transformation for an unknown future. Proceedings ascilite Sydney 2010 (pp. 454-458). http://www.ascilite.org.au/conferences/sydney10/procs/Jackson-concise.pdf
Klisc, C., McGill, T. & Hobbs, V. (2009). The effect of assessment on the outcomes of asynchronous online discussion as perceived by instructors. Australasian Journal of Educational Technology, 25(5), 666-682. http://www.ascilite.org.au/ajet/ajet25/klisc.html
Laurillard, D. (2002). Rethinking university teaching: A framework for the effective use of learning technologies (2nd Ed.). London and New York: Routledge.
Leh, A. (2002). Action research on hybrid courses and their online communities. Educational Media International, 39(1), 31-38. http://dx.doi.org/10.1080/09523980210131204
Liu, S. (2007). Assessing online asynchronous discussion in online courses: An empirical study. TCC 2007 Proceedings. http://tcc.kcc.hawaii.edu/previous/TCC%202007/liu.pdf
Maor, D. (2008). Changing relationship: Who is the learner and who is the teacher in the online educational landscape? Australasian Journal of Educational Technology, 24(5), 627-638. http://www.ascilite.org.au/ajet/ajet24/maor.html
Maor, D. & Volet, S. (2007). Interactivity in professional learning: A review of research based studies. Australasian Journal of Educational Technology, 23(2), 227-247. http://www.ascilite.org.au/ajet/ajet23/maor.html
Mazzolini, M. & Maddison, S. (2007). When to jump in: The role of the instructor in online discussion forums. Computers & Education, 49(2), 193-213. http://dx.doi.org/10.1016/j.compedu.2005.06.01
Mazzolini, M. & Maddison, S. (2003). Sage, guide or ghost? The effect of instructor intervention on student participation in online discussion forums. Computers & Education, 40(3), 237-253. http://dx.doi.org/10.1016/S0360-1315(02)00129-X
Meyer, K. (2002). Quality in distance education: Focus on online learning. ASHE-ERIC Higher Education Report, 29(4), i-vii. San Francisco: Jossey-Bass.
Nandi, D., Chang, S. & Balbo, S. (2009). A conceptual framework for assessing interaction quality in online discussion forums. In Same places, different spaces: Proceedings ascilite Auckland 2009. http://www.ascilite.org.au/conferences/auckland09/procs/Nandi.pdf
Neuman, W. L. (2006). Social research methods: Qualitative and quantitative approaches, 6th Ed.
Newman, D. R., Webb, B. & Cochrane, C. (1996). A content analysis method to measure critical thinking in face-to-face and computer supported group learning. [verified 29 Apr 2012]. http://www.qub.ac.uk/mgt/papers/methods/contpap.html
Piguet, A. & Peraya, D. (2000). Creating web-integrated learning environments: An analysis of WebCT authoring tools in respect to usability. Australian Journal of Educational Technology, 16(3), 302-314. http://www.ascilite.org.au/ajet/ajet16/piguet.html
Rovai, P. A. (2000). Online and traditional assessments: What is the difference? The Internet and Higher Education, 3(3), 141-151. http://dx.doi.org/10.1016/S1096-7516(01)00028-8
Rovai, P. A. (2002). Building sense of community at a distance. The International Review of Research in Open and Distance Learning, 3(1). http://www.irrodl.org/index.php/irrodl/article/view/79/152
Salmon, G. (2003). E-tivities: The key to active online learning. London: Kogan Page
Seo, K. (2007). Utilizing peer moderating in online discussions: Addressing the controversy between teacher moderation and nonmoderation. The American Journal of Distance Education, 21(1), 21-36. http://dx.doi.org/10.1080/08923640701298688
Sharples, M. (2000). The design of personal mobile technologies for lifelong learning. Computers & Education, 34(3-4), 177-193. http://dx.doi.org/10.1016/S0360-1315(99)00044-5
Sheard, J., Ramakrishnan, S. & Miller J. (2003). Modeling learner and educator interactions in an electronic learning community. Australian Journal of Educational Technology, 19(2), 211-226. http://www.ascilite.org.au/ajet/ajet19/sheard.html
Steel, C. (2009). Reconciling university teacher beliefs to create learning designs for LMS environments. Australasian Journal of Educational Technology, 25(3), 399-420. http://www.ascilite.org.au/ajet/ajet25/steel.html
Strauss, A. L. & Corbin, J. M. (1998). Basics of qualitative research: Techniques and procedures for developing grounded theory, 2nd ed. SAGE Publications, Thousand Oaks, CA, USA.
Strauss, A. L. & Corbin, J. M. (1990). Basics of qualitative research: Grounded theory procedures and techniques. SAGE Publications, Newbury Park, CA, USA.
Weaver, C. M (2005). What encourages student participation in online discussions? Unpublished PhD thesis, University of Southern Queensland. http://eprints.usq.edu.au/1523/
Vlachopoulos, P. & Cowan, J. (2010a). Reconceptualising moderation in asynchronous online discussions using grounded theory. Distance Education, 31(1), 23-36. http://dx.doi.org/10.1080/01587911003724611
Vlachopoulos, P. & Cowan, J. (2010b). Choices of approaches in e-moderation: Conclusions from a grounded theory study. Active Learning in Higher Education, 11(3) 213-224. http://dx.doi.org/10.1177/1469787410379684
Yin, R. K. (1989). Case study research: Design and methods. Thousand Oaks, CA: Sage.
|Content||Clarification||Regurgitation of information||A clear explanation of available information||Explaining available information using relevant examples||Articulating available information to expand on ideas presented, including the use of examples|
|Justification||No justification of points||Justification based on personal opinion||Justification using existing cases, concepts or theories||Justification using existing cases, concepts or theories and providing clear discussion of implications|
|Interpretation||Misrepresentation of information||Basic paraphrasing of available information||Clear interpretation of available information||Critical discussion of available information|
|Application of knowledge (relevance)||No application or discussion of relevance to questions asked||Application of knowledge to questions asked||Application of knowledge including discussion using relevant examples||Knowledge is critically applied and may include discussion of limitations|
|Prioritisation||No prioritisation of information or knowledge||Some basic comparison of information||Ability to prioritise information and knowledge||Ability to prioritise information and knowledge based on criteria that learner has established|
|Breadth of knowledge||Narrow and limited knowledge||Some indication of a wider view of the topics discussed||Presenting a wider view of the topics discussed by showing a good breadth of knowledge||Ability to point out other perspectives, including drawing from other fields of studies|
|Critical discussion of contributions||No engagement with other learners' contributions||Some basic discussion about other learners' contributions||Consistent engagement with other learners' cont-ributions and acknowledge-ment of their comments||Contributing to a community of learners, with consistent engagement and advancement of each others ideas|
|New ideas from interactions||No evidence of new ideas and thoughts from interaction||Some new ideas developed as a result of interaction||Some solutions and new ideas as a result of interactions||Collaborative approach to solution seeking and new ideas developed|
|Sharing outside knowledge||No sharing of outside knowledge||Sharing generic information that is easily available from outside sources||Sharing real world examples that may not be immediately obvious to other learners||Sharing real life knowledge, personal experience and examples of similar problems/solutions|
|Using social cues to engage other participants||No engagement with others in the discussion forum||Answering some basic question posed by facilitator or other learners||Engaging with the work and discussion of other learners||Engaging and encouraging participation with fellow discussants in the forum|
|Objective measures (this category is subject to facilitators' expectations)||Participation rates||None or less than 2 posts per week||Between 2 to 5 posts per week||Between 5 to 10 good quality posts per week||More then 10 good quality posts per week|
|Consistency of participation||Rarely posts with occasional activity||Occasional activity||Consistent activity||Consistent and productive activity|
|Authors: Mr Dip Nandi, School of Computer Science and Information Technology, RMIT University, Melbourne, Victoria 3001, Australia.|
Dr Margaret Hamilton, School of Computer Science and Information Technology, RMIT University, Melbourne, Victoria 3001, Australia.
Dr Shanton Chang, Department of Information Systems, The University of Melbourne, Victoria 3010, Australia.
Dr Sandrine Balbo, Nuance Australia. Email: email@example.com
Please cite as: Nandi, D., Hamilton, M., Chang, S. & Balbo, S. (2012). Evaluating quality in online asynchronous interactions between students and discussion facilitators. Australasian Journal of Educational Technology, 28(4), 684-702. http://www.ascilite.org.au/ajet/ajet28/nandi.html