|Australian Journal of Educational Technology
1990, 6(2), 92-98.
The purpose of any training system or organisation is to produce a trained person who can successfully perform specific tasks in the workplace. If these tasks cannot be performed to a required standard, then the training has failed. In fact, the quality control process of ensuring that a particular course meets the job requirements, by investigating the trained person on the job, is arguably the most important phase in a training system. After all, if a person performs well in the workplace there is probably nothing significantly wrong with the training course, and any necessary corrective measures will be concerned more with the efficiency of the training.
Despite the importance of this process, external evaluation (or 'validation') has been neglected in the past and only now is gaining the recognition it deserves. Current training legislation (NSW, 1989) attempts to ensure quality training in a cooperative industrial context. For the future, much will depend on the emphasis policy makers place on quality control, and in particular on validation (Bright, 1990). This paper describes the validation process and demonstrates that a relatively simple and cost-effective validation unit can be of considerable benefit to any organisation involved in training.
Validation of a particular course is the external evaluation process which concentrates on information concerning the trained person in the job. A number of terms have been used to describe this process and they include 'external evaluation', 'external validation', 'summative evaluation' and long term follow-up evaluation'.
Figure 1: The Royal Australian Navy Training System Model.
The differences between 'evaluation' and 'validation' are illustrated in Figure 2 where it can be seen that evaluation is concerned with the efficiency of a course or on-the-job training and validation investigates the effectiveness of these processes by looking at the trained person (either fully or partly trained) on the job.
Figure 2: Quality control of the training process.
The use of assistance from other departments should be considered. This may include assistance from occupational analysts, training developers and training consultants. These personnel may already be available within an organisation and part-time assistance readily available. The required manpower can then be determined. It may be possible to allocate validation responsibilities to existing staff although it would obviously be preferable to have staff solely dedicated to the validation function. Costs may involve travel (the validation team will need to visit graduates in the workplace), duplicating and typing (production of questionnaires and reports) and analysis instruments (computers and optical mark readers). These costs, however, may be minimised if the validation team is co-located with the workforce and if there are only a few course graduates to interview and therefore only a few questionnaires to analyse. The number of course graduates should not influence the amount of validation effort. Only a handful of astronauts are trained each year, but who would question the necessity for careful validation of the associated training course?
Planning at the micro level is relatively straight forward. Most courses can be validated over a four month period although allowances should be made for particularly complicated courses, remote and varied work locations and the availability of interviewees. The four stages of a validation study are illustrated in Figure 3, and each stage takes approximately one month to complete, assuming three studies are being undertaken at any one time.
The instruments used to collect information include questionnaires, interviews, expert panel input, observations on the job, course results, evaluation reports and company reports (including safety reports). Data is collected, collated and analysed and the draft report is checked with interested parties before the final report is written. The report recommendations are followed-up after a pre-determined time.
A combination of informal and structured interviews produces the best results despite the difficulties in recording responses, and are particularly useful in discovering attitudes associated with the course. The use of audio tape, overt or covert, should be avoided. Telephone interviews may be used when personal interviews are not possible (Crockett, 1989).
Areas of concern are then identified. These would include those tasks that are not considered to be useful by more than half the respondents and those that have never been used by more than eighty per cent of the job holders. Interview data is analysed to form additional conclusions.
The final report is then produced in a high quality format and issued from the highest suitable authority to trainers and managers. The report, however, does not mark the conclusion of the validation study.
Report production is an indicator of performance at the internal office management level and should meet a pre-determined annual output of quality reports. The number of recommendations implemented is the performance indicator at the second level (training department). Dollar savings as a result of creating a cost effective training environment is a third performance indicator and affects the organisation at the third and highest level.
The systematic approach to training is the most effective method of training for purpose. The validation phase provides the vital feedback from the job to ensure that effective training occurs. If quality control of training is to be given the attention it warrants, then validation will become an increasingly important function.
Bright, J.C. (1990). Pre-conditions for Flexible Training Systems in Education and Industry in Australia in the 1990s. Industry Training for the 1990s Conference Paper 4.
Crockett, R.A. (1989). An Introduction to Sample Surveys - A User's Guide. Australian Bureau of Statistics, Victoria, Australia.
Dick,W. and Carey, L. (1978). The Systematic Design of Instruction. Scott, Foresman and Company, Glenview, Illinois, USA.
Gay, L.R. (1987). Educational Research - Competencies for Analysis and Application. Merrill Publishing Company, Columbus, Ohio, USA.
NSW Parliament Legislative Assembly (1989). Industry and Commercial Training Bill. Government Printer Sydney, NSW, Australia.
Poulter, B (1982). Training and Development. CCH Australia Ltd, Sydney, NSW, Australia.
Romiszowski, A.J. (1981). Designing Instructional Systems. Kogan Page, London, UK.
|Authors: Peter Brown and Michael Hickey are Training Validation Officers with |
Department of Defence (Navy), PO Box 706, Darlinghurst NSW 2010.
Please cite as: Brown, P. and Hickey, M. (1990). Validation: Cost effective external evaluation. Australian Journal of Educational Technology, 6(2), 92-98. http://www.ascilite.org.au/ajet/ajet6/brown.html