To Improve Teacher Effectiveness and Student Ability by Digitized Assessment Technology with Facilitating OBE

Examining the Impact of Computerized Assessment on Student Motivation and Performance in ICT Courses

by Kamatchi K. S.*, T. Gnana Sambandan,

- Published in Journal of Advances and Scholarly Researches in Allied Education, E-ISSN: 2230-7540

Volume 16, Issue No. 7, May 2019, Pages 146 - 150 (5)

Published by: Ignited Minds Journals


ABSTRACT

Earlier research has found that computer based assessments (CBA) have many valuable effects on learning outcome (LO) of students. This study examines the impact of computerized assessment on the underlying motivation and performance of students effectively in ICT (Information Communication Technology) courses. Digitized method of assessments provides meticulous information to both students and facilitator that helps to improve student learning and effectiveness in teaching (Angelo, 1991). In this paper, we try to compare the curriculum of MCA in the state of Tamilnadu, in India with the components of OBE followed in various developing countries. The proposed method shows that the students exposed to frequent computer based assessments had definitely gained higher levels in using technology, learning skill, creativity, independence, and overall motivation compared to other students.

KEYWORD

computer based assessments, learning outcome, ICT courses, digitized method, student learning, effectiveness in teaching, MCA curriculum, OBE components, technology skills, motivation

I. INTRODUCTION

―Outcome-based means it refers the effect of learning that distribute among the students but not the curriculum. Outcome is determined by what the student can do at the end of any learning event. The OBE focus on educational activity from ―teaching to learning; skills to thinking; content to process; and teacher instruction to student demonstration‖ Spady (1994). Assessment is a evaluating catalyst for students learning (for example, Brown et al., 1997) and there is a huge effort to measure learning outcome of students on higher education institutions that are more approved (Farrer, 2002; Laurillard,2002).The main aim of assessment in OBE is used to validate the students learning outcome, either it may be summative or diagnostic or formative. Performance and actions of students are measured by representing the ideas (skills), content (domain knowledge), and tools used (problem solving abilities) etc.(SPT Malan, 2017). Computer assisted assessment (CAA) is a powerful mode of assessment to provide an accurate feedback, and allow the teacher to revise the instructions (Brown et al., 1997; Bull & McKenna, 2004). The purpose of this study is to analyse the effectiveness of teachers to improve student creativity skill and the instruction given by the teachers to improve the performance of students. Assessing expertise (cognitive, skills and affective) is stated in learning objectives can be tested by using the criterion (Mpepo, 1998), under a predetermined external standard. Different changes in strategies and the availability of new analytical methods is essential to identify the learning outcome of the students and effectiveness of teachers based on their instructional methods. Learning contents must be assessed by the teacher not only for core but also concentrate on the learning outcomes of students (SLO) for better in training. Feedback on student performance can be automatically delivered on a question-by-question basis, or for the whole assessment. OBE‘s focus on assessing learning from various learning activities, that change from teaching to learning like cognitive (e.g. online tests), group work (e.g. project development), continuous assessment (e.g. assignments) are dominant features. This paper contains the data collected from 35 trainee participants at the National Institute of Technical Teachers Training and Research, Chennai by conducted a survey that includes 17 different developing countries around the world (Table 1.0). who had visited Chennai, India for a training programme during February-March 2019. Questionnaire was constructed to the participants on the influence of ICT in India. The feedbacks were compared with a similar surveyed data conducted for the participants from the developing

was the training programme‘s coordinator and authors were interacted with the participants undergoing an ICT training programme through Outcome Based Education (OBE) sponsored entirely by the Indian Government. The presentations of other findings are beyond the scope of this paper while the study reported in this paper is only a part of a whole research project. Important conclusions have been drawn from the comparative study.

II. BACKGROUND

This article explains that OBE organizes the educational system that is essential for all learners to know, value, and be able to do and achieve a desired level of competence (Commission on Higher Education, 2014). Assessment is a continuous process that occurs during, before and after the instruction is delivered. Actions and performances that successfully reflect and express the learner competence using ideas, tools, content, and information. Effectiveness in teacher instruction focuses on providing students with basic skills and critical thinking skills to be successful (Zahorik et al., 2003). Digitized technology has an essential role to play in efficient and effective assessment of learning. E-Assessment can help teachers to assess their students' learning as well as their performance in the classroom. Effective teachers spend more time on teaching with maximizes instructional time than on classroom management (Molnar et al., 1999). ICT in assessment involves the use of digital devices (smart phone, iPad, gaming) to assist in the construction, delivery, storage or reporting of student assessment tasks, responses, feedback or grades. Feedback is provided to departments and instructors to understand the quality of students used in their portfolios to demonstrate the general education competencies. Students made greater performance gains that they had access to technology Schacter (1999). Technology has a greater impact on encouraging critical thinking in students to improve higher order thinking skills and overall performance (Cradler, McNabb, Freeman, & Burchett, 2002). Effective teachers check for student understanding throughout the lesson and adjust their instruction based on the feedback (Guskey, 1996). The authors evaluate certain learning domains such as Basic Core Knowledge, Communication, Entrepreneurship, Professionalism and Humanity under the skills category, namely Psychomotor, Cognitive, Affective and Psychomotor respectively. A critical analysis for an assessing unit outcome is documented either by manual or online based system (Shamsul Muhamad et. al.2012). The authors have argued that direct assessment which can be used as indicator to measure the learning outcomes and it is used for mapping the long term Programme Educational Objectives. Assessment were carried out and results obtained by conducting experiments on ICT courses these investigation, this research work considered three instruments namely assignments, project development and online-tests have undergone a survey administrated by the authors in the state of Tamil Nadu in India, while considered quiz, group activities and others apart from these three for the Developing countries. The paper has considered only the essential portfolios of assessments according to OBE. Results obtained are from survey conducted on the samples.

III. METHODOLOGY

The interview interference is used for survey methodology was considered. For the proposed comparative study, the Purposive sampling technique (Sharma, BVS, 1988) has been adopted. Demographic data are presented below.

A. Demography of the Samples:

In this paper the hypothesis is that there is a significant difference in considering assessment portfolios of Outcome Based Education between India (delimited to the State of Tamilnadu) and developing countries in the area of Information and Communication technology that reveals effectiveness of teacher instructions and student ability in learning. The demography of Developing countries: Number of samples (respondents): 35, representing 17 different countries as listed in Table 1.0 (consisting of ITEC: Indian Technical and Economic Cooperation programme, TCP: Technical Cooperation Scheme of Colombo Plan and the SCAP: Special Commonwealth assistance for Africa Programme). These participants had visited Chennai to go through a two months training programme on ―ICT Applications in Education and Training‖, sponsored by Govt. of India. (February and March 2019). The demography of Tamilnadu (restricted to Chennai district): 26 Institutes offering MCA programmes (10 from Central region; 4 from Northern region and 12 from Southern region). Number of samples (respondents): 35 teachers of MCA (11 from Central regional institutions; 9 from Northern regional institutions and 15 from Southern regional institutions) with a well combination of gender and experience. Purpose sampling is selected for the purpose of opinions for feedback analyses, as it is known to represent the total required data for well-matched groups. The sampling is based on ―Purposive or convenient sampling‖ (Sharma BAV 1988). In addition, this selection is also influenced by the fact that the availability and willingness of respondents are also sensitive, but satisfies the purpose of the research Average (excluding lower level schools): Online Test = 3.075; Project-development = 4.098; Assignments =2.095. Average experimental survey

IV. RESULTS AND DISCUSSIONS

Figure 1.0 shows the distribution of average minimum number of online test opined by the participants of the developing countries. The average of average coincides with that of Tamilnadu (restricted to Chennai district), namely 3.25 Mauritius, alone shows relatively high, and appears that the assessment system would match with OBE practices.

(Table 1.0) COMPARISON OF CBA CONDUCTED IN DEVELOPING COUNTRIES AND TAMILNADU (RESTRICTED TO CHENNAI DISTRICT) – for SIMILAR COURSES

Standard deviation: 3.075

FIGURE 1.0 COMPARISONS ON AVERAGE # OF ONLINE-TEST PRACTICED IN 17 DEVELOPING COUNTRIES WITH TAMILNADU

Standard deviation: 2.095

FIGURE 2.0 COMPARISONS ON AVERAGE # OF ASSIGNMENTS PRACTICED IN 17 DEVELOPING COUNTRIES WITH TAMILNADU

Standard deviation: 4.098

FIGURE 3.0 COMPARISONS ON AVERAGE # OF PROJECT-DEVELOPMENT PRACTICED IN 17 DEVELOPING COUNTRIES WITH TAMILNADU

In the case of average minimum number of assignment is concerned. Figure 2.0 shows average of average value is 2.095 in the case of developing countries, whereas Tamilnadu shows a meager 2.038. In the case of average minimum number of project development is concerned. Figure 3.0 shows average of average value is 4.098 in the case of developing countries, whereas Tamilnadu shows a meager 2.032. Even with this value, the average # of project-development practiced in Tamilnadu is found to be grossly

V. CONCLUSION

Portfolios such as online test, continuous assignment and project-development works would greatly assist in enhancing the learning outcome of a student by a teacher as instructions in order to improve their learning ability, skill as well as to understand the knowledge of students to improve the overall progress, as per the principles and practices of Outcome Based Education. While the average number of project-development works being practiced in Tamilnadu is even though less, compared with the developing countries for facilitating computer based online assessments for students. Many developing countries across the world are attempting to create Outcome Based Education in similar problem based subjects like the MCA programmes of Tamilnadu State in India. The study reported in this paper gives alarming results which are brought as conclusions presented below. In the case of periodical online tests, the average number practiced in Tamilnadu matched well with that of developing countries. However, in the case of assignment, an important portfolio of computer based assessment; the average number practiced in Tamilnadu is grossly inadequate when compared with some countries such as Fiji and Bhutan only partly comparable with some African countries.

REFERENCES

1. Spady, William, G, (1994). ―Outcome Based Education in Critical Issues and Answers‖, Arington, American Association of School Administrators. 2. Stephens, D. (1994). computer-assisted assessment is a time saver or flexible distractor! Active Learning, 1, pp. 11–15. 3. Van der Horst, and McDonald, R, (1997). ―OBE. A Teachers Manuals‖, Pretoria, Kagiso, 1997, pp. 10-11. 4. Wyatt-Smitt, C.; Klenowski, V.; Colbert, P. (Eds.) (2014). Designing the assessment for Quality Learning. Springer. 5. Black, P., & Wiliam, D. (1998). Assessment & classroom learning and Assessment in Education: Principles, Practice & Policy 5(1), pp. 7-74. 6. Aziz, A., Megat Mohd Noor, M., Abang Ali, A. and Jaafar, M. (2005) International Journal of Engineering and Technology: A Malaysian outcome-based engineering in education model. 2 (1), pp. 14–21. 8. Shekar, C.R., Farook, O. and Bouktache, E. (2008). Continuous improvement process based on outcome based education. Nashville, TN: IAJC-IJME. 9. Davies, A., et. al. (2012). Connections Publishing and Bloomington, IN: Solution Tree Press , Leading the Way to Assessment for Learning: A Practical Guide, 2nd Edition. 10. Targeted Teaching: How efficient use of data can improve learning among students. Grattan Institute Goss, P. Hunter, Parsonage, H. J., Romanes, D. (2015). 11. Timperley, H. (2015). Article AEL Journal of the Australian Council for Educational Leaders Leading teaching and learning through professional learning.: Vol. 37 No. 2. 11. Goodman M, Bennett R. E., Hessinger, J., Kahn, H., Liggett, J., Marshall, G. & Zack, J. (1999). Using multimedia in large-scale computer-based testing programs, Computers in Human Behaviour, 15(3), pp. 283–294. 12. Stephens, D. & Mascia, J. (1997) Results of a survey into the use of computer-assisted assessment in institutions of higher education (Loughborough University). 13. SPT Malan, (2000), ―The New Paradigm‟ of Outcome Based Education in Perspective‖, pp. 22-28 14. Shamsul Muhamad, Zarina Tukiran, Rafizah Mohd Hanifa, Afandi Ahmad, Mohamad MdSom (2012). ―An Evaluation of Assessment Tools in Outcome-based Education: A Way Forward‖, pp. 336-343. 15. Haidar M. Harmanani (2017). ―An Outcome Based Assessment Process for Accreditating Computing Programmes‖, European Journal of Engineering Education, Vol. 42, No. 6, 2017, pp. 844- 859. 16. Anderson, L. W. & Krathwohl, D. R. (2001). A taxonomy for learning, assessing and teaching. A revision for blooms taxonomy of educational objectives New York, Longman. Proceedings of the 7th International Computer Assisted Assessment Conference (Loughborough, Loughborough University), pp. 19–29 18. Joanna Bull : ―Computer-Assisted Assessment: Impact on Higher Education Institutions ―CAA Centre, Teaching & Learning Directorate University of Luton, United Kingdom. 19. Sharma, B.A.V. (1988). ―Research Methods in the field of Social Sciences‖, S. Chand & Co. New Delhi, India. 20. Bloom, B. S. (1956). Taxonomy of educational objectives: the classification of educational goals and the Handbook on Cognitive domain, New York, Longman.

Corresponding Author Kamatchi K. S.*

Research Scholar, Department of CSE, NITTTR, Chennai, Tamil Nadu, India kampradeep26@gmail.com