E-Learning: an Approach to Evaluate Subjective Questions For Online Examination System Using Data Similarity: Research Paper

Evaluating Descriptive Exam Answers for E-Learning Systems Using Data Similarity

by Manisha Malyal*, Ms. Sudheshna, Dr. Soniya,

- Published in International Journal of Information Technology and Management, E-ISSN: 2249-4510

Volume 8, Issue No. 11, Feb 2015, Pages 0 - 0 (0)

Published by: Ignited Minds Journals


ABSTRACT

In recent years we have seen that a number ofgovernments, semi government examinations are gone online, for example [IBPSCommon Written Examination (CWE)]. This system or any other such systems areadvantages in terms of saving resources. However we have observed that thesesystems cater only multiple choice questions and there is no provision toextend these systems to subjective questions. Our objective is to design analgorithm for the automatic evaluation of single sentence descriptive answer.The paper presents an approach to check the degree of learning of thestudent/learner, by evaluating their descriptive exam answer sheets. Byrepresenting the descriptive answer in the form of graph and comparing it withstandard answer are the key steps in our approach.

KEYWORD

E-Learning, subjective questions, online examination system, data similarity, algorithm, automatic evaluation, descriptive answer, learning, student/learner, graph

I. INTRODUCTION

For ever-growing population governments strive to provide a good education system, proper drinking water & food and other amenities for the betterment of their people. For countries like India, ever-growing population and poor infrastructure hampers quality education system. The number of students appearing for board examination in state of Utter Pradesh itself [1]. It can be imagined the amount of pressure that is subjected on education system and teachers to evaluate the number of answer copies. Further according to the figure [2] as per the Annual Status of Education Report (ASER) 2012, 96.5% of all rural children between the ages of 6-14 were enrolled in school. This is the fourth annual survey to report enrollment above 96%. 83% of all rural 15-16 year olds were enrolled in school. However, going forward, India will need to focus more on quality. Gross enrollment at the tertiary level has crossed 20% (as per an Ernst & Young Report cited in Jan 2013 in Education News/minglebox.com) As per the latest (2013) report issued by the All India Council of Technical Education (AICTE), there are more than 3524 diploma and post-diploma offering institutions in the country with an annual intake capacity of over 1.2 million. The AICTE also reported 3495 degree-granting engineering colleges in India with an annual student intake capacity of over 1.76 million with actual enrollment crossing 1.2 million. Capacity for Management Education crossed 385000, and post graduate degree slots in Computer Science crossed 100,000. Pharmacy slots reached over 121,000.Total annual intake capacity for technical diplomas and degrees exceeded 3.4 million in 2012. According to the University Grants Commission (UGC) total enrollment in Science, Medicine, Agriculture and Engineering crossed 6.5 million in 2010. Charu Sudan Kasturi reported in the Hindustan Times (New Delhi, January 10, 2011) that the number of women choosing engineering has more than doubled since 2001.We propose an Approach to Evaluate Subjective Questions for Online Examination System.

WHAT IS E-LEARNING?

E-learning is an education via the Internet, network, or standalone computer. E-learning is basically the network- enabled convey of skills and knowledge. E-learning refers to using electronic applications and processes to learn. E-learning applications and processes include Web-based learning, Computer-based learning, virtual classrooms and digital collaboration. EL is when content is delivered via the Internet, intranet/extranet, audio or video tape, satellite TV, and CD- ROM. E-learning was first called "Internet-Based training" then "Web-Based Training" Today you will still find these terms being used, along with variations of E-learning. EL is not only about training and instruction but also about learning that is

2

  • Practitioner confidence and skills
  • Learner access and choice
  • Flexible, customizable systems and tools
  • Enabling, cost-effective technical infrastructures
  • Enabling, responsive e-learning policies and processes
  • Institutions using e-learning to widen participation, deliver flexible opportunities, support work-based learning.

RELATED WORK

In recent years we have seen that a number of governments, semi government examinations are gone online, for example [IBPS Common Written Examination (CWE)]. This system or any other such systems are advantages in terms of saving resources. However we have observed that these systems cater only multiple choice questions and there is no provision to extend these systems to subjective questions. We have studied a number of problems for which these systems cannot be used in board examination, university examination where student writes subjective answers, these problems are as follows:

  • Students coming from various educational backgrounds have difficulties like grammatical

mistakes, formation of correct and complete sentences.

  • Natural language is very rich in expressibility;

therefore the same meaning can be conveyed in different forms using different set of word.

  • Similarly for evaluating answers for the same

question different teachers may have different views for example: Biology (biology is always full of exception)

  • Use of formulas and mathematical expressions are other classes of difficulties for these systems.
  • The multiple choice online examination systems so far have been design in such a way that a continuous link to the main server is to be maintain, in situations where link gets disturbed or bracken (temporally).

So far we have reviewed a lot of literature related to above field and we have found a) questions like who, why, how, when, can be asked by the system (..) an graving system to award marks/grades for the answer given d)We have found some literature on generating simple questions for mathematical problems and formula. Many architectures and features have been proposed for descriptive answer evaluation. The approaches are mainly based on keyword matching, sequence matching and quantitative analysis , but semantic analysis of descriptive answer is still an open problem. Considering the general structure of text analysis in natural language processing, most of the work has been done for morphological and syntactic analysis [7], [8], [9], [10], but semantic, pragmatic and discourse are still being explored. Online tools that support managing of online assessments such as Moodle and Zoho are based on string matching technique for short answers but long answer evaluation is still handled manually by most systems [9],[10]. Features which are available currently in online assessment are [9], [10]:

  • Question paper setting
  • Online Evaluation of objective type questions
  • Question bank editor
  • Spell checker
  • Grammar checker
  • Report generation of result
  • Descriptive answer evaluation is still an open problem. [6]

A. Benefits of Online Subjective Examination System

  • Can be adopted and implemented quickly
  • Reduce/eliminate faculty time demands in instrument development and grading (i.e., relatively low “frontloading” and “back loading” effort)
  • Objective scoring
  • Provide for externality of measurement (i.e., external validity is the degree to which the conclusions in your study would hold for other persons in other places and at other times – ability to generalize the results beyond the original test group.)

 Provide norm reference group(s) comparison often required by mandates.

Manisha Malyal1 Ms. Sudheshna2 Dr. Soniya3

  • Very valuable for benchmarking and cross-institutional comparison studies.

B. Issues in Online Subjective Examination System

  • Main issue of subjective examination is the explanation, example and description given by students which may have different words i.e.

synonym used to frame the sentence but they must have the same meaning and point’s necessary for the answer to be correct.

  • Second issue is the size or the length of the sentences in an answer. Indirectly the answer vary person to person, which requires huge

efforts to put them into the category according to context.

PROPOSED MODEL FOR ONLINE SUBJECTIVE EXAMINATION SYSTEM

In this work we will propose solution of aforementioned problems, precisely our system will solve the problem of deducing knowledge represented by partially or grammatically incorrect sentences, we will interpret the meaning conveyed by the student in different forms and sentences, propose a normalized strategy for grading the answers, ways to interpret the mathematical formulas and expression however our system will be limited to non-mathematical subjects only. The proposed architecture of the system is: Figure 1. Online Subjective Examination System model In order to calculate the function point from UML diagrams, we use the sequence diagrams and class diagrams. Because these diagrams includes the information about all functions and data manipulated in the system

Input User Data

When any student will give the answer then it would be taken as the input for the user for checking the data base by using the input of the user we will calculate the grade of the student.

Tokenization Gien a character sequence and a defined document unit, tokenization is the task of chopping it up into pieces, called tokens, perhaps at the same time throwing away certain characters, such as punctuation. input: Students lend me your ears. Output:

Tokenization is the act of breaking up a sequence of strings into pieces such as words, keywords, phrases, symbols and other elements called tokens. Tokens can be individual words, phrases or even whole sentences. In the process of tokenization, some characters like punctuation marks are discarded. The tokens become the input for another process like parsing and text mining.

Stop Word Removal

stop words are words which are filtered out before or after processing of natural language data (text). There is no single universal list of stop word used by

4

support phrase search. Any group of words can be chosen as the stop words for a given purpose. For some search engin, these are some of the most common, short function words, such as the, is, at, which, and on. In this case, stop words can cause problems when searching for phrases that include them,particularly in names such as 'The Who', 'The The', or 'Take That'. Other search engines remove some of the most common words—including lexical words, such as "want"—from a query in order to improve performance.

Stemming to Data

Stemming is the term used in linguistics morphology and information retrival to describe the person for reducing inflect word to their word stem,base or root generally a written word form.

Using Similarity Formula

If two queries contain the similar terms, they denote the same information needs. The below mentioned formula is used to measure the similarity between two queries. SimKeyword(x,y)=KW(x,y)/max(kW(x),kW(y)) Where kW(x) and kW(y) are the number of keywords in the queries x and y respectively, KW(x, y) is the number of common keywords in two queries. It is noted that longer the query, the more reliable it be. Though most of the user queries are short, this principle alone is not enough. Consequently, the second criterion is also used in combination as a complement.

CONCLUSION

Automatic Evaluation of Subjective Questions for Online Examination System would be beneficial for the universities, schools and colleges for academic purpose by providing ease to faculties and the examination evaluation cell. So this Proposed model will convert the student’s subjective answer and standard answer into its graphical form and then, to apply some of the similarity measures such as string match, wordNet and spreading process for the calculation of similarity score. This proposed model provides a solution for the automation of subjective answer evaluation process.

FUTURE WORK

More analysis would be required for similarity matching. Derive a method to check the domain ontology of two phrases .Find out an appropriate technique to minimize the gap between human and

REFERENCES:

[1]. International Journal of Inventive Engineering and Sciences (IJIES) ISSN: 2319–9598, Volume-1, Issue-9, August 2013 [2]. Papri Chakraborty , ”Developing an Intelligent Tutoring System for Assessing Students’ Cognition and Evaluating Descriptive Type Answer”,IJMER, pp 985-990, 2012 [3]. Mita K. Dalal, Mukesh A. Zave , “Automatic Text Classification: A Technical Review”, International Journal of Computer Applications, pp.37-40, 2011 [4]. Meghan Kambli,“Report on Online Examination System”, Available:http://www.cdacmumbai.in/design/corporate_site/override /pdfdoc/meghareport.pdf [5]. Jawaharlal Nehru Technical University, Hyderabad, “Examination and Evaluation System”, Available: [6]. http://articles.timesofindia.indiatimes.co/2011-03 17/allahabad/29138537_1_first-paper-examination-centres-boys. [7]. https://en.wikipedia.org/wiki/Education_in_India [8]. http://www.cs.umd.edu/~nau/cmsc421/part-of-speech-tagging.pdf [9]. http://en.wikipedia.org/wiki/Academic_grading_in_India. [10]. http://www.cs.rochester.edu/~nelson/courses/csc_173/grammars/parsetrees.html