• Home
  • Submit Paper
  • Check Paper Status
  • Download Certificate/Paper
  • FAQs
  • Contact Us
Email: ijraset@gmail.com
IJRASET Logo
Journal Statistics & Approval Details
Recent Published Paper
Our Author's Feedback
 •  ISRA Impact Factor 7.894       •  SJIF Impact Factor: 7.538       •  Hard Copy of Certificates to All Authors       •  DOI by Crossref for all Published Papers       •  Soft Copy of Certificates- Within 04 Hours       •  Authors helpline No: +91-8813907089(Whatsapp)       •  No Publication Fee for Paper Submission       •  Hard Copy of Certificates to all Authors       •  UGC Approved Journal: IJRASET- Click here to Check     
  • About Us
    • About Us
    • Aim & Scope
  • Editorial Board
  • Impact Factor
  • Call For Papers
    • Submit Paper Online
    • Current Issue
    • Special Issue
  • For Authors
    • Instructions for Authors
    • Submit Paper
    • Download Certificates
    • Check Paper Status
    • Paper Format
    • Copyright Form
    • Membership
    • Peer Review
  • Past Issue
    • Monthly Issue
    • Special Issue
  • Pay Fee
    • Indian Authors
    • International Authors
  • Topics
ISSN: 2321-9653
Estd : 2013
IJRASET - Logo
  • Home
  • About Us
    • About Us
    • Aim & Scope
  • Editorial Board
  • Impact Factor
  • Call For Papers
    • Submit Paper Online
    • Current Issue
    • Special Issue
  • For Authors
    • Instructions for Authors
    • Submit Paper
    • Download Certificates
    • Check Paper Status
    • Paper Format
    • Copyright Form
    • Membership
    • Peer Review
  • Past Issue
    • Monthly Issue
    • Special Issue
  • Pay Fee
    • Indian Authors
    • International Authors
  • Topics

Ijraset Journal For Research in Applied Science and Engineering Technology

  • Home / Ijraset
  • On This Page
  • Abstract
  • Introduction
  • Conclusion
  • References
  • Copyright

Online Subjective Answer Verifying System Using Artificial Intelligence

Authors: Prof. Priyadarshani Doke, Priyanka Gangane, Kesia S Babu, Pratiksha Lagad, Neha Vaidya

DOI Link: https://doi.org/10.22214/ijraset.2022.47543

Certificate: View Certificate

Abstract

Every year educational institutes conduct various examinations, which include institu- tional and non-institutional competitive exams. Now a day’s online tests and examina- tions are becoming popular to reduce the burden of the examination evaluation process. The online exams include either objective or multiple-choice questions. Nevertheless, the exams include only objective or multiple-choice questions. However, subjective- based questions and answers are not involved due to the evaluation process complexity and efficiency of the evaluation process. An automatic answer checker application that checks the written answers and marks the weighted similar to a human being is more helpful in the current modern era is necessary. Hence, the software applications built to check subjective answers may be more useful for allocating marks to the user after verifying the answers for online examination.

Introduction

I. INTRODUCTION

The Online Examination is beneficial to users as in the present day, and the online exams are based on objective questions and exams are getting digitized all over . In  this  scenario,  exam  questions  can  even  be  based  sub- jective answers. Meaning that the traditional pen-paper based tests are replaced to computer-based tests that have proven to be : (i)more consistent in allocating marks and (ii)faster than teachers correcting papers. The traditional exam usually consisted of subjective answers, which were not the best way of grading the student’s perception of the subject. Because sometimes, examiners get bored by checking many answer sheets, and there may be an increase in the false evaluation. Evaluation of such questions using computers is a tricky task, mainly because natural language is ambiguous.  Several prepossessing steps must be performed, such as cleaning the data and tokenization before working on it.

II. BACKGROUND AND LITERATURE REVIEW

As mentioned before, the evaluation of subjective answers is not a new thought, and it has been worked upon for almost two decades. Various techniques have been implemented to solve this problem, such as big data Natural Language Processing, Latent Semantic Analysis, Bayes theorem, K-nearest classifiers, and even formal techniques such as Formal Concept Analysis. They are categorized into three main categories: Statistical, Information Extraction, and Full Natural Language Processing.

A. Technical Background

  1. Statistical Technique: It is based on keyword matching and is considered poor as it cannot tackle problems such as synonyms or take the context into account. Several works have been done on subjective paper evaluation using this approach
  2. Information Extraction (IE) Technique: Information Extraction techniques depend on getting a structure or a pattern from the text so that the text can be broken into concepts and their relationships. The dependencies found to play a significant role in producing scores and need to be confirmed from an expert in domain

???????B. Literature Survey

  1. Online Subjective answer verifying system Using Artificial Intelligence(2021)

Authors:Jagadamba G,Chaya Shree G. Organizations/educational institutes always depend on the grading system through examinations. However, most of the examinations are objective. These systems or any other such system are more advantageous in terms of saving resources but failed to include subjective questions [1, 9, 10]. This paper attempted to evaluate the descriptive answer. The evaluation is done through graphical comparison with a standard answer

2. Subjective Answers Evaluation Using Machine Learning and Natural Language Processing(2021)

Authors: Hamza Arshad,Abdul Rehman Javed. Various  methods  are  used  for  subjective  answer  evaluation  in  the  past  and  looked  at  their  shortcomings. In this paper, we propose a new approach to solve this problem which consists of training a machine learning classification model with the help of results obtained from our result prediction module and then using our trained model to reinforce results from the prediction model, which can lead to a fully trained machine learning model.

3. Tool for Evaluating Subjective Answers using AI(TESA)(2021)

Authors: Shreya Singh,Omkar Manchekar,Ambar Patwardhan All the studies which have been reviewed show that there are various different techniques for the evaluation of subjective answer sheets. The advantage of the system lies in the fact that it uses a weighted average of the closest to accurate techniques to provide the most optimized result.TESA is a systematic and reliable system which eases the role of evaluators and provides faster and more efficient outputs.

4. ASSESS – Automated subjective answer evaluation using Semantic Learning

Authors: Nidhi Dedhia,Kunal Bohra,Prem Chandak This automated approach is beneficial when students need to be assessed online for self improvement. This system gives special emphasis to the specially-abled by providing various speech-based usability features, where the gaps are filled by providing audio facilities like listening to the questions and answering them verbally. The advantage of this system is that it is near completion, has improved performance and caters to a very large audience.

5. Automated Answer-Checker

Authors: Vasu Bansal,M.L. Sharma, Krishna Chandra Tripathi The proposed system could be of great utility to the educators whenever they need to take a quick test for revision purposes, as it saves time and the trouble of evaluating the bundle of papers.

This System would be beneficial for the universities, schools and colleges for academic purpose by providing ease to faculties and the examination evaluation cell.

6. Online Subjective Answer Checker

Authors: Merien Mathew, Ankit Chavan, Siddharth Baikar The project report entitled ”Online subjective answer checker” has been developed with much care that it is free of errors and at the same time it is efficient and less time consuming.The important thing is that the system is robust. Also provision is provided for future developments in the system. The entire system is secured. This online system will be approved and implemented soon.

III. PROPOSED METHODOLOGY

Each answer for coming from a student is evaluted using same pre-processing against answer provided by teacher.

  1. Question Paper : When a teachers logs-in into id, the following three questions are asked:

a Select from the 2 sets of paper

b. Select the number of questions

c. Choose level of complexity

2. Answer sheet

3. Evalution

Algorithm:

Navies Bayes Classifier  Navie Bayes Classifier is one of the simple and most effective classification algo- rithms which helps in building the fast machine learning models that can make quick predictions.

Natural  Langugage  Processing   It is the technology is used by machine to understand, analyse, manip- ulate, and interpret human’s languages .It helps developers to organize knowledge for performing tasks such as translation, automatic summarization, Named Entity Recognization(NER), relationships extraction, and topics segmentation.

???????A. UML Diagrams of Proposed System

???????

Conclusion

In this paper, we are design the online subjective answer verifying using artificial intelligence for any sectors like a school, colleges and universities. Hence, the proposed system could be great utility to the educators whenever they need to take a quick test for revision purpose, as it saves time and the trouble of evaluating the bundle of papers.

References

[1] G. Jagadamba and C. Shree G., ”Online Subjective answer verifying system Using Artificial Intelligence,” 2020 Fourth International Conference on I-SMAC (IoT in Social, Mobile, Analytics and Cloud) (I-SMAC), 2020, pp. 1023-1027. [2] M. F. Bashir, H. Arshad, A. R. Javed, N. Kryvinska and S. S. Band, ”Subjective Answers Evaluation Using Machine Learning and Natural Language Processing,” in IEEE Access, vol. 9, pp. 158972-158983, 2021. [3] S. Singh, O. Manchekar, A. Patwardhan, U. Rote, S. Jagtap and H. Chavan, ”Tool for Evaluating Subjective Answers using AI (TESA),” 2021 International Conference on Communication information and Computing Tech- nology (ICCICT), 2021. [4] Johri, Era and Dedhia, Nidhi and Bohra, Kunal and Chandak, Prem and Adhikari, Hunain, ASSESS - Auto- mated Subjective Answer Evaluation Using Semantic Learning (May 7, 2021). Proceedings of the 4th International Conference on Advances in Science Technology (ICAST2021). [5] Vasu Bansal M.L. Sharma and Krishna Chandra Tripathi, ”Automated Answer-Checker” December 2020In- ternational Journal for Modern Trends in Science and Technology 6(12):152-155, DOI:10.46501/IJMTST061229

Copyright

Copyright © 2022 Prof. Priyadarshani Doke, Priyanka Gangane, Kesia S Babu, Pratiksha Lagad, Neha Vaidya. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Download Paper

Paper Id : IJRASET47543

Publish Date : 2022-11-19

ISSN : 2321-9653

Publisher Name : IJRASET

DOI Link : Click Here