How can you Track Learners’ Responses to Assessments in LMS?

How can you Track Learners’ Responses to Assessments in LMS?

How can you Track Learners’ Responses to Assessments in LMS?

A Learning Management System is used to track learners’ training progress. This tracked information is very useful to evaluate the training as well as the learners’ performance. This does not mean that LMS tracks everything what you need. Assessment information will be tracked based on the type of training content. For example, if there is a document added to the training activities, then the LMS can track whether the learner has opened the document or not, but it cannot track whether he has all pages.

If you want to track whether learner has read all the pages, then you may need to convert the document into a SCORM / AICC compliant course.


We have good experience in setting up LMSs for our customers and satisfying their specific needs using an open-source LMS. Let us discuss some of our customers’ needs on assessment tracking.

Requirement 1:

  • Customer has a SCORM compliant assessment, developed in-house using Lectora.
  • Assessment needs to be given to the learners after successful completion of the eLearning course.
  • He wants to track the individual scores and number of attempts.
  • He wants to track user responses for each question.


  • Assessment is already created in the SCORM format. He does not want to recreate it to avoid additional cost.
  • Customer wants to track user responses for each question.


This is a very common requirement. A SCORM compliant assessment works perfectly on the LMS as it will track the score and number of attempts automatically.

LMS to track assessmentsThe challenge is to track the user responses for each question without recreating it. To provide the solution, we have asked our customer to republish the course using the SCORM interactions option. This publish option is available in all authoring tools. This will track each question in the LMS database and displays as shown in the figure below under “Other Tracks” section of the assessment report. Each question will have separate tracking information, such as correct response and user response.

Separate tracking information

This may track the user responses, but it is very technical and difficult to analyze the data. Instructors have to manually generate reports based on this information.

Requirement 2:

  • Customer has questions in a MS-Word document, which they use in classroom training sessions to conduct assessments. Now, they want it to be included in the LMS, so that participants can take this assessment after a classroom training session.
  • Assessment needs to be given to learners after the successful completion of an ILT course.
  • Customer wants to track individual score and number of attempts.
  • Customer wants to track user responses for each question.
  • LMS should give analytical reports on the assessment and instruction should be able download the reports into Excel sheet.


  • Creating the paper type questions in the LMS.
  • Generating analytical reports, which help instructors to evaluate without much effort.


We have created the paper type question in the LMS easily, as it supports various types of questions. For a few questions, we faced some difficulty in displaying the questions as mentioned in the paper format. Our client accepted the application’s limitation because it cannot be exactly same as paper format of the questions and the question created in the LMS matches with the question objective exactly.

After the successful completion of this training, they have generated following reports which gave them the required analytical reports.

In addition to score tracking and number of attempts, the LMS gives each learner responses per question. You can see whether the questions have been answered correctly or not and also the score secured for each question.

Score tracking and number of attempts

In the following report, you can find the number of participants in each grade range.

How many participants are there in each grade

In the following report, you can see more analytical information, generated automatically by the LMS, based on the user responses.

Analytical information

Facility index: Average score on the question is displayed in percentage. If the Facility index is high, it means that question is very easy.

Discriminative Efficiency: A question with a very high facility index is difficult to get a high covariance between question score and whole test score.

In addition to all these reports, LMS will also provide details such as question details, attempts, standard deviation and random guess score.

The only difference between these two types of assessments is that the first one is developed using an external authoring tool and the second is developed using the LMS assessment builder. Both meet the basic training requirement, but the second option gives very detailed information as it is created within the LMS. I hope you may now decide how to track the assessment responses into the LMS and what information you can track for the further analysis.

View Presentation on Generating Reports in an LMS to Review Training