CATS SCORING AND ITEM ANALYSIS SERVICES

Scantron answer sheets must be scanned by your district prior to sending them to us for item analysis.

Because test development and delivery of selection materials are our highest priorities at CODESP, there may be a wait of up to 5 business days before we will be able to scan your answer sheets and return them to your districts.

Initial scanning and scoring services with immediate turnaround is not always possible. We will continue to scan for item analysis, and the reports will be sent as quickly as possible.

By doing the initial scoring at your district you will be able to identify the errors before sending them on to us. We have over 740 members and many are using our scoring services. Many of the answer sheets are rejected by our scanner due to errors such as incomplete erasures, two answers, incomplete forms, etc. Hand correcting these forms is extremely time consuming and should not be done by CODESP staff.

Therefore, we will scan your test for item analysis data ONLY after the sheets have been scanned by your district. This will allow your district to find and correct the errors on the Scantron forms before reaching our office and to obtain test scores quickly. Scantron forms can be scanned more than once so that you can obtain scoring data prior to sending them to our office for item analysis results. This service can only be provided by using Scantron Form F-289-PAR-L or F-289-ERI-L which can be purchased from Scantron.

To avoid some of the same problems we are having with scoring, make sure that test-takers are completing the Scantron sheets correctly and review them for errors as test-takers turn them in. Requiring test-takers to correct their own answer sheets will save your district time and fewer answer sheets will be rejected from the scanner.

We hope to expand our services in the future, and we are sorry for anyinconvenience this may cause. Item analysis is an important function oftesting; and assisting your districts in this area is a priority. We also hope that CODESP members may be able to group together to purchase optical scanners that can be shared by member districts in their area who cannot afford one at this time.

If you have any questions, please do not hesitate to call.

Thank you,

Marianne Tonjes

Executive Director

CODESP

A Tutorial follows below which will explain the reports you will receive:

Item Analysis Tutorial

CODESP now requires that all Scantron test forms are scanned before they are sent to us to ensure they are completed correctly.

Please follow these steps to have CODESP produce an item analysis for your test:

  1. Before we can provide you with an item analysis, your district must be a member of CODESP and you must already have registered for CATS services on-line.
  1. Confirm that the Scantron form being used is FORM NO. F-289-ERI-L or FORM NO. F-289-PAR-L. This information is located at the bottom center on side 1 of the Scantron form. These are the only forms that CODESP can process.
  1. To order compatible answer sheets so that we can provide your district with item analysis, you can notify SCANTRON at 800-722-6876 ext. 2645 or via the web site at
  1. Look over the forms and make sure that:
  1. The “Test Form” bubbles are not filled in (a,b,c,d), as our scanner will not read the answer form if these bubbles are filled in.
  2. There is an ID number bubbled in for each candidate, and that the length of your candidate’s ID numbers is consistent for the entire batch.
  3. There are no marks near the black bars on the Scantron, as it interferes with reading the forms.
  4. The candidates marked one answer per question and that they did not skip any questions
  5. The version is written on a line at the top of the sheet (e.g., Instructional Aide Exam A) and if there are multiple versions all like versions are grouped together with a paper clip.
  1. Create a cover letter to send with the Scantron sheet; this should include:
  1. Contact information (name, phone number, and e-mail address)
  2. Date that the test was administered
  3. The name of the exam (e.g., Library Technician)
  4. Number of Scantron sheets that you have sent
  5. Return address
  1. Put the cover letter and all Scantron forms into an envelope and address it to:

CODESP (item analysis)

20422 Beach Blvd, Suite 310

Huntington Beach, CA 92648-4377

  1. Once we receive your information, CODESP will post the item analysis on CATS within 10 business days. The Scantron sheets will then be mailed back to you within three business days of the posting.
  1. To view your item analysis:
  1. Open Internet Explorer and type into the address bar
  2. Click the “Member Login” button
  3. Enter your login and password
  4. Click the “Login” button
  5. This will bring you your personal welcome screen. Click the “Test Results” hyperlink, this is located on the left hand side of the page under the heading CATS
  6. Select your test from the drop down menu labeled “Scored Tests”
  7. Click the “Run Report” button
  8. A new window will open with the item analysis for the selected test.
  1. Interpreting Standard Item Analysis Report

ITEM ANALYSIS

The Standard Item Analysis Report is a statistical report that provides detailed distractor analysis based on raw scores. The reliability of a question, the difficulty, or effectiveness is statistically calculated. If you have low numbers of candidates taking the examination, the statistics will NOT be valid. The statistics available will only let you know if your test is too easy or too difficult for that particular group of applicants.

CATS Item Analysis includes: total possible points, applicants in this group, standard deviation, median score, mean score, reliability coefficient (KR20), highest score and lowest score, as well as an individual analysis for each applicant. The Standard Item Analysis Report expands the analysis and provides a point-biserial correlation coefficient (PBCC) as a discrimination value for each distractor. The PBCC is considered to be the single best measure of the effectiveness of a test item. Generally, the higher the PBCC, the better the discrimination, and thus the better the item. Typically, the following criteria may be used:

Point Biserial Indication

.30 and above Very good item

.20 to .29 Reasonably good but subject to improvement

.09 to .19 Marginal item, usually need improvement

below .09 Poor, to be rejected or improved

A negative PBCC indicates that more candidates in the lower portion of the group responded correctly to the item than those in the higher portion of the group. PBCC is more affected by difficulty level and a small applicant group will distort the number. If you have any questions about your test results, refer to the CODESP Handbook or give CODESP staff a call.

ITEM ANALYSIS REPORT EXPLANATION

Analysis Available on the Standard Item Analysis Report

Standard Deviation: A measure of variability computed by determining the square root of the variance.

Applicants in Group: The number of Scantrons analyzed.

Median Score: The point at or below which exactly 50% of the scores fall.

Mean Score: The average score.

Reliability Coefficient: The consistency of test items from one testing date to the next. A KR20 of 1.00 is perfect reliability.

Correct Group Responses

Total: The percentage of the total group answering the item correctly.

Upper 27% of Group: This is a percentage of the highest scoring 27% of the candidates that has answered that question correctly.

Lower 27% of Group: This is the percentage of the lowest scoring 27% of the candidates that answered that question correctly.

Point Biserial Coefficient: The point biserial correlation coeffiecient (PBCC) measures the correlation between the correct answer on an item and the total test score of a candidate.

Difficulty Index: Indicates how the candidate pool as a whole performed on a particular item. The closer the number is to 1.00, the greater the amount of candidates who got the question correct. An item that has a lower number (e.g., .20) indicates that the question was more difficult.

Response Frequency: This is the number of responses to each possible answer A, B, C, D and E if used.

Non-distractors: A non-distractor is an answer that zero percent of the group chose. Items with low discrimination will, typically, have more than one non- distractor or the response frequencies will be especially low.

1