Peer Reviewing Checklist for the L.A.P.

for use during in-class peer editing sessions

  1. Are there highlighting and links that were designed to help the proctor still on the document?
  2. Is the student demographic data on the first page completely filled in?
  3. The student description should be factual (not subjective terms such as cute, smart, charming, etc.), and include data you collect from all your visits as you discover more about the student. You can write words such as introverted (or shy), extroverted (or talkative), etc., as these words are observable when you work with a student and are not as subjective as words like, "good" or "sweet". Well behaved, active, engaged, attentive, easily frustrated, etc. are all observable behaviors that do not require a judgment call on your part.
  4. The school and class description should include the school's student population demographical data and be as detailed as possible.
  5. Is the report devoid of personal pronouns I, my, me, or we? For example, instead of "I felt the student read the passage with prosody.", write, "the student read the passage with prosody." or instead of, "the student read his book to me with few errors", write, "the student read the book with few errors."
  6. Are the testing dates listed in the date column?
  7. Are the tests’ purpose, administration, scoring, and indications fully explained for each test so that a parent could understand the test and the scores given.
  8. Are there indications of what the numerical data in each test means? Reading levels (frustrational, instructional, independent)? For example, write what theGarfield test data indicated so parents know what those numbers mean - the raw scores will tell them nothing. Tell how the attitude survey and the interest inventory either backed up one another, or contradicted one another.
  9. Are there anecdotal notes (in the right hand column) that indicate student reactions and struggles/testing conditions/other facts that are pertinent to the test given?
  10. Has the document been thoroughly proofread for punctuation? Spelling? Grammar?
  11. Do the comprehension question scores indicate inferential or explicit questions missed?
  12. Is there a general statement about listening observations from ESRI 5 & 6?
  13. Do the additional tests (including optional tests) given have a justification for why they were selected in the right column?
  14. Do the Running records contain all the information needed?
  15. Text or test used and grade level?
  16. WPM rate? Meets/does not meet grade level?
  17. CWPM rate? Meets/does not meet grade level?
  18. Comprehension question score? Meets/does not meet grade level?
  19. Reading level determined (frustrational, instructional, independent)?
  20. Do the last three boxes contain areas of testing that do and do not match the current grade level context, and suggestions to help the student improve?