HPS 410Lec1

Measurement- Collection of information on which a decision is based. An objective, nonjudgmental process.

Evaluation- Use of measurement in making decisions. Involves interpretation of a score. Place value judgment on the measurement.

Test- Tool utilized to gather data.

Reliability is the consistency of your measurement, or the degree to which an instrument measures the same way each time it is used under the same condition with the same subjects.- Stability of performance over several trials

Validity is the strength of our conclusions, inferences or propositions. Extent to which a test measures what it is supposed to measure. Truthfulness

Purposes of Physical Education/Activity Assessment

Screening and referral decisions

Eligibility and programming decisions

Day-to-day teaching decisions

Student progress and feedback decisions

Sport classification decisions

Day-to-Day Teaching Decisions

• Assist in determining individual adaptations

• Acquaintance with student’s needs and interests to plan best lessons and units

• Continuous assessment throughout year to assess progress

Student Progress and Feedback Decisions

• Grading related to goals and objectives

• Large-scale assessment does not exist in most states for physical education

• Grading - often related to participation, dressing, and effort as opposed to content-oriented goals and objectives

Norm-, Criterion-, and Content-Referenced Tests

• Validated for the specific purpose for which they are used

• Administrator must document training and skill in protocol used

• Administered in native language

• More than one procedure is used

Norm-Referenced Tests

• Statistics describe group performance and enable comparisons by age and gender

• Various kinds of norms

– Percentiles

– Standard scores

– Age equivalents

Criterion-Referenced Tests

• Measure mastery learning and/or assess achievement of developmental milestones, mature movement patterns, and minimal fitness levels

• Data is pass/fail rather than numbers

• Emphasis on process rather than product

• Criteria can be written as task analysis

Content-Referenced Tests

• Teacher-made tests designed to measure what is being taught

• Assess where a student fails related to a continuum

• Change to criterion-referenced when a designated score is required to pass

• Used in curriculum-embedded instruction

Authentic Assessment

This is an assessment done in a "real-life" setting, as opposed to a more "sterile" testing situation.

A rubric is a rating scale and list of criteria by which student knowledge, skills, and/or performance can be assessed

Assignment for Thurs., 9/1/11:
Do a web search and find an example of norm, criterion, authentic (tests and/or assessments) and rubrics and bring to class to discuss.
*Quiz next Thurs 9/8/11

Formal Assessment Instruments Most Commonly Used

• Test of Gross Motor Development-2

• Bruininks-Oseretsky Test of Motor

Proficiency Short Form

• FitnessGram

• Brigance Diagnostic Inventory of EarlyDevelopment

• Brockport Physical Fitness Test

Planning Assessment

• Establish specific purpose

– Screening and referral

– Diagnosis and placement

– Instruction and student progress

• Relating assessment to goals and variables

– Relate to goals of the school system and/or teacher

Planning Assessment

• Using criteria to select instruments

– Validity - extent to which a test measures what it is supposed to measure

• Content, criterion, construct validity

– Reliability • Stability of performance over several trials

• Internal consistency - consistent responses

– Objectivity - similar ratings across observers

Planning Assessment

• Reviewing available instruments

– Purpose, age range, validity, reliability, and objectivity of a variety of instruments

– Mental Measurements Yearbook

• Selecting instruments

– High validity and reliability

– Items from different sources changes validity and reliability

Planning Assessment

• Determining the setting

– Depends on purpose

– Which setting will elicit the best performance

• Determining environmental factors

– Factors that affect student

– Factors that affect test administrator

Formal Test Administration Procedures

• Document formal training and competence

• Focus on functional competence

• Competences in several contexts

• Frequent administration

• Videotape test performance

• Record keeping

• Avoid bias

• Develop rapport with test administrators

• Minimize test anxiety

Interpreting Data and Recommending Services

• Interpretation and writing results

• Statistics and computer competencies

• Understanding normal curve theory

Normal Curve Theory

• Mean, median, and mode

• Standard deviations

• Applications

• Standard scores (z and T conversions)

• Conversions

• Norms

Rubrics

• Criteria used to specify the elements ofperformance that can be used to determinestrengths and weakness

• Various uses

– Evaluate performance

– Checklist for screening

– Informal assessment

– Programming

1