DIRECT MEASUREMENT OF STUDENT OUTCOMES

WITH A PIPE NETWORK DESIGN PROGRAM

John Finnie & Neil Fennessey

Department of Civil & Environmental Engineering

University of Massachusetts Dartmouth

INTRODUCTION

Direct measurement has become an essential part of the assessment and evaluation of ABET Student Outcomes. One of the possible ways to accomplish a direct measurement is to document student performance on specific parts of an assignment. Student performance on specific parts of the assignment can be compared over time and conclusions drawn regarding attainment of various student outcomes. The objective of this paper is to show how a Communication Rubric for grading student design projects could be used to provide some direct measurements of attainment of student outcomes.

DIRECT MEASUREMENT OF STUDENT OUTCOMES

The ABET web site (WWW.ABET.ORG) provides information about accreditation, including the documents entitled “Criteria for Accrediting Engineering Programs” for specific Accreditation Cycles (school years). A comparison of these criteria reveals that “direct measures” for assessment are not specifically mentioned until 2011-2012. The definition section for the 2010-2011 criteria defines Assessment as follows [1].

“Assessment is one or more processes that identify, collect, and prepare data to evaluate the achievement of program outcomes and program educational objectives.”

However, the 2011-2012 criteria document provides the following definition [2].

“Assessment is one or more processes that identify, collect, and prepare data to evaluate the attainment of student outcomes and program educational objectives. Effective assessment uses relevant direct, indirect, quantitative and qualitative measures as appropriate to the objective or outcome being measured. Appropriate sampling methods may be used as part of an assessment process.”

This latter definition was proposed for the “EAC Harmonized General Criteria” on November 1, 2008. It was formally adopted for the 2011-2012 accreditation cycle at the ABET meeting October 30, 2010 [3].

ABET does not provide a specific definition of a direct measure, nor does it publish procedures for the use of direct measures as assessment tools. However, ABET does provide a number of venues for learning about the assessment process and preparing for an accreditation visit. These include an ongoing series of webinars, workshops, conferences, newsletters, as well as their website. Some of the webinars are available at no cost, as are the newsletters and website.

One definition of a direct measure was provided by Community Matters, a monthly newsletter from ABET, who started a column about assessment in August of 2006. This series addressed a number of assessment topics, including direct measures of learning. In a column in Community Matters, Gloria Rogers presented a list of direct and indirect assessment techniques [4]. The list of indirect measures included the familiar techniques of written surveys, questionnaires, and interviews. The list of direct measures included portfolios, local exams, oral exams, standardized exams, and external examiners. In this paper, a different measure will be explored. A rubric for grading technical reports will be used to directly measure attainment of a student outcome.

THE COMMUNICATION RUBRIC

A previous accreditation visit criticized our process for evaluating technical communications. In response to this criticism, our Communication Rubric was developed as a systematic way of grading student reports. It has been utilized to grade student reports in three specific undergraduate courses: freshman computer graphics, our second course in water resources engineering, and our second course in environmental engineering. A section of the Communication Rubric is given below. The full text of our Communication Rubric is available on our web site at: http://www.umassd.edu/engineering/cen/undergraduate/programoutcomes/

The Communication Rubric includes sections for Written Content, Technical Content, and Oral Presentation. For example, the section on written content is presented below.

Written Content Grade ____

Content and integration of information from sources (journals, manuals, etc.) ( %)

______1. All ideas presented support and develop the topic.

______2. Project reflects insight into and understanding of the subject matter.

______3. Ideas are stated clearly and are developed fully with specific supporting details from the specifications or technical literature.

______4. Effectively uses examples, paraphrases, or summaries from the literature concerning the subject matter, not just quotations.

______5. Work reflects a sufficient review of the applicable Codes, specifications and/or technical literature.

Structure and Form ( %)

______1. Abstract is succinct and clear.

______2. Table of Contents is correct and logical.

______3. Introduction engages reader, explains project and gives clear sense of direction.

______4. Logical, structured body guides reader through ideas, using topic sentences, etc.

______5. Conclusion gives sense of rounding off and wrapping up without feeling repetitive, rushed, or unfamiliar.

______6. Demonstrates proper and effective paragraphing.

______7. Uses appropriate transitional words and phrases between paragraphs and sentences.

______8. Meets required length, if specified.

Grammar, Usage, and Mechanics ( %)

______1. Contains few or no errors in grammar and usage.

______2. Word choice is appropriate to professional writing.

______3. Contains few or no errors in spelling, capitalization, and punctuation.

______4. Shows clear evidence of proofreading and use of a spellchecker.

Format ( %)

______1. Typed – black ink in 12-point standard font (Times New Roman or similar)

______2. Follows specified line spacing (e.g., single, 1.5 or double-spaced).

______3. Follows specified page margins (e.g., 1-inch margins all around.)

______4. Pages numbered at page bottom, center.

______5. Follows other formatting requirements specific to course/project (i.e., title page, etc.)

______6. Citation of facts, tables, figures, quotations, etc.

Quotations: lengthy quotations block-style indented 1 inch and single-spaced; source and page number provided for quotations.

Source citation in correct format: e.g., Fennessey (2004)

_____ 7. Citation/Reference list is complete, accurate, and in specified format (ASCE, TRB, etc.)

In a similar fashion, the section on technical content addresses Technical Approach, Design Calculations, and Drawings and Supporting Graphics. The section on oral presentation (if applicable) addresses Appearance of Presenters, Oral Presentation, Subject Matter Presentation, and Post-Presentation Questions and Answers. Under each of these categories, a list of positive attributes helps the instructor to decide upon a numerical score for that category.

Our Communication Rubric has been utilized to evaluate technical communication projects since 2004 in at least one of our courses. However, its use as a direct measurement of student outcomes was not considered until recently.

THE DESIGN PROJECT

The team design project used for this example requires students to increase the capacity of domestic water pipeline system for a fictional city. Students are required to enlarge some of the pipes to increase capacity for fighting fires, to specify dimensions, volume, and elevations for a new city water tank, and to specify horse power, flow rate, and pressure (i.e. a “pump curve”) for a new pump to supply the new system. A copy of the specific design project is presented in the appendix.

STUDENT OUTCOMES

Our program has adopted the “a thru k” student outcomes as presented in the 2010-2011 criteria [1]. In addition, we added an additional outcome which addresses application of codes and regulations. For this particular course, we wish to assess attainment of the following student outcome: (g) an ability to communicate effectively.

RESULTS OF THE COMMUNICATION RUBRIC

Table 1 presents an example using scores for thirteen teams on the Written Content section of the Communication Rubric. This section accounts for 40% of the project grade. We have assumed the following weights for each category of this section.

·  Content and integration of information from sources (journals, manuals, etc.) (8 %)

·  Structure and Form (12 %)

·  Grammar, Usage, and Mechanics (12 %)

·  Format (8 %)

Figure 1 presents the results of the Written Content section of the Communication Rubric. A rating of “Exceeds Criteria” requires a score of 90% or above. A score below 90% but at least 80% receives a rating of “Meets Criteria”. A score below 80% but at least 70% receives a rating of “Progressing to Criteria”. Any score below 70% gets a rating of “Below Expectations”. These limits and their descriptions were suggested by Gloria Rogers in column in Community Matters [5]. The number of these ratings, the limits, and their names are not proscribed by ABET, but should be decided by the program.

To determine team ratings on Figure 1, the numerical scores in Table 1were first divided by the total possible points. For example, on the category “Content & Sources” Team 1 scored 5 out of 8 possible for an average of .625. Since this is below 70%, they received a rating of “Below Expectations” on this category. Figure1 summarizes the performance of the teams on each of the four categories. Under the category “Content & Sources”, one team received an evaluation of “Exceeds Criteria”, three teams received an evaluation of “Meets Criteria”, five teams received an evaluation of “Progressing to Criteria”, and four teams received an evaluation of “Below Expectations”.

Table 1 Summary of grades by category

Team / Content & Sources / Structure & Form / Grammar, Usage & Mechanics / Format / Sum
0.08 / 0.12 / 0.12 / 0.08 / 0.4
1 / 0.05 / 0.09 / 0.12 / 0.07 / 0.33
2 / 0.05 / 0.11 / 0.10 / 0.08 / 0.34
3 / 0.06 / 0.09 / 0.09 / 0.08 / 0.32
4 / 0.07 / 0.10 / 0.12 / 0.07 / 0.36
5 / 0.06 / 0.09 / 0.11 / 0.06 / 0.32
6 / 0.07 / 0.10 / 0.11 / 0.08 / 0.36
7 / 0.05 / 0.11 / 0.11 / 0.07 / 0.34
8 / 0.06 / 0.10 / 0.12 / 0.07 / 0.35
9 / 0.05 / 0.08 / 0.12 / 0.08 / 0.33
10 / 0.06 / 0.11 / 0.10 / 0.06 / 0.33
11 / 0.08 / 0.12 / 0.11 / 0.08 / 0.39
12 / 0.06 / 0.11 / 0.12 / 0.06 / 0.35
13 / 0.07 / 0.08 / 0.11 / 0.08 / 0.34

Figure 1 Results of Communication Rubric for Written Content

It should be noted that the Technical Content section of the Communication Rubric could also be used to assess other Program Outcomes. These outcomes could include

(a) an ability to apply knowledge of mathematics, science, and engineering

(c) an ability to design a system, component, or process to meet desired needs within realistic constraints such as economic, environmental, social, political, ethical, health and safety, manufacturability, and sustainability

(e) an ability to identify, formulate, and solve engineering problems

EVALUATION

Figure 1 identifies topics and skills areas where students need extra work. An initial evaluation would be that this class needs to improve its performance on the first two categories (Content & Sources, and Structure & Form).

However, ABET has a specific definition of evaluation which is presented in its criteria [1].

“Evaluation is one or more processes for interpreting the data and evidence accumulated through assessment processes. Evaluation determines the extent to which student outcomes and program educational objectives are being attained. Evaluation results in decisions and actions regarding program improvement.”

Figure 1 can help the program to evaluate the level of attainment of Outcome “g”, (an ability to communicate effectively). To do this, the program must specify how scores on Figure 1 correlate to levels of attainment of an outcome. For example, the program could decide that acceptable attainment for Outcome “g” means that all student teams receive a rating of “Meets Criteria” in at least 3 of the 4 categories.

However, this evaluation must also integrate Figure 1 with other assessment tools and the results of the Communication Rubric from other courses. In our case, these other assessment tools include embedded exam questions and student presentations in front of our Industrial Advisory Board. Evaluating outcomes with data from multiple assessment tools can be approached in a number of ways. The program could develop a formula to combine all assessments into a single number (dimension). This could be accomplished by converting the results of all assessment tools to a number between 1 and 5, and averaging the results of all tools. Alternatively, the program can form a qualitative statement based on multiple data sources and data dimensions. ABET does not specify a procedure for formulating an evaluation from assessment tools.

The data from Figure 1 for future classes could also allow the program to track performance trends.

CONCLUSIONS

A Communication Rubric has been presented and applied to a team design project in a civil engineering course. Scores in specific categories of the Communication Rubric were tabulated. Weights were applied to each category, and a four-part rating system developed. These ratings were summarized and graphed for each category in the Communication Rubric. Procedures were presented for using these ratings to form an evaluation of attainment of a student outcome.

The results and recommendations of this paper are those of the authors, and have not been reviewed or approved by ABET.

REFERENCES

[1] ABET, “Criteria for Accrediting Engineering Programs, Effective for Evaluations during the 2010-2011 Accreditation Cycle”, ABET.

[2] ABET, “Criteria for Accrediting Engineering Programs, Effective for Evaluations during the 2011-2012 Accreditation Cycle”, ABET.

[3] ABET, “Criteria for Accrediting Engineering Programs, Effective for Evaluations during the 2009-2010 Accreditation Cycle”, ABET.

[4] Gloria Rogers, “Assessment 101, Direct and Indirect Assessments: What Are They Good For?”, Community Matters Newletter, (August 2006), ABET Inc.

[5] Gloria Rogers, “Assessment 101, Rubrics: What Are They Good For?”, Community Matters Newsletter, (September 2006), ABET Inc.

APPENDIX

CEN 325 Water Resource Engineering

City of San Roberto Light Water Distribution System (Version S)

Project 1 of 2 Design Projects for a Team

March 31, 2009

Introduction. The City of San Roberto Light (population 2736) has an existing drinking water distribution system which meets its current minimum needs. It wants to upgrade its existing system to handle fire flows. It has hired you to advise it on alternatives and to design alterations to the pipe system, pump, and storage tank.

Description of the system. Treated water enters the pipe network at node 1 (see Figure 1) and is pumped to an elevated storage tank. The storage tank has a volume of 20,000 gallons, with a diameter of 16 feet and a depth of 13 feet. The ground elevation at the storage tank is 112 feet, and the bottom of the tank is at elevation 170 feet. The pumps at the water treatment plan can deliver 500 gallons per minute to the storage tank, when its water surface is 190 feet elevation.