Report on Pilot Offering of

CS1315 Introduction to Media Computation

Mark Guzdial

Rachel Fithian

Andrea Forte

Lauren Rich

Executive Summary

<Mark writes at the end>

Table of Contents

Executive Summary......

1.Introduction: What we collected and why......

2.What happened in the class?......

2.1.Who took the class?......

2.2.What content was covered?......

2.3.How were students assessed?......

2.4.What support for collaborative learning was used in the class?......

2.5.What was the WFD rate?......

2.6.Future work on the curriculum and course materials......

3.Did the students learn computing?......

3.1.Did the students think that they learned computing?......

3.2.How do CS1315 students compare to known problems in computing? To CS1321 or COE1361 students?

3.3.Future work on the learning of computation......

4.What did students think of the class?......

4.1.How hard was it?......

4.2.Was the course relevant, interesting, motivating?......

4.3.What role did collaboration play?......

4.4.What role did the technology we built play?......

4.5.How did these attitudes differ by College and ethnicity?......

5.Did the class appeal to women?......

5.1.Did female students find it relevant?......

5.2.Did female students find it creative?......

5.3.Were female students motivated to want to continue in CS?......

5.4.Did it matter that the course was non-CS and non-Engineering?......

5.5.Future work on studying female attitudes toward CS......

6.How do CS1315, CS1321, and COE1361 compare?......

6.1.In students’ impressions entering the course......

6.2.In how hard students perceive the course......

6.3.In how students are using what they are learning......

6.4.In how motivated students are to continue in CS......

6.5.Future work on comparing the CS courses......

Appendix A: Homework Assignments......

Homework 1:......

Homework 2:......

Homework 3:......

Homework 4:......

Homework 5:......

Homework 6:......

Appendix B: Take-Home Exams......

Take Home Exam #1:......

Take Home Exam #2:......

Appendix C: Midterm Exam #1......

(25) 5. Generalized changeVolume......

Appendix D: Midterm Exam #2......

Appendix E: Final Exam......

1.Introduction: What we collected and why

During the Spring 2003 semester, Georgia Tech offered four introductory computing courses.

  • CS1321 Introduction to Computing is the traditional CS course at Georgia Tech. It’s taught in sections of 200 and more, using the programming language Scheme.
  • CS1361, sections A and B, Introduction to Computing for Engineers was an introductory computing course aimed at Engineers taught in sections of 20-25 in the programming language MATLAB.
  • CS1361, section C, was a course aimed at covering the same concepts as CS1321, but in the programming languages MATLAB and Java, in a 75 person lecture.
  • CS1315 Introduction to Media Computation was being trialed in this semester. It’s a course aimed at non-CS and non-Engineering majors. It was taught in a section of 120 using the programming language Python.

The main focus of our assessment effort during the Spring 2003 semester was to understand CS1315, but we felt that we couldn’t really do that without comparing it to the other three CS courses. The result was that we ended up gathering information not only about our new non-CS majors course, but also about the concept of specialized computing courses overall.

We were interested in four main questions:

  • What happened in the class? That is, what content was covered, how were students assessed, and what was the overall success rate. In particular, we were interested in the overall WFD (withdraw, F, or D grades) rate. Our “best-practice” comparison is to the work by Laurie Williams and colleagues at North CarolinaStateUniversity[1] where they found that specialized collaboration strategies led to improved retention. For non-CS majors, there were able to achieve a 66.4% success rate (WFD 33.6%) compared to a 55.9% success rate in a traditional CS1.
  • Did the students learn computing? A great retention rate does not necessarily imply good learning.
  • What did students think of the class? One of our goals was to improve students’ attitudes toward computer science. We’re interested both in the attitudes toward the class overall, but also the role of the specific design features: The motivation of using a relevant domain like media computation[AF1], support for collaboration, and the value of the technologies we implemented.
  • Did the class appeal to women? We designed the course explicitly to appeal to women with features that addressed the concerns expressed in studies of the gender bias in Computer Science courses.
  • How do CS1315, CS1321, and COE1361 compare?

To address these questions, we gathered the following data:

  • Initial, midterm, and final surveys were given to consenting students in CS1315, one section of CS1321, and COE1361 section B. We were also able to use the midterm survey in COE1361 section A.
  • For consenting students, we studied their homework assignments.
  • There were a set of problems in common between CS1315, CS1321, and COE1361 on some of the exams and quizzes to give us some points of comparison. In particular, [AF2]

2.What happened in the class?

Overall, the class went very well. Students did well at the homework and exams, but more importantly, took advantage of the creative aspects of the course and did interesting things. For example, the third homework asked students to create a collage where the same image appeared at least three times, with some different visual manipulations each time—but that was a minimum. More images were acceptable and more manipulations were also acceptable. As can be seen by the examples in Figure 1 (which were all posted for sharing the CoWeb collaborative website), students did go above-and-beyond. Some of these programs were over 100 lines in length!

2.1.Who took the class?

<Lauren, can you fill in this section, please?)

120 students enrolled for the course. From the class roll, we know that the major distribution was…

Of the 120 students in the course, X% consented to be part of our study of the courses. From consent forms, we know that the gender and ethnicity distributions were…

2.2.What content was covered?

The syllabus for the course walks through each media type, with some repetition of concepts so that conditionals and loops can be re-visited in different contexts. Here's a rough description of the syllabus.

•Week 1: Introduction to the course and the argument for why media computation. Introduction to variables and functions, in the context of playing sounds and showing pictures.

•Weeks 2–3: Pictures as a media type, including psychophysics (why don’t we see 1024x768 dots on the screen?), looping to change colors with a simplified for loop, conditionals to replace specific colors, then indexing by index numbers to implement mirroring, rotating, cropping, and scaling.

Figure 1: Student collages for Homework 3

•Weeks 4–6: Sound as a media type, including psychophysics (how human hearing limitations make MP3 compression possible), looping to manipulate volume, then indexing by index numbers to do splicing and reversing of sounds. Include discussion of how to debug and how to design a program, as those issues arise. One lecture on additive and FM sound synthesis.

•Week 7: Text as a media type: Searching for text, composing text, reading text from a file and writing it to a file. An example program parses out the temperature from a downloaded weather page.

•Week 8: Manipulating directories. Manipulating networks, including making the temperature-finding program work from the “live” Web page. Introduction to HTML.

•Week 9: Discuss media transitions. Moving from sound to text and back to sound again. Using Excel to manipulate media after converting it to text.

•Week 10: Introduction to databases: Storing media in databases, using databases in generating HTML.

•Week 11: Movies: How persistence of vision makes animations and movies possible, generating frames using the various techniques described earlier in the semester.

•Week 12: “Can’t we do this any faster? Why is Photoshop faster than Python? ” Introduction to how a computer works (e.g., machine language), and the difference between an interpreter and a compiler. Algorithmic complexity and the limits of computation.

•Week 13: “Can we do this any easier?” Decomposing functions, modularity, and functional programming (map, reduce, filter, and simple recursion).

•Week 14: “Can’t we do this any easier?” Introduction to objects and classes.

•Week 15: “What do other programming languages look like?” Brief overview of JavaScript and Squeak.

2.3.How were students assessed?

Student work consisted of both activities whose goal was student learning and activities whose goal was assessment of learning. The student learning activities were graded and counted toward the final grade, but were weighed less heavily than the assessment activities. Students were encouraged to collaborate on the [AF3]

  • There were five take-home lab-like activities to help students learn basic productivity software and skills such as email, Web browsing, Word, Excel, PowerPoint, and HTML. These were worth 15% of the total grade.
  • There were three quizzes, each of which was preceded by a pre-quiz. The pre-quiz was distributed on Monday and was open to collaboration. The quiz was given in-class for 20 minutes on Wednesday and looked very similar to the pre-quiz. The pre-quiz was worth 20 points and the quiz was worth 80 points. Overall, the pre-quiz/quiz grade was worth 15% of the total grade.
  • There were six homework assignments, all involving programming, all of which could be worked on collaboratively. These appear in Appendix A, and the grades for these are summarized in Table 1. They were worth 15% of the total grade.
  • There were two take-home exams, which were programming assignments like the homework, but on which students were asked not to collaborate at all. They were asked to enter a statement that they neither gave nor received aid on the exam. These appear in Appendix B, and the grades for these are summarized in Table 2. These were worth 20% of the total grade.
  • There were two in-class exams and one final exam, which made up 35% of the total grade. These appear in Appendix C, D, and E, and the average grades and distributions for these appear in Tables 3, 4, and 5[2].

While the numbers are impressive, the distributions are even more striking. Figure 2 is the distribution for Homework #3 (the collage assignment) described earlier. The student performance on this homework was overwhelmingly positive.

Figure 2: Grade statistics and distribution for Homework #3

Obviously, from the collaboration policy described, students might have achieved much of this performance through collaboration off with one another. Literally, every one of the 63 students who got 100% could have handed in the same assignment.[AF4] While possible, the shared collages suggest students working individually and taking some pride in their individual accomplishment.

We also have the take-home exam performance to support the conjecture that student performance wasn’t merely a matter of collaboration. Figure 3 are the grade statistics and distribution from Take-Home Exam #2. While the grades aren’t quite as high as on Homework #3, over half the class got an A (90% or better) on this assignment.

Figure 3: Grade statistics and distribution for Take-Home Exam #2

<Insert Table 1 here: A table for each of the six homeworks with average and standard deviation>

<Insert Table 2 here: A table for each of the two take-home exams with average and standard deviation>

<Insert Tables 3, 4, and 5 here: For each problem on each of the exams, put a heading summarizing the topic of the problem, then provide the average and standard deviation for each problem, and then for the exam overall.>

2.4.What support for collaborative learning was used in the class?

The students made extensive use of a collaborative website, CoWeb (at [AF5]

  • Each of the assignments (homework, labs, and take-home exams) had Q&A pages associated with them where extensive discussions took place.
  • For each exam, a review page was posted where students could answer sample questions, ask questions about the questions or concepts, and critique each other’s questions.
  • Each week, a new general “Comment on the week” page was put up for general feedback and discussion.
  • A number of special topics pages were provided to encourage discussion (favorite movies, music, Atlanta-area restauraunts).
  • Pages encouraging anonymous feedback were provided.

2.5.What was the WFD rate?

By drop day, only two students dropped the course. At the end of the course, 3 students were given D’s, 7 were given F’s, and three were given I’s (two for pending academic misconduct cases and one for hardship supported by the Dean of Students’ office). W’s, F’s, and D’s made up 12 of the 120 originally enrolled students, for a 10% WFD rate.[AF6]

2.6.Future work on the curriculum and course materials

The course notes that were developed for the course only covered about half of the semester. The rest of the course notes are being developed during Summer 2003. We are also improving the labs during the summer—they were only modified slightly from CS1321’s labs, and we’d now like to make more substantial revisions to fit them into the media computation context.

We also plan to revise the technology considerably during the Summer 2003. The technology developed for the course (programming environment and media manipulation tools) performed admirably—e,g, there was literally not a single report of the environment crashing during the entire semester. Nonetheless, several bugs were noted and two significant enhancements are desired:

  • First, there is no support for debugging in the programming environment. We need to figure out what support is useful and understandable to non-CS majors who are creating loops that iterate tens of thousands of times (e.g., processing samples or pixels). Simple breakpoints won’t work—no one wants to click Continue 80,000 times.
  • Second, two separate environments exist for viewing media and for writing programs; thise separation between the media tools and the programming environment was a problem. Students who have just created a sound want to visualize it then, not after saving it to WAV file, opening the MediaTools, and then opening the WAV file.

There is considerable interest in the course external to Georgia Tech. A contract for a book based on these course notes has been signed with Prentice-Hall. KennesawStateUniversity is teaching a summer camp for high school students based on the course during Summer 2003, while GainesvilleCollege is actually offering the course to a small number of students during Summer 2003. With Marion Usselman of CEISMC, we are exploring the creation of a high school version of the course.

3.Did the students learn computing?

3.1.Did the students think that they learned computing?

<Andrea, can you tune this one?>

We asked the question of whether students felt that they were learning to program at on both the midterm and final surveys. In general, students did report that they felt that they were learning to program.

At the midterm survey, the majority of all four CS1’s reported that they felt that they were learning to program (Table 6). Surprisingly, between the midterm and final surveys, CS1315 students’ confidence in their learning dropped. This may be because the material at the beginning of the semester was less challenging than that presented at the end, or because as students learn more, they begin to realize how little they know.

It may be that, as students learn more, they realize how little they know. CS1321 students, tending to be more advanced than the others, may have already realized that at the midterm survey. [AF7]

<Insert Table 6 here: Midterm survey responses for the four courses on “Are you learning to program?”>

Midterm Survey Responses: Are you learning to program?
Course / Yes / No / Total
CS 1321 / Raw count / 29 / 4 / 33
Percentage of class / 88% / 12% / 100%
CS 1315 / Raw count / 84 / 3 / 87
Percentage of class / 97% / 3% / 100%
COE 1361b / Raw count / 26 / 5 / 31
Percentage of class / 84% / 16% / 100%
COE 1361a / Raw count / 20 / 1 / 21
Percentage of class / 95% / 5% / 100%
Final Survey Responses: Did you learn to program?
Course / Yes / No / Blank / Total
CS 1321 / Raw Count / 23 / 3 / 0 / 26
Percentage of class / 88.5% / 11.5% / 0.0% / 100.0%
CS 1315 / Raw Count / 42 / 7 / 5 / 54
Percentage of class / 77.8% / 13.0% / 9.3% / 100.0%
COE 1361b / Raw Count / 19 / 2 / 0 / 21
Percentage of class / 90.5% / 9.5% / 0.0% / 100.0%

<Insert Table 7 here: Final survey responses for the three courses on are you learning to program, and the rating of skill before/after>


3.2.How do CS1315 students compare to known problems in computing? To CS1321 or COE1361 students?

In order to compare CS1315 students’ achievement to that of CS1321 and COE 1361, wWe did developed several problems that we attempted to put on all three courses’ exams and quizzes. Unfortunately, logistical considerations such as different rates of development in each course and different times and numbers of exams and quizzes prevented us from distributing the problems as uniformly as we would have liked. In addition, those questions that did make it onto all three exams were modified to such an extent that it is difficult to compare them. In general, we found that the different programming languages used, the differences in directions given on exams, and the inclusion of model code on CS1315 exams created unique conditions for each course and rendered the results fundamentally incomparable.But even for those that we were successful in getting them on all three exams (e.g., dealt with different rates of development in courses, times for exams and quizzes, finding the right people who were developing exams or quizzes), the problems ended up getting modified for each context to such a point that it was hard to compare them. <ANDREA, CAN WE SAY SOMETHING MORE HERE ABOUT THE RESULTS? PERHAPS INSERT YOUR GENERAL SUMMARY?>