16.01 | Fall 2005 | Undergraduate

Unified Engineering I, II, III, & IV

Thermodynamics and Propulsion

Reflective Memo from Fall 2002

Summary

I completed most of the actions outlined in my reflective memo from the last time I taught this material, (2 years ago). I continued to use various active learning techniques (prs + concept tests) and forms of regular feedback (e.g. mud cards, concept test results, etc.). Based on student performance and subject evaluations, I believe that most of the students met most of the learning objectives. Specific exceptions include their ability to describe the assumptions that underly some of the engineering models applied, and some lack of ability in applying the concepts to situations they haven’t seen before. The first deficiency I can address through stressing the assumptions more when presenting the material to the class (ACTION: Emphasize this more the next time I teach the material). The second deficiency is a result of the small amount of practice (one hour of homework for each hour of lecture) afforded the students in Unified. I think some shift in balance between contact hours and homework hours in Unified would be beneficial (ACTION: Propose such a shift to the Unified faculty).

Learning Objectives

1. What are the learning objectives (expressed as measurable outcomes) for this subject?

The subject learning objectives are contained on the course web page. I do not have any changes to recommend to these.

2. To what extent were you able to integrate the CDIO skills specified for this subject in the Curriculum Plan of 2002?

I did not specifically implement any CDIO syllabus items since these are largely covered in the systems portion of Unified. However, several of the in-class experiments I did required 2.1.2 Modeling and 2.1.3 Estimation and Qualitative Analysis–see item 3. e) below. I would say both skills were taught at the “Introduce” level.

Teaching Methods

3. What teaching methods did you use and what evidence indicates these methods were successful or not?

a) Prepared lecture notes were available on the web for all of the material. These notes have evolved over several years starting with a set of handwritten lecture notes. Each year I try to augment them when I find specific areas of difficulty from mud responses, etc. I am quite happy with them at this point. In the end-of-term evaluations 90% of the respondents rated the web page (for all of Unified) Somewhat Effective or Very Effective. 97% of the respondents rated the prepared lecture notes (for all of Unified) Somewhat Effective or Very Effective.

b) I used 27 concept questions over the 13 lectures with responses taken on the PRS system. The performance and answers to these were provided on the web page. I continue to find these to be very useful for engaging the class in the material while I am lecturing. 95% of the respondents on the SEF rated the in-class exercises as Very Effective or Somewhat Effective. Also several positive comments were made about the PRS/concept questions in the written comments from the end-of-term evaluations. In general my teaching reviews were good, so I think the students found my lectures to be helpful to them.

c) I used mud cards for each lecture and responded to them the evening the lecture was delivered and put the responses up on web. These responses were linked to the relevant areas of the online notes. See for example T1 mud responses. 81% of the respondents on the end-of-term evaluations said the mud cards were Very Effective or Somewhat Effective, however the majority found they were only Somewhat Effective. Nonetheless, I still found the mud cards to be valuable to me for providing feedback on each lecture. Even in cases where there were very few respondents it was helpful, therefore I will continue to use these.

d) I wrote short assessments of each lecture (how they went). See for example T3 mud responses. This was mostly helpful for me, although I did use it to stress the important points from the lecture. I am not sure how many students read these responses. In general, I think that we have saturated the students in terms of available material on the web. Further information goes un-read.

e) I did 4 small demonstrations/in-class experiments. The students seemed to like these activities since they allowed them to apply the subject material to a real problem. They all were of the form where I asked the students to estimate something (using the concepts in class), then did the experiment and then discussed the result in light of their estimates. The activities thus had three primary objectives: to engage the students in the material we were working on and show them how to apply it, to highlight the various simplifications and assumptions in the engineering models we use, and to give the students practice in estimating various parameters required for input to the models (e.g. the volume of the room, or the weight of something, etc.)

f) I had a formal self-assessment activity during the last recitation. I asked the students to grade themselves on the subject learning objectives. I did this largely as an effort to get them to think more deeply about the success criteria in advance of the exam, since based on past history their responses don’t correlate well with their exam performance.


Assessment Methods

4. How was each subject learning objective assessed and what evidence indicates students achieved these outcomes?

I used homeworks and exams to assess student learning. Each of the homework problems and exams was coded to specific subject learning outcomes. The overall weighting was10% class participation 30% homework, 60% quizzes. Also, the use of the PRS system in-class and for the self-assessment activity and the collection of, and response to, all of the mud cards gave me a large amount of data on the class performance (formative and summative). Also, as in the past in Unified, we collected data on time spent on various activities (its primary use is to make sure the class stays within bounds).

DATA SOURCE FORMATIVE SUMMATIVE FEEDBACK FOR ME
PRS System X

X
Time spent X

X
Self-assessment X

X
Muddiest Part of the Lecture X

X
Homework X X X
Quizzes

X X
End-of-term SEF data

X
Class attendance

X

The performance on the homework and the time spent were both good with students on average performing well. The performance on the quiz is shown below with each question labeled in terms of the learning objectives that were addressed. Middle-B performance for the quiz was 79%, suggesting the class on average was about a half a grade below this standard. The lowest performance was on the second question. This was a conceptual question that tested their understanding of the assumptions underlying the engineering models they were applying. I mentioned the specific concept tested in this question in only one class, and then rather briefly. Note however, that it was discussed at some length in the mud responses (I am not sure how many students read these). ACTION: Next time I teach this, I need to give this and other modeling assumptions more time in lecture. The second lowest score was on the fourth problem which required application of the steady flow energy equation. The students are historically weak in applying some of the concepts to situations they haven’t seen before (such as this whereas problem 3 was basically a cookie cutter repeat of a problem they had seen in various forms before). I attribute this in part to the small amount of homework time allowed in Unified for each hour of lecture material (1-1). See ACTION below.

LEARNING OBJECTIVES

PROBLEM 1

#1

PROBLEM 2

#4,#5,#6

PROBLEM 3

#4,#6

PROBLEM 4

#4,#5,#2

QUIZ TOTAL
Mean 88% 58% 88% 63% 74%
Standard Deviation 18% 35% 13% 27% 16%
Weighting 10% 10% 36% 44% 100%

The learning objectives that were not addressed on the quiz (e.g. learning objective 3) were addressed in the homework. The performance on the homeworks was quite acceptable and consistent with historical performance.

Continuous Improvement

5. What did you learn about your teaching and assessment methods this semester?

The most important thing I learned is that teaching with all of these techniques does not take me any more time than the old-fashioned chalk-and-talk ONCE THE MATERIALS ARE DEVELOPED. The first time through took a tremendous amount of time, but this time wasn’t that bad. It was enjoyable. I made a decision this year to use old homework problems rather than develop them from scratch. This was also beneficial because I was able to pick some of the best problems. Previously when I forced myself to generate new problems I produced many that either had errors, were difficult to understand, didn’t stress the key concepts, were too long, etc. Based on my interactions with the students during office hours, most of them appeared to be doing the homeworks without reference to the old bibles, so I will continue to do this. Unfortunately, I will have to continue to generate new exam questions which is always a challenge.

6. How do you use feedback from students and colleagues to improve your subject?

The most valuable feedback I get comes in the form of class performance on the PRS questions. I use this in real time to modify my lectures. Second most valuable is the feedback on the mud cards which I use to modify the next lecture and as a way of updating the course notes (I typically add sections in areas where I get a lot of mud). Reference to the last reflective memo I wrote will show that I followed through on most of the items that I listed as areas for improvement. The muddiest parts of the subject were similar to past years (unsteady versus quasi-static, and static and stagnation quantities) despite the additions to the notes and modification to my lecture strategy, however I think these topics went a little better this year (maybe).

7. What will you continue or change?

I will continue to apply the suite of teaching techniques I am using now.

Summary of recommended ACTIONS

a) Improve my presentation of quasi-static versus unsteady processes and my presentation of static and stagnation quantities (metrics: mud responses and performance on quiz)

b) Underscore many of the assumptions inherent in the models applied (metrics: additions to online notes and lectures, improved student performance on quiz)

c) Propose that the Unified faculty shift the course structure to enable more homework hours for each hour of lecture (metric: change in course structure)

Information Sharing

8. To whom have you forwarded this reflective memo?

a) Zolti Spakovszky (16.050 instructor)

b) Students from this class

c) Students who will take the class next year

d) Other members of the Unified staff

Learning Resource Types
Lecture Videos
Course Introduction
Competition Videos
Problem Sets with Solutions
Exams with Solutions