16.01 | Fall 2005 | Undergraduate

Unified Engineering I, II, III, & IV

Thermodynamics and Propulsion

Reflective Memo from Fall 2000

Summary

I completed most of the actions outlined in my reflective memo from the last time I taught this material, implemented significant changes to my methods of teaching and made small changes to the content. All of this allowed me to collect what was for me an unprecedented amount of data. There are some overarching messages from this data. The most important message is that I believe that the new teaching methods I am trying and/or the increased energy I am putting into teaching are improving the students’ achievement of the subject learning objectives.

Subject Learning Objectives

The subject learning objectives are contained on the course web page. I do not have any changes to recommend to these.

Teaching Methods and the Learning Environment

I took several actions intended to improve learning. These are discussed in greater detail below. Implementation of all of these new elements produced an unsustainable workload for me. I am hopeful that the next time around things will be easier.

a) I put all my lecture notes on the web. Reviewing these notes will provide a good overview of both the subject content and the teaching methods.

b) I added sections to the notes for areas of difficulty identified in last year’s mud cards (enthalpy, shaft work vs. flow work, and frame dependence of stagnation quantities).

c) I developed 60 concept questions and active learning exercises for use in class (sprinkled throughout the notes, click on the “?” buttons)

d) I applied 41 concept questions and active learning exercises during the 12 formal lectures (the 13th lecture I used for review and student self-assessment).

e) I used the PRS system for 39 of the 41 concept questions and active learning exercises and thus have student data for each of these.

f) I implemented a formal self-assessment activity during the 13th lecture (I asked about 15 questions using the PRS system).

g) I responded to the mud cards the evening the lecture was delivered and put the responses up on the web. These responses were hyperlinked to the relevant areas of the online notes. See for example T6 mud responses.

h) I wrote short assessments of each lecture (how they went). See for example T3 mud responses.

i) I changed the grading criteria to provide for 10% credit for the in-class activities.

j) All of this material was integrated into a web page that includes the course notes, the PRS questions asked in class, the answers to the PRS questions asked in class, the homework questions, the homework solutions, the quizzes and solutions, and the muddiest part of the lecture questions and answers. These are all organized in a chronological order aligned with the content in the course.

k) I posted last year’s reflective memo for this year’s and last year’s students to review online.

l) I led the implementation of a web-based subject evaluation process–we used this same process for delivering a web-based math and physics skills diagnostic early in the semester.

m) I did one new demonstration/in-class experiment. For a more detailed discussion, follow the links underneath the “A” answer button.

Basis of Overall Subject Grade

10% Class participation as described in the course notes, 30% homework, 60% quizzes.

Assessment Methods

The use of the PRS system in-class and for the self-assessment activity and the collection of, and response to, all of the mud cards gave me a large amount of data on the class performance (formative and summative). Also, as in the past in Unified, we collected data on time spent on various activities (its primary use is to make sure the class stays within bounds). Links to each of the following data sources as well as to data plots and correlations are contained in the appendices at the end of this memo.

DATA SOURCE FORMATIVE SUMMATIVE FEEDBACK FOR ME
PRS System X

X
Time spent X

X
Self-assessment X

X
Muddiest Part of the Lecture X

X
Web-usage data

X
Homework X X X
Quizzes

X X
Mid-term evaluation data

X
End-of-term SEF data

X
Class attendance

X

Some observations and lessons

a) Sometimes having data can be a little depressing, but it is always valuable. For example, the first PRS question I asked had to do with identifying kW-hr as a unit of energy and only 30% of the class did this correctly (correct student responses to PRS). But if I hadn’t known this, I would have launched off without mentioning units.

b) I observed that I did not teach the material in a simple pro-rated manner in terms of time devoted to different subjects. In the past I have chugged through the notes at roughly a constant pace, independent of the level of conceptual difficulty. This year for example I spent three lectures on the concept of frame dependence of stagnation quantities alone, but very few lectures on the various forms of the First Law. Thus the teaching was tailored to the needs of the class, rather than based on how many words are devoted to the topics in my notes.

c) My personal assessment is that this class learned more (and deeper) than previous classes I have taught. While this is not based on a lot of hard data, there is some supporting evidence.

First, relative to last year, the class felt more confident in their abilities relative to being able to demonstrate the subject learning objectives (based on comparison of self-assessment result this year (six pages) to last year) . One caveat to using this information to support my feeling that the student achievement was increased is that there is no strong correlation between self-assessment vs. homework (several pages) or quiz (several pages) or in-class question (several pages) scores. So even if they feel more confident, it doesn’t necessarily imply they learned more. Potential reasons for the lack of correlation are i) the quizzes aren’t particularly good tests of their understanding, ii) the students are not good at assessing their understanding, or iii) the sample size is too small. Relative to the latter, note also that there is no correlation between time spent vs. performance (several pages), or class participation vs. scores on homework or quizzes when considering the thermodynamics scores alone. However, positive correlations are apparent between class participation vs. quiz scores and homework vs. quiz scores when the data is compiled for all of the subjects in Unified (other correlation plots can be accessed below).

Second, one of the shortcomings apparent in the summative quiz assessment from last year was that the students had difficulty with solving open-ended problems. They improved in this regard this year even though I gave them a harder problem on the quiz (last year was “Describe energy exchange processes for an object entering the atmosphere from space.” Class average was 67% for this question. This year I asked them a problem about energy exchange during the turbine failure that the ABB power plant experienced. Class average was 74%.) Again, due to the small sample size (small number of quiz questions, and the fact that the question was not the same year to year) it is not possible to be definitive.

d) Considering the measurable outcomes independently (see data plots of quiz, PRS, self-assessment, and homework scores organized by measurable outcome), the students performed well on all outcomes except for #4, application of the First Law. Understanding and using the First Law is the core of the material I teach in this subject so this is a little disappointing. Note that both the average quiz performance and the self-assessment results for measurable outcome #4 suggest weakness in this area, but on a student-by-student basis, no such correlation exists (see page 3 of the quiz scores vs. self-assessment correlations). There is no deficiency apparent in the homework and PRS question scores for this learning objective. Again, given the small amount of data (2 quiz questions), it is hard to be definitive. However, it does support my concern that the students in Unified get relatively less time to practice the concepts (one hour of homework for each lecture hour, whereas in other MIT subjects students get as many as three hours for each hour of lecture). This underscores the importance of making the lectures in Unified an important learning experience (the active learning and the PRS system are valuable in this regard).

e) The muddiest parts are tabulated in the web page index. From among these the worst areas of confusion were:

thermodynamic equilibrium, unsteady versus quasi-static, for example

applicability of cv and cp, for example

static and stagnation quantities, see here and here for example

f) The students still have difficulty explaining concepts in words, but this ability improved slightly relative to last year–comparing performance on problem 13 (class average=83%) which was the same as last year’s problem 12 (class average=66%). Also, after they did a poor job explaining in words what the First Law means on quiz 1 (class average = 76% for this question), I asked them the same question again on the second quiz and they improved significantly (class average = 92% on this question).

g) Because of the enhanced feedback with the class I was able to complete the material in about 8% less time than it took last year leaving the entire last lecture open for review and self-assessment activities.

h) The students were terrific in participating in the experiments with the PRS system and believe that it was very effective in improving their achievement of the class learning objectives (see mid-term and end-of-term (histograms, text) evaluation data). A large fraction of the students also felt the web-based mud responses were valuable (see midterm evaluation data and student input on web-usage (multiple pages)).

i) I did not do reading quizzes or assignments (I just couldn’t get it done in terms of time available). I think that forcing the students to come to class prepared would improve their learning. They feel differently (see page 3 of student input file).

j) I had more fun teaching this way (i.e. with the PRS system) than I have ever had before.

k) The time required to implement all of these changes to my teaching style was excessive. Steve and I regularly found ourselves dueling between 12am and 2am to see who could finish their mud responses first. I am hopeful that technology can come to the rescue (see below) and make responding to the questions easier. If not, I will probably only respond to the top few questions rather than responding to them all.

a) Try some of the PRS question techniques that Steve used (like asking them how confident they are in a calculation/problem solution and then asking them to self-report whether they got it correct).

b) Organize the mud responses into a keyword searchable database and work with CAES and EMCC to develop a self-organizing FAQ (frequently asked question) database generation process for the future.

c) Two muddiest-of-the-mud topics showed up again this year after being similarly identified last year: thermodynamic equilibrium and static vs. stagnation quantitites. The first (equilibrium), I didn’t take any action on last time around but in responding to all of the mud cards this year, I think I developed some better explanations (actually, the best one I stole from Ed G.) which should be put into the notes. The second (stagnation quantities) I added examples and more discussion into my notes and had several PRS questions and spent 2.5 lectures on and it was still rough for the students. As a result of this attention, I think they ended up understanding it better, but it took a lot of class time. Perhaps in the future, this would be a good place for a physical demonstration.

d) Formulate a “mud taxonomy” to better understand and categorize the types of questions students ask. Then use this to better target teaching techniques.

e) Implement reading assignments to improve student preparation for the lectures.

f) Ask Greitzer to have the TA that grades his diagnostic (given first day in 16.050) to also grade samples from the previous year to remove year-to-year grading biases and thus give me better input on how well the students are doing relative to the Unified learning objectives.e) Develop similar system of instructional materials and methods for the propulsion part of Unified that I teach in the Spring.

g) Try it all again and see if the workload goes down.

h) Develop a similar set of materials and apply similar techniques for the spring term propulsion lectures in Unified.

People who should see this reflective memo

a) Greitzer or equivalent 16.050 instructor

b) Students from this class

c) Students who will take the class next year

d) Other members of the propulsion division

e) Other members of the Unified staff

f) Brodeur, Soderholm and others working to improve the Department learning environment

Appendices

A. Data Sources

  • Homeworks and solutions are interspersed through the notes (click on the “HW” buttons).
  • Quiz 1 and Quiz 2
  • The muddiest parts are tabulated in the web page index. From among these the worst areas of confusion were:
    thermodynamic equilibrium, unsteady versus quasi-static, for example
    applicability of cv and cp, for example
    static and stagnation quantities, see here and here for example
  • Student self-assessment
  • Midterm Review Data: Histograms, PRS text comments, mud cards text comments, Best parts text comments, Needs improvement text comments, Other text comments
  • End-of-term Review Data: Histograms, text comments
  • Web usage input from students

B. Data Plots for Thermodynamics (note some have multiple pages)

  • PRS scores (organized by problem)
  • PRS scores (organized by measurable outcome)
  • Homework Scores (organized by problem)
  • Homework Scores (organized by measurable outcome)
  • Quiz Scores (organized by problem)
  • Quiz Scores (organized by measurable outcome)
  • Time Spent (organized by homework problem)
  • Self-assessment (organized by measurable outcome), alternate

C. Correlations for Thermodynamics (note some have multiple pages)

  • Quiz scores versus homework scores
  • Quiz scores versus prs scores
  • Quiz scores versus self-assessment
  • PRS scores versus self-assessment
  • Homework scores versus self-assessment
  • Homework scores versus PRS scores
  • Performance versus attendance
  • Performance versus time spent

D. Correlations for Unified as a whole

  • Overall quiz average versus study time
  • Overall quiz average versus homework average
  • Overall quiz average versus class participation
  • Overall class grade (minus participation grade) versus participation
Learning Resource Types
Lecture Videos
Course Introduction
Competition Videos
Problem Sets with Solutions
Exams with Solutions