16.01 | Fall 2005 | Undergraduate

Unified Engineering I, II, III, & IV

Thermodynamics and Propulsion

Instructor

Prof. Ian Waitz

Learning Objectives

16.01-16.02 Thermodynamics (PDF)

16.03-16.04 Propulsion (PDF)

Featured Video Lecture

Additional Resources

Table Organization

Sample Lecture

Sample Lecture P9: Energy Exchange with Moving Blades

Thermodynamics, 16.01-16.02

LEC # TOPICS CONCEPT QUESTIONS MUDDY POINTS READINGS ASSIGNMENTS / SOLUTIONS
T1 Course Introduction and Thermodynamic Concepts Q1 (PDF)

Q2 (PDF)

2003
2002
2000
Course Introduction and Thermodynamic Concepts Problem T1 (PDF)

Solution T1 (PDF)

T2 Changing the State of a System with Heat and Work Q1 (PDF)

Q2 (PDF)

2003
2002
2000
Changing the State of a System with Heat and Work Problem T2 (PDF)

Solution T2 (PDF)

T3 State Changes, First Law Q1 (PDF)

Q2 (PDF)

2003
2002
2000
First Law of Thermodynamics, Sections I and II Problem T3 (PDF)

Solution T3 (PDF)

T4 First Law (cont.) Q1 (PDF)

Q2 (PDF)

2003
2002
2000
First Law of Thermodynamics, Sections I and II Problem T4 (PDF)

Solution T4 (PDF)

T5 First Law, Enthalpy, Specific Heats Q1 (PDF)

Q2 (PDF)

Q3 (PDF)

2003
2002
2000
First Law of Thermodynamics, Sections III, IV, V Problem T5 (PDF)

Solution T5 (PDF)

T6 First Law, Enthalpy, Specific Heats, Introduction to Heat Engines Q1 (PDF) 2003
2002
2000
Applications of the First Law to Heat Engines Problem T6 (PDF)

Solution T6 (PDF)

T7 Heat Engines (cont.) Q1 (PDF) 2003
2002
2000
Applications of the First Law to Heat Engines Problem T7 (PDF)

Solution T7 (PDF)

T8 Steady Flow Energy Equation Q1 (PDF)

Q2 (PDF)

2003
2002
2000
Steady Flow Energy Equation, Section I.A Problem T8 (PDF)

Solution T8 (PDF)

T9 Shaft Work and Flow Work Q1 (PDF)

Q2 (PDF)

Q3 (PDF)

2003
2002
2000
Steady Flow Energy Equation, Section I.A Problem T9 (PDF)

Solution T9 (PDF)

T10 Stagnation Quantities Q1 (PDF)

Q2 (PDF)

Q3 (PDF)

2003
2002
2000
Steady Flow Energy Equation, Section I.B-I.C Problem T10 (PDF)

Solution T10 (PDF)

T11 Reversible and Irreversible Processes Q1 (PDF)

Q2 (PDF)

Q3 (PDF)

Q4 (PDF)

2003
2002
2000
Reversible and Irreversible Processes, Entropy, and Introduction to the Second Law, Section I Problem T11 (PDF)

Solution T11 (PDF)

T12 Entropy   2003
2002
2000
Reversible and Irreversible Processes, Entropy, and Introduction to the Second Law, Sections II and III Problem T12 (PDF)

Solution T12 (PDF)

 

Propulsion, 16.03-16.04

LEC # TOPICS CONCEPT QUESTIONS MUDDY POINTS READINGS ASSIGNMENTS / SOLUTIONS
P1 Introduction to Propulsion and the Integral Momentum Equation Q1 (PDF)

Q2 (PDF)

2004
2003
2002
Introduction to Propulsion and Integral Momentum Theorem

CFM56-3 Diagrams (PDF)

Problem P1 (PDF)

Solution P1 (PDF)

P2 Integral Momentum Equation Q1 (PDF) 2004
2003
2002
Integral Momentum Theorem Problem P2 (PDF)

Solution P2 (PDF)

P3 Applications of the Momentum Theorem, Definition of Efficiencies Q1 (PDF)

Q2 (PDF)

P3 Lecture Slides (PDF)

2004
2003
2002
Integral Momentum Theorem and Efficiencies of A/C Engines Problem P3 (PDF)

Solution P3 (PDF)

P4 Aircraft Performance and Mission Requirements, Relationship with Propulsion Q1 (PDF)

Q2 (PDF)

2004
2003
2002
Aircraft Performance Problem P4 (PDF)

Solution P4 (PDF)

P5 Rocket Performance Q1 (PDF)

Q2 (PDF)

2004
2003
2002
Rocket Performance Problem P5 (PDF)

Solution P5 (PDF)

P6 Rocket Performance and Connection to Engine Design Parameters Q1 (PDF) 2003
2002
Rocket Nozzles: Connection to Flow of Geometry Problem P6 (PDF)

Solution P6 (PDF)

P7 Cycle Analysis of a Gas Turbine Engine Q1 (PDF)

Q2 (PDF)

2003
2002
Ideal Cycle Analysis of Aircraft Gas Turbine Engines Problem P7 (PDF)

Solution P7 (PDF)

P8 Energy Exchange with Moving Blades Q1 (PDF)

Q2 (PDF)

2003
2002
Energy Exchange with Moving Blades Problem P8 (PDF)

Solution P8 (PDF)

P9 Energy Exchange with Moving Blades Q1 (PDF)

Q2 (PDF)

2003 Energy Exchange with Moving Blades

Velocity Triangle Handouts (PDF)

Problem P9 (PDF)

Solution P9 (PDF)

 

Additional Resources for Thermodynamics (16.01-02)

Pedagogical Resource: Reflective Memo from Fall 2003

Compendium of Equations used in the Course (PDF)

All Concept Questions

Fall Term 1999, 62 students in the class, 13 lectures, plus 5 recitation sections that I taught.

I. Top-Level Learning Objective

To be able to use the First Law of Thermodynamics to estimate the potential for thermo-mechanical energy conversion in aerospace power and propulsion systems.

Measurable Outcomes (Assessment Method)

  1. To be able to state the First Law and to define heat, work, thermal efficiency and the difference between various forms of energy. (quiz, self-assessment)
  2. To be able to explain at a level understandable by a high school senior or non-technical person how various heat engines work (e.g. a refrigerator, an IC engine, a jet engine). (quiz, self-assessment)
  3. To be able to identify and describe energy exchange processes (in terms of various forms of energy, heat and work) in aerospace systems.
  4. To be able to apply the steady-flow energy equation or the First Law of Thermodynamics to a system of thermodynamic components (heaters, coolers, pumps, turbines, pistons, etc.) to estimate required balances of heat, work and energy flow. (homework, quiz, self-assessment)
  5. To be able to explain at a level understandable by a high school senior or non-technical person the concepts of path dependence/independence and reversibility/irreversibility of various thermodynamic processes, to represent these in terms of changes in thermodynamics state, and to cite examples of how these would impact the performance of aerospace power and propulsion systems. (homework, quiz, self-assessment)
  6. To be able to apply ideal cycle analysis to simple heat engine cycles to estimate thermal efficiency and work as a function of pressures and temperatures at various points in the cycle. (homework, self-assessment)

II. General Comments on the Delivery of the Course Material

This is the fourth time that I have taught the thermodynamics section of Unified. I hand out lecture notes that contain all the information I would like the students to learn. The students appreciate these notes (see student feedback data). I also require that they purchase van Wylen, Sonntag and Borgnakke which is the same book that is required in the follow-on course (16.050).

This semester I applied active learning in all of the lectures (muddiest part of the lecture cards, unannounced reading quizzes and turn-to-your-partner activities in every class). I also asked the students to self-assess their learning at the end of the module. I found that teaching the recitation sections myself (instead of relying on a TA as I have in the past), gave me much more time to cover the material in the way I want to cover it. I believe the overall product was better as a result.

The active learning questions tended to take about 10 minutes each. These are sketched into the margins of the prepared notes and written down on the hand-written lecture notes that I prepare for each class (Action: I need to organize these in a better fashion.).

III. Assessment Data

ASSESSMENT METHOD FORMATIVE SUMMATIVE
Muddiest Part of the Lecture X

Quiz

X
Homework X X
Self-assessment

X
Reading Quizzes X X
CEG and SGT Forms other other
Mid-term Student Feedback other other
Time Spent on Homework other other

A. Key Points from Muddiest Part of Lecture Cards

  1. Why pexternal vs. psystem for the definition of work. There is a good explanation of this in Levenspiel. Action: Put explanation into prepared lecture notes.
  2. Physical meaning of enthalpy. Action: Clarify this in the notes.
  3. Isothermal vs. quasi-static, adiabatic processes. The turn-to-partner exercise on this is good. Action: Include this example in the notes next year.
  4. Shaft work vs. flow work. The sketch + explanation I do in class seems good. Action: Need to put this in the prepared lecture notes.
  5. Frame dependence of stagnation quantities. This isn’t even discussed in the prepared notes and it always is a hurdle. Action: Need to put in some examples and a discussion in the notes.

B. Reading Quiz Performance

The reading quizzes were problematic for all of the Unified instructors who applied them. We told the students that we would not test so much on comprehension but rather on whether or not they had read the material. The students of course optimized on grades, reading for memory instead of understanding. On the positive side, we did have strong attendance and many in the class found the quizzes helpful (see midterm teaching technique survey data). For all of these quizzes 3/4 of the class scored above 50% and the average for the reading quizzes was 6.8/10 demonstrating that for the most part they were reading and coming to lectures prepared. We ended up dropping these quizzes from Unified, in part to increase good will with the students (we were subjecting them to many new teaching techniques and they were pretty good sports about it), and in part because we were having trouble writing the questions so that they were understanding-neutral (since this is what we promised-in the end I think this was the primary mistake).

C. Homework Performance

(Homework questions are appended at the back of the document.)
Means: 1= 8.6, 2= 9.2, 3= 8.2, 4=6.6, 5=9.0, 6= 7.5,7=6.9, 8=7.5, 9=7.7, 10=8.3, 11=7.1, 12=6.6, 13=8.0. The average reported time spent on these was about an hour each (as designed) with the exception of problem 4 which averaged 1.5 hours (see the data below).

  1. Performance on Problem 4 suggests confusion between q-s and free expansion.
  2. Performance on 6 and 7 suggests that the students are having some difficulty with work and applications of the first law.
  3. Performance on Problem 8 is a good measure of Measurable Outcome #6-class ave.= 7.5.
  4. The performance on 12 is disappointing-it asked students to explain reversibility and irreversibility. They did very poorly. I think they had trouble with these concepts.

D. Student Self-assessment Results

Note: I added measurable outcome #3 after the class was over so they didn’t get a chance to self-assess on it. By their reckoning, they are the poorest at applying the first law. This, of course, is perhaps the most important outcome of the class. I am not sure how valuable this self-assessment activity was for me-I think it gave them an opportunity however, to think about what they know and don’t know prior to taking the quiz. This is good.

E. Quiz Performance

Middle B was approximately 70/100. I collected A, B, and C-level examples of their work for reference.

  1. The students did not perform well on Q2 particularly parts c and d which required some applications to a more open-ended question. This was a SFEE question that I expected them to do better on.
  2. The students also did not perform particularly well on Q3a where they were asked to represent a cycle on a thermodynamic diagram.

Below the data is sorted in terms of the measurable outcomes:

Image

Bloom level #1 (MO#1) they did well on. Cycle analysis (MO#6) they did well on. Steady flow energy equation they did not do well with (MO#4)-the question required some modeling. Modeling an energy exchange process (MO#3) they also did not do well with. They were unable to explain path dependence (MO#5)-this is also true of irreversibility and reversibility on the homework (P12).

I think that these data suggest that their understanding of the concepts is not as deep as I would desire. They have difficulty a) describing concepts in words, and b) applying the concepts to anything other than plug and chug problems.

F. CEG, SGT, and Mid-term Survey Results

Image

Generally favorable reviews from the students. They like the prepared lecture notes. The response to the various new teaching techniques that we introduced was also very favorable (see below).

Image 1

Image 2

IV. Items to Improve for Next Time the Material is Taught

  1. Share this data with the students.
  2. Give them models for explaining concepts in words and force them to practice in class-they are not good at this.
  3. Continue to give them more open-ended problems so that they don’t fool themselves into thinking they understand the concepts just because they can work the equations.
  4. Make the modifications to the notes described in the discussion of the muddiest part of the lecture cards.
  5. Continue to teach all my own recitations.
  6. Move all of this onto a web-based platform like Steve is using.
  7. Continue to seek good demo’s (these are hard to find).
  8. Continue to really push on the active learning.

V. Who should see this Information

  1. Greitzer or equivalent 16.050 instructor
  2. Students from class (when they start 16.050 next year)
  3. Students in next Unified class
  4. Other folks in the propulsion division
  5. Other Unified instructors

Histogram for total grade bin Frequency
46 to 55 2
56 to 65 1 0
66 to 75 2 0
76 to 85 2 3
86 to 95 5
More 0

Summary

I completed most of the actions outlined in my reflective memo from the last time I taught this material, implemented significant changes to my methods of teaching and made small changes to the content. All of this allowed me to collect what was for me an unprecedented amount of data. There are some overarching messages from this data. The most important message is that I believe that the new teaching methods I am trying and/or the increased energy I am putting into teaching are improving the students’ achievement of the subject learning objectives.

Subject Learning Objectives

The subject learning objectives are contained on the course web page. I do not have any changes to recommend to these.

Teaching Methods and the Learning Environment

I took several actions intended to improve learning. These are discussed in greater detail below. Implementation of all of these new elements produced an unsustainable workload for me. I am hopeful that the next time around things will be easier.

a) I put all my lecture notes on the web. Reviewing these notes will provide a good overview of both the subject content and the teaching methods.

b) I added sections to the notes for areas of difficulty identified in last year’s mud cards (enthalpy, shaft work vs. flow work, and frame dependence of stagnation quantities).

c) I developed 60 concept questions and active learning exercises for use in class (sprinkled throughout the notes, click on the “?” buttons)

d) I applied 41 concept questions and active learning exercises during the 12 formal lectures (the 13th lecture I used for review and student self-assessment).

e) I used the PRS system for 39 of the 41 concept questions and active learning exercises and thus have student data for each of these.

f) I implemented a formal self-assessment activity during the 13th lecture (I asked about 15 questions using the PRS system).

g) I responded to the mud cards the evening the lecture was delivered and put the responses up on the web. These responses were hyperlinked to the relevant areas of the online notes. See for example T6 mud responses.

h) I wrote short assessments of each lecture (how they went). See for example T3 mud responses.

i) I changed the grading criteria to provide for 10% credit for the in-class activities.

j) All of this material was integrated into a web page that includes the course notes, the PRS questions asked in class, the answers to the PRS questions asked in class, the homework questions, the homework solutions, the quizzes and solutions, and the muddiest part of the lecture questions and answers. These are all organized in a chronological order aligned with the content in the course.

k) I posted last year’s reflective memo for this year’s and last year’s students to review online.

l) I led the implementation of a web-based subject evaluation process–we used this same process for delivering a web-based math and physics skills diagnostic early in the semester.

m) I did one new demonstration/in-class experiment. For a more detailed discussion, follow the links underneath the “A” answer button.

Basis of Overall Subject Grade

10% Class participation as described in the course notes, 30% homework, 60% quizzes.

Assessment Methods

The use of the PRS system in-class and for the self-assessment activity and the collection of, and response to, all of the mud cards gave me a large amount of data on the class performance (formative and summative). Also, as in the past in Unified, we collected data on time spent on various activities (its primary use is to make sure the class stays within bounds). Links to each of the following data sources as well as to data plots and correlations are contained in the appendices at the end of this memo.

DATA SOURCE FORMATIVE SUMMATIVE FEEDBACK FOR ME
PRS System X

X
Time spent X

X
Self-assessment X

X
Muddiest Part of the Lecture X

X
Web-usage data

X
Homework X X X
Quizzes

X X
Mid-term evaluation data

X
End-of-term SEF data

X
Class attendance

X

Some observations and lessons

a) Sometimes having data can be a little depressing, but it is always valuable. For example, the first PRS question I asked had to do with identifying kW-hr as a unit of energy and only 30% of the class did this correctly (correct student responses to PRS). But if I hadn’t known this, I would have launched off without mentioning units.

b) I observed that I did not teach the material in a simple pro-rated manner in terms of time devoted to different subjects. In the past I have chugged through the notes at roughly a constant pace, independent of the level of conceptual difficulty. This year for example I spent three lectures on the concept of frame dependence of stagnation quantities alone, but very few lectures on the various forms of the First Law. Thus the teaching was tailored to the needs of the class, rather than based on how many words are devoted to the topics in my notes.

c) My personal assessment is that this class learned more (and deeper) than previous classes I have taught. While this is not based on a lot of hard data, there is some supporting evidence.

First, relative to last year, the class felt more confident in their abilities relative to being able to demonstrate the subject learning objectives (based on comparison of self-assessment result this year (six pages) to last year) . One caveat to using this information to support my feeling that the student achievement was increased is that there is no strong correlation between self-assessment vs. homework (several pages) or quiz (several pages) or in-class question (several pages) scores. So even if they feel more confident, it doesn’t necessarily imply they learned more. Potential reasons for the lack of correlation are i) the quizzes aren’t particularly good tests of their understanding, ii) the students are not good at assessing their understanding, or iii) the sample size is too small. Relative to the latter, note also that there is no correlation between time spent vs. performance (several pages), or class participation vs. scores on homework or quizzes when considering the thermodynamics scores alone. However, positive correlations are apparent between class participation vs. quiz scores and homework vs. quiz scores when the data is compiled for all of the subjects in Unified (other correlation plots can be accessed below).

Second, one of the shortcomings apparent in the summative quiz assessment from last year was that the students had difficulty with solving open-ended problems. They improved in this regard this year even though I gave them a harder problem on the quiz (last year was “Describe energy exchange processes for an object entering the atmosphere from space.” Class average was 67% for this question. This year I asked them a problem about energy exchange during the turbine failure that the ABB power plant experienced. Class average was 74%.) Again, due to the small sample size (small number of quiz questions, and the fact that the question was not the same year to year) it is not possible to be definitive.

d) Considering the measurable outcomes independently (see data plots of quiz, PRS, self-assessment, and homework scores organized by measurable outcome), the students performed well on all outcomes except for #4, application of the First Law. Understanding and using the First Law is the core of the material I teach in this subject so this is a little disappointing. Note that both the average quiz performance and the self-assessment results for measurable outcome #4 suggest weakness in this area, but on a student-by-student basis, no such correlation exists (see page 3 of the quiz scores vs. self-assessment correlations). There is no deficiency apparent in the homework and PRS question scores for this learning objective. Again, given the small amount of data (2 quiz questions), it is hard to be definitive. However, it does support my concern that the students in Unified get relatively less time to practice the concepts (one hour of homework for each lecture hour, whereas in other MIT subjects students get as many as three hours for each hour of lecture). This underscores the importance of making the lectures in Unified an important learning experience (the active learning and the PRS system are valuable in this regard).

e) The muddiest parts are tabulated in the web page index. From among these the worst areas of confusion were:

thermodynamic equilibrium, unsteady versus quasi-static, for example

applicability of cv and cp, for example

static and stagnation quantities, see here and here for example

f) The students still have difficulty explaining concepts in words, but this ability improved slightly relative to last year–comparing performance on problem 13 (class average=83%) which was the same as last year’s problem 12 (class average=66%). Also, after they did a poor job explaining in words what the First Law means on quiz 1 (class average = 76% for this question), I asked them the same question again on the second quiz and they improved significantly (class average = 92% on this question).

g) Because of the enhanced feedback with the class I was able to complete the material in about 8% less time than it took last year leaving the entire last lecture open for review and self-assessment activities.

h) The students were terrific in participating in the experiments with the PRS system and believe that it was very effective in improving their achievement of the class learning objectives (see mid-term and end-of-term (histograms, text) evaluation data). A large fraction of the students also felt the web-based mud responses were valuable (see midterm evaluation data and student input on web-usage (multiple pages)).

i) I did not do reading quizzes or assignments (I just couldn’t get it done in terms of time available). I think that forcing the students to come to class prepared would improve their learning. They feel differently (see page 3 of student input file).

j) I had more fun teaching this way (i.e. with the PRS system) than I have ever had before.

k) The time required to implement all of these changes to my teaching style was excessive. Steve and I regularly found ourselves dueling between 12am and 2am to see who could finish their mud responses first. I am hopeful that technology can come to the rescue (see below) and make responding to the questions easier. If not, I will probably only respond to the top few questions rather than responding to them all.

a) Try some of the PRS question techniques that Steve used (like asking them how confident they are in a calculation/problem solution and then asking them to self-report whether they got it correct).

b) Organize the mud responses into a keyword searchable database and work with CAES and EMCC to develop a self-organizing FAQ (frequently asked question) database generation process for the future.

c) Two muddiest-of-the-mud topics showed up again this year after being similarly identified last year: thermodynamic equilibrium and static vs. stagnation quantitites. The first (equilibrium), I didn’t take any action on last time around but in responding to all of the mud cards this year, I think I developed some better explanations (actually, the best one I stole from Ed G.) which should be put into the notes. The second (stagnation quantities) I added examples and more discussion into my notes and had several PRS questions and spent 2.5 lectures on and it was still rough for the students. As a result of this attention, I think they ended up understanding it better, but it took a lot of class time. Perhaps in the future, this would be a good place for a physical demonstration.

d) Formulate a “mud taxonomy” to better understand and categorize the types of questions students ask. Then use this to better target teaching techniques.

e) Implement reading assignments to improve student preparation for the lectures.

f) Ask Greitzer to have the TA that grades his diagnostic (given first day in 16.050) to also grade samples from the previous year to remove year-to-year grading biases and thus give me better input on how well the students are doing relative to the Unified learning objectives.e) Develop similar system of instructional materials and methods for the propulsion part of Unified that I teach in the Spring.

g) Try it all again and see if the workload goes down.

h) Develop a similar set of materials and apply similar techniques for the spring term propulsion lectures in Unified.

People who should see this reflective memo

a) Greitzer or equivalent 16.050 instructor

b) Students from this class

c) Students who will take the class next year

d) Other members of the propulsion division

e) Other members of the Unified staff

f) Brodeur, Soderholm and others working to improve the Department learning environment

Appendices

A. Data Sources

  • Homeworks and solutions are interspersed through the notes (click on the “HW” buttons).
  • Quiz 1 and Quiz 2
  • The muddiest parts are tabulated in the web page index. From among these the worst areas of confusion were:
    thermodynamic equilibrium, unsteady versus quasi-static, for example
    applicability of cv and cp, for example
    static and stagnation quantities, see here and here for example
  • Student self-assessment
  • Midterm Review Data: Histograms, PRS text comments, mud cards text comments, Best parts text comments, Needs improvement text comments, Other text comments
  • End-of-term Review Data: Histograms, text comments
  • Web usage input from students

B. Data Plots for Thermodynamics (note some have multiple pages)

  • PRS scores (organized by problem)
  • PRS scores (organized by measurable outcome)
  • Homework Scores (organized by problem)
  • Homework Scores (organized by measurable outcome)
  • Quiz Scores (organized by problem)
  • Quiz Scores (organized by measurable outcome)
  • Time Spent (organized by homework problem)
  • Self-assessment (organized by measurable outcome), alternate

C. Correlations for Thermodynamics (note some have multiple pages)

  • Quiz scores versus homework scores
  • Quiz scores versus prs scores
  • Quiz scores versus self-assessment
  • PRS scores versus self-assessment
  • Homework scores versus self-assessment
  • Homework scores versus PRS scores
  • Performance versus attendance
  • Performance versus time spent

D. Correlations for Unified as a whole

  • Overall quiz average versus study time
  • Overall quiz average versus homework average
  • Overall quiz average versus class participation
  • Overall class grade (minus participation grade) versus participation

Summary

I completed most of the actions outlined in my reflective memo from the last time I taught this material, (2 years ago). I continued to use various active learning techniques (prs + concept tests) and forms of regular feedback (e.g. mud cards, concept test results, etc.). Based on student performance and subject evaluations, I believe that most of the students met most of the learning objectives. Specific exceptions include their ability to describe the assumptions that underly some of the engineering models applied, and some lack of ability in applying the concepts to situations they haven’t seen before. The first deficiency I can address through stressing the assumptions more when presenting the material to the class (ACTION: Emphasize this more the next time I teach the material). The second deficiency is a result of the small amount of practice (one hour of homework for each hour of lecture) afforded the students in Unified. I think some shift in balance between contact hours and homework hours in Unified would be beneficial (ACTION: Propose such a shift to the Unified faculty).

Learning Objectives

1. What are the learning objectives (expressed as measurable outcomes) for this subject?

The subject learning objectives are contained on the course web page. I do not have any changes to recommend to these.

2. To what extent were you able to integrate the CDIO skills specified for this subject in the Curriculum Plan of 2002?

I did not specifically implement any CDIO syllabus items since these are largely covered in the systems portion of Unified. However, several of the in-class experiments I did required 2.1.2 Modeling and 2.1.3 Estimation and Qualitative Analysis–see item 3. e) below. I would say both skills were taught at the “Introduce” level.

Teaching Methods

3. What teaching methods did you use and what evidence indicates these methods were successful or not?

a) Prepared lecture notes were available on the web for all of the material. These notes have evolved over several years starting with a set of handwritten lecture notes. Each year I try to augment them when I find specific areas of difficulty from mud responses, etc. I am quite happy with them at this point. In the end-of-term evaluations 90% of the respondents rated the web page (for all of Unified) Somewhat Effective or Very Effective. 97% of the respondents rated the prepared lecture notes (for all of Unified) Somewhat Effective or Very Effective.

b) I used 27 concept questions over the 13 lectures with responses taken on the PRS system. The performance and answers to these were provided on the web page. I continue to find these to be very useful for engaging the class in the material while I am lecturing. 95% of the respondents on the SEF rated the in-class exercises as Very Effective or Somewhat Effective. Also several positive comments were made about the PRS/concept questions in the written comments from the end-of-term evaluations. In general my teaching reviews were good, so I think the students found my lectures to be helpful to them.

c) I used mud cards for each lecture and responded to them the evening the lecture was delivered and put the responses up on web. These responses were linked to the relevant areas of the online notes. See for example T1 mud responses. 81% of the respondents on the end-of-term evaluations said the mud cards were Very Effective or Somewhat Effective, however the majority found they were only Somewhat Effective. Nonetheless, I still found the mud cards to be valuable to me for providing feedback on each lecture. Even in cases where there were very few respondents it was helpful, therefore I will continue to use these.

d) I wrote short assessments of each lecture (how they went). See for example T3 mud responses. This was mostly helpful for me, although I did use it to stress the important points from the lecture. I am not sure how many students read these responses. In general, I think that we have saturated the students in terms of available material on the web. Further information goes un-read.

e) I did 4 small demonstrations/in-class experiments. The students seemed to like these activities since they allowed them to apply the subject material to a real problem. They all were of the form where I asked the students to estimate something (using the concepts in class), then did the experiment and then discussed the result in light of their estimates. The activities thus had three primary objectives: to engage the students in the material we were working on and show them how to apply it, to highlight the various simplifications and assumptions in the engineering models we use, and to give the students practice in estimating various parameters required for input to the models (e.g. the volume of the room, or the weight of something, etc.)

f) I had a formal self-assessment activity during the last recitation. I asked the students to grade themselves on the subject learning objectives. I did this largely as an effort to get them to think more deeply about the success criteria in advance of the exam, since based on past history their responses don’t correlate well with their exam performance.


Assessment Methods

4. How was each subject learning objective assessed and what evidence indicates students achieved these outcomes?

I used homeworks and exams to assess student learning. Each of the homework problems and exams was coded to specific subject learning outcomes. The overall weighting was10% class participation 30% homework, 60% quizzes. Also, the use of the PRS system in-class and for the self-assessment activity and the collection of, and response to, all of the mud cards gave me a large amount of data on the class performance (formative and summative). Also, as in the past in Unified, we collected data on time spent on various activities (its primary use is to make sure the class stays within bounds).

DATA SOURCE FORMATIVE SUMMATIVE FEEDBACK FOR ME
PRS System X

X
Time spent X

X
Self-assessment X

X
Muddiest Part of the Lecture X

X
Homework X X X
Quizzes

X X
End-of-term SEF data

X
Class attendance

X

The performance on the homework and the time spent were both good with students on average performing well. The performance on the quiz is shown below with each question labeled in terms of the learning objectives that were addressed. Middle-B performance for the quiz was 79%, suggesting the class on average was about a half a grade below this standard. The lowest performance was on the second question. This was a conceptual question that tested their understanding of the assumptions underlying the engineering models they were applying. I mentioned the specific concept tested in this question in only one class, and then rather briefly. Note however, that it was discussed at some length in the mud responses (I am not sure how many students read these). ACTION: Next time I teach this, I need to give this and other modeling assumptions more time in lecture. The second lowest score was on the fourth problem which required application of the steady flow energy equation. The students are historically weak in applying some of the concepts to situations they haven’t seen before (such as this whereas problem 3 was basically a cookie cutter repeat of a problem they had seen in various forms before). I attribute this in part to the small amount of homework time allowed in Unified for each hour of lecture material (1-1). See ACTION below.

LEARNING OBJECTIVES

PROBLEM 1

#1

PROBLEM 2

#4,#5,#6

PROBLEM 3

#4,#6

PROBLEM 4

#4,#5,#2

QUIZ TOTAL
Mean 88% 58% 88% 63% 74%
Standard Deviation 18% 35% 13% 27% 16%
Weighting 10% 10% 36% 44% 100%

The learning objectives that were not addressed on the quiz (e.g. learning objective 3) were addressed in the homework. The performance on the homeworks was quite acceptable and consistent with historical performance.

Continuous Improvement

5. What did you learn about your teaching and assessment methods this semester?

The most important thing I learned is that teaching with all of these techniques does not take me any more time than the old-fashioned chalk-and-talk ONCE THE MATERIALS ARE DEVELOPED. The first time through took a tremendous amount of time, but this time wasn’t that bad. It was enjoyable. I made a decision this year to use old homework problems rather than develop them from scratch. This was also beneficial because I was able to pick some of the best problems. Previously when I forced myself to generate new problems I produced many that either had errors, were difficult to understand, didn’t stress the key concepts, were too long, etc. Based on my interactions with the students during office hours, most of them appeared to be doing the homeworks without reference to the old bibles, so I will continue to do this. Unfortunately, I will have to continue to generate new exam questions which is always a challenge.

6. How do you use feedback from students and colleagues to improve your subject?

The most valuable feedback I get comes in the form of class performance on the PRS questions. I use this in real time to modify my lectures. Second most valuable is the feedback on the mud cards which I use to modify the next lecture and as a way of updating the course notes (I typically add sections in areas where I get a lot of mud). Reference to the last reflective memo I wrote will show that I followed through on most of the items that I listed as areas for improvement. The muddiest parts of the subject were similar to past years (unsteady versus quasi-static, and static and stagnation quantities) despite the additions to the notes and modification to my lecture strategy, however I think these topics went a little better this year (maybe).

7. What will you continue or change?

I will continue to apply the suite of teaching techniques I am using now.

Summary of recommended ACTIONS

a) Improve my presentation of quasi-static versus unsteady processes and my presentation of static and stagnation quantities (metrics: mud responses and performance on quiz)

b) Underscore many of the assumptions inherent in the models applied (metrics: additions to online notes and lectures, improved student performance on quiz)

c) Propose that the Unified faculty shift the course structure to enable more homework hours for each hour of lecture (metric: change in course structure)

Information Sharing

8. To whom have you forwarded this reflective memo?

a) Zolti Spakovszky (16.050 instructor)

b) Students from this class

c) Students who will take the class next year

d) Other members of the Unified staff

Summary

I completed the actions outlined in my reflective memo from last year. I continued to use various active learning techniques (prs + concept tests) and forms of regular feedback (e.g. mud cards, concept test results, etc.). One new addition was a quiz recap directly following the completion of the quiz. Based on student performance and subject evaluations, I believe that most of the students met most of the learning objectives. One exception is their understanding of the frame dependence of stagnation quantities which is inadequate. This is a long standing challenge.

Learning Objectives

1. What are the learning objectives (expressed as measurable outcomes) for this subject?

The subject learning objectives are contained on the course Web page. I do not have any changes to recommend to these.

2. To what extent were you able to integrate the CDIO skills specified for this subject in the Curriculum Plan of 2002?

I did not specifically implement any CDIO syllabus items since these are largely covered in the systems portion of Unified. However, several of the in-class experiments I did required 2.1.2 Modeling and 2.1.3 Estimation and Qualitative Analysis - see item 3. e) below. I would say both skills were taught at the “Introduce” level.

Teaching and Assessment Methods

3. What teaching and assessment methods did you use and what evidence indicates these methods were successful or not?

  • Prepared lecture notes were available on the Web for all of the material. These notes have evolved over several years starting with a set of handwritten lecture notes. Each year I augment them when I find specific areas of difficulty from mud responses, etc. This year, I worked with Robin to increase the font size for the imbedded equations so they are more readable. I also added to the discussions regarding cp & cv and frame dependence of stagnation quantities. In general I am quite happy with the notes. In the end-of-term evaluations (PDF) 97% of the respondents rated the Web page (for all of Unified) Very Effective (70%) or Somewhat Effective (27%). 100% of the respondents rated the prepared lecture notes (for all of Unified) Very Effective (88%) or Somewhat Effective (12%).

  • I used 25 concept questions over the 12 lectures with responses taken on the PRS system. The performance and answers to these were provided on the Web page. I continue to find these to be very useful for engaging the class in the material while I am lecturing. 100% of the respondents on the SEF rated the lectures Very Effective (64%) or Somewhat Effective (36%). 95% of the respondents on the SEF rated the in-class exercises as Very Effective (53%) or Somewhat Effective (43%). Also several positive comments were made about the PRS/concept questions in the written comments (PDF) from the end-of-term evaluations. In general my teaching reviews were good, so I think the students found my lectures to be helpful to them.

  • I used mud cards for each lecture and responded to them the evening the lecture was delivered and put the responses up on Web. These responses were linked to the relevant areas of the online notes. See for example T1 mud responses. 82% of the respondents on the end-of-term evaluations said the mud cards were Very Effective (30%) or Somewhat Effective (53%), however the majority found they were only Somewhat Effective (53%). Nonetheless, I still found the mud cards to be valuable to me for providing feedback on each lecture. Even in cases where there were very few respondents it was helpful, therefore I will continue to use these.

  • I wrote short assessments of each lecture (how they went). See for example T3 mud responses. This was mostly helpful for me, although I did use it to stress the important points from the lecture. I am not sure how many students read these responses. In general, I think that we have saturated the students in terms of available material on the Web. Further information goes un-read.

  • I did 4 small demonstrations/in-class experiments. The students seemed to like these activities since they allowed them to apply the subject material to a real problem. They all were of the form where I asked the students to estimate something (using the concepts in class), then did the experiment and then discussed the result in light of their estimates. The activities thus had three primary objectives: to engage the students in the material we were working on and show them how to apply it, to highlight the various simplifications and assumptions in the engineering models we use, and to give the students practice in estimating various parameters required for input to the models (e.g. the volume of the room, or the weight of something, etc.)

  • I had a formal self-assessment activity during the last recitation. I asked the students to grade themselves on the subject learning objectives. I did this largely as an effort to get them to think more deeply about the success criteria in advance of the exam, since based on past history their responses don’t correlate well with their exam performance.

  • I wheeled the CFM56 into class for one of the lectures. The students very much enjoyed this. I find that it prompts a degree of interest and a depth of questioning that I do not get otherwise.

  • I used homeworks and exams to assess student learning. Each of the homework problems and exams was coded to specific subject learning outcomes. The overall weighting was 10% class participation, 30% homework, 60% quizzes. Also, the use of the PRS system in-class and for the self-assessment activity and the collection of, and response to, all of the mud cards gave me a large amount of data on the class performance. Also, as in the past in Unified, we collected data on time spent on various activities (its primary use is to make sure the class stays within bounds).

Data Source Formative Summative Feedback for Me
PRS System X . X
Time spent X . X
Self-assessment X . X
Muddiest Part of the Lecture X . X
Homework X X X
Quizzes . X X
End-of-term SEF data . . X
Class attendance     X

Student Learning

4. How well did the students perform on each subject learning objective? (Where possible, make reference to specific data to support your conclusion.)

The performance on the homework (see plot below) and the time spent (see plot below) were both good with students on average performing well. The performance on the quiz is shown in the table below with each question labeled in terms of the learning objectives that were addressed. Middle-B performance for the quiz was 78%, suggesting the class on average (75.4%) was close to this standard. More than half of the points on the quiz were devoted to assessing conceptual understanding (versus the mechanics of solving a problem), consistent with the learning objectives.

  1. To be able to state the First Law and to define heat, work, thermal efficiency and the difference between various forms of energy.
    As evidenced by the performance on questions 1e, 1f, and 1h, most of the students achieved this learning objective. One deficiency surfaced with flow work and shaft work (1d), but I think if I would have directly asked what these forms of work were, the students would have done well. This particular question asked for an extension of these ideas and application to a cycle.
  2. _To be able to identify and describe energy exchange processes (in terms of various forms of energy, heat and work) in aerospace systems.
    _To be able to explain at a level understandable by a high school senior or non-technical person how various heat engines work (e.g. a refrigerator, an IC engine, a jet engine).
    Both of these learning objectives were well-addressed in homework #1 and on quiz question #2. I was pleased with the student performance in this area. In particular, they took home the message (delivered in lecture and recitations) that they were required to describe the workings of heat engines in terms of energy. Reading the answers on the quiz (from most but not all students) was a pleasant surprise.
  3. _To be able to apply the steady-flow energy equation or the First Law of Thermodynamics to a system of thermodynamic components (heaters, coolers, pumps, turbines, pistons, etc.) to estimate required balances of heat, work and energy flow.
    _Historically the students do very well on this learning objective since it stresses the mechanical elements of solving a thermodynamics problem, versus conceptual understanding. The very good performance on quiz problem 3 is evidence of this. There was however, a weakness with the frame dependence of stagnation quantities as seen on problem 4. Due to having one less lecture this year (this was replaced with the review session after the quiz), I had to spend a little less time on this topic. It is always the most difficult topic for the students conceptually anyway, and their performance suffered.
  4. _To be able to explain at a level understandable by a high school senior or non-technical person the concepts of path dependence/independence and reversibility/irreversibility of various thermodynamic processes, to represent these in terms of changes in thermodynamic state, and to cite examples of how these would impact the performance of aerospace power and propulsion systems.
    _The students performed well in this area as evidenced by quiz questions 1a, 1b, 1c, and 1g. Indeed, the biggest change from last year was the performance in understanding the requirements for and implications of quasi-equilibrium. This was the weakest part of the quiz performance last year so I focused on it more in lecture - to good effect I think.
  5. To be able to apply ideal cycle analysis to simple heat engine cycles to estimate thermal efficiency and work as a function of pressures and temperatures at various points in the cycle.
    Their homework performance and quiz performance (3a, 3b, 3c) on this learning objective were very good.

Quiz 1a 1b 1c 1d 1e 1f 1g 1h 2 3a 3b 3c 4a 4b Total
LO# 4,5 4,5 4,5 1,4 1,4 1,4 5 1 2,3 4,6 4,6 4,6 4 4 na
Mean 95% 82% 78% 41% 84% 85% 85% 93% 75% 98% 94% 80% 63% 35% 75.4%
Weight 5% 5% 5% 5% 5% 5% 5% 5% 16% 6% 12% 6% 8% 12% 100%

Continuous Improvement

5. What actions did you take this semester to improve the subject as a result of previous reflections or input from students or colleagues?

  • The major action was introducing a quiz recap/discussion directly following completion of the quiz, under the presumption that this was a “teachable moment”. Most of the students in the class found this helpful as noted in the response to the PRS question “I found the quiz review after the exams helpful, and think they are a better use of time than adding another lecture; 1=strongly agree, 2=agree, 3=neutral, 4=disagree, 5=strongly disagree” shown below. I too found this to be a good addition - worth the trade of an additional lecture hour. It enabled me to explain how the quiz was designed, to address questions, and to discuss scoring rubrics.
  • I improved my delivery of quasi-equilibrium processes (evidenced on the exam performance and mud responses) by focusing on the physical implications and physical examples. The students had a much easier time understanding the time to reach equilibrium when applied to a coffee cup.
  • I worked with Robin to increase the font size for the imbedded equations in the Web notes so they are more readable.
  • I also added new discussions regarding cp & cv and frame dependence of stagnation quantities based on previous years’ mud responses.
  • I improved my presentation of cv & cp, in particular the implications for them being only a function of temperature for ideal gases. I did this with reference to doing an experiment - the physical example seemed easier for the students to grasp. I have added this example to the notes.

6. How do you use feedback from students and colleagues to improve your subject?

The most valuable feedback I get comes in the form of class performance on the PRS questions. I use this in real time to modify my lectures. Second most valuable is the feedback on the mud cards which I use to modify the next lecture and as a way of updating the course notes.

7. What will you continue or change?

  • I will continue to use the suite of teaching and assessment methods.
  • I will continue to schedule a quiz recap directly following the quiz.
  • I will add some additional discussions to the notes focusing on the physical implications of quasi-equilibrium processes. This went well this year and I need to capture it in the notes.
  • I will continue to seek a new strategy for discussing the frame dependence of stagnation quantities. I will schedule a meeting with Darmofal and Greitzer to see if we can brainstorm some ideas.

Information Sharing

8. To whom have you forwarded this reflective memo?

  • Prof. Zoltan Spakovszky (16.050 instructor)
  • Students from this class
  • Students who will take the class next year
  • Other members of the Unified staff

The following tables contain thermodynamics and propulsion concept questions from current and previous semesters.

16.01-16.02

LEC # TOPICS CONCEPT QUESTIONS
T1 Course Introduction and Thermodynamic Concepts Q2.1 (PDF)
Q2.4 (PDF)
T2 Changing the State of a System with Heat and Work Q2.2 (PDF)
Q2.3 (PDF)
Q2.5 (PDF)
Q2.6 (PDF)
Q3.3 (PDF)
T3 Finishing State Changes, Starting First Law Q3.2 (PDF)
Q3.5 (PDF)
Q4.1 (PDF)
T4 First Law Q4.2 (PDF)
Q4.4 (PDF)
Q4.5 (PDF)
Q4.7 (PDF)
Q4.12 (PDF)
Q4.13 (PDF)
Q4.15 (PDF)
T5 First Law, Enthalpy, Specific Heats Q4.11 (PDF)
Q4.14 (PDF)
Q4.8 (PDF)
T6 First Law, Enthalpy, Specific Heats, Introduction to Heat Engines Q5.3 (PDF)
Q5.5 (PDF)
Q5.7 (PDF)
T7 Heat Engines (cont.) Q5.4 (PDF)
T8 Steady Flow Energy Equation Q6.1 (PDF)
Q6.2 (PDF)
Q6.3 (PDF)
Q6.4 (PDF)
T9 Shaft Work and Flow Work Q6.9 (PDF)
Q6.6 (PDF)
Q6.7 (PDF)
Q6.12 (PDF)
T10 Stagnation Quantities Q6.11 (PDF)
Q6.19 (PDF)
Q6.14 (PDF)
Q6.18 (PDF)
T11 Reversible and Irreversible Processes Q7.2 (PDF)
Q7.4 (PDF)
Q7.9 (PDF)
Q7.10 (PDF)
T12 Entropy Q7.1 (PDF)
Q7.3 (PDF)

 

16.03-16.04

LEC # TOPICS CONCEPT QUESTIONS
P1 Introduction to Propulsion and the Integral Momentum Equation Q1 (PDF)
Q2 (PDF)
P2 Integral Momentum Equation Q3 (PDF)
Q6 (PDF)
Q9 (PDF)
 
P3 Applications of the Momentum Theorem, Definition of Efficiencies Q8 (PDF)
Q10 (PDF)
P3 Lecture Slides (PDF)
P4 Aircraft Performance and Mission Requirements, Relationship with Propulsion Q13 (PDF)
Q14 (PDF)
Wing loading slide (PDF)
P5 Rocket Performance Q15 (PDF)
Q16 (PDF)
Q17 (PDF)
P6 Rocket Performance and Connection to Engine Design Parameters Q19 (PDF)
Q21 (PDF)
P7 Cycle Analysis of a Gas Turbine Engine Q22 (PDF)
Q18 (PDF)
P8 Energy Exchange with Moving Blades Q31 (PDF)
Q24 (PDF)
P9 Energy Exchange with Moving Blades (cont.) Q25 (PDF)
Q30 (PDF)
Learning Resource Types
Lecture Videos
Course Introduction
Competition Videos
Problem Sets with Solutions
Exams with Solutions