I teach physics and grade with a standards-based grading system. We’ve been formatively assessing (and reassessing) all semester. In my view, the purpose of a summative assessment like the semester exam is to show (a) sustained mastery of the skills taught in the class and (b) synthesis and depth of understanding (that is, knowing both when and how to use each skill in a larger context).
My flavor of SBG
We (the physics teachers here) split the skills that we teach into two categories. “A objectives” represent the most basic skills of the course; they often involve modeling a situation using diagrams. In order to pass the class (have a grade >= 70), students must demonstrate mastery on every A objective. “B objectives” are more complex skills and often involve solving problems. In order to earn a grade above 90, students must demonstrate mastery on every B objective.
Our system is essentially all-or-nothing. Throughout the semester, students received feedback on tests by getting a 0, 1, or 2 on each objective. 0 and 1 both mean “no” (but a 1 indicates developing mastery while a 0 indicates no mastery). 2 means yes. In the first semester, most reassessments were student-initiated, and most students took advantage of the breakfast and dinner time (boarding school) to do them. There were some in-class reassessments (both teacher and student initiated) as the semester drew to a close.
Format of the exam
My honors and “normal” classes had fairly similar formats, so I’ll mainly talk about the normal classes’ exam here. The exam was split into two sections. The first section (about 45 minutes worth) consisted of short, structured questions that aimed to isolate and assess all 15 of the first semester B objectives. The second section was a selection of 5 goal-less problems from which students chose two. Students had up to three hours to complete the test (I meant for them to finish in about 1.5 to 2 hours).
Goal-less problems were practiced throughout the year. They give the set-up of a situation, but they don’t ask any specific question. The general approach to these problems is supposed to be:
- Say which models apply and why
- Start to model the situation by drawing as many of the appropriate diagrams/graphs as you can
- Use your diagrams/graphs to do calculations and find as many quantities as possible
Here’s an example of a goal-less problem (click to make it a bit bigger) from our conservation of energy unit:
Mapping objectives to a final numerical grade
Of course, at the end of all of that demonstrating, I need to come up with a number on a scale (0 to 100) that is more precise than the instrument with which I’m taking the measurement. So. How do you go from scores on objectives to a number grade?
We decided that it wasn’t fair to “lose” an A objective on the exam (that is, to think you are passing the class before walking into the test and leave failing it). We also didn’t want to use one data point to completely determine the final grade. Any grade above 85 is supposed to be considered an “honors” grade at our school, and we decided that you couldn’t have done honors work without being able to show that level of mastery on the exam. Considering all of that, we came up with a pre-exam grade range for each kid that told them the minimum grade (something generally between 70 and 84) and the maximum grade (100 for everyone) that they would be getting for the semester. The only way they might get a grade below 70 was if they hadn’t shown mastery on every A objective before the exam (only one student had any A’s left, and he did show it on the exam, so everyone passed).
For each exam, I took a list of the B objectives and marked it up. I highlighted the ones they hadn’t shown before the exam, and I made + or – marks next to each based on their work on the exam. For any B that had any – marks, I looked back at their work and decided whether the + work outweighed the – work (for example: if they – had come from only a unit or calculation error rather than a major conceptual error). For any that they were still missing, I went back and looked at their history on that objective (I’ve been using https://activegrade.appspot.com/ since the start of the 2nd quarter) and decided whether this new evidence was enough to outweigh their history (assuming they had shown fairly consistent mastery before the exam). In the end, I circled any B objectives that they still didn’t have.
I used a simple linear interpolation to determine a grade between 70 and 90 based on the number of B objectives they had shown. If that number was less than their minimum grade (this applied to only 3 of my 42 students), then their grade for the semester was simply that minimum number.
The 90 to 100 range was determined through their work on the goal-less problems. We had a point system (not grade points, simply a way of quantifying what they had done on the goal-less problems without respect to any grade value) to judge the work that they had done.
Stating a model that applies AND WHY IT APPLIES = 1 pt
Drawing a diagram = 3 pts
–> annotating the diagram = +1 pt
–> using the diagram to calculate something = +2 pts
Stating a fundamental principle before you use it = 1pt
Doing a calculation = 1 pt
–> an especially clever or complicated calculation = +1 pt
A decent amount of points to get on a problem is usually around 25 to 30. An exceptional solution usually gets at least 50 or 60 points. So a sum of 50 to 60 would be good, and a sum over 100 would be excellent. One of my non-honors classes averaged a sum (over two problems) of 55 points, the other averaged 40 points (so an average of 47 points for both classes). My honors classes averaged a sum of 57 points. (More data: high/low for regular physics was 96/19. For honors it was 128/15.)
While I didn’t have a precise way to translate that score into a concrete part of their semester grade, I did make a cut-off: unless a student had a sum of at least 40 points, their goal-less problems did not improve their grade (outside of being able to gain a B objective from them).
Where do we go from here?
First, I should say that many (almost all in Honors) had the best exams that I’ve ever seen (in my 4 years of teaching physics). I think a good portion of the credit goes to the grading system. My students have been calm and interested all semester. Their course evaluations (a good subject for a future post) indicate that they almost universally LOVE this system and realize the benefits (being allowed to make mistakes, learning at their own pace, working on a skill until it is mastered). They also indicate that they were not completely happy with taking their progress on objectives and turning it into a number grade. I’m not completely happy with that one either, and I think we need to keep playing with how to do that.
One amazing moment was handing back the exams at the end of class early last week. In one of my sections the students were just stunned: “Woah… there’s a number on this. What does that mean?”
Thanks for this, I’ve been struggling with the concept of a summative assessment while I’ve switched to SBG this semester for one of my classes. Currently I’ve decided to do away with them and just do the standards (re)assessment, giving the students as many opportunities as they’d like. I am giving a final, however, that’s worth the last 5% of the class where they have to write a sentence or two about every standard.
What you’re doing seems very reasonable, especially how you take into account the student’s track record when deciding on the grade. My classes are certainly small enough that I could do that very thoroughly.
I’d also like to incorporate some way for the kids to have more input on their final grade. I’m just not sure how to fit that in, yet.
Welcome to the blogosphere! I’m still new to the whole thing as well, except as an Aussie chemistry teacher. I look forward to reading your blog!
http://chemistrychris.wordpress.com
Thanks, Chris. I’ll add your blog to my reader. 🙂
Thanks for sharing your system. I’m working on modifying my objectives for next year to include two or three levels.
We tried to do three levels, but we found that it was too difficult to reassess the third level since ours mostly had to do with knowing when to use which model. You can’t isolate those skills on student-initiated reassessments! So that has to come from the exam (we figured in-depth responses to goal-less problems basically demonstrated the skills we meant by our C category). The problem is going to be finding a way to come up with a quarter grade above 90 since we don’t have an exam at the quarter.
Welcome to blogging. +1 subscriber now. And goal-less problems? Fantastic.
Thanks, Jason. Like I said, they were inspired by Matt Greenwolfe. I’ll make a future post about them and include some sample student work so you can see how the kids approach them in practice.
I’ve used something similar (though not as fun as I’d imagine they are in physics) called “Creative Exercises.” I’d write a formula or statement (like “8.2 mg of NaCl”), and the student writes down as many things about it as they know or can calculate.
[…] teacher, presents an innovative approach to exams that give students a chance to show synthesis in SBG & Semester Exams posted at Physics! […]
[…] to find a way to assess this skill more. Goal-less problems are a great start, and I really like Kelly O’Shea’s exam design, that gives students an opportunity to show […]
Hi Kelly. I’ve been thinking a lot about these goal-less problems. Elsewhere (http://educating-grace.blogspot.com/2011/02/like-box-of-chocolates.html) I wondered if these were just exercises in identifying problem archetypes. I was hoping you could respond to that since you have had actual experience with them. If you respond in the comments, please let me know via twitter (@jybuell) but if it’s a blogpost it’ll show up in my reader. Thanks!
Hey, Jason. By problem archetypes, do you mean that it is about the kids deciding whether a problem is a “momentum problem” or a “kinematics problem” etc?
Anyway, I am starting to write a post specifically about goal-less problems. I need to get back to my office and scan some of the exams so that you can see what students actually do with them. (I’d like to scan some student work from the packets, too, but the kids are gone for ~3 weeks now, so that will have to wait.) I’ll give you some quick thoughts about them here, then expand on them later this week in a whole post. Oops… warning: I’ve been advised to work on being more “pithy”, but haven’t gotten around to practicing a whole lot yet.
1) Goal-less problems actually aren’t problems, but just situations. There isn’t anything specific that I’m expecting them to find; instead, they are supposed to model the situation and then calculate anything that is interesting. In my honors classes, the kids almost always calculate things that I didn’t even think about in my own (admittedly quick rather than thoughtful) solutions. The more important aspect, though, is the modeling. And for the modeling, the most important part is drawing diagrams. In honors, they currently have about 11 that they know how to draw so far (x-t graph, v-t graph, a-t graph, motion map, free body diagram, system schema, IF chart [for cons. of momentum], F-t graph, F-x graph, LOL chart [cons. of energy], energy pie charts). It might even be possible (or really helpful) to draw more than one of a type of diagram (for example: drawing a Vx-t and a Vy-t graph for a projectile). The next step is to use the graphs or diagrams to write relationships. For graphs, they can write relationships for the slope or the area (if either has a physical meaning). For diagrams (like the FBD, IF, or LOL), they can usually write equations that have to do with a quantity being balanced or unbalanced. Finally, they can see if they have enough information to use those relationships to get quantitative answers.
2) One thing I really like about the goal-less problems is that they encourage the kids to take risks with their solutions. When there is a known goal, students usually seem to only want to write things down that are direct steps along the path toward that goal. Many seem unwilling to write anything down at all until they see every step they will have to take in order to get to the goal. With the goal-less problems, everything they do is the right thing to do (as long as they are applying the models appropriately). No graph or diagram is a “waste of time” in their quest to finish the worksheet. They discover a lot of things that they wouldn’t have if I had asked only one question. And they themselves get to decide what is interesting to ask about the situation. Remarkably, even problems that seem leading to me (usually I’ve simply removed the final sentence from a book problem, so I know exactly where the author was going with it) don’t seem to have a simple end to the kids. They use more than one fundamental principle. They draw energy or velocity graphs for a problem that, were the question intact, would not suggest such practices at all.
3) They allow for instant, no-effort differentiation. In my intro classes, where the math levels are seriously varied (junior boy in Algebra 2 alongside senior girl in Calc BC…??), different groups can reach different depths with the same problem without one group feeling behind or slow (as they would if there were parts a-g and they were still on part b when the other people were finished). Some are still working on objectives from earlier in the year, and slow down when trying to apply those to a new problem. Others have long-since mastered those early skills and are playing with something else on the new problem.
4) “Old” skills are constantly being used in units where I wouldn’t have touched them before. Since the kids worked so hard to master those v-t graphs, of COURSE they are going to draw them when working on an energy problem. They are also going to see pretty quickly that it is a lot faster and easier to use the LOL to calculate the final velocity than to use the v-t and a-t graphs for certain problems.
5) The cool thing in Honors Physics right now is that we are doing a conservation of momentum unit part 2 (the first time was before we invented energy), and I start them out with some goal-less problems. I suggest that they draw, for example LOLLL’s, or otherwise consider more than 2 states in a situation. They are figuring out when to pick states so that it is easy to use cons. of momentum and when to pick the states so that it is easy to use cons. of energy, and they are figuring it out by simply playing. I just put a situation in front of them (for example: a ballistic pendulum) and say “go.” So cool!
Hmm.. I think I should stop the stream-of-consciousness rambling here. I’m interested in any questions you have, though. And I promise the eventual blog post will be more structured.
Hi, I am new to your site, but I can’t stop reading. You mentioned that you would post example goalless problems. Did you do this? And where could I find it? Also, the have my kids do motion diagrams, x-vs-t, v-vs-t, a-vs-t…. But I was unfamiliar with LOL, IF charts and system schema. Is there somewhere where you blog about these more?
Thanks
Hi Ryan,
Sorry for the slow response (still digging out from the start-of-school emails). I think you’ve found the goal-less problem examples, but here are the links to the other diagrams (if you haven’t found them already). All of that sort of post is generally collected under the Model Building tab on this page. 🙂
https://kellyoshea.wordpress.com/2011/10/15/building-the-balanced-force-particle-model/ (System Schemas, among other things)
https://kellyoshea.wordpress.com/2012/03/05/energy-bar-charts-lol-diagrams/ (LOL diagrams)
https://kellyoshea.wordpress.com/2011/12/17/momentum-bar-charts-if-charts-iff-charts/ (IF charts)
Hope that helps! 🙂
Kelly
Kelly,
No need to apologize. I completely understand the grind of the first month and getting your kids/parents into a routine. Thank you for responding to my comments! Keep Calm and Physics On! 🙂
Thanks so much for sharing this. I have been wanting to take a shot at this. Would you mind sharing your list of mastery skills or objectives? It seems like so much work has to go into decomposing the discipline (or standards) into isolated skills that are “easily” assessable.
I just made a new post that is a long list of my objectives so far this year (3/4 of the way through). Here you go!
OK. Question about the assessment system. Are the goal-less problems on the final exam meaningless for students with less than 90? Or did I miss something?
Students can demonstrate mastery on an objective by doing it on a goal-less problem. For the Honors kids, not all of the B objectives were covered in the first half, so that was potentially very relevant. Really excellent work on the goal-less problems was enough to bring someone up to a 90 (or even 91, 92) even if they were missing one or two B objectives.
[…] no fixed percentage requirement. In fact, Kelly O’Shea and Mark Hammond have come up with a pretty ingenious system that I’ve never been able to fully wrap my head around until Kelly explained it to me for the […]