Sunday, April 17, 2011

Migration to WordPress

I've decided to move on over to WordPress for my blogging. The main reason is easy equation editing, though there are some other functionality issues as well. I've ported all the material from this blog over there. The new site is:

http://arundquist.wordpress.com

and the new title is "SuperFly Physics". Hope to see you there. -Andy

Thursday, March 31, 2011

Shared labs

Last year I tackled a common problem in our Modern Physics lab: too many students, not enough copies of the equipment for them all to do the famous labs of the early 20th century. My solution was to have them all get a little time on the lab/equipment but not let anyone do the whole lab outright.

Here's how it worked. In the first week they break into four groups, one for each experiment (Franck-Hertz, Millikan Oil Drop, e/m measurement, angular momentum study (the TeachSpin one with the magnet in the cue ball)). In that week they were tasked with writing the theory and set-up/procedure section of the eventual write up. That's it. They were given manuals for all the equipment and, of course, they had their text books from class to help with the theory section.

In the second week they come in and are expected to only do data collection. On a different apparatus. For a different experiment. The only thing they are allowed to use is the write up from the previous group from the previous week.

In the third week they analyze data. For a different experiment. From a different lab group.

Then the Frankenstein lab reports are mashed together and turned in. It worked pretty well last year and I'm between weeks 1 and 2 this year.

At first I told them that the last group would get the full grade for that lab report. That didn't go over very well. We ended up deciding that I would grade each group and their contributions, trying to be careful not to penalize a group for some other group's poor work.

Spreading it over several weeks was done because some of this equipment is finicky and the analysis can be tricky. Typically I don't like students to have to do too much outside of our three hours per week together so the three weeks is about the right amount of time is you consider all the analysis and all the writing being done in class. What was cool, though, about the three week spread was how the groups pushed each other. If a group knew that it was going to have to take Millikan data, they would pressure the set-up group to make sure to give them all the details they would need. Last year Millikan gave us some fits and it was cool how the data collection group worked with the setup group to figure out the glitches.

I see a lot of value to the way this is set up. In three weeks they get exposed to different aspects of three different labs. They understand the value of a carefully written setup section and the value of carefully organized data. I do it because I don't have enough equipment to do it other ways but I have to say I like this solution.

Wednesday, March 23, 2011

Global Physics Department

Announcement: Next week John Burk (@occam98) will show us how he uses Tracker in his teaching. Wednesday 3/30/2011

Tonight was the second installment of the physics educators elluminate chat. This time we had 15 different people log in and join us. Here's the recording of the bulk of the session.

Tonight we tried having a mini-presentation at the beginning (me talking about "momentum is king") and then we chatted for about 20 minutes about issues around teaching momentum, force, and energy.  The one sentence version of my take is that momentum is king, force keeps track of momentum swaps, and energy is a handy way of figuring out how much momentum an interaction can swap in total. We then group edited a google doc with our names and interests  to try to facilitate the community building that we're clearly all craving (why else would we be online late on a Wednesday night when, say, some of us are on spring break enjoying a fun stay at a hotel with 5 pools, ping pong, a basketball court, and where kids eat free!). Then we broke into a spontaneous discussion of Standards Based Grading (since, as one participant emailed me later, there were some serious heavy hitters in that department present).

Please think about joining in next week. We'll stick with the same time (9:30pm EDT/8:30pm CDT) on Wednesday. We don't have a theme for next week yet but I hope one will grow organically in the next few days (along with a possible presenter if that fits).

Here are the useful links again:

What's in a name?
I've coined this phrase "Global Physics Department Meeting" and I'm not sure if it has legs. I guess I'm trying to capture the spirit of the meetings as people are enjoying sharing with each other.

Saturday, March 19, 2011

If it spins, it doesn't flop

Me: "IF IT SPINS . . ." Audience: "IT DOESN'T FLOP!" This is a shot from my Piper Physics Patrol showing the concept of angular momentum. I hold the wheel right over the student's head and let go with my left hand (the one without the rope). It doesn't flop and so doesn't hit his head. Fun times.
Posted by Picasa

Wednesday, March 16, 2011

Wednesday physics chats

Tonight my twitter physics buds and I tried an Elluminate chat. Here's the recording of some of it. It was a lot of fun as we talked about standards in physics courses, a database of (re)assessments, Momentum as King, and more.

We thought we'd try to make it a weekly thing so here's the details:
Wednesdays
8:30PM central time
http://tinyurl.com/RundquistOfficeHours that's the link you use to get into the elluminate session

We thought we'd use Elluminate instead of a twitter chat because you gain a lot more functionality. Tonight we had eight participants but the system can handle up to 100 (I think) so please think about joining us.

We talked about leveraging some of the functionality and asking people to occasionally give a mini-presentation to set some context. Next week we'll start that off with a presentation from me on my thoughts on Momentum as King. This will involve talking about both force and energy as accounting tricks, lots of fun.

As just a teaser for how cool these chats can be, at one point tonight I made an offhand remark about how we could all share the first three things we teach in an intro physics course. Wow, with just eight people I think we got ten opinions! It lead to some great conversation.

Thanks to all who joined me tonight. Please comment below about your impressions of tonight when you get a chance.

Friday, March 11, 2011

More on collaborative oral assessments

Today was the last of four consecutive class periods dedicated to oral assessments in my Theoretical Mechanics course. I've written about these assessments before. After that last post some of my tweeps said they'd love to see some of the assessments in action. I asked the students if it would be ok if I video taped (that's such an anachronistic phrase, what's the modern version?) the class and they gave me the thumbs up. Below are a couple videos from today.

In both I start the clip after the student has finished working the problem. This is when the whole class has a conversation about how to assess the work.



In the one above the student was asked to show the steps necessary to determine the motion of a double pendulum when the first rod is replaced by a spring. He got a little hung up on the potential energy but eventually got it all down. The students and I decided it was a 3 (meets expectations) but it was interesting to hear the debate about whether it was a 4 (exceeds expectations). That's when I introduced the concept that I brag to my colleagues about any 4's that ever happen. I say pretty bluntly that I wouldn't brag about that performance.



In this one the student had to solve the following problem:



Here it's neat to hear the students making the case that this is a 4. And, true to my word, I bragged about this one to my colleagues later in the day.

I really like how open the class is about these assessments. They're honest with each other and they're very thoughtful about their comments. This class has sure been a lot of fun so far.

Tuesday, March 8, 2011

Draft grading

First, sorry if you came here under false pretenses. This is about doing multiple drafts of a grading run, not grading drafts of student papers. Stick around, though! Shoot, lost 'em. Oh well, here are my thoughts about this anyways.

I like to grade on the bus when I have time where I can't do much else. I don't have internet on the bus and I don't own a smart phone. When I started grading papers with my voice I thought I'd have to give up the bus part but I found the bus time could still be useful if I printed out the paper to read on the bus and then screencast my comments later.

Today on the bus I realized that I really like this new set up. Have you ever marked up a paper, putting tons of effort into a comment on a particular page and then later realized that they fixed the problem on a later page? I used to hate that because I'd have to back to the earlier comment and cross it out or add "nevermind" or something. With this new system of mine they never see the paper copy so I can mark it up any way I like. I don't have to worry about mistakes I make because the student will never see them.

All I have to do is make sure there's enough there for me to know what I was thinking when I get around to screencasting my comments to the student. This is usually the next day but sometimes a few days go by before I remember to do it. When I see a crossed out comment or "nevermind" I know to just skip that, or, possibly, comment to the student that the order of the paper was a little different than I thought it should have been.

The other thing I like about my bus system is that I go through the whole stack of papers before doing my final "draft" of grading with the screencasts. This helps me normalize my grading.

It's funny how I encourage my students all the time to use a drafting process on things and now I'm seeing a similar benefit in my own grading. I'd love to hear how others have benefited. Use the comments or let me know on Twitter @arundquist

Saturday, March 5, 2011

collaborative oral assessments

Yesterday was a great day in my Standards-Based Theoretical Mechanics course. It was the first of four consecutive days of scheduled oral assessments, a time to pause the flow of new material and let the students have a chance to catch up on the standards that we've covered so far.

Every fifteen minutes I randomly selected a student and a standard. That student went up the to board and I gave them a situation off the top of my head for them to deal with around that standard. Four students went to fill the hour and lots of really cool things happened.

First off, no one got a 4 (exceeds expectations) and no one clamored for one. In fact, one of the coolest things that happened was the class-wide discussion about what score to give each student. They all know I get the final say but the sbg-inspired way that no grade is truly final loosens things up a bit. One student was stuck on something and wanted to approach a problem from a different direction. I said "sure" and let him have at it. When he was done I asked the class what other standard he had just done. It took a while but they realized it was a standard from a completely different chapter than the one he was at the board to do. He said "oh, if I had known it was that one I would have done it much better!" I asked the class if he should get an assessment for both standards and it was great to hear them debate the value of that. Eventually we decided that, yes, he should get two scores. We focused for a little while on the notion that he would have done better if he had named the standard and one student said "well, we need to know these things and know the connections among them." Awesome.

At the end I asked some questions about how to improve the process since we have three more days of it next week. One thing we discussed was that I talked too much. When students got stuck I would give them some hints, or, more often, ask them a question to get them thinking about it from a different direction. What they want is the ability to add those hints instead of me. But one student pointed out that it had happened a couple of times already and sometimes the students in the audience are too quick to offer help. It was decided that I would be the gate keeper, deciding when help is needed, but that they would provide the help. I'm really excited to see that in action on Monday.

At the end I also asked the five students who weren't picked whether this was a good learning experience. They commented that it was neat to see how other students make similar mistakes. It was also good for them to see slightly different approaches to things. And of course they liked to see what follow up questions I and others had.

Overall I was really pleased with how it went. I'm glad we're taking this extended break from new material to give them a chance to really show me what they've learned. They've also really picked up the pace in asking to check out my LiveScribe pens so I'm looking forward to lots of pencast assessments this week as well.

Friday, March 4, 2011

Online pseudoteaching

Another in a series of pseudoteaching blog post organized by John Burk and Frank Noschese.They coined the term with the following definition:
Pseudoteaching is something you realize you’re doing after you’ve attempted a lesson which from the outset looks like it should result in student learning, but upon further reflection, you realize that the very lesson itself was flawed and involved minimal learning.
They further refined it by pointed out that not only did you think the lesson was "good" but so did the students at the time and perhaps even an outside observer if there had been one.

I'd like to talk about some pseudoteaching of my own, specifically in the fully-online courses I've taught in a program for an alternative physics teaching license.  This program takes licensed science teachers and gives them a path to an additional physics license in the state of Minnesota.  Each cohort of roughly 20 teachers takes an online course with me in the fall (thermodynamics) and another in the spring (modern physics).

I wanted to build in opportunities for community building and group learning into the courses.  I had found (though I wasn't really surprised) that forced discussion board posts weren't really doing that so I came up with a grand plan.  The students were stressed out about the homework and wanted further help. I proposed that we try to mimic a technique we had done in our in-person classes (done in the summer) where we worked to produce what I called "road maps" for a problem. These are not solutions but rather statements about what physics is necessary to get to a solution. Here's an example for a falling object problem:

  1. Potential energy is converted to kinetic energy
  2. height is needed for potential energy
  3. speed can be found from kinetic energy
Of course the problems were typically much harder but hopefully you get the gist of how minimalist the road maps are while still being quite useful to students. We had found in the summer that, since they were all teachers, it was great practice to develop these road maps to think about how to guide students without giving everything away.

Ok, this sounded great and we talked about how to pull it off online. I used Blackboard back then and so I randomly put the students into six groups because every week there were six homework problems to be done. The homework was due on Sunday so I required the groups to work collaboratively on their assigned problem and to post a road map for it by Wednesday night. Everyone would still turn in all six problems on Sunday but now they'd have a road map for each.

The first time I did it I worked very hard to keep up with what each group was doing.  The issue was that they had to find the solution to the problem first and then figure out the road map, all by Wednesday. This often proved too difficult as they would start to just post more of a bare-bones solution.

To combat that I had the next great brainstorm: provide full screencast solutions to each group! This way they had the solution right away and could use their teacher talents to really come up with a road map rather than expending energy on the solution first. Cool right? That's what I thought, at least, and I think the students were jazzed about it too, not least because now they really only had to do 5 problems per week.

Why is this my example of pseudoteaching? Because the students didn't do well on the homework. Of course they'd nail the one problem but in our interactions in my online office hours (held on Thursday nights) and in grading their homework I'd see that they weren't synthesizing the material. In fact, often they'd do the steps of the road map, but not see the overall picture of the chapter or problem. Even worse was the lack of retention as we'd move through the class.

It became too difficult to ensure everyone was contributing ("I agree" became a common post) and I was frustrated with the lack of learning. In hindsight, it seems that they were simply trying to do what they could to help each other complete the homework set.  What it evolved to was a near step-by-step method for doing the problems (use equation X, then divide by the rest mass, then plug in to equation Y, . . .). I would comment to a group with phrases like "that's too much detail" but I found it difficult to get the students to engage with the material as a whole and the learning/synthesizing that I was looking for wasn't happening to the degree I wanted.

So what now? These days my online course is run in a much more individual way. I regret the loss of community and I'm still looking for ways to get that back in. But as far as learning is concerned, my most recent class was a big leap forward compared to the past. Now I give every student full access to all my screencast solutions on Sunday. On Thursday I still have online office hours to talk about any issues they're having with either the homework problems or the chapter concepts. Then on Friday morning I post a new problem that is similar to one or more of the six screencast problems. The students then have until Sunday to provide a screencast of their own of the solution.

What I like about this new method is that they get the full benefit of seeing how typical problems are done but they still need to synthesize the material as best they can before our Thursday sessions. And of course, I love hearing their voices.

Sunday, February 20, 2011

Flipped SBG with voice so far

I've been teaching with Standards Based Grading (sbg or #sbar on twitter) in my advanced-level Theoretical Mechanics course (sometimes called Classical Mechanics) for a few weeks now and I wanted to post some of my impressions early on in this pedagogy experiment.

A quick sbg primer first: I determine a set of standards that I want students to know.  For this class you can see them the same way my students do.  Each standard is assessed multiple times for each student with the last assessment acting as the current score for the grade book.  Students can initiate reassessment at (nearly) any time and I can also initiate (re)assessments.  That's the nuts and bolts.  The philosophy is much broader and I'm greatly indebted to a passionate online community to help me wrap my brain around it.  A recent collection of blog posts called the SBG Gala #5 is a great place to get a flavor of that great community.

My implementation: I put my content online ahead of time (sometimes called a flipped classroom or teaching naked) and use class time for problem solving practice and occasionally assessment.  I've been teaching "flipped" like that for a while and you can check out a lot of my older posts to see some of how I do that.  For assessment I've decided to focus on students' voices and so I've required all assessment to include some version of their voice. That means students are doing pencasts (using LiveScribe smartpens), screencasts (typically using Jing on their own computers), and in-person office visits.  I'd like to also see online office visits and oral "exams" in class. I've changed my grade book so that students can see their progress with a Google Annotated Timeline flash plot of every assessment of every standard on one graph (note, seems not to work in Safari for some reason).

How it's going: So far I've had one student (of only 9 total - the reason I was willing to do this experiment on such short notice) who has really embraced the system. He's turned in at least one assessment for almost every standard (that is available - bold on the standards list) and many standards have seen multiple reassessments.  With one exception his reassessments have done as well or better on the rubric than the last and he has told me privately that he really likes the system.  A few middle-of-the-roaders have turned in a few assessments and are promising more and a few not-quite-sure-what-to-make-of-it-ers haven't really turned in anything yet.  I required them all to turn in one screencast early on to make sure that they can get Jing and screencast.com to work so the technical details have been mostly solved.  With the flipped classroom there is time for discussing how it's going with them and it seems that they're still giving it the benefit of the doubt. When I asked if anyone had any successful strategies for approaching the class a couple talked about when they watch my screencasts and when they do theirs.  When I asked about not-so-successful strategies mostly I just heard the usual "I haven't gotten around to it yet" type comments.

Changes we had to make already: On my syllabus I had said that we'd also do group quizzes where the answer to a question had to be on the board after 10 or 15 minutes and then I'd ask follow up questions to random students.  Then we'd all work together to determine the rubric assessment for the relevant standards for each individual.  I've always loved those sorts of group quizzes in the past and was happy to include them in this system. We did one on the second day and it was fun to see the students interact with each other in working to solve a problem. At the end of the day, though, they didn't feel that they should each be assessed individually because they didn't get to show their own understanding on every step, rather they were just asked a few questions about a random step. So we've gone away from those for the moment.

I've also had to change my approach to my screencasts. I'm trying much harder now to make sure that everything I do is important for them to learn to help them do well on standards assessment. A recent example is some tangential info that I would have covered in the past only to say come test time that it wouldn't be on the test. I got away with it in the past because I typically wouldn't specifically say what was on a particular test until close to test day. Now the standards ARE the test and the students know all about them all the time. That doesn't mean I skip all tangential material but I think about whether it's a fair follow up type question for the in-person assessments.  If it is, I leave it in.

What I like: I love hearing their voices!  I started having students turn in screencasts/pencasts last semester in general physics and I love how you can judge their confidence and understanding when you hear them talk, especially when they're simultaneously pointing out the relevant equation or part of a figure. A lot of my standards require them to do a calculation in Mathematica and it's quite telling to watch them have to type their code in character by character while discussing the meaning/usefulness/physics of a particular portion. Having to watch them type code from scratch is a little tedious but I've told them that eventually I'll approve their Mathematica skills and they can start just doing screencasts of finished code, just walking me through it.  I am starting out forcing them to start with a blank canvas, so to speak, because I need to make sure they're not just cutting and pasting code from someone else.

For non-Mathematica standards I like having them write it on a sheet of paper and then scanning it to do a screencast walking me through it.  I also let them check out one of my LiveScribe smartpens and tablets to do it.  To see and hear a student walk through the derivation of the Euler formula for the calculus of variations is just fabulous, especially compared to just looking at the written work, since often that's just copying from the book.

I'm really happy, too, about using screencasting to show them how to do Mathematica calculations. In the past I would just give them the .nb files and let them build on them but now all they have is my video so they have to type it themselves. That is a huge deal!

What I don't like (so far?): I don't like giving the students so much rope to hang themselves. By requiring their voice I can't do assessments in class without canceling the plan for that day. In a couple of weeks we've set aside days to do just that but by then they'll be pretty far behind. We'll see.

I also don't like not being able to grade on the bus. I ride my bike in to work and then take it on the bus home. In the past the bus was where I got almost all my grading done. Now I need a computer with internet to do it (although apparently you can watch screencast.com material on an Android phone so who knows.)

What I'm hopeful for: I really hope the students learn the material better this way. I've never been overly proud of my students' retention of material and I'm hoping that'll get much better this semester. I also hope they can realize that the practice we do in class and the reassessments are all about learning and that as long as they eventually "get it" the grade will come.

What I'd love: I'd love to get some more great feedback from you. This is a grand experiment of mine and, while it feels like it's going well now, it's really too early to tell if it's worth the effort.

Wednesday, February 2, 2011

Oral exams, writing rubrics, and SBG

This semester I've switched my class around to Standards Based Grading.  I was strongly influenced by my growing PLN on Twitter that is populated by a lot of people who use it and sing its praises.  I decided with only four days to go before the semester to make the switch on my syllabus and today was my first class.  I can tell it's going to be quite a ride but I'm still pretty excited about it.

What I wanted to talk about in this post are the connections I see between standards based grading (hashtag #sbar or standards based assessment and reporting on twitter) and some of the assessment techniques I've used in the past, specifically writing rubrics and oral exams.

Rubrics
Last semester I taught a writing-intensive first year seminar.  The first year seminar is a cornerstone of the undergraduate curriculum at Hamline University but they're not all writing intensive.  The topic of the seminar was "Hamline Mythbusters" but that's a whole different post.  Passing the class did not automatically get a student out of our first year writing class.  Essentially I was able to post a grade and check off that requirement separately for each student.  For each writing assignment they had, I used an in-house writing rubric that I found very useful and flexible.  I told the students that to get the "Expository Writing" requirement checked off they had to turn in one of two major writing assignments and receive a "meets" or "exceeds expectations" on the rubric in all seven categories.  I told them they could turn in as many drafts as they'd like.  One student turned in 13 drafts of one paper until she finally had "exceeds" in every category.

I make the connection with SBG because the students knew they could always be reassessed.  Essentially I had seven writing standards I was using that were mostly separate from the rest of the class.  One thing that surprised me was the relatively low number of complaints I got about their scores on the rubric.  I have been assuming that this was because 1) it was a very well-thought-out rubric with good descriptors about the levels, and 2) they knew they could always work on it some more and turn it back in.  I think it also helped that I gave them a ton of feedback on every draft.

Oral exams
In my upper division courses (read: small class size) I use oral exams all the time.  What I really like about them is that I can pick the problems that I think are crucial to the class and have the students spend roughly a week honing them.  They come in for the exam (all together because there's points for audience questions) and randomly pick one of the assigned problems.  They work it on the board and lose points any time I have to unstick them.  They sometimes heave a sigh of relief when they find out which one they have to do but what I like is that they spend a week studying all of them.  SBG feels like the oral exam cycle all semester long!  They continually work on the standards, having them reassessed when they're ready and I know that they're working on exactly the things that I think are the most important.

I want to say a special thanks to Frank Noschese for helping me out over the last week.  He's a high school physics teacher who is passionate about teaching and learning and routinely gives me things to think about for my own work.

Sunday, January 23, 2011

Fleece Sheets

I had the idea for this post as I went to bed last night.  My wife and I sleep in the basement where it's really cold (especially these days in MN).  We've tried multiple comforters etc to try to stay warm while we sleep but the best investment we've made is fleece sheets.  They're awesome!  With normal sheets I get into bed and freeze because the bed is so cold.  With fleece sheets, they feel warm right away!  What got me thinking is how do they do that?

What I'd really like to talk about is how we present thermodynamics concepts to students.  If you look at a typical intro physics text the thermo section will start with a definition of temperature and go from there.  You'll likely find phrases like "what do cold and hot mean" and "what does it mean to be warmer than something else."  My sheets helped me crystallize something about why this has bothered me when teaching it.  Our sense of hot and cold is much more a measure of heat flow than "average energy per mode".  In other words, my fleece sheets are much worse heat conductors than normal sheets and so I don't feel that heat flowing out of my body, a sensation we call "cold".

Here's another common example that texts often get around to but not until later.  In the morning your bathroom tiles are cold but the carpet isn't.  That's certainly the way my kids would say it, at least.  My snarky response is to say "no, they're both the same temperature."  That doesn't seem to help my kids very much.  What's really happening is that tiles conduct heat much better and you get that "cold" sensation as heat leaves your body much faster than on carpet.

There are a couple of interesting points here.  If you focus on conductivity as being central to the notion of human temperature understanding you can then start to ask questions about why heat flows.  You can also address issues of equilibrium (why I'm so confident that the carpet and the tiles are the same temp, for example).  Heat flows **not** to conserve energy but rather to maximize entropy.  Entropy is a measure of statistics, nature moves to distributions that are more common and cold things gaining energy increases their entropy way more than the loss of entropy for a hot thing losing energy.  This then leads to heat engines and on and on.

So how do you teach temperature?  When does conductivity come it?  What would happen if you did it first? Does anyone do that?

Friday, January 14, 2011

Fake data labs

One thing I've noticed over the years of teaching upper-level physics labs is that the students see the experiment "working" as the finish line.  They seem to plan their semesters with that as a goal, determining when to order equipment, when to test it, when to set it up etc.  And often it works out, the experiment "works" right at the end of the semester and they're happy.  Of course, I'm usually not because there's not enough time for them to do a decent amount of data analysis and write-up.  Don't get me wrong, I think the experience they get by having to plan and execute their own experiments is good, I just needed to find a way for them to experience the other stuff too.

My solution, which I've used for the last several years, is to have them do a fake experiment.  This is in our Modern Physics lab so they're sophomores typically.  I have them choose from a list of seminal modern physics experiments, things like the photoelectric effect, Compton scattering, Michelson/Morley etc.  I then ask them to plan a 21st century version of the experiment and to assume that money is no object, since they're not going to do the experiment in real life anyway.  I have them plan a fully computer-controlled experiment (we teach LabVIEW in this course) and ultimately they have to turn in a functioning LabVIEW program, a formal lab report, and a Mathematica document showing their data analysis.

Now this last part is where I come in.  About half-way through the semester they have to turn in to me what their raw data would look like.  Typically this is a description of a data file or files explaining what all the columns and rows would be, as saved by their LabVIEW program.  They have to figure out things like the motor step size they'll use, what wavelengths of light are necessary etc. I then spend spring break creating fake data for them.  I do this in Mathematica with a random number generator providing appropriate levels of noise.  I also choose my own values of things like Planck's constant and the speed of light so that I can know that they've done their data analysis correctly.  I especially like choosing a direction for and relative speed of the luminiferous ether.

What's great about this project is that the students always have time to do the data analysis and write up because they don't spend any time on the actual experiment.  Of course if this is all we did it would be a problem but they get a lot of experience with real experiments both in and out of this class.  Being somewhat early in the program students get a sense of the type of planning they'll have to do for future long-term projects and they get a real sense of the difference between raw data and a result.

I always tell the students that their last sentence in the report should be something like "Planck's constant is ___ +/- ____".  I tell them that's their goal, not getting the experiment to work.

Saturday, January 1, 2011

Flipped classroom evaluations

I just read my student evaluations for the fall 2010 classes.  This semester I taught General Physics II in the "flipped classroom" or "naked teaching" philosophy for the first time.  I felt the class went pretty well and the student ratings and comments mostly bare that out.  I thought I'd make them available (unedited) to anyone who's interested in flipping their classroom.  If you want a pdf copy, email me (andy.rundquist@gmail.com), twitter me (@arundquist), or let me know in the comments below.

I made a few changes in the course as the semester went along.  By the end, we'd gotten into our groove pretty good.  We had daily quizzes on the material from the last class (a 4-sided die randomly deciding which problem to choose), 10-15 minutes of me answering questions they'd submitted online about the new material (which included ~20 minutes of screencasts along with reading the book), and then 30 minutes of the students working in small groups with a smartpen recording what we called roadmaps for how to do the problems. Here's a typical daily outline.

There are a lots of comments praising the flipped classroom but there's definitely a few people who strongly favored traditional lectures.  Check out the spread of scores on some of the questions if you get the pdf.