Monthly Archives: September 2011

Goodbye Paperless Classroom, Hello Collaboration

After a three-week experiment, I’ve decided that I’m going to downgrade my classroom from “paperless” to “minimal paper usage.” Every student in our school is given a tablet computer, which is awesome, yet very few teachers take advantage of the tablet capabilities of these computers. I thought I would try this year and had every student bringing their tablet to every class. We have been taking notes and doing homework with Microsoft OneNote. The cool part about this is that I have their notebooks shared with me on our school network, so I can check out their notes if I want, AND no one has to rip out paper from their notebook to hand in, only to lose it when I hand it back (or, on the rare occasion, have me lose it). OneNote automatically syncs with their shared notebook on the network, so I can grade their notebooks on my computer without them sending anything to me, and then they can automatically see my marks on their homework notebook the next time they open up their computers and sync – without me sending anything to them. This, I have found, is absolutely awesome. It’s more efficient and more organized and I think saves a considerable amount of valuable class time.

But the in-class tablet usage is another story. Here are the downsides:

  • It has taken them like an average of 6.8 minutes* to start-up their computers and get going every class, which is possibly the most annoying thing in the world. My hairline recedes a little bit at the beginning of each class from the stress of the wasted time – I couldn’t have kept that up for very long.
  • Our tablets are not reliable. Every class, someone’s computer wont start, the pen wont work, the screen wont flip to the right orientation, a computer will spontaneously light itself on fire etc. etc. I’ll get a hand raised every lesson with someone asking me an annoying tech question that I can’t answer and I have to make the choice of whether to help them out or to keep going with class and let them fend for themselves. As an efficient/impatient person, I choose the latter, sorry kid!
  • Most importantly, there’s a weird invisible barrier between everyone with computers in front of them. I have found that the level of collaboration in the class went way down. I could have predicted this with regular laptops, but I would have thought that the tablet would feel just like a regular notebook. But something about that extra weight over a notebook keeps people from scooting over to see what a neighbor is doing, keeps the computer flat on the table instead of being able to show someone else, and keeps students eyes down more often than up.

Final decision: I’m keeping the tablets for the homework (where efficiency is paramount and collaboration doesn’t really matter), but ditching them for in class stuff. Though I’m sad it failed, it helped me really realize what I value in my classroom. Frank’s famous $2 Interactive Whiteboard post really captures well how technology can actually restrict modalities of learning though it can expand our abilities to visualize and manipulate things. And even though I read that a few times this summer, it took the experience of the tablets to make me really understand what he was talking about.

So what did I do? I had the school order whiteboards for the math department. And we used them today to play the Mistake Game, where students present the solution to problems and purposely hide a mistake in their solution. And it was awesome. My favorite math class of the year by far.

I guess I’m willing to sacrifice a tree or two for mathematics sake.

[See the mistake(s)?? Kind of a silly one.]

*Educated guess. I’m not ridiculous enough to actually time that and record the data.

Students Identifying Misconceptions (Instead of Me)

At the Klingenstein Summer Institute this past summer, a really transformative experience that I think is going to help me bridge the gap from rookie teacher to low level intermediate teacher (I set ambitious goals), we talked a lot about identifying MISCONCEPTIONS as one of the roots of good teaching. In the math teacher group, one of the challenges that we decided would challenge ourselves with was to design an activity/lesson where the students would realize that they had a misconception on their own without me explicitly telling them. Here was my inadvertent attempt…

THE ISSUE: In a extraordinarily weak non-AP Calculus class (for now! and only compared to last year’s group, – growth mindset!!), we just spent about a week and a half talking about limits. Nothing fancy, basically just graphical and numerical limits. I had a really hard time getting going this year and do not think I did a good job at all of creating a student centered, constructivist classroom… and it showed. We had a quiz on numerical and graphical limits, and… wow. I was genuinely surprised at how low the conceptual understanding was of the main idea of limits.

THE MISCONCEPTION: Among many misconceptions, the biggest one seemed to be that the limit as x approaches a certain value is affected by the value of the function there. This mainly manifested itself with tabular limits (in which students put that the function is undefined at a point, and conclude the limit does not exist), and with open and closed endpoints on piecewise functions (closed meaning – in the teenage mind – that the limit does exist, open meaning it does not).

THE ACTIVITY: I made a four stations around the classroom and each one had 3 or 4 different functions represented either graphically or algebraically. The goal was to determine the left, right and overall limits of each function. For the tabular limits, I put a green piece of paper over the approached x-value of the function in the tables. I asked them to first determine the limits before lifting up the green piece of paper. Then lift it up, check out what the behavior of the function is like at the approached point, and see how their answers change.

The catch was that all the limits were the same. This was patently obvious with the green strip over the function values – I could see the students at each station talking and being like “What? They’re all the same…” Then, they lifted them and were like “What?!? How can that be?”  The other stations were similar, except that I represented the functions graphically instead, covering up the whole x-value on the graph:

After everyone passed through all the stations, we debriefed. We talked about what changed when you lifted up the paper and what didn’t (okay, maybe this is explicitly pointing out the misconception, but only after most students realized it on their own). I asked what the point of the activity was and one student eagerly raised his hand to say, “I think the point was to show that when you lifted up the green strip, EVERYTHING about the limit changed because how the function is defined underneath.” Right, except… wait no, exactly wrong. But, from that we ended up having a good discussion and I really saw the right idea finally click in a few students heads. I really wish I had done this in the beginning instead of defining a limit more mathy-mathlike.

So the reason I decided to post about this? I got this encouraging bit of metacognition as part of a Reassessment Request:

on the table, in our last quiz i assumed that since there was an error at the limit the limit did not exist. we did the activity in class and it helped me learn that even though there might be an error or a weird number, there is still a limit if it is going to the same number from both sides

Success! Misconception realized. Exactly right young mathling. Even with the rough start, maybe this will be a great year after all.

Growth Mindset – Normalizing Mistakes

My first year teaching, I remember one of the elder, wiser, experienced teachers at my school looking at my first week plan and telling me that I think more deeply about setting routines in the class and creating a good class atmosphere. I kind of brushed this off as a sort of silly – I was there to teach Physics, and that’s what I would do. The other stuff would happen automatically. Well, luckily, I wasn’t totally wrong – I think that I inadvertently did a decent job of setting good routines, though I don’t think I did a great job of creating an atmosphere where mistakes were not only encouraged but celebrated. I realized by the end of the year that the hardest part of teaching Physics was not Physics at all, and tried to focus a bit more on all the “other stuff.”

This year, my third year teaching, one of my main goals is to really get my students to buy into the idea of a “Growth Mindset,” especially in my non-AP Calculus class. I started well with an awesome discussion, which was based on Dweck’s original mindset survey (which John Burk over at Quantum Progress turned into a cool data driven exploration of his students’ mindsets, which then he turned into a collaborative mindset data collection experiment in which you can participate). As my beginning of the year review rolled on though, I kind of ruined what I had started through my frustration with my non-AP Calc students. For some reason, they are incredibly weak, far weaker than the students I had going into the same class last year. Many don’t know the basic shapes of parent function graphs, don’t know how to correctly simplify rational or radical expressions
(\sqrt{x^2+y^2}=x+y    right??), have never seen a piecewise function, can’t find the domain of a rational function, can’t recognize a basic vertical shift etc. Sigh. I guess my surprise and confusion that they were at this level was pretty obvious, and both of my classes seemed embarrassed by not knowing things that I thought they “should have” and, yeah, worried that they were “dumb.” I sort of realized that I hadn’t bought into the growth mindset as much as I had thought – They’re weak? No. I was comparing them to the students from last year instead of just assessing their level of math and working from there.

The worst side effect of our really rough week of review is that the class started to get really, really quiet. I could only get a few students to respond to questions and take risks. I couldn’t tell when they didn’t understand something, even instructions, because they would just be silent – I have never had that happen in the classroom before. I decided to take some action and remind them (and remind myself) that we are a classroom committed to the Growth Mindset. Using PollEverywhere.com, a wonderful interactive polling website where students can vote and immediately see the results at the front of the classroom, I carved out 10 minutes from mathematics and took them through a series of questions that I designed to help normalize mistakes. We looked at the results of each question before moving on to the next. The results…

Observations: Though some of the questions were certainly leading, the students seem to really buy into the ideas and remember our growth mindset conversation. The questions were ordered perfectly, because after everyone realized that no one else judges other people for making mistakes, they were forced to think about really why they were having a hard time participating in class. We went through each of the statements for the last question and talked about if we believed that statement, how the results from the previous two polls might help us participate more. It was a really nice conversation and seemed effective. I saw that look that the students get when their gears are turning and stuff is clicking. Side note: I was a little surprised that students voted for the “Mr. B, you are intimidating option” but I used that as a spring-board to remind them that I buy into the growth mindset idea too. (Also, sra7a means “honestly” in Arabic).

We wrapped up with a PollEverywhere open-ended question, where they type things into the poll and they show up on the screen. I thought this might be a nice, low pressure way to share some thoughts with the class so that we could all be supportive of each other:

How do I know this was a wonderful use of 10 minutes? The first response to the question above was “Thank You” which was surprising and actually pretty touching. And theeeen, it quickly devolved into things like “Bring lasagna to class” and “apple juice breaks.” Really senior-in-high-school? Apple juice? Thanks for ruining a rare sentimental math moment.

Next step: Now that I have them a bit more prepped to be okay with mistakes, I want to find ways to go one step further and celebrate mistakes. I really love the Mistake Game , from Kelly over at Physics! Blog!, to use with Whiteboarding in Physics. Basically, students work in groups and present the solution to a problem that contains a mistake hidden in it. Students are encouraged to find the mistake through asking thoughtful questions instead of just saying “HA! I FOUND THE MISTAKE.” I love this because it is not only instructional, but teaches students how to constructively criticize each other’s work. The math department plans on getting mini-whiteboards any day now, so I am excited to experiment with this. Also, Kate over at f(t) has some great tips from the Virtual Conference on Core Values from this past summer where she describes the center of her classroom as being “We Make Mistakes.”

Moral of the Story? Growth Mindset takes more than a description and a survey to create buy-in. I will remember that teachers can unintentionally send subtle signals through their behavior. I’ve learned from my mistakes with this, which, paradoxically, will lead me to encourage lots more mistakes. I’ll certainly be coming back to this throughout the rest of the year.

September Review: Is it Actually a Necessary Evil?

It seems like math is the only subject in which teachers feel like they need to review for the first week or two of the school year. Teachers of other subjects seem to review as they go along, going back to old skills and ideas as needed, as motivated by their curriculum. To me, this makes much more sense… and I know because I say this a bit disheartened having plowed my students through a review of algebra for the first week and a half of school. After starting of my class with a bang, with some great metacognition and a good introduction to Standards Based Grading, I had a lot of trouble getting the groove, mostly because I knew that I wanted to push through the review to get to the good stuff, which means I ended up having an uninspiring week of a hugely teacher-centric classroom.

So that of course brings the question to my mind of “Why am I wasting time on not-good stuff?” I know some stuff is unavoidable (especially when a vast majority of my non-AP Calc students claim they have never seen or heard of piecewise-defined functions before), but I really believe, like a bunch of other math teachers I have talked to, that most of review could come as needed as the curriculum develops. And our book reviews all these crazy topics that won’t ever have much bearing on the future curriculum. Symmetry? Modeling** (the mathematical kind)? Animal Husbandry?

I got frustrated halfway through the week and decided that instead of just hammering out more material I would do a problem solving activity with my AP class that would remind my students of many of the things they needed to know while engaging them in deep problem solving at the same time. The sad part about teaching an AP class is that it totally felt like I was “losing” a day (my yearly schedule is nagging me), but it was totally worth it. This is something I am going to struggle with all year, as I have taught a very application based Calculus for a year and this is my first shot at the AP. Here is a mini-unit I organized about piecewise functions.

MOTIVATING PIECEWISE FUNCTIONS: I tried to get them to see why piecewise functions are necessary by giving them data of tourism arrivals and departures in Jordan and the US over the past 15 years and asking them to tell me the story that the numbers are telling them (thanks John for giving me the idea with your post about Telling the Story of a Number). Each group came up with 4-5 bullet points and wrote them on the board for the others to see. Quite unsurprisingly, there are major drops in both US arrivals in departures around 2001 and 2008, which most groups mentioned in their story, but others came up with some great explanations that I didn’t expect. For example, one group mentioned that Petra was named one of the new Seven Wonders of the World somewhere around 2008, so that might explain an uptick in arrivals (cool!). Another explained the rise in US tourism in the mid-1990’s to a Deep Purple tour… We talked about how it would be hard to fit ONE function to the data because it’s kind of all over the place, but we could fit a bunch of chopped up functions. The cool part about framing it like this is that the points where the function changes corresponds to major world events, which is because those events changed the relationship between the variables. I saw a bunch of light bulbs go off on that one.

ACTIVATING THEIR PREVIOUS KNOWLEDGE: Then, I played for them DJ Earworm’s 2009 United State of Pop mashup (I blogged about this about this on Sam Shah’s blog this summer). We made the metaphor between a mashup and a piecewise function and used that to give ourselves a quick reminder of how the notation works. This led into a few examples as a reminder, but none of the drill and kill – I just wanted them to remember that they knew how this stuff worked.

FINALLY, THE PROBLEM SOLVING ACTIVITY: With more-than-inspiration from Mimi’s Income Tax Unit, I presented them with how income tax works here in Jordan. They were really surprised – most thought it was some sort of flat tax. They were also confused. Why is it so complicated? So I presented them with the goal of the task, which was to make the Income Tax more easily understandable for the average person. I found the income tax for five different countries, and each group was tasked with graphing Tax Owed vs. Money Earned and then writing a piecewise-defined function that will give you your tax owed if you plug in your income. That way, an average person could either just find their income on the graph, or plug their income into the function. Trying to be “Less Helpful” à la Dan Meyer, I tried to provide scaffolding only where needed. This was so hard for me! I just wanted to give them little hints. I gave in to these urges every once in a while, but this was the most I have ever let my students really struggle. Most tried to start directly with the equation and had a lot of trouble abstracting the situation, but over the course of a period and half (everything takes roughly 24 times longer than I think it well), pretty much every group had a graph drawn and pretty much finished the equations.

(that’s Spain’s Income Tax)

REASONS THIS WAS NOT “LOSING A CLASS”

  • They realized that the keys to piecewise functions are the points on the boundaries of of the intervals. This will really help when we talk about continuity and differentiability.
  • After some experimenting, most groups realized that the slope of each segment was the same as the percent of taxed money on that segment. Any sort concrete exploration of the idea of slope is alright by me.
  • Many students became much more comfortable with point-slope form (or, more importantly, realized that this form of the line is much easier some of the time than slope-intercept), which will help when we talk tangent lines.
  • One group made connections between all the ways they could have solved the problem. They actually determined their equations analytically – CATEGORY BASE TAX + (TOTAL INCOME – CATEGORY BASE INCOME) * TAX PERCENTAGE – which was impressive to me, as I always do things other ways first (graphically, numerically etc). But they came to the realization that this form is pretty much exactly point-slope form if you rearrange it a bit and then got the added conceptual understanding that arises from point-slope form here. I thought this was great.
  • This was more genuine, engaging and thought-provoking than the rest of the week combined (and it’s not even a particularly rich problem).
I was feeling really frustrated and down from limping through review, so this class was exactly what I needed. My goal this year for my math classes is pretty simple: I want to really motivate everything we study. This was a great start and I hope to come up with more rich (but not necessarily “real world”) tasks like this. NEXT YEAR: NO MORE DRILL AND KILL REVIEW.

(PS One of the sweet things about working here is that when I leave I will get most of the tax that I paid the Jordanian government right back. And I don’t pay US taxes because you have to make a boatload abroad to have to pay. Buhahahah.)

**I actually love love love modeling, but doing textbook problems about modeling is like that first chapter in science textbook that “teaches” the scientific method.

What If Angry Birds Didn’t Grade With SBG?

Last year I tried out Standards Based Grading the first time and really thought it was a game changer for my classroom. Though I haven’t worked out many of the tweaks yet, and some departmental pressure is conflicting a bit with my ideal way of running things, I am still very excited about using SBG this year in class. One of the mistakes last year was that I did a terrible job explaining the whole system in the first few days of school – the whole thing was far too abstract and different from what they were used to that the first presentation went over their heads and it took some students a while to actually figure it out. One of my goals of this year was to sell/explain SBG much better so that I could have everyone on board, and I figured that this would be a worthy use of about a day total of class (I ended up integrating it with problem solving and review).

It’s easy to get caught up in trying to explain all the details of SBG, but of course making a simple analogy to scaffold off their existing knowledge is far more powerful. I realized that they already know Standards Based Grading from playing games like Angry Birds. Here is how Angry Birds grades with SBG:

Right? Levels graded separately that you can play over and over until you gain mastery? I’m sure others have thought of this analogy, but it seems pretty solid to me. So now contrast this to what the Angry Birds score screen would look like if it “graded” in the traditional manner:

This would suck because I never get 3 stars the first time around. I’m really hoping that these pictures can do almost all of the explaining for me, especially when we compare them to the way I graded their diagnostic tests from the first day. I have never done a diagnostic test in the beginning of the year like this, but I wanted to do it this year for both its diagnostic purposes and to have them learn how SBG works experientially. I graded it today two ways for them – in the traditional points manner and with SBG (and I will give them back tomorrow with my 4 point rubric and a full description of the standards):

I hope to have a discussion about what SBG tells you that traditional points based scores do not, and talk about the very different reaction you would have to quizzes graded in the two ways. I hope that with sample grades in front of them that mean something to them and a fitting metaphor, they will be totally sold on SBG before the second full day of school finishes.

Other Materials I Used to Introduce SBG…

1. Getting them in a Growth Mindset

I have decided that metacognition is going to be a big goal of mine this year, of which one of the lynch pins (especially while grading with SBG) will be getting students to realize the difference between a fixed mindset and a growth mindset. This boils down to the idea that those that believe they can always grow and always get smarter will end up growing far more easily than those that believe that intelligence is fixed. I gave them a little math learning questionnaire adapted from Math Hombre (gracias!), who mathified Carol Dweck’s original research questionnaire for use with his math students:

I had them first fill it out silently for a few minutes. Then, in partners, they found and discussed a statement about which they had differing opinions and a statement about which they had similar opinions. Then each pair found a new pair and shared with them the two statements that they had discussed previously. This really helped pave the way for an awesome class discussion. My favorite comments were when one student said that intelligence has to change because he is a lot smarter than he was in 9th grade, and then when others came to the consensus that in a fixed mindset you are comparing yourself to other people whereas in  a growth mindset you are comparing yourself to yourself (beautiful!). Though some students were really resistant to the idea of not thinking in terms of “smart” and “dumb” anymore, I think many students really bought into the idea of a growth mindset and will hopefully be able to connect that idea to SBG in general…

2. The Nitty Gritty Details of my Hybrid SBG System

And theeeeennnnn, finally, after getting into a growth mindset and experiencing SBG through a diagnostic test, I am going to give them all the details of the grading system – percentages, processes, resources, philosophy etc. This is basically what I did last year without all the prep. I made a pretty awesome Prezi to do all of this, which I am really excited to show tomorrow (not in small part because it includes a hand drawn picture of an angry octopus).

I hope this will really stick because then it’s onward and upward to the magical land of Calculus!

My Dog Could Do Math (or perhaps just perform algorithms)

I totally understand the place of the algorithm in mathematics. But the argument about the use of algorithms reminds me of a deeper issue, how math education is currently trying to figure out how to adapt to major technological advances in computing that allow us to have computers perform these algorithms for us. I’ve seen a lot of arguments that the math curriculum needs to change drastically to take advantage of new tools like graphing calculators and Wolfram|Alpha (which, of course, both rock). Basically, that any sort of direct computation needs to be phased out of a 21st century curriculum – that we should teach students how to problem solve using these technological tools. Others argue that deep understanding is enhanced by knowing how these processes work. To which others counter that people drive cars all the time to get from place to place without ever having built a car or really knowing how it fully works.

Somehow I was reminded of this when I saw the family dog, Whiskey (who is absolutely hilarious), performing some of his tricks when I was visiting home this summer. Our family’s favorite is the “Bang, you’re dead” trick.

First, my mom puts her gun out, to which Whiskey responds by sticking his paws in the air innocently. Then, my mom yells “bang!” and Whiskey awkwardly flops to the floor, flips over and plays dead. As you can probably tell, it’s a pretty amusing trick, and pretty complicated for such a puny little brain. But… here’s the whole video from which I got these screen shots (no idea why it ended up so stretched out):

Notice that he tries just about everything before he gets it right. He has sort of a general idea of what he is doing, but he has no idea why he is doing each of the steps. When he accidentally does one too quickly, or jumps up instead of putting his paws up, he can’t diagnose his misconception thoughtfully and fix his mistake. He just blunders through trial and error until he figures out something.

The sad part was that this totally reminded me of a few students whom I taught last year (seniors taking Calculus) who had no idea how to do basic algebra because they had no deeper understanding of what was going on and somehow had no idea how to check their answers to see if they was on the right track. It felt like their previous math teachers had taught them how to do tricks, and perhaps I wasn’t doing any better. If they managed to plow through the right steps and stumble on the right order, I would reward them with the treat of a good grade, and that reward exists in both traditional grading methods AND in Standards Based Grading. But the problem for me wasn’t in the algorithms. It was that they had no deeper understanding of mathematics to accompany those algorithms. Teaching just computation, and not teaching it well.

So for me, when math computation technology proponents argue that you can drive plenty of places without every knowing how a car works, I always think about the one time when the car breaks down. What do you do then? You are stuck. You have to call someone for help. You can’t thoughtfully work out the problem on your own. I agree with the hopefully general obvious opinion that learning how to do computation is not the goal of a mathematics education. And I agree about the danger of accidentally teaching meaningless algorithms, which can easily happen unless you conscientiously dig deeper in checking for understanding. But the idea of throwing out the deep conceptual understanding of mathematical structure that goes along with learning some of these processes in favor of using computers for computing just doesn’t really sit well with me. The important, deep conceptual understanding of mathematics certainly doesn’t come from just learning algorithms, but it also not helped by never learning why algorithms work. Technology, though a great tool, is not a replacement for the human mind.

Why I am thinking about things like this and not the hectic as-of-yet-unplanned first week of school is beyond me…

UPDATE: If you haven’t checked out Matthew Brenner’s “The Four Pillars Upon Which the Failure of Math Education Rests” go read it – reading the whole thing is on my to-do list but everything I have read from it so far is wonderful. It was pointed out to me after I wrote this that he wrote something very similar (though about ten times more eloquent) on page 55. Agree to agree I guess! Now I must read the whole thing.