mathlovergrowsup

Closing the Teach For America Blogging Gap
Jan 29 2012

Pelham Math Committee Misrepresents Statistics in “Math War” over Curriculum

My awesome office mate who helped me pass statistics last semester got engaged in the Pelham Math debates and wrote a Rubenstien-esque statistical analysis of the studies used to justify removing a particular curriculum from the schools.

Here are his thoughts:

So I have several issues about this article that was posted by the PMC. However, I do want to qualify my following statements. First, I do not know enough about the TERC Investigations curriculum in order to make an evaluative judgment regarding that curriculum. Second, I do not know the teachers in the district nor do I know the inner workings of the district. By this I mean I do not know how the teachers teach (saying that a teacher uses such-and-such curriculum does not tell me how he/she teaches) and I do not know the extent to which the school provides professional development or training opportunities in using a specified curriculum. A final disclosure: I am a mathematics educator, a mathematician/statistician, and a mathematics/statistics education researcher.

My first issue with this article is the third sentence. “But there is compelling research that disproves it.” As any good statistician worth his/her salt will tell you, statistical analysis NEVER proves or disproves anything. In fact, as the joke goes, statistics is the best field to be in…you always get to be wrong! In other words, the PMC is now (un-) intentionally twisting the study’s findings to promote its agenda.

While we’re on the topic of findings of the study, let’s get a little factual. The first “finding” reported in the article isn’t a finding of the study at all. Oh, and that graph, yeah…it’s not in the study at all. In fact, considering I intimately know the program that the study’s authors used (SAS) and I know Excel very well (which the study doesn’t use at all), I can only conclude that the PMC made that graphic and are attributing it to the study. Now, I looked quite extensively and I couldn’t find numbers that would correlate well with the bar chart. However, if you can give me a page number inside the study to go examine again, I would be more than happy to re-evaluate this point.

The article goes on to claim that “The study was undertaken with the goal of finding out—with real evidence—whether the type of curriculum used in early elementary math education matters when it comes to achievement and learning.” Here’s the actual goal: “The goal of this study is to examine the relative effects of widely used currciula that draw on different instructional approaches and that hold promise for improving student math achievement” (p. 4). Hmm, anyone else seeing a disconnect? Let’s check the research questions.
• “What are the relative effects of the study’s four math curricula on math achievement of first- and second-graders in disadvantaged schools?”
• “Are the relative curriculum effects influenced by school and classroom characteristics, including teacher knowledge of math content and pedagogy?”
• “What accounts for curriculum differentials that are statistically significant?” (pp. 4-5, my emphasis)
So, in summary, the PMC is now claiming that the study is focused on finding out something that the study’s own authors aren’t claiming to study. So one of two things must be true here: 1) the PMC knows what was being studied better than the study’s own authors, or 2) the PMC is reading what they want to believe and then writing propaganda.

Wondering about that emphasis I added? In case you haven’t ready the study, allow me to describe the sample population. The study was on disadvantaged students and disadvantaged schools. The study defines these schools as “those that have a relatively high schoolwide Title I eligibility rate” (p. xxi). This means that the schools in the study have high numbers (percentages) of children from low-income families. This is traditionally measured by the number of children eligible for free or reduced-price lunch programs. According to GreatSchools.org, Pelham Union Free School District has, wait for it, 5% of students qualifying for these programs. Sounds disadvantaged to me. (I’ll admit, I’m making an assumption that this is the correct school district; please correct me if I am wrong.) By the way for the casual reader, New York state’s average is 44% for this measure.

Now you might wonder why it is important that I bring this up. It has to do with the intent that the PMC has behind publishing this article with a study they read. The idea that I gleaned was that the PMC is attempting to use a scientific study to provide their cause with momentum and to get well-meaning parents to join the PMC cause. However, as any statistician will tell you, you can’t take the findings from one study and apply those findings as you see fit without paying attention to differences between groups. The study’s authors mention several times in the report that one should not attempt to draw any conclusions beyond the scope of the study, i.e. non-disadvantaged schools.

In the second bulleted “finding” of the article is that the average achievement of Math Expressions students compared to Investigations. Now, this was an actual finding from the study but there is some information-hiding going on. This result was only significant when you don’t adjust the statistical tests to account for the multiple pair-wise comparisons that were being made. When you do make the adjustment, the finding is no longer significant.

So what’s the difference between non-adjusted and adjusted? Quite a bit actually. I encourage you to expand your horizons and research about it. Essentially, the difference boils down to the number of tests that are being conducted. If one does not adjust, at the 5% level (this study’s level), you have a 26.5% chance of a false-positive…saying that a difference is significant when it really isn’t. It is in my professional opinion that the difference of 0.11 standard deviations is NOT significant at 5% level.

Interestingly enough, the PMC seem very adamant about pointing out that TERC Investigations was the only “pure constructivist program” and had a constructivist pedagogy. Recall that I am an educator so I do happen to know quite a bit about pedagogy. Sorry to burst your bubble PMC but there is NO constructivist pedagogy. Constructivism is a theory of learning and pedagogy is about instructional theory. Recall that the study’s authors wanted to look at instructional approaches. This means that it was looking at teacher-centered and student-centered instruction…not the driving cognitive theory behind the curriculum. It is EXTREMELY inappropriate to equate student-centered learning with constructivism. I have watched a hard-core constructivist use teacher-centered instruction to help students understand mathematics. I have also watched hard-core behaviorists use student-centered instruction. Therefore, student-centered instruction does not imply constructivism.

To briefly mention your other article with the three mathematicians, I have worked with students who learned mathematics through the Saxon program. Guess what, I had to spend a significant portion of time getting the students to “unlearn” the conceptions that they developed through that program about mathematics. In my opinion, it is not an effective curriculum. (Yes, I have looked at this one.)

An independent research group conducted a rather in-depth study of the relative effects of four mathematics elementary curricula in disadvantaged schools. The study’s authors were very through and clear with their findings. However, NOT ONCE in the entire 250 page document do the study’s authors ever make the claim “Investigations students are at the back of the achievement pack by a meaningful amount in first grade and by a significant margin in second grade.” This is a pure invention of the PMC. Recall the goal and the research questions of the study. This statement is not a logical outcome in any statistical sense.

Now, I do want to say something in praise of the PMC. I applaud the very fact that you are taking an active role in your children’s education. I firmly believe that curriculum development requires the interaction of many different types of stakeholders, from school administrators and teachers to parents and education researchers. There have been a number of times when I have wished that some of my students’ parents would be more involved. However, there is a time and place for all stakeholders in the process. I would never presume myself knowledgeable enough to dictate to my doctor what she/he should be doing while I’m in surgery. There does come a time when you need to listen to individuals who have more experience and more knowledge about different topics than you. Yes, it is good to get mathematicians involved; however mathematicians know mathematics not how students learn math. You also need to invite people who research how students learn mathematics to be part of your dialogue.

You obviously want the highest-quality math for the children of Pelham. (I’ll assume you mean math education.) However, what are you doing to see about the quality of instruction? About the meanings of mathematical topics your child is learning? What are you doing to actually experience the TERC Investigations curriculum yourself? From my vantage point it seems that you have particular ideas for how math should be taught and have found people who do research in other subjects who agree with you. Yes, TERC Investigations has a different approach to mathematics education than you might have learned. However, there is more to mathematics than fluency in standard algorithms. Standard algorithms teach a student that mathematics is devoid of meaning and lead to a belief that mathematics does not involve the real world. I’ve served on hiring committees before and I can tell you that not once have I ever supported the hiring of someone who believed that mathematics was all about standard algorithms. Yes, students should have computational fluency, but they also must be able to make sense of new situations. What do you think is needed more in the STEM fields, creative-thinkers/problem solvers or skill-and-drill, computationally fluent robots?

It is a sad day indeed when people believe that mathematics achievement should only be measured by how well a student can follow a standard algorithm.

21 Responses

  1. Ms. Math

    Because the article mentioned using What Works Clearinghouse as a way to judge math curriculum I thought it was worth noting Schoenfeld’s article about the methodological issues in studies What Works Clearinghouse used to evaluate math curriculum. Schoenfeld was a research mathematician but has spent the bulk of his career as a well-respected math education researcher at Berkeley.

    http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=3&sqi=2&ved=0CDQQFjAC&url=http%3A%2F%2Fgse.berkeley.edu%2Ffaculty%2Fahschoenfeld%2FSchoenfeld_WhatDoesntWork1.pdf&ei=tWklT7TDDqHJsQKN5vGMAg&usg=AFQjCNGEv7HLm8turF9Vyl8XDZkG4KbmMA&sig2=lt1BDceo34S0S5M7_UVJrA

  2. T

    Yes! Yes yes yes yes yes!

    I am somewhat familiar with the Pelham schools, and your information regarding the community is correct. As a teacher who has studied both statistics and cognition before entering the field of education (although not to the same extent that you have), I applaud you for writing this article.

  3. T- you’ve been a pretty awesome poster on my blog lately :)

    And not just because we have similar opinions!

    I’d love to know who you are, even if it was just an internet persona-do you have a blog?
    Cameron

  4. Eric Weber

    Let’s assume he’s on point in his criticisms, and having read the report myself, I tend to agree with his analyses. However, then the analysis ends and the preaching begins.
    A reasoned analysis that does not vilify a group of people is much more effective. The emotion detracts from the message. People don’t listen to each other in the first place, and calling shame on people who disagree with you makes it that much harder for constructive criticism to take place (see: US politics). It makes him seem like a scorned educator who can’t stand that someone thinks about mathematics differently than he does.

    • Eric-thanks for you two cents on the issue-i didn’t know you were engaged with the debate as well-Neil and i did get a little riled up-your point about constructive criticism is on the mark.

  5. Ricardo

    So easy to debate when it is not your kid… here is what I fear for my kids:

    http://www.youtube.com/watch?v=1YLlX61o8fg

    Nowhere close to the depth of your academic experience, just concerns from a real father in the real world. I really hope you are right.

  6. Ms. Math

    If the girl knew what she was doing with the cubes, sheets, sticks and blocks and could relate that to the standard algorithm I think that would be awesome and show understanding of addition and give her the tools to be able to add complicated things(such as polynomials) later in life. I hope my kids never view manipulatives as meaningless, and excessively complicated means for addition.

    I’m not sure that I hope I’m right about math education! My views are often pessimistic. I know I won’t really know what it means to care about a kid until I’m a parent, but I do really care about the state of kid’s education in general.

  7. Dominic Charles

    This article strikes me as partisan and pedantic in its analysis.
    The broad question that we want answered is whether Investugations is the right program for our kids.
    Common sense suggests that a math program that shuns standard algorithms might not perform as well as a program that teaches the basic math that most adults would recognize as standard.
    It seems that however you spin the statistics common sense is proved right. Students who study a traditional math program do better.

  8. Ricardo

    Dominic Charles – thanks for your posting. Just one more stat: 57% of the districts that adopted Investigations have dropped it.

    http://mysite.verizon.net/resu7qv4/sitebuildercontent/sitebuilderfiles/misuccessjul09.pdf

    For those studying the topic, it would be great to learn what happened to scores before/during/after investigations.

  9. Ms. Math

    It’s not a surprise to me that most districts drop the curriculum. Just because it’s goals are based on research doesn’t mean it works in practice.

    Most elementary school teachers do not have strong enough understandings of the mathematics to probably meaningfully talk about operations like division(Ball, 1990) (Simon, 1993). If they don’t understand division and multiplication quantitatively then it is unlikely they will be able to teach this way well. It doesn’t mean that having meanings for operation is not useful-I’m sure that it is. It doesn’t mean that traditional math is better. It does probably mean that if you ask people to teach things they don’t understand it might not end well. The right thing for your kids might be spending more money on professional development of elementary school teachers in mathematics. It’s unlikely that any curriculum will be right for them if issues of teaching are not addressed. Curriculum is just a piece of the puzzle-it doesn’t solve any problems by itself even if it can keep people doing ineffective, yet traditional, things.

    Additionally, since most parents in the country don’t understand the aims of “reform” curriculum(at least I didn’t really get it without long term study), it makes sense to me that they would react similarly and ban it in all 57% of those districts. If they don’t understand math beyond standard algorithms and then how can they see something else as useful? If they do understand math more meaningfully, just because they get it, how will they see that some kids need to be taught the meanings that they picked up on their own? There are websites of teachers saying that they loved investigations-but I bet those teachers had an understanding of math that allowed them to see why it was helpful for their students.

    I don’t see how going with tradition is common sense. We’ve been teaching about the same way for decades. (See the Teaching Gap for results of a large national study) Our math scores are poor. Hundreds of articles document that college students can’t use math they have learned in science and engineering classes. Why does it make sense to keep doing what we’ve been doing even though the premises are contradicted by the last 30 years of research by people who devote their lives to understanding how kids learn math? That doesn’t make sense to me.

  10. Ricardo

    Ms Math – I see your point. But what method are the top ranked countries using? Is it Investigations-like? Is there any real success case to validate the method?

    • Ms. Math

      The Teaching Gap(Steigler and Heilbert) has the best answer to what methods the top countries are using.
      I’m not sure how Investigations is being implemented so I can’t tell if what they are doing is similar or not. I could speculate but it would be that no, whatever is happening, is not what top countries are doing. It doesn’t seem based on parent complaints that the curriculum is being presented in a way that is very effective by the teachers. The teachers seem to be presenting the idea that algorithms are wrong. I’ll I’m saying is that meaningless algorithms are wrong and traditional math fosters these extensively.

      I do know that the top countries like Japan spend substantial time reflecting on what their students are learning as a team. Investigations also intends that teachers reflect on what their students have learned because this has been shown to help make progress. They solve more complicated problems in Japan, and spend more time focused on understanding and meanings as opposed to disconnected memorization. There are studies based on large random samples confirming that high performing countries are doing something more effective that traditional instruction in the US.
      My guess is that success stories for curriculum like Investigations are not too common. It teaches concepts not often measured by standardized tests. There are success stories about the depth of understanding kids gain from reformed curriculum but those involve extensive professional development as well as new measures that test understanding and not just procedural fluency. Conceptual programs(as far as I can tell) don’t make kids better at procedures. Connected Math for example has kids better on concepts and about the same on procedures in large studies.

      I do have amazing success stories from teaching differently in my Calculus class-one kid told us that he always felt dumb because math seemed like random rules and now he finally feels smart because we showed him that it made sense and supported his progress. I’ve seen what kids can do when you give them strong meanings.

  11. Jonny Swift

    I totally agree with Ms Math on this one.

    My three children have been in the Investigation program since it’s inception in my school district and both have really flourished. I don’t think that they have suffered at all from not wrote learning the addition times tables drummed into them or from not doing long subtraction.

    The unit on focusing on how students felt about triangles was especially useful. My kids have always love playing with felt and really missed it once they left kindergarten so was really great for them to play with felt again instead of doing doing difficult math problems which – let’s face it – no one ever likes. That’s not all though, geometry is something that is has great practical value and learning about triangles can be difficult for some students. I certainly remember struggling with the topic – especially when dealing with the bigger more complex triangles that have more than three sides.

    The parents who object to Investigations are just haters who don’t know what they are talking about.

    • Ms. Math

      It’s so great to hear from a parent with positive experiences- I’ve never used Investigations so my main point is we need some experts to weigh in on the topic!

      • I know you’re Ms Math, not Ms English, but did you notice the username?

        • Ms. Math

          umm…. sad to say I have read Swift’s work and totally missed that one. Oops!

    • T

      “Especially when dealing with the bigger more complex triangles that have more than three sides.”

      Sweet sarcasm, bro.

  12. I really love your blog.. Great colors & theme. Did you develop this web site yourself?
    Please reply back as I’m attempting to create my own personal blog and would love to find out where you got this from or what the theme is named. Many thanks!

  13. With havin so much content and articles do you ever run into any problems of plagorism or
    copyright violation? My site has a lot of completely unique content I’ve either written myself or outsourced but it appears a lot of it is popping it up all over the internet without my authorization. Do you know any solutions to help prevent content from being stolen? I’d truly appreciate it.

  14. What’s up, yup this paragraph is in fact nice and I have learned
    lot of things from it about blogging. thanks.

Post a comment

About this Blog

Learning more about life than math…

Region
Las Vegas Valley
Grade
High School
Subject
Math

Subscribe to this blog (feed)


Archives