*My awesome office mate who helped me pass statistics last semester got engaged in the Pelham Math debates and wrote a Rubenstien-esque statistical analysis of the studies used to justify removing a particular curriculum from the schools. *

*Here are his thoughts:*

So I have several issues about this article that was posted by the PMC. However, I do want to qualify my following statements. First, I do not know enough about the TERC Investigations curriculum in order to make an evaluative judgment regarding that curriculum. Second, I do not know the teachers in the district nor do I know the inner workings of the district. By this I mean I do not know how the teachers teach (saying that a teacher uses such-and-such curriculum does not tell me how he/she teaches) and I do not know the extent to which the school provides professional development or training opportunities in using a specified curriculum. A final disclosure: I am a mathematics educator, a mathematician/statistician, and a mathematics/statistics education researcher.

My first issue with this article is the third sentence. “But there is compelling research that disproves it.” As any good statistician worth his/her salt will tell you, statistical analysis NEVER proves or disproves anything. In fact, as the joke goes, statistics is the best field to be in…you always get to be wrong! In other words, the PMC is now (un-) intentionally twisting the study’s findings to promote its agenda.

While we’re on the topic of findings of the study, let’s get a little factual. The first “finding” reported in the article isn’t a finding of the study at all. Oh, and that graph, yeah…it’s not in the study at all. In fact, considering I intimately know the program that the study’s authors used (SAS) and I know Excel very well (which the study doesn’t use at all), I can only conclude that the PMC made that graphic and are attributing it to the study. Now, I looked quite extensively and I couldn’t find numbers that would correlate well with the bar chart. However, if you can give me a page number inside the study to go examine again, I would be more than happy to re-evaluate this point.

The article goes on to claim that “The study was undertaken with the goal of finding out—with real evidence—whether the type of curriculum used in early elementary math education matters when it comes to achievement and learning.” Here’s the actual goal: “The goal of this study is to examine the relative effects of widely used currciula that draw on different instructional approaches and that hold promise for improving student math achievement” (p. 4). Hmm, anyone else seeing a disconnect? Let’s check the research questions.

• “What are the relative effects of the study’s four math curricula on math achievement of first- and second-graders in disadvantaged schools?”

• “Are the relative curriculum effects influenced by school and classroom characteristics, including teacher knowledge of math content and pedagogy?”

• “What accounts for curriculum differentials that are statistically significant?” (pp. 4-5, my emphasis)

So, in summary, the PMC is now claiming that the study is focused on finding out something that the study’s own authors aren’t claiming to study. So one of two things must be true here: 1) the PMC knows what was being studied better than the study’s own authors, or 2) the PMC is reading what they want to believe and then writing propaganda.

Wondering about that emphasis I added? In case you haven’t ready the study, allow me to describe the sample population. The study was on disadvantaged students and disadvantaged schools. The study defines these schools as “those that have a relatively high schoolwide Title I eligibility rate” (p. xxi). This means that the schools in the study have high numbers (percentages) of children from low-income families. This is traditionally measured by the number of children eligible for free or reduced-price lunch programs. According to GreatSchools.org, Pelham Union Free School District has, wait for it, 5% of students qualifying for these programs. Sounds disadvantaged to me. (I’ll admit, I’m making an assumption that this is the correct school district; please correct me if I am wrong.) By the way for the casual reader, New York state’s average is 44% for this measure.

Now you might wonder why it is important that I bring this up. It has to do with the intent that the PMC has behind publishing this article with a study they read. The idea that I gleaned was that the PMC is attempting to use a scientific study to provide their cause with momentum and to get well-meaning parents to join the PMC cause. However, as any statistician will tell you, you can’t take the findings from one study and apply those findings as you see fit without paying attention to differences between groups. The study’s authors mention several times in the report that one should not attempt to draw any conclusions beyond the scope of the study, i.e. non-disadvantaged schools.

In the second bulleted “finding” of the article is that the average achievement of Math Expressions students compared to Investigations. Now, this was an actual finding from the study but there is some information-hiding going on. This result was only significant when you don’t adjust the statistical tests to account for the multiple pair-wise comparisons that were being made. When you do make the adjustment, the finding is no longer significant.

So what’s the difference between non-adjusted and adjusted? Quite a bit actually. I encourage you to expand your horizons and research about it. Essentially, the difference boils down to the number of tests that are being conducted. If one does not adjust, at the 5% level (this study’s level), you have a 26.5% chance of a false-positive…saying that a difference is significant when it really isn’t. It is in my professional opinion that the difference of 0.11 standard deviations is NOT significant at 5% level.

Interestingly enough, the PMC seem very adamant about pointing out that TERC Investigations was the only “pure constructivist program” and had a constructivist pedagogy. Recall that I am an educator so I do happen to know quite a bit about pedagogy. Sorry to burst your bubble PMC but there is NO constructivist pedagogy. Constructivism is a theory of learning and pedagogy is about instructional theory. Recall that the study’s authors wanted to look at instructional approaches. This means that it was looking at teacher-centered and student-centered instruction…not the driving cognitive theory behind the curriculum. It is EXTREMELY inappropriate to equate student-centered learning with constructivism. I have watched a hard-core constructivist use teacher-centered instruction to help students understand mathematics. I have also watched hard-core behaviorists use student-centered instruction. Therefore, student-centered instruction does not imply constructivism.

To briefly mention your other article with the three mathematicians, I have worked with students who learned mathematics through the Saxon program. Guess what, I had to spend a significant portion of time getting the students to “unlearn” the conceptions that they developed through that program about mathematics. In my opinion, it is not an effective curriculum. (Yes, I have looked at this one.)

An independent research group conducted a rather in-depth study of the relative effects of four mathematics elementary curricula in disadvantaged schools. The study’s authors were very through and clear with their findings. However, NOT ONCE in the entire 250 page document do the study’s authors ever make the claim “Investigations students are at the back of the achievement pack by a meaningful amount in first grade and by a significant margin in second grade.” This is a pure invention of the PMC. Recall the goal and the research questions of the study. This statement is not a logical outcome in any statistical sense.

Now, I do want to say something in praise of the PMC. I applaud the very fact that you are taking an active role in your children’s education. I firmly believe that curriculum development requires the interaction of many different types of stakeholders, from school administrators and teachers to parents and education researchers. There have been a number of times when I have wished that some of my students’ parents would be more involved. However, there is a time and place for all stakeholders in the process. I would never presume myself knowledgeable enough to dictate to my doctor what she/he should be doing while I’m in surgery. There does come a time when you need to listen to individuals who have more experience and more knowledge about different topics than you. Yes, it is good to get mathematicians involved; however mathematicians know mathematics not how students learn math. You also need to invite people who research how students learn mathematics to be part of your dialogue.

You obviously want the highest-quality math for the children of Pelham. (I’ll assume you mean math education.) However, what are you doing to see about the quality of instruction? About the meanings of mathematical topics your child is learning? What are you doing to actually experience the TERC Investigations curriculum yourself? From my vantage point it seems that you have particular ideas for how math should be taught and have found people who do research in other subjects who agree with you. Yes, TERC Investigations has a different approach to mathematics education than you might have learned. However, there is more to mathematics than fluency in standard algorithms. Standard algorithms teach a student that mathematics is devoid of meaning and lead to a belief that mathematics does not involve the real world. I’ve served on hiring committees before and I can tell you that not once have I ever supported the hiring of someone who believed that mathematics was all about standard algorithms. Yes, students should have computational fluency, but they also must be able to make sense of new situations. What do you think is needed more in the STEM fields, creative-thinkers/problem solvers or skill-and-drill, computationally fluent robots?

It is a sad day indeed when people believe that mathematics achievement should only be measured by how well a student can follow a standard algorithm.

Because the article mentioned using What Works Clearinghouse as a way to judge math curriculum I thought it was worth noting Schoenfeld’s article about the methodological issues in studies What Works Clearinghouse used to evaluate math curriculum. Schoenfeld was a research mathematician but has spent the bulk of his career as a well-respected math education researcher at Berkeley.

http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=3&sqi=2&ved=0CDQQFjAC&url=http%3A%2F%2Fgse.berkeley.edu%2Ffaculty%2Fahschoenfeld%2FSchoenfeld_WhatDoesntWork1.pdf&ei=tWklT7TDDqHJsQKN5vGMAg&usg=AFQjCNGEv7HLm8turF9Vyl8XDZkG4KbmMA&sig2=lt1BDceo34S0S5M7_UVJrA