Why we're failing students with the 'new' math
I would like to thank David Wees for taking the time to critique my report on math education. It's good to see this issue is getting the attention it deserves, and these issues should be vigorously debated. However, there are a number of points in Wees' article to which I would like to respond.
The CBC just ran an article on the problems in our current math education system, which was terribly one-sided and an example of the worst kind of fear-mongering in journalism. They are quoting an article by Michael Zwaagstra, an "educational expert" writing on behalf of the Frontier Centre for Public Policy.
Actually the CBC coverage about math education was quite thorough. It did not simply summarize my study as the reporter in question also interviewed other educators, visited public school classrooms, gave Department of Education officials an opportunity to respond, and also spoke with several math professors. Wees has every right to disagree with the tone of the news story, but he can hardly call the coverage one-sided when it included opposing views within the same story.
Now Zwaagstra points out that remedial math courses are on the rise in universities, but he doesn't mention a couple of key facts. First, under the old system of mathematics instruction, around 50 per cent of students failed first-year math courses, which were often included in programs as a tool with which to weed people out of university. Could it be that this issue has always been around, and universities are simply now doing something about the problem? What about the increase in students seeking a university education? Could these two issues be connected? Zwaagstra has assumed a correlation between the number of remedial math courses, and the effectiveness of K-12 math education, without actually finding research that supports his conclusion.
As I pointed out in my report, math professors from across the country have clearly stated that they have noticed a decline in math skills of students in their first-year university classes. Since the vast majority of university students received their math instruction in public schools, it's not unreasonable to ask whether there is something wrong with K-12 math instruction.
It is also important to point out that the "new" math education techniques are themselves not very old, and are not used by all teachers equally. The most recent iteration of the elementary school math curriculum in British Columbia is only four years old, and the secondary school curriculum is only five years old, neither of which is a long enough period of time to make the kind of determinations of effectiveness that Zwaagstra is making.
Wees is partly right and partly wrong in his statement. He is correct that not all teachers use these new math education techniques equally. However, it is becoming increasingly difficult for teachers to use traditional methods when they are regularly discouraged from doing so by school administrators. In addition, while the most recent iteration of the "new" math is only a few years old (depending on the province in question), these concepts have been around much longer than that. For example, the widely used Quest 2000 series of math textbooks from Addison Wesley was published in the 1990's and already had very little emphasis on math facts or traditional algorithms. So this is not really a new problem. It's just become more apparent since provincial curricula were re-written to fully reflect this methodology.
Further, he talks about parents enrolling their kids in after-school tutoring programs without discussing the reasons why parents are doing this. Are parents increasingly enrolling their kids for extra tutoring because they are dissatisfied with their children's current educational attainment? Or do they have other reasons for paying for these tutoring services? We don't know, and Zwaagstra doesn't provide us with any evidence for the reasons for parents to choose tutoring. He just cherry-picks this fact because it seems to support his argument.
Parents who enrol their kids in after-school tutoring programs almost always do so because there is something lacking in the education their children are receiving in school. Tutoring is expensive and parents wouldn't pay for it if they were already satisfied with the quality of education provided.
There is also solid evidence showing that the longer that people are out of school, the less likely they are to use the algorithms they use in school, but the more successful they are at solving mathematical problems they encounter, as Keith Devlin points out in his book, The Math Instinct. In other words, traditional school math seems to be a hindrance to people being able to actually solve real-world mathematical problems. It's worth pointing out that Devlin's research is reasonably old, and most of the participants in the research learned mathematics in the traditional method. Is it even worth pointing out that Zwaagstra doesn't actually include any of his "solid evidence" in his paper, and the footnote here (see the original article) leads to a definition of the word algorithm?
Perhaps Wees should clarify his point because it looks like he's saying that people get better at math the further removed they are from school. By that argument, people who never attended school at all should have the best math skills since they were never corrupted by traditional school math in the first place. I doubt that is what he meant to say, but it comes across that way.
As for Wees' claim that I didn't include any solid evidence in my paper, he obviously did not check the rest of my footnotes. Here are some of the references I cited.
- K. Anders Ericsson, Ralf Th. Krampe, and Clemens Tesch-Romer, "The Role of Deliberate Practice in the Acquisition of Expert Performance," Psychological Review, Vol 100, No. 3, 1993, pp. 363-406.
- Hung-Hsi Wu, "Basic Skills Versus Conceptual Understanding: A Bogus Dichotomy in Mathematics Education," American Educator, Fall 1999, pp. 1-7.
- W. Stephen Wilson, "In Defense of Mathematical Foundations," Educational Leadership, March 2011, pp. 70-73.
In my report, I also made reference to research showing the effectiveness of the JUMP program for teaching math. The JUMP website has more detailed information about these studies at: http://jumpmath1.org/jump_research
Wees may also want to check out Dr. John Hattie's 2009 comprehensive book, Visible Learning, in which he systematically examined hundreds of research studies in education to find the most effective teaching methods. Here is what Hattie says about constructivism (the philosophy upon which new math is based).
"The role of the constructivist teacher is claimed to be more of facilitation to provide opportunities for individual students to acquire knowledge and construct meaning through their own activities, and through discussion, reflection and the sharing of ideas with other learners with minimal corrective intervention. These kinds of statements are almost directly opposite to the successful recipe for teaching and learning as will be developed in the following chapters," page 26 (emphasis added).
Wees should also examine the following peer-reviewed research study that recently appeared in the Journal of Educational Psychology. I didn't reference it in my report since it is very recent and I did not see it until after my report was released.
- Alfieri, L., et al. (2011) Does Discovery Based Instruction Enhance Learning? Journal of Educational Psychology, Vol. 103, Issue 1, p 1-18.
This article is a meta-analysis of 164 studies of discovery-based learning, and it concluded that unassisted discovery does not benefit learners. Rather, students need to have explicit instruction, scaffolding and worked examples.
More evidence supporting the traditional approach to math instruction can be found at ahypatia.wordpress.com, a website written by a Canadian professor of mathematics.
Zwaagstra then goes on to bash the results of the PISA examinations, citing an article (claiming it is research) written that suggests that Finnish students are not as good at math as the PISA results would claim, and that by extension, neither are Canadian students.
I did not "bash" the results of the PISA examinations. I simply pointed out the problem with using our relatively high PISA results to argue that the math skills of our students are just fine. PISA evaluates "everyday math" rather than the type of math students need to be successful at the university level.
It's pretty important to note that when teachers are given proper training in effective pedagogy, their students' understanding improves. To say that the problems in our math education system are entirely due to the introduction of the new math curriculum is pretty irresponsible, given that any number of other factors could be contributing to the problem.
I never said the problems in math education were entirely due to the introduction of the new math curriculum. Of course, there are other factors that need to be considered. What I did say is that the current curriculum and recommended math textbooks are not providing students with the foundational skills they need to be successful at math.
Zwaagstra continues by bemoaning the lack of standards and emphasis on accurate calculations by the National Council of Mathematics Teachers (NCTM), and the Western and Northern Canadian Protocol (WNCP). Clearly, the research these two organizations have done for decades is not sufficient for Zwaagstra.
The research studies cited by these organizations are sadly deficient. I've read many of these research studies, and they usually have extremely small sample sizes, lack proper control groups, and contain a significant amount of speculation. If Wees disputes this statement, then I challenge him to list the top three peer-reviewed research studies that back up his claim that the discovery-based method to teaching math is superior to the traditional emphasis on math facts and standard algorithms. I will read them and provide my analysis.
Before I go on to Wees' next point, it should be noted that he doesn't even touch the part of my report where I directly compared the recommended math techniques in the new textbooks with those in the old textbooks. It's interesting that he omitted from his analysis the part of my paper where I made the most direct and hands-on comparison between the old and new methods.
Zwaagstra's solution to improving math education is to move "back to basics," which is as unoriginal an idea as I've heard. It is arrogant of Zwaagstra to assume that this approach hasn't been tried before. Perhaps he could instead address the issue of elementary school teachers often lacking support and training in how to teach math? Zwaagstra points out (correctly) that having mastered one computation, students are then better able to learn another computation, but this leaves students learning a series of computations, and not spending any time actually using them.
Obviously, we can't go "back" to the basics if we've never tried them before. So I'm not sure why Wees thinks I believe that I've come up with a new form of teaching math. It's called standard algorithms and traditional math education exactly because it has been tried before and it is actually quite effective. In addition, Wees sets up a false dichotomy when he says that when students learn a series of computations they don't spend any time actually using them. There's no reason why students can't learn a new computation, gain experience using it, and then move on to the next level.
JUMP math is mentioned in Zwaagstra's article as an antidote to the problem, but he doesn't talk about the issue of the associated training, or the lack of diverse assessment used in the JUMP math system. I think that the training manuals that go along with the JUMP math curriculum, for example, actually address the misconceptions of the people teaching the math (mostly elementary school teachers) rather than itself being a significantly better system. As one educator has told me, JUMP math is pretty useless without the training materials for teachers.
The system in JUMP math is significantly different from the discovery-based methods currently prescribed in the math curriculum and the recommended textbooks. Considering that Wees obviously disagrees with the methodology of JUMP math, I'm not sure how he can claim that its training manuals help address the misconceptions of people teaching math.
Similarly, someone who has not had time to play with a piano, to improvise, and to perform music for others will never develop an appreciation for the instrument. Zwaagstra is suggesting that we should discard the extra parts of math education, like problem solving, and focus on computations, which is the musical equivalent of only learning scales, and never getting to perform music.
Nowhere in my report did I say that we should discard problem solving from the math curriculum. Once again, Wees has set up a false dichotomy. By all means allow students to discover new ways of doing math. Just make sure they also master their math facts and learn the standard algorithms.
Peer reviewed research (which shows that the techniques advocated by the NCTM and WNCP are effective) carries with it a heavier weight of authority, and is a more reliable instrument with which to craft public policy.
Ironically, the link provided by Wees in this section is to an opinion piece published by the International Academy of Education, not to a peer-reviewed study. The challenge still remains for Wees to list the top three peer-reviewed studies that back up his claim that the discovery-based method to teaching math is superior to the traditional emphasis on math facts and standard algorithms.
In conclusion, I submit that there are serious problems with the objections Wees has raised to my report.