Minor revelation

Posted on December 24, 2011

0


Well, not minor to me 😉
I’ve struggled with why folks who tout “throw away traditional math instruction! We need more problem-solving!” just don’t make me want to stand up and join them. For that matter, the “we need to bring back real teaching and actually have kids learn the tiems tables!” people don’t either.
I am thinking that maybe it’s all about the concepts. John Van de Walle got me thinking about it (when I sent my resume to Dreambox, whom I haven’t heard back from) with this little gem – “It is not a matter of discovery but rather the personal development of the expected curriculum, based on the students’ existing ideas.” (sadly, a whole lot of the rest of that article at http://mathematicallysane.com/reform-mathematics-vs-the-basics/ was spewing mild vitriol at the evils of direct instruction.)
Depending on the student and the instruction, either a procedure-based or a “problem solving” based curriculum could have that result. Perhaps the most critical factor is actually whether or not the teacher understands the “expected” conceptual curriculum. Given the assorted research on student teachers’ consistent lack of confidence in math, it would explain a lot, especially why neither approach seems to work very well.
I’m going to post an article about why students don’t get math concepts from that Spring 2011 issue of the International Dyslexia Association’s Perspectives bulletin, and that led me to _Key Misconceptions in Algebraic Problem Solving._ (The whole bloomin’ conference is online. WOOT.)
Now, it seems that the folks at Carnegie-Mellon were playing around with some clever “intelligent tutor” software. (Yes, it’s procedural. A quote from the article: “Current instructional methods, including those used in the
Cognitive Tutor, are not typically focused on helping
students gain conceptual understanding even though the
need for greater emphasis on conceptual understanding has
been acknowledged (NCTM, 2000, National Mathematics
Advisory Panel, 2007).) Students were asked conceptual questions, put through the software stuff, and then their concepts were tested again. THe concepts were understanding what the equals sign means, what a negative sign does, and whether things are like terms or not. Seems the students did improve “significantly.” However, the cited “success” story was of a student who went from getting 2/8 conceptual questions right to getting 4/8 of them right. Now, if the questions kept getting harder, maybe he learned something; if not, I am not so sure the gains were “significant.” The score increases may be statistically so, but he may just have done a better job of matching the procedures he knew tot he questions.