Mathia X Review

Posted on June 8, 2016

3


I was out marking roads for our upcoming bike ride during yesterday’s webinar on the latest THING THAT IS GOING TO SAVE ALL MATH INSTRUCTION BECAUSE IT IS SO GOOD AND AMAZING.

I think I’m glad I get to watch it because I don’t really want to be commenting “live…”  through the initial cynic filter…

… okay, I’ve seen it.

Executive short summary (current jargon, tl;dr ) :   Same stuff, different day, and with a mess of errors.  PRocedure, procedure, procedure.    It seems to have **some** visual & concrete but I can’t tell how much or where it is… the “instruction”  of the software is math textbook on the pages.   Basically — they have to do a lot of marketing because the product isn’t good.
It’s got more visuals than ALEX — a little — and that’s a good thing.  Still,  if the student is getting something wrong, that means: Do more problems!

If you’re stuck — the  teacher needs to work with you.

“Adaptive” means it will walk. you. through. the procedures.  It doesn’t actually “adapt.”   It can’t pretend to discern your misconception.

It’s got some conceptual questions but if you don’t get it right it will walk you through that problem’s … procedure.   It does not adapt and do anything whatsover to actually work with the concept. The teacher says that most students get it after 3 times or so, which is how long it would take to memorize the arbitrary pattern asked of a particular problem.

It’s got a few creative ways of using data to “adapt” the path along the procedural instruction, which doesn’t have all the wrinkles ironed out.
They’re really excited that there’s no Java.  It’s all HTML5.   It’s standards aligned.   Those are the selling points (and they *did* talk most excitedly and confidently about that stuff).

That’s what they should focus on, as opposed to the actual “instruction.”  Please, leave that word out, dudes.  It’s not instructions; you get directions.   It’s math practice software.
And the word problems are … the usual force-fit “real world” that isn’t actually anything like the real world — even when it could have been, but they couldn’t be bothered to check on what bicycle racing is like…

Screen-by-screen:

First many minutes were the “we’re so great and this is why” but … lots and lots of programs can make charts that look like that and their program is still a mess of procedures manipulating symbols. Of course, their instructional design makes it good for … EVERYTHING!!  Blended learning, Supplemental learning, and Response to Intervention.  JUST BUY IT NOW!!!

They had a cute graph of teacher paced vs. student paced instruction and the number (or percent) of errors through time and it revealed that w/ teacher-paced, the number of errors went up through time.   They thought it was a bad thing, showing that students were understanding less.

I am pretty sure they’re right, but what about that whole New Thing that “Mistakes Grow Your Brain!!!”   Or, as a real question:   don’t forget that emotional part of wrestling with mistakes.   I don’t think they’re going to talk about that.

That said, yup, I agree — y’all go too fast!   Even for the smart ones!  Teachers should slow down to student pace…   How are you going to sell that, though…

Real question:   I wonder whether the mistakes (in “teacher-paced” vs. “student-paced”) tallied were an average, too.   Did even the smart kiddos make more mistakes?   Or did some students make fewer mistakes — and the poor unwashed make so many more that it still skewed the results towards more mistakes?   I know from research that when you add challenge, the good kiddos feel smarter and the strugglers feel worse.

The first part of the actual program’s “Varied Instructional Strategies is “Problem Solving.”

The first one had that really annoying use of the word prism for a rectangular solid, which, I’m sorry, most people associate with triangular prisms when you say that word.   Still, that’s arguable if you teach it because… it’s certainly not a “cube.”  Rectangular solid is better IMHO because it’s specific.

“Explore Tools.”   This is conceptual, and visual.   Pretty abstract though. (just a few seconds on this slide.)

“animations” — it showed dividing by fractions but it doesn’t deal with the struggle that students have with what a “whole” is.   Dividing by fractions is pretty much *always* done too quickly.

“Worked Examples”   is pretty much exactly like Connect’s “step by step” walkthrough is.   It’s still highly abstract.   It shows a 100 square grid and 3/4 of one square are filled in, which is a cool problem.

“Classification Tools” — also totally abstract though it is conceptual.

So… they’ve said  “Rich, Adaptive Mathematics Instruction” again… and I am waiting on the instruction part.

OKAY we’re int hte actual demo.   We’re going to go to THE LESSON! MathiaX Lesson

Gee.

I don’t know.
What do you think?

This is instruction?

Okay, it’s not horrible, but – it’s a math textbook, shoved onto a screen.   It’s a TON of words.   No visuals to go with these descriptions of the bars.   If a human were teaching it, would they recite those words and then pull up a picture?

Nope, this is just another “put a mess of stuff up there and hope the student guesses what to look at.”

That’s not rich.

That’s not “adaptive.”  It hasn’t adapted one whit to students for whom this barrage of words and symbols is terrifying. Sorry, RTI, not *really* for y’all.   mathiaX2.PNG

 

Okay, she just scrolls on down … yea, it goes further… yup, SURE!!!   My guys are going to pore over that text to figure that out.   (Update:   they acknowledge that students who read it do better — obvious correlation; Students who understand math text do better… but gosh, their support for the questions means students will progress anyway.   Bottom line:  they can make it look like students are learning math.)

Dratfooey.

Yea, they do have the beginnings of some visuals — but is it actually integrated into the *instruction*?   Nope.   They’re given that textbook excerpt to dissect (hint:   they won’t).   Then the *next* step is the stuff that makes it work for them:
Walking You Through Step By Step Example.

No, of *course* there are no visual or conceptual explorations here.   It’s just the main part of the program, after all.

The skillometer is nice — data, data, data:)

The other thing that is kinda cool is that the program collects that data to find the “next best step in the path” for this student.   However, since the “instruction” really isn’t even beginning to pretend to work with different paths to understanding, that’s of dubious value.  Why would a student be on one path vs. the other?  Erm, probably they’ll get the first thing in the “walk-through” first.

Still, new stuff has to start somewhere and grow, and it’s a good place to start!

Still, I cannot get all that excited about their statement that the skillometer  will “allow the student to know what they need to learn, see their progress in that learning, and then know what skills they have yet to master.”

My guys would not understand what these things are describing.

mathiaX3

Okay, and another absurd thing.   Their example problem has “Carlota” — pronounced car- lotta — riding in the Tour de France.   “So far, Carlota has ridden 12 miles of the Tour de France. She rides 12 miles per hour.”

Real world problem, right?   Except that TdF is men only and 12 mph for 2 hours is not what a TdF rider does, and she’d better not go 10 hours between water bottles.   As in, **all** the details wouldn’t apply in the real world.   Oh, I’m being “picky” again…

Sorry.   If you want to claim to be “real,” then you need to do it.   Google, for crying out loud.  Just say she’s ridden 12 miles of the TdF *course.*   It almost looks as if they worked to make sure *every* part of the problem that could be bogus was bogus.

The “try and find the good things” angel is even rolling her eyes at this point.

Okay, I’ve found a good thing!   They have different kinds of hints for different kinds of errors.

Do the hints have any visuals?  Anything besides math-booktext??? Erm, no.

No visuals whatsoever in the problem.

Three “hint” levels…”Strike three, we give you the answer.”    it says “hint:  enter the word miles for measuring the distance…”   I guess that’s better than saying “answer: miles” so you have to do some reading but … of course the answer is *highlighted* so as soon as you figure that out, you can click through to it.   Is there a setup to see whether you’re clicking through?   (LOL like Reading Plus and its determination that I was going “too fast” and warning me, tho’ I wasn’t — or, like the Khan Academy’s removal of people who went too fast from their research? Because we don’t care whether we actually reach the frustrated ones, after all…)

And I’m going to stay “picky,” dear ones.   Because you know what?   Our students don’t deserve beta.  They deserve stuff that **already works.**

I like that when the student has “question 1” highlighted and ready for an answer in the answer block, it’s got the pertinent text underlined in yellow.   (Note, though:   they know this is necessary for when the students are actually answering questions.   Still, the “opening of the lesson” is to plop screen-down-keep-scrolling massive text at them.)

mathiaX4.PNG

 

 

Now I wish I were at the webinar, because I would ask “How does the student know not to just answer the *question* in that cell?”   How does the student know to put in an expression in the “miles” and not have to figure out the answer?   I don’t know… are they “walked through” this process?   (It’s also unclear to me that the “after ten hours” are also additional… additional to what?   the original 12?  The 12 and then the next 24?)

The *directions* don’t say at all.   You’re to define the units, then enter a variable and write an expression.

Except, you actually have to do more.  So… the directions don’t actually… match the task that the students are supposed to do.

And we wonder why  students don’t read directions?

Well, it’s new, right?  They need to work a few glitches out.    Sorry, not on my students… I don’t want to be your beta tester unless you’re paying *me* and yes, this is why I’m not publishing anything I do by myself.

And the big, big question:   was that mess of instruction at the beginning actually supposed to get the student to know how to do that problem?  It goes through *one example* and students don’t get to practice that at all.

Okay, I *do* like “exploring symmetry on the Coordinate plane.   It’s an “explore” tool.

Except:   no clue when students would “explore.”   How is that integrated into the instruction?  Oh, it’s not.   IT’s so it’s a feature you have.

Of *course* the reminder email is a “bounce@” address.  However, I found a real one because I did get a nice, real answer from these guys in a previous conversation, claiming that yes, they had applied what the research on knowing concepts said to this product.

Welp, they haven’t really.

Gee, no kidding… students like seeing the bars grow.   Sorry, but I want the connection between the Pavlovian stuff and actual learning to be a little stronger.

This is strongly reinforced by her explanation of how students would be confused by the concept-based problems dividing fractions even if they could do the procedure.
How does she know they have the concept?   Well, because they were walked through the step-by-step process of solving the problems that were presented that way.

Guess what?   Being walked through a procedure several times and then being able to replicate that procedure does not indicate conceptual knowledge.   They  may well have!   If they have, kindly give me evidence of that.   THe fact that your “evidence” is that gosh, they mastered a different procedure — by being walked through it, step by step — tells me that you probably don’t have stronger evidence. (“By the third time walking through it, they could do it step by step.”  Therefore, they were thinking more deeply. ?????)

Oh, she saw **every** student benefiting … because of the instant feedback.   This is what she said after saying it was curing their need for instant gratification… it’s not curing it.   There’s no evidence that they’ll be able to engage in something w/0 it for longer.

“What if they’re stuck?”  Answer:   It’s because a: they’re not trying or b:  they’re not getting it.   Teacher’s supposed to look at the subskills and … teacher should work 1:1 with the student.  Gosh, that’s so adaptive!!!!

From another question (do you really expect students to read the lesson pages or words to that effect):  we know students do better when they read the lesson page.  THey’re robust with good worked examples.    They have, he says, a “check for understanding.” He  doesn’t say whether students have to do it.

Oh, well.   At least ALEKS has some competition so maybe they’ll each work out their tech glitches.    Math pedagogy?   Some other venue…

Advertisements