Not for the faint of art. |
Thank you to everyone who commented here on Saturday about my travel plans. It clarified a few things for me. Today, I'm going to discuss an article about numbers: this 2018 Ars Technica discussion of statistics. Trigger warning: math ahead. “Fixed mindsets” might be why we don’t understand statistics Study finds people prefer complicated methods because that's what they're used to. Just reading the subhead there triggered my skepticism. People prefer complicated methods? Not in my experience, piped up my inner skeptic. People like things simple. And then I remembered that we're reading and writing in English, with all of its confusing and complicated grammar and spelling rules, and how every attempt to simplify the language has met with indignant resistance. Sure, I'm used to it. I've spent a lifetime trying to improve my command of the language. If we suddenly decided that everything should be spelled consistently, I'd have to learn a whole new set of rules, and even if those rules are simpler, it's more work. Besides, shouldn't everyone else have to put in the same work I did to learn something? Anyway, that has nothing to do with statistics (well, not directly). It was the first analogy that popped into my head. A new study in Frontiers in Psychology examined why people struggle so much to solve statistical problems, particularly why we show a marked preference for complicated solutions over simpler, more intuitive ones. Chalk it up to our resistance to change. The study concluded that fixed mindsets are to blame: we tend to stick with the familiar methods we learned in school, blinding us to the existence of a simpler solution. As I noted above, it wouldn't be a "new study" anymore. I mostly mention the time frame because the article's publication was before the entire world got hit by a working example of statistics. Now, I know I keep harping on this, but one study isn't sufficient to establish fact. I have no idea if any follow-ups have been done in the 5+ years since this article was published. Fortunately, this is handled by weasel words like "might be." Roughly 96 percent of the general population struggles with solving problems relating to statistics and probability. 67.56% of statistics are made up on the spot. Okay, no, I just made up that number, but if you're going to assert something like "96% of the general population" does anything, shouldn't you back that up with some sort of citation? Assuming it's true, though, would it surprise you to know that I'm one of the 96%? I necessarily encountered statistics and probability in my education, but that doesn't mean I don't "struggle." I'm not an expert, and it doesn't come naturally to me. It's one reason I like blackjack at casinos: it exercises that part of my brain. Sure, it can be an expensive education, but so is college. I think the bigger problem is that many people don't even try. Recent studies have shown that performance rates on many statistical tasks increased from four percent to 24 percent when the problems were presented using the natural frequency format. The "natural frequency" thing is explained fully in the article (basically, saying something like 1 in 4 instead of giving it a 25% probability, the two of which are mathematically equivalent). But I had to chuckle at the way they phrased this... using percentages. What I finally figured out they meant by "four percent to 24 percent" was: the 4% is the complement of the 96% above, and "natural frequency format" lowered the "struggle" rate from 96% to 76%. But the way that was presented was somewhat ambiguous. I encounter this sort of thing sometimes in video games. Say you have a 30% chance of scoring a critical hit on an opponent. You find some item whose description includes "raises the crit hit chance by 10%." Now: does that mean it's additive, raising the crit hit chance to 40%? Or does it increase your chance of scoring a critical hit by 10% of the current probability, raising it to (30+0.1*30=) 33%? This matters because the latter might not be enough of an edge to justify the opportunity cost of using that item rather than, say, one that makes you prettier. And these games are generally designed, at least in part, by nerds who oughta know better than to be that ambiguous about numbers. The article goes on to describe a Baysean probability problem (in brief, Bayes' Theorem takes into account prior knowledge), and the thing that amused me there is that at the end of the section, which involves a bit of math, there's an editorial comment in italics and brackets: [corrected]. The implication is that even the author got it wrong. Which I'm not going to snark on them for, because I get it wrong sometimes, too (there's one particularly egregious math error in an old blog entry that I'm really hoping doesn't come up in one of my Revisited posts, because I'd then be honor-bound to publicly kick myself for getting it wrong). But if you're specifically writing an article about how people get statistics wrong, and in it you get the statistics wrong, well, that's funny. The students had to show their work, so it would be easier to follow their thought processes. Weber and his colleagues were surprised to find that even when presented with problems in the natural frequency format, half the participants didn't use the simpler method to solve them. Rather, they "translated" the problem into the more challenging probability format with all the extra steps, because it was the more familiar approach. A while back, I encountered an article about second-order polynomials. The quadratic equation is traditionally taught as a rigorous method for finding their solutions. In my experience, I had to memorize the quadratic equation long before anyone really explained to me why it worked, which I find to be essential for my true understanding of something. The QE is somewhat complicated, involving square roots, addition, subtraction, multiplication, and division; not to mention that it's got that pesky "plus or minus" sign, because the square root of any positive number can be either positive or negative (sqrt(4), for example, can be 2 or -2). Anyway, this article presented a faster, simpler method for deconstructing polynomials. As I recall, it's indeed simpler. And yet, on the rare occasion these days when I want to think about this sort of thing, I still go to my hard-fought memorization of the QE. So I can believe the conclusion reached in the study. But that just means I have to be more skeptical of it. |