\"Writing.Com
*Magnify*
    April     ►
SMTWTFS
  
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
Archive RSS
SPONSORED LINKS
Printed from https://www.writing.com/main/profile/blog/cathartes02
Image Protector
Rated: 18+ · Book · Opinion · #2336646
Items to fit into your overhead compartment

Carrion Luggage

Blog header image

Native to the Americas, the turkey vulture (Cathartes aura) travels widely in search of sustenance. While usually foraging alone, it relies on other individuals of its species for companionship and mutual protection. Sometimes misunderstood, sometimes feared, sometimes shunned, it nevertheless performs an important role in the ecosystem.

This scavenger bird is a marvel of efficiency. Rather than expend energy flapping its wings, it instead locates uplifting columns of air, and spirals within them in order to glide to greater heights. This behavior has been mistaken for opportunism, interpreted as if it is circling doomed terrestrial animals destined to be its next meal. In truth, the vulture takes advantage of these thermals to gain the altitude needed glide longer distances, flying not out of necessity, but for the joy of it.

It also avoids the exertion necessary to capture live prey, preferring instead to feast upon that which is already dead. In this behavior, it resembles many humans.

It is not what most of us would consider to be a pretty bird. While its habits are often off-putting, or even disgusting, to members of more fastidious species, the turkey vulture helps to keep the environment from being clogged with detritus. Hence its Latin binomial, which translates to English as "golden purifier."

I rarely know where the winds will take me next, or what I might find there. The journey is the destination.
Previous ... -1- 2 ... Next
April 12, 2025 at 3:22am
April 12, 2025 at 3:22am
#1087073
This Wired article is fairly old, and published on my birthday, but neither of those tidbits of trivia are relevant.

    Why a Grape Turns Into a Fireball in a Microwave  Open in new Window.
Nuking a grape produces sparks of plasma, as plenty of YouTube videos document. Now physicists think they can explain how that energy builds up.


No, what's relevant is that fire is fun.

The internet is full of videos of thoughtful people setting things on fire.

See?

Here’s a perennial favorite: Cleave a grape in half, leaving a little skin connecting the two hemispheres. Blitz it in the microwave for five seconds. For one glorious moment, the grape halves will produce a fireball unfit for domestic life.

Unfortunately, you can only see it through the appliance's screen door (that screen serves the important function of keeping most of the microwaves inside the microwave), and I don't know what it might do to the unit, so don't try this with your only microwave. Or at least, don't blame me if you have to buy a new one. I'm not going to pay for it.

Physicist Stephen Bosi tried the experiment back in 2011 for the YouTube channel Veritasium, in the physics department’s break room at the University of Sydney.

What's truly impressive is that Bosi, the grape, and the microwave oven were all upside-down.

Off-camera, they discovered they had burned the interior of the physics department microwave.

What'd I tell you? I'm not responsible if you blow up the one at work, either. Still, if the last person to use it committed the grave sin of microwaving fish, this might be an improvement.

I should also note that the article contains moving pictures of the effect. These are cool, but you might hit a subscription screen. With my script blocker, I could see the text, but not the pictures.

But it turns out, even after millions of YouTube views and probably tens of scorched microwaves, no one knew exactly why the fireball forms.

As regular readers already know, this is the purpose of science.

After several summers of microwaving grape-shaped objects and simulating the microwaving of those objects, a trio of physicists in Canada may have finally figured it out.

At least they weren't upside-down. Sucks if they wanted to nuke some poutine, though.

The fireball is merely a beautiful, hot blob of loose electrons and ions known as a plasma. The most interesting science is contained in the steps leading up to the plasma, they say. The real question is how the grape got hot enough to produce the plasma in the first place.

And this is why some people think science sucks the joy out of everything. No, nerds: the fireball is the cool part. The science is merely interesting.

Their conclusions: The grape is less like an antenna and more like a trombone, though for microwaves instead of sound.

Huh. Never heard of a trombone exploding into a blaze of glorious fire, but I suppose it could happen. Better to save that fate for instruments that deserve it, like bagpipes, accordions, and mizmars.

I joke, yes, but the article explains it rather well. If you have a subscription. Or can cleverly bypass that annoying restriction.

The grape, incidentally, is the perfect size for amplifying the microwaves that your kitchen machine radiates. The appliance pushes microwaves into the two grape halves, where the waves bounce around and add constructively to focus the energy to a spot on the skin.

Not explained: if the grape is "the perfect size," how come it works for grapes of different sizes?

A common misconception is that the microwave acts on the grape from the outside in, like frozen meat defrosting, says physicist Pablo Bianucci of Concordia University, who worked on grape simulations included in the paper.

I don't know where Concordia University is, so I can't make jokes about its location. Oh, wait, I could look it up.

...

Oh, it's in SoCal. Grody.

Anyway, I didn't know people still thought microwaves heated from the outside in. We can't all be physicists, but I was under the impression that it's fairly common knowledge that the wavy EM thingies work by exciting the water molecules throughout the... whatever you put in there. That's why it's usually faster to nuke a cup of water than it is to boil it on the stove.

The work has more serious applications too, Bosi says.

Look, not everything needs to be useful for something. But when it is, that's pretty cool.

His experiments with grape balls of fire...

And there we have it, folks: the real reason I saved this article to share with all of you.

...began and ended with the 2011 YouTube video, but his curiosity did not. “I’m impressed with the scientific depth of the paper,” wrote Bosi in an email. In particular, he notes that authors came up with mathematical rules for describing the grape hotspot. They could conceivably shrink these rules to a smaller scale, to create similar hotspots in nanoparticles, for example. Scientists use heated nanoparticles to make very precise sensors or to facilitate chemical reactions, says Bianucci.

I'll take their words for it.

During all their microwaving, they noticed that two grapes placed side by side repeatedly bump into each other, back and forth. They don’t know why that happens, and they’ll be studying that next, says Bianucci.

Always something else to study. This is a good thing.

Not mentioned in the article: how in the hot hell did anyone figure out that putting a grape, cut mostly in half but still connected by a tiny thread of grape skin, into a microwave would produce a "grape ball of fire?" It's not like we eat warm grapes. Even if we did, that's still a very specific configuration.

Some mysteries, I suppose, will never be solved. And that's also a good thing.
April 11, 2025 at 9:13am
April 11, 2025 at 9:13am
#1087018
I'm more than a little pissed at Time right now because they reported the "dire wolf de-extinction" story as if it were true and not a steaming pile of bullshit. Don't know what I'm talking about? Use a search engine; I'll be damned if I'm going to give that crap any more boost by linking it.

But I'm really hoping they got the science right on this article:



"Surprising," I guess, if you're a prude. It makes me feel better to cuss, so I've always known it had health benefits (for me, not the people I'm cussing at). Still, it's good to have science backing me up. If it's true. After the "dire wolf" bullshit, I can't be sure.

Many of us try to suppress the urge to blurt out an expletive when something goes wrong.

And many of us try to hold sneezes in. That doesn't mean it's healthy.

Research has found that using profanity can have beneficial effects on people’s stress, anxiety, and depression. In fact, there are numerous potential physical, psychological, and social perks related to the power of a well-timed F-bomb.

"Social?" I guess it depends on the society.

Cursing induces what’s called hypoalgesia, or decreased sensitivity to pain. Researchers have shown that after uttering a curse word, people can keep their hands submerged in ice water for longer than if they say a more neutral word.

I get why they do the submerged in ice water thing. It's a low-risk means of inducing some level of pain in a test subject. Other kinds of pain may be unethical for scientists. But I wonder about the efficacy of low-risk pain inducement in a study such as this. For one thing, a big part of pain is the surprise. If you know you're going to get stuck with a needle at the dentist, you can control your reaction somewhat (though it's quite difficult to swear with your mouth wide open and the dentist's fingers in there).

But here’s an interesting twist: “People who swear less often get more benefit from swearing when they need it,” he says. In other words, cursing all the time zaps the words of their potency.

That's not surprising to me. I prefer to hold back the important words for when they can provide better emphasis.

Swearing aloud is associated with improvements in exercise performance, including cycling power and hand-grip strength.

This wouldn't surprise me either. I glanced at the study. Decent sample size, but restricted demographics (i.e. one of those studies that used students as swearing guinea pigs), and the control group used neutral language, presumably words such as "hit," "truck," or "bunt."

A study in the European Journal of Social Psychology found that when people wrote about a time they felt socially excluded, then repeated a swear word for two minutes, their hurt feelings and social distress were significantly lower than for people who used a neutral word.

Taken together with the findings about physical pain, this might lend more credence to the idea that physical pain and emotional pain are related in more ways than just being described with the same word.

In another study, researchers found that when drivers cursed after being refused the right of way by another driver, or when they encountered a traffic jam caused by cars that were stopped illegally, cursing helped them tamp down their anger and return to a more balanced emotional state.

I didn't look at that study. I've experienced this myself. And "cursing" in this context includes showing the offender my middle finger.

There appear to be surprising social benefits associated with the well-timed use of profanity. “Some people believe that profanity can break social taboos in a generally non-harmful way, [which] can create an informal environment in which people feel like insiders together,” says Ben Bergen, a professor of cognitive science at the University of California, San Diego, and author of...

This isn't on the same level as those other assertions. "Some people believe" is weasel words, which is why I'm not including the name of his book. I don't doubt that it does these things, but, as anyone who's been on WDC for a while can attest, cussing can also alienate some people.

Of course, it is possible to overdo it. People who swear frequently are sometimes perceived as angry, hostile, or aggressive, so there’s a potential tipping point to using profanity.

Again, I'm pretty sure that's true, but: what's the tipping point? I suspect it's different for different groups. Baptist church vs. biker bar, e.g.

The article does address this qualitatively:

It’s also important to know your audience.

Swearing etiquette may depend on the social hierarchy and power dynamics in certain situations, such as the workplace, says Jay. Just because the boss uses curse words doesn’t necessarily mean you can get away with it. (You’ll also want to modify your language around young children.)

Nah. I want young children to stay as far away from me as possible. If I cuss in public, their parents herd them away. I win. They win, too, because I have furthered their education.

Not addressed in the article: whether writing "fuck" has similar benefits to saying it. I suspect not. Clearly, further study is needed. Can I get money for being a guinea pig in that study?
April 10, 2025 at 12:42am
April 10, 2025 at 12:42am
#1086956
I'm posting early today because I have a dentist thing that will a) take all morning and b) leave me in no shape to form coherent sentences (worse than usual, I mean) in the afternoon. Speaking of posting schedule, I'll be going on a little trip next week, so blog posts will be erratically timed.

For today, though, I'll try not to make any tired old "place is in the kitchen" jokes about today's article from Gastro Obscura. No promises.

    Meet the Feminist Resistance Fighter Who Created the Modern Kitchen  Open in new Window.
Margarete Schütte-Lihotzky left an indelible mark on Austria, architecture, and how we cook.


Sexist jokes notwithstanding, this scene is set in Austria in the 1940s, and it was a central platform, in that era, of a certain political party led by a certain Austrian that women were for children, kitchen, and church. Which should be enough right there to rebel against the entire idea of rigid gender roles.

Schütte-Lihotzky had been imprisoned since 1941 for her work as a courier for the Communist Party of Austria (KPÖ), which led the resistance against the Nazi regime in her home country. While she managed to narrowly avoid a death sentence, Schütte-Lihotzky remained in jail until the end of World War II in 1945. The incarceration would forever split her life in two. On the one side were her beginnings as a precocious and successful architect spurred on by the desire to create a better life for working-class women. On the other, what she would refer to as her “second life,” as an active communist, political activist, and memoirist who was professionally shunned in Austria for her political beliefs and received her much-deserved accolades only in the final decades of her life.

I suppose it could have been worse. Some people don't get recognized until after they croak.

Schütte-Lihotzky led a remarkably long and full life, dying a few days short of her 103rd birthday in 2000. But her name remains forever connected to a space she designed when only 29 years old: the Frankfurt Kitchen, the prototype of the modern fitted kitchen.

Which is so ubiquitous in developed countries now that it's hard to imagine a time when it didn't exist.

Designed in 1926 as part of a large-scale social housing project in Frankfurt, Germany, the “Frankfurt Kitchen” introduced many of the elements we now take for granted...

So the concept of a kitchen as we know it today is just under 100 years old. That's not too surprising; 100 years ago, we were still arguing over things like the size of the Universe and what powers the Sun. Still, I'd have said "take for granite," because of the proliferation of granite countertops in kitchens and because I can't resist a gneiss play on words.

...a continuous countertop with a tiled backsplash, built-in cabinets, and drawers optimized for storage—all laid out with comfort and efficiency in mind.

Whoever put my kitchen together must have forgotten about the "optimized for storage" bit.

“She didn’t just develop a kitchen,” says Austrian architect Renate Allmayer-Beck. “It was a concept to make women’s lives easier by giving them a kitchen where they could manage more easily and have more time for themselves.”

Thus leading inexorably to women joining the workforce, which, if you think that's a bad thing, boy are you reading the wrong blog.

The article even addresses the obvious:

While the Frankfurt Kitchen was marketed as a kitchen designed for women by a woman, Schütte-Lihotzky resented the implication that her gender automatically endowed her with secret domestic knowledge, writing in her memoir that “it fed into the notions among the bourgeoisie and petite bourgeoisie at the time that women essentially work in the home at the kitchen stove.”

I vaguely remember featuring a bit back in the old blog about the invention of the automatic dishwasher, which predated the Frankfurt kitchen (I suppose that rolls off the tongue and keyboard more easily than "Schütte-Lihotzky Kitchen") by a few decades. That, too, was a woman's work. And that's the closest I'm going to get to making a "women's work" joke; you're welcome.

The Frankfurt Kitchen was efficiently laid out and compact, to save both on costs and the physical effort required to use it. Here, a woman could move from sink to stove without taking a single step. This quest for efficiency also led Schütte-Lihotzky to move the kitchen from a corner of the family room into its own space—a choice that baffled contemporary homemakers.

And then, decades later, they'd take away the wall separating the kitchen from the family room, putting it back into one big open space. I spent my childhood in a house with an open-concept kitchen/living area, and I have nothing inherently against it. What I have a problem with is all the remodeling shows that insist on that kind of layout. Not because they insist on it, but because they're thinly-veiled ads for home improvement stores, and they enable that bane of the housing market in the US: house flippers.

The article even addresses the open-concept change, if obliquely:

When the Frankfurt Kitchen came under fire from second-wave feminists in the 1970s for isolating women in the kitchens and making domestic labor invisible, the critique hit her hard.

She defended her design in her memoir. “The kitchen made people’s lives easier and contributed to women being able to work and become more economically independent from men,” she wrote. Still, she conceded, “it would be a sad state of affairs if what was progressive back then were still a paragon of progress today.”


I feel like a lot of people would defend their life's work to the last, but that quote demonstrates a willingness to keep an open mind, even later in life, and to acknowledge that nothing is ever truly completed. As they used to say, "a woman's work is never done."

There's a lot more at the link, which I found interesting because I was only vaguely aware that today's kitchen designs owed a debt to something called a "Frankfurt Kitchen," but I didn't know anything about how it came to be. I figured maybe someone else might want to know, too.
April 9, 2025 at 11:19am
April 9, 2025 at 11:19am
#1086899
I sure talk about the Moon a lot. We're coming up on another Full Moon, by some reckonings the Pink Moon, the first Full Moon after the Northern Hemisphere Spring Equinox. It's also a culturally significant Full Moon because it marks the start of Pesach, or Passover; and helps to define the timing of Easter. This will occur on Saturday, based on Eastern Standard Time.

But this article, from aeon, isn't about Moon lore or cultural observances; quite the opposite.

    How the Moon became a place  Open in new Window.
For most of history, the Moon was regarded as a mysterious and powerful object. Then scientists made it into a destination


On 25 May 1961, the US president John F Kennedy announced the Apollo programme: a mission to send humans to the Moon and return them safely to Earth within the decade.

Specifically, white American male humans, but hey, one small step and all that.

The next year, the American geologist Eugene M Shoemaker published an article on what it would take to accomplish the goal in American Scientist. It is an extraordinary document in many ways, but one part of his assessment stands out. ‘None of the detailed information necessary for the selection of sites for manned landing or bases is now available,’ Shoemaker wrote, because there were ‘less than a dozen scientists in the United States’ working on lunar mapping and geology.

I had to look it up to be sure, but yeah, this was the same guy who co-discovered Comet Shoemaker-Levy 9, the one that impacted Jupiter back in the 1990s, right around the time we coincidentally started confirming the existence of exoplanets. That's a lot of astronomy wins for a geologist, especially considering that, technically, "geology" only applies to Earth. I think that's a word it's safe to expand the definition of, though; otherwise, we'll have selenology, areology, and any number of other Greek-rooted world names attached to -ology. The problem becomes especially apparent when you consider we also have geography, geometry, and geophysics. Some sources refer to him as an astrogeologist; I'm not really picky about the wording in this case, as long as we all understand what's meant, though technically "astro-" refers to stars, not moons or planets. Being picky about that would cast doubt on "astronaut" as a concept.

Incidentally, he apparently died in a car crash in 1997, and some of his ashes got sent to the Moon with a probe that crashed into its south pole region. A fitting memorial, if you ask me.

But I digress.

The Moon is a place and a destination – but this was not always the case.

Well, it was certainly a destination for Eugene M. Shoemaker. Or part of him, anyway.

To geographers and anthropologists, ‘place’ is a useful concept. A place is a collision between human culture and physical space. People transform their physical environment, and it transforms them. People tell stories about physical spaces that make people feel a certain way about that space. And people build, adding to a space and transforming it even further.

So, this is a situation where science, technology, anthropology, folklore, mythology, linguistics, engineering, and psychology (and probably a few other ologies) meet. In other words, candy for Waltz.

Now, you might be thinking, as I did, "But science fiction treated other worlds as 'places' long before we sent white male American humans to the Moon." And you'd be right (because, of course, I was). The key is in the definition of 'place' I just quoted from the article: the Moon became a real place, as opposed to the speculative place it had in science fiction and fantasy:

Centuries ago, a major reconceptualisation took place that made it possible for many to imagine the Moon as a world in the first place. New technologies enabled early scientists to slowly begin the process of mapping the lunar surface, and to eventually weave narratives about its history. Their observations and theories laid the groundwork for others to imagine the Moon as a rich world and a possible destination.

Then, in the 1960s, the place-making practices of these scientists suddenly became practical knowledge, enabling the first visitors to arrive safely on the lunar surface.


One might argue that we lost something with that, like the folklore and mythology bits. But we gained something, too, and didn't really lose the folklore (though some of it, as folklore is wont to do, changed).

For much of history, the Moon was a mythological and mathematical object. People regarded the Moon as a deity or an abstract power and, at the same time, precisely charted its movement. It seemed to influence events around us, and it behaved in mysterious ways.

The connection between the Moon and tides was clear long before Newton explained gravity enough to demonstrate a causal relationship.

There were some who thought about trips to the Moon. Stories in religious traditions across the world tell of people travelling to the Moon. There were some thinkers before and after Aristotle who imagined that there were more worlds than just Earth. The ancient atomists discussed the possibility of worlds other than Earth, while other Greeks discussed the possibility of life on the Moon. This included Plutarch, who wrote about the Moon as both mythical and a physical object. But, to the extent that the Moon was thought about as a place, the notion was largely speculative or religious.

I sometimes wonder if, had we not had the big shiny phasey thing in the sky, our perception of space travel might have been different. The only other big thing in the sky is the Sun; all the other relatively nearby objects resolve to little more than dots: Venus, Mars, etc. I suspect that the presence of a visible disc, with discernible features even, might have served as a stepping-stone to imagining those other dots as worlds, once the telescope could start us seeing them as discs, too.

It would certainly have made mythology and folklore a lot different, not having a Moon.

The rest of the article is basically a brief (well, not so brief because it's aeon, but brief in comparison to human history) recap of our cultural relationship with the Moon. I don't really have much else to comment on, but I found it an interesting read, especially to see how our understanding has changed over time.
April 8, 2025 at 10:11am
April 8, 2025 at 10:11am
#1086823
Got this one from Time, and now it's Time to take a look at it.



"Has become?" Always has been.

Imagine walking through New York City, invisible.

I don't have to. I've done it. People bumped into me (and didn't even pick my pocket), cars didn't stop at crosswalks, and taxicabs just zoomed on by when I hailed them.

This is also known as "being in New York City."

Marilyn Monroe, one of the most recognizable women in the world, once did exactly that.

The article describes how no one recognized her until she started acting Marilyny. There's some irony (or whatever) there, because it wasn't Marilyn Monroe who (if the story is true) walked through NYC invisibly; that was Norma Jeane Mortenson. So who was being herself? Marilyn or Norma Jeane? Who is real and authentic: Superman or Clark Kent? (Yes, I know, trick question; they're both fictional.)

Her story is extreme, but her struggle is not unique. Like Marilyn, many of us learn to shape ourselves into what the world expects. Refining, editing, and performing until the act feels like the only version of us that belongs.

Well, yeah. And then you become the act. And that becomes your authentic, real, true self. This isn't news or something to be ashamed of; it's the essential process of life as a human.

Today, even authenticity is something we curate, measured not by honesty but by how well it aligns with what’s acceptable. The pressure to perform the right kind of realness has seeped into every aspect of modern life.

Oh, boo hoo hoo. "Today," my ass. We've been doing this since we figured out this newfangled "fire" shit, if not before then. I might even postulate that the pressure to fit in, to conform, to not act like but be the person your society expects was even stronger in pre-industrial times.

Authenticity was supposed to set us free. Instead, it has become something we must constantly prove. In a culture obsessed with being “real,” we curate our imperfections, filter our vulnerabilities, and even stage our most spontaneous moments online.

Who's this "we" person?

I figured out a long time ago that I needed to be someone different at work than I was for, say, my role-playing game group. The latter helped with the former.

Those who should know these things told me that people responded well to honesty and authenticity, so I learned to fake those qualities.

Instead of naturally shifting between different social roles, we now manage a single, optimized identity across multiple audiences—our family, coworkers, old friends, and strangers online.

Again, who the fuck is "we?" Not me.

The bigger, paradoxical problem is, however, that the more we strive to be real, the more we perform; and in proving our authenticity, we lose sight of who we truly are.

To me, this is like saying "No one sees how we truly look; they only see the wardrobe and hairstyle we choose." Hell, even nudists get to choose their hairdos. Who "we" are is always a performance. Eventually, the performance becomes who we are. Fake it 'til you make it, and all that.

Think back to childhood. At some point, you probably realized that certain behaviors made people like you more. Maybe you got extra praise for being responsible, so you leaned into that. Maybe you learned that cracking jokes made you popular, so you became the funny one.

Okay, now you're attacking me directly.

Psychologists call this the “False Self”—a version of you that develops to meet external expectations.

Well, far be it from me to dispute what professional psychologists say, but again, that's like saying "society expects us to wear clothing to cover one's genitals, so the only way to be authentic is to be naked."

And even then, which is more authentic: pre-shower, or post-shower? And do you comb/brush your hair? Then you're not being authentic; you're conforming to society's norms.

My point here is that despite what the article says, authenticity isn't always a good thing. Maybe your "authentic" self is a thief, and you don't want to face society's punishment for that, so you choose not to steal stuff. You're tempted, sure, but you just walk past the shinies instead of pocketing them, or restrain yourself from picking an NYC pedestrian's pocket or running off with her purse. You become not-a-thief, and that eventually becomes your true self.

Some of us are just naturally funny, but others have to work at it. The desire to work at it is just as authentic as the not-being-funny part.

What's the point of trying to improve yourself if you then get slammed for being "unauthentic?" A violent person may want to do the work to stop being violent. A pedophile may choose to deliberately avoid being around children. Is that not a good thing for everyone?

As for code-switching, are we supposed to wear the same clothes for lounging around the house, going to a gym, working, and attending a formal dinner? This is the same thing, but with personality.

Authenticity isn’t something you achieve. It’s what’s left when you stop trying. Yet, the more we chase it, the more elusive it becomes.

Well gosh, you know what that sounds exactly like, which I've harped on numerous times? That's right: happiness.

Culture shifts when enough people decide to show up as they are.

Naked with uncombed hair?

Hard pass.
April 7, 2025 at 9:25am
April 7, 2025 at 9:25am
#1086745
It's nice to be able to see through optical illusions, as this article from The Conversation describes. It would be even nicer to be able to see through lies and bullshit, but that's probably harder.



And I did find possible bullshit in this article, in addition to the slightly click-baity headline.

Optical illusions are great fun, and they fool virtually everyone. But have you ever wondered if you could train yourself to unsee these illusions?

I can usually see past the optical illusion once it's pointed out to me, or if I figure it out, but not always.

Now, it should be obvious that there are pictures at the article. They'd be a pain to reproduce here, and why bother, when I already have the link up there in the headline?

We use context to figure out what we are seeing. Something surrounded by smaller things is often quite big.

Which is why it's important to hang out with people smaller than you are. Or bigger, depending on the effect you're looking for.

How much you are affected by illusions like these depends on who you are. For example, women are more affected by the illusion than men – they see things more in context.

The article includes a link to, presumably, a study that supports this statement. I say 'presumably,' because when I checked this morning, the link wasn't working. So I can't really validate or contradict that assertion, but I do question the validity of the "they see things more in context" statement.

Young children do not see illusions at all.

The link to that study did work for me, and from what I can tell, it was about a particular subset of illusions, not "all."

The culture you grew up in also affects how much you attend to context. Research has found that east Asian perception is more holistic, taking everything into account. Western perception is more analytic, focusing on central objects.

None of which fulfills the promise of the headline.

This may also depend on environment. Japanese people typically live in urban environments. In crowded urban scenes, being able to keep track of objects relative to other objects is important.

Okay, this shit is starting to border on racism and overgeneralization. Also, the glib explanation is the sort of thing I usually find associated with evolutionary psychology, which reeks of bullshit.

However, what scientists did not know until now is whether people can learn to see illusions less intensely.

A hint came from our previous work comparing mathematical and social scientists’ judgements of illusions (we work in universities, so we sometimes study our colleagues). Social scientists, such as psychologists, see illusions more strongly.


And this is starting to sound like the same old "logical / creative" divide that people used to associate with left brain / right brain.

Despite all these individual differences, researchers have always thought that you have no choice over whether you see the illusion. Our recent research challenges this idea.

Whatever generalization they make, I can accept that there are individual differences in how strongly we see optical illusions. So this result, at least, is promising.

Radiologists train extensively, so does this make them better at seeing through illusions? We found it does. We studied 44 radiologists, compared to over 100 psychology and medical students.

And we finally get to the headline's subject, and I'm severely disappointed. 44? Seriously?

There is plenty left to find out.

I'll say.

Despite my misgivings about some of the details described, I feel like the key takeaway here is that it may be possible to train people away from seeing a particular kind of optical illusion. But it may be a better use of resources to train them to smell bullshit.
April 6, 2025 at 7:50am
April 6, 2025 at 7:50am
#1086678
Once again, Mental Floss tackles the world's most pressing questions.

    Why Do So Many Maple Syrup Bottles Have a Tiny Little Handle?  Open in new Window.
It’s not for holding, that’s for sure.


Well, this one would be pressing if anyone in the US could still afford maple syrup.

Ideally, you’d be able to hold the handle of a maple syrup container while you carry it and also while you pour the syrup onto pancakes, waffles, or whatever other foodstuff calls for it.

Good gods, how big is your maple syrup container? I usually get the ones about the size of a beer bottle, which doesn't even require a handle. Or, you know, I used to, when we were still getting stuff from Canada.

But the typical handle on a glass bottle of maple syrup is way too small and positioned too far up the bottleneck to be functional in either respect.

So, why is it there?


Why is anything nonfunctional anywhere? Decoration, tradition, or for easy identification, perhaps.

The most widely accepted explanation is that the tiny handle is a skeuomorph, meaning “an ornament or design representing a utensil or implement,” per Merriam-Webster.

I'm actually sharing this article not to complain about trade wars, but because I don't think I'd seen 'skeuomorph' before, and it's a great word.

As the article goes on to note, it's apparently pretty common in software design. They use other examples, but here on WDC, we have a bunch of them. The magnifying glass for Search, the shopping cart (or trolley) for Shop, glasses for Read & Review, the gear icon for settings, and so on. I don't do website or graphic design, so I didn't know the word.

But there are plenty of skeuomorphs that don’t involve the transition from analog to digital life, and the useless handle of a maple syrup bottle is one of them.

I'd hesitate to call it "useless," myself. Obviously, it's not useful as a handle for carrying or pouring, but, clearly, it does have a purpose: marketing.

Here’s one popular version of the origin story: The little handle harks back to the days of storing liquids in salt-glazed stoneware that often featured handles large enough to actually hold.

Moonshine distillers, take note. (And yet, the article mostly debunks that origin story, as one might expect.)

Maple syrup manufacturers had started to add little handles to their glass bottles by the early 1930s. This, apparently, was a bit of a marketing tactic. “Maple syrup companies weren’t so much retaining an old pattern of a jug as reinventing it and wanting to market their product as something nostalgic,” Canada Museum of History curator Jean-François Lozier told Reader’s Digest Canada.

Like I said.

Perhaps one day, I will again have the opportunity to purchase delicious maple syrup. When I do, I'll be looking for the skeuomorph.
April 5, 2025 at 9:46am
April 5, 2025 at 9:46am
#1086609
While Cracked ain't what it used to be (what is, though?), here, have a bite of this:

    5 Foods That Mutated Within Your Lifetime  Open in new Window.
We finally figure out what happened to jalapeños


It should go without saying that "mutated" is a bit misleading, but here I am, saying it anyway.

We know that companies keep tinkering with the recipes behind processed foods, changing nitrates or benzoates so you’ll become as addicted as possible.

Wow, that would suck, becoming addicted to food.

More basic foods, however, are more dependable.

And, of course, here's the countdown list to contradict that.

5 Brussels Sprouts

A couple decades ago, jokes on kids’ shows would keep saying something or another about a character hating Brussels sprouts.


Pretty sure it was more than a couple of decades ago. But the Brussels sprouts thing didn't stick in my memory. Broccoli did. Of course, as I got older and didn't have to eat them the way my mom overcooked them, I learned to like both. And when I got even older, I had my mind blown with the fact that they are the same species.

If you were around back then, you probably learned that Brussels sprouts tasted gross before you’d ever heard of the city of Brussels.

Having been to Brussels, I still don't know what they call them there. Sprouts, probably, or whatever the French or Dutch word for sprouts is. like how Canadian bacon is called bacon (or backbacon) in Canada, or French fries are called frites in Brussels because they're a Belgian invention, not French.

Unlike French fries, Brussels sprouts actually have a connection to Brussels. Well, not the city. It's hard to find extensive vegetable gardens in most major cities. But they were grown extensively in the surrounding countryside, or so I've heard.

Brussels sprouts used to taste bitter, but during the 1990s, we started crossbreeding them with variants that didn’t. When we were done, we’d bred the bitterness out.

There's an incident stuck in my head from several years ago, back when I did my own grocery shopping so at least six years and probably more, where I sauntered up to a supermarket checkout counter with a big bag of Brussels sprouts. The cashier started to ring me up, but then she looked me in the eye and said, "Can I ask you a question?"

"Sure."

She held up the bag o' sprouts. "How can you eat these things?"

I was rendered speechless for a moment, but retained enough presence of mind to say "With butter and garlic." Or maybe I just sputtered, and then a week or so later, lying awake at night, I finally came up with a good comeback, and edited my memory to make me look better.

Turns out there’s no moral law saying healthy stuff must taste bad.

Shhh, you can't say that in the US.

4 Pistachios

Pistachio nuts in stores used to always be red.


I don't think I ever noticed that.

Today, we instead associate pistachios with the color green, due to the light green color of the nuts and the deeper green color of the unripe shells.

I associate them with a lot of work and messy cleanup, but damn, they taste good.

3 Jalapeños

In the 1990s, the word “jalapeño” was synonymous with spicy.


Again, this is a US-oriented site. For many Americans, mayonnaise is too spicy, and anything else is way too spicy.

Today? Not so much. Maybe you’d call a habanero spicy, but jalapeños are so mild, you can eat a pile of them.

That's... not entirely true. It's actually worse than that; jalapeños have wildly varying levels of capsaicin, making it difficult to control the flavor of one's concoction when using that particular species.

Today, you might find yourself with one of the other many hotter jalapeño varieties, but there’s a good chance you’ll find yourself with TAM II or something similarly watery.

Which is why, when I want spicy peppers, I go with habanero or serrano. No, I don't use whole ghost peppers, but I do use ghost pepper sauce sometimes.

2 Sriracha Sauce

You know Sriracha sauce? Its label says that the primary ingredient is “chili,” and the chili pepper they use happens to be a type of jalapeño. At least it used to be, until some recent shenanigans.


I know it, and I sometimes use it, but my tongue refuses to pronounce it. It has no problem tasting it, though.

1 Apples

I don't think it would surprise many people to know that this iconic fruit has been selectively bred into hundreds of different varieties.

The most extreme victim of this aesthetics supremacy may be the Red Delicious apple. Today, it’s perhaps the most perfect-looking apple. It looks like it’s made of wax, and many say it tastes like it’s made of wax, too.

Nah, more like cardboard. I know what cardboard tastes like because I ate a pizza from Domino's once.

Buyers have started rebelling. If you aren’t satisfied with Red Delicious, you can try the increasingly popular Gala or Fuji apples.

On the rare occasions that I actually buy apples for eating, those are my choices, because they're tasty and they're usually available.

In summary, yeah, lots of foods have changed, and sometimes for the worse. What's remarkable isn't the change itself, but our ability to tinker with the genetics of what we eat. And we've been doing it for as long as we've been cultivating food. We can be quite clever, sometimes. But I question our collective taste.
April 4, 2025 at 10:04am
April 4, 2025 at 10:04am
#1086555
I wanted to share this article because a) it's interesting and I have stuff to say about it and b) I wanted to show that even the most serious science communicators, like Quanta, sometimes can't help using a pun in the headline.

    The Physics of Glass Opens a Window Into Biology  Open in new Window.
The physicist Lisa Manning studies the dynamics of glassy materials to understand embryonic development and disease.


If you're anything like me, you're wondering what the hell glass and biology could possibly have in common. Well, that's what the article's for.

The ebb and flow of vehicles along congested highways was what first drew Lisa Manning to her preferred corner of physics...

I can relate. I still remember the epiphany I got back in engineering school when I realized that the equations of traffic flow are the discrete-math versions of the equations of fluid flow.

But it wasn’t until after she had earned her doctorate in physics in 2008 that Manning started applying that enthusiasm to problems in biology.

I've noted before that, sometimes, an interdisciplinary approach can solve problems that a focus on one field cannot. Perhaps I'm biased because I prefer to know a little bit about a lot of things than to know a whole lot about one thing and nothing about anything else.

...she learned about what’s known as the differential adhesion hypothesis, an idea developed in the 1960s to explain how groups of cells in embryos move and sort themselves out from one another in response to considerations like surface tension. “It was amazing that such a simple physical idea could explain so much biological data, given how complicated biology is,” said Manning, who is now an associate professor of physics at Syracuse University. “That work really convinced me that there could be a place for this kind of [physics-based] thinking in biology.”

"Amazing," sure, but to me, it's not surprising. Complexity emerges from simplicity, not the other way around. And biology is basically chemistry which is basically physics, so even there, it should be no surprise that one field can inform the other.

She took inspiration from the dynamics of glasses, those disordered solid materials that resemble fluids in their behavior.

I'm going to digress for a moment, here. Glass has been described as a "solid liquid." When touring some historical site lo these many years ago—hell, it might have been Monticello—I heard a tour guide assert that being a solid liquid, glass flows very, very slowly under the influence of gravity, and that's why all these old windows are wavy and thicker at the bottom. This didn't sit right with me then, so I looked into it (this was pre-internet, so it involved an actual trip to an actual library). Turns out that no, glass is solid, period. It doesn't flow any more than rocks do, assuming ordinary temperatures (of course it flows when heated enough to change phase). The waviness of pre-industrial glass is a result of its manufacturing process, and apparently, they'd often install the panes with the thicker bits at the bottom, for whatever reason.

Point is, people confuse "glass resembles a fluid" with "glass flows, albeit very slowly." Which is understandable, though really, tour guides should know better. The reason we say glass is fluid-like is that most solids have a crystalline structure of some sort, at the atomic level. But glass does not; its atomic structure is disordered.

I mention all this in case someone's still got that idea in their head that glass is a slow-moving liquid; the article doesn't make it clear (see, I can pun, too) until it gets into the interview portion.

Manning found that the tissues in our bodies behave in many of the same ways. As a result, with insights gleaned from the physics of glasses, she has been able to model the mechanics of cellular interactions in tissues, and uncover their relevance to development and disease.

Unlike the relatively simple ideas about the atomic structure, or lack thereof, of various solids, the connection to biology is beyond me. The rest of the article is, as I said, an interview, which I'm not quoting here. While I can't pretend to understand a lot of it, I can appreciate her multidisciplinary approach and how insights from one branch of science can illuminate problems from another branch.

Incidentally, I find it helps to use a similar approach to writing. Because as much as we like to categorize things, the boundaries tend to blur and become fluid. Like the view through an 18th century window pane.
April 3, 2025 at 10:20am
April 3, 2025 at 10:20am
#1086501
Almost everyone I know, when starting to read the headline from this Guardian article, would blurt out "forty-two!"



They'd be wrong, though. Forty-two is the "Answer to the Great Question of Life, the Universe, and Everything," as revealed by the great prophet, Douglas Adams. Says nothing about "meaning."

As this article is an ad for a book, I conclude that the author's actual Meaning of Life is to sell as many books as possible. But in doing so, at least he includes others' points of view, opinions from those who probably aren't trying to sell a book.

In September 2015, I was unemployed, heartbroken and living alone in my dead grandad’s caravan, wondering what the meaning of life was.

And it never occurred to you that being broke, depressed, and trapped in a tin can with your dead grandpa might actually be the meaning of life? See, this sort of thing is why we push people to have jobs. Not so they'll have money, but so they'll be too busy to contemplate philosophical questions.

What was the point to all of this?

Apparently, selling books.

Like any millennial, I turned to Google for the answers.

Aw, this was too early. Try that now, 10 years later, and an AI bot will confidently and definitively answer your question. Or so I assume. I'm not going to try it, because I might not like the response. Or, worse, I might like it.

I trawled through essays, newspaper articles, countless YouTube videos, various dictionary definitions and numerous references to the number 42...

I told you 42 would be involved. It's a red herring. To be fair, so is everything else.

...before I discovered an intriguing project carried out by the philosopher Will Durant during the 1930s.

The problem with letting philosophers have a go at this question is that none of them, not a single one, has a sense of humor (or humour, as this is The Guardian). And any answer to "What is the meaning of life?" that doesn't involve humor in some way is categorically and demonstrably false. We have a different name for philosophers with a sense of humor: comedians.

Durant had written to Ivy League presidents, Nobel prize winners, psychologists, novelists, professors, poets, scientists, artists and athletes to ask for their take on the meaning of life.

See? Not a single comedian in the bunch. In the 1930s, there were any number of humorists he could have polled, many of which are still revered. The Marxists, er, I mean, the Marx Brothers had been active for at least a decade. There was a Laurel and a Hardy. The Three Stooges got their start in the late 1920s. I'd want to hear their answers. Nyuk nyuk.

I decided that I should recreate Durant’s experiment and seek my own answers. I scoured websites searching for contact details, and spent hours carefully writing the letters, neatly sealing them inside envelopes and licking the stamps.

I can almost forgive the low-tech throwback of writing letters, folding them into envelopes, and sending them through the post. What I don't get is stamp-licking. Here in the US, stamps have been peel-and-stick for decades; is it that different in the UK?

What follows is a small selection of the responses, from philosophers to politicians, prisoners to playwrights. Some were handwritten, some typed, some emailed. Some were scrawled on scrap paper, some on parchment. Some are pithy one-liners, some are lengthy memoirs.

When I saved this article (not that long ago), I had in mind to quote at least some of the responses. But upon reflection, I'm not going to do that. The answers are as varied as the people giving them. Some are non-answers. Some contain the barest glimmers of a sense of humor. Some are highly specific; as a trained engineer, I could very easily assert that designing systems that work to make peoples' lives better is the ultimate meaning of life, or, as an untrained comedian, I could just as easily state that the meaning of life is to laugh and to make others laugh. Or I could just say "cats."

The point is, the answer is different for everybody, and even for one individual at different points in life. For some, perhaps even this author, the meaning is in the search. For others, there is no meaning; this can be horrifying or liberating, depending on one's point of view. In my more literal moments, I assert that the meaning, or at least the purpose, of life is to generate additional entropy, thus accelerating the inevitable heat death of the Universe.

Mostly, though, I don't concern myself with meaning or purpose. A Jedi craves not these things. It's enough for me to occasionally sit outside on a nice day, listening to music and drinking a beer.
April 2, 2025 at 8:54am
April 2, 2025 at 8:54am
#1086432
In my ongoing quest to look at word/phrase origins, I found this explanation from Mental Floss, though I felt no urgency to share it.

    Why Does ‘Stat’ Mean “Immediately”?  Open in new Window.
It was originally a medical thing—here’s why.


Well, I thought it was pretty common knowledge that it came from the medical field, but I've been surprised many times by what I thought was common knowledge that turned out to not be.

The reason stat is short for statistics needs no explanation.

Yeah, it kind of does. Because 'stat' is short for 'statistic,' and 'stats' is short for 'statistics,' at least in my country. The one thing about British English that I actually find superior is that they shorten 'mathematics' to 'maths,' while we use 'math.' If stats are statistics, why is math mathematics? Many things in language make little sense, and this is one of them.

But that's not the 'stat' we're talking about.

Stat simply means “immediately.”

And has the advantage of one short, sharp syllable instead of an unwieldy and tongue-time-consuming five.

You sometimes see it written in all caps, STAT, which could either be to add extra emphasis or because people assume it’s an acronym.

Amusing thing: like many people, I have an ongoing prescription for a cholesterol-controlling medicine. My doctor's office, affiliated with the university here, has a computer system that always capitalizes STAT. Consequently, the prescription is for AtorvaSTATin.

It’s possible that the all-caps custom is influenced by the fact that ASAP basically means the same thing and is an acronym (for as soon as possible).

It's also possible that they just want it to stand out on reports for other medical professionals. "We need an X-ray of this leg stat" might be overlooked, but "We need an X-ray of this leg STAT" adds emphasis to the urgency.

But stat is not an acronym: It’s an abbreviation for the Latin word statim, also meaning “immediately.”

Oddly enough, 'immediately' is also a Latin derivative, but it appears to share its Latin root with 'mediate' and 'medium.' I don't have a good source for this, but I suspect the 'im-' prefix negates the 'mediate' root, conveying a sense of urgency as opposed to moderation. Like with 'immobile' or 'imprecise.'

When stat first entered the English lexicon in the early 19th century, it was used by physicians clarifying that a drug or procedure should be administered immediately.

Early 19th century? "Give me that jar of leeches, stat!" "Trepanning drill, stat!"

Medical professionals still use stat today, sometimes to differentiate a medication that must be administered immediately from two other types of medication orders. There are scheduled ones, which “are typically utilized for medications that are designed to give a continuous effect over a certain period of time (e.g. antibiotics),” per a 2016 article in Pharmacy Practice; and PRN orders “for medications that are to be given in the event of specific signs or symptoms (e.g. analgesics and antipyretics for pain and fever, respectively).” PRN is Latin, too: It stands for pro re nata (literally, “for the affair born”), meaning “as needed.”

There's a brewery near me called Pro Re Nata, and the R in their sign has the little x cross on the tail that signifies 'prescription.' I find this amusing. Their beer isn't bad, either.

Next time I go there, I'll be like, "Pint of brown, STAT!" Though I'll have to pronounce it carefully, or they might think I'm ordering stout. Not that there's anything wrong with that.
April 1, 2025 at 10:18am
April 1, 2025 at 10:18am
#1086338
I know what day it is, but I'm just going about my business, here. This bit is from HuffPo, which I don't usually read, but this one caught my attention.

    I Moved Abroad For A Better Life. Here’s What I Found Disturbing During My First Trip Back To America.  Open in new Window.
“The hardest part wasn’t seeing these differences – it was realizing I could never unsee them.”


Well. Okay. I guess some people need to push outside their envelope to see what's inside it.

When I left America last spring for a safer home for my family and a better quality of life, I thought the hardest part would be adapting to life in the Netherlands.

It's nice to have the privilege to just up and emigrate somewhere, isn't it? Like, if you don't like your life in whatever country you're in, boy it sure would be nice to have another country you can go to where you're not treated like something lesser or illegal.

“We just hired Riley’s college consultant,” my friend Jackie mentioned casually, sipping her drink. “Five thousand for the basic package, but you know how it is these days. Everyone needs an edge.”

"Everyone needs an edge." Yeah. Think about that for a moment. When everyone gets an edge, nobody gets an edge. Or, perhaps, people able to drop five grand on the edge end up winning, which perpetuates the whole cycle of economic disparity.

How could I explain that everything — from the massive portions before us to the casual acceptance of paying thousands to game the education system — suddenly felt alien? That I’d spent the past eight months in a place where success wasn’t measured by the size of your house or the prestige of your child’s college acceptance letters?

Congratulations; you've achieved an outsider's perspective.

The Dutch principle of “niksen” ― the art of doing nothing ― replaced our American addiction to busyness.

We had him once, but he was forced to resign.

Okay, bad Nixon pun. Seriously, though, how could you not see the problem when you were living here? Too busy, I guess.

Living abroad hadn’t just changed my zip code — it had fundamentally altered how I viewed success, relationships and the American Dream itself. In the Netherlands, I’d learned that a society could prioritize collective well-being over individual achievement.

But that's... that's... soshulizm!

What I’ve learned is that feeling like a stranger in your own country doesn’t have to be purely painful — it can be illuminating. It shows us that another way of life isn’t just possible, it’s already happening elsewhere.

I don't mean to be mean, but I've spent comparatively little time abroad and didn't need to spend any to figure out that what passes for culture in the US is fucked.

Some people really do thrive on it, though, and it's good to have choices.

Maybe we need more people willing to step outside the fishbowl and then return with fresh eyes. Maybe we need more voices saying, “This isn’t normal, and it doesn’t have to be this way.”

And maybe some people can figure it out without having to spend a year living in another country. Because not everyone can do that.

So, I hope you haven't spent this entire entry looking for an April Fools' prank. If you did, now is when I reveal that the only prank is that there was no prank. April Fools!
March 31, 2025 at 10:55am
March 31, 2025 at 10:55am
#1086267
Someone, a week or two ago, asked me something related to the perpetual belief that people do more crazy shit during a Full Moon than at other times during the lunar cycle. This has been a belief for a very long time, and we even have the word 'lunatic'  Open in new Window. to describe the phenomenon, so there must be something special about the Full Moon, right?

    Does the Moon Affect Humans?  Open in new Window.
Yes, the moon and its lunar cycles can impact you — but for other reasons than you may think


Well, the only thing I can think of that's special about the Full Moon is the amount of light we see. In the time before electric lights, this would have effectively extended the time when one could see well outdoors. More activity can lead to more perceived instances of people acting weird, because people have acted weird since there were people.

Others, however, have, both historically and well into the current era, ascribed a more mystical connection to it. This is, I think, akin to the Bermuda Triangle mythology: that particular stretch of ocean has been perceived to be especially mysterious and prone to make ships and planes disappear; but, as it turns out, when you compare that area to other places with similar traffic, there are no more or fewer disappearances in the BT than elsewhere.

And this is why we use science and statistics.

For centuries, the moon and how it affects human behavior has been at the center of mythology and folklore around the world. The very word “lunacy” dates back to the 15th century when it was believed the moon and its phases could make people become more or less aggressive, depending on its place in the lunar cycle.

So, I see four different possibilities:

1. The Full Moon causes people to do crazy shit, for some mystical reason;
2. People do more crazy shit during a Full Moon for some rational reason;
3. People don't do more crazy shit during a Full Moon; observation bias (as with the BT) makes people think it happens more then;
4. People think there's a link between Full Moon and Crazy, so they let their inhibitions loose, and it becomes self-fulfilling.

Okay, 4 may be a subset of 2. I'm pretty sure regular readers already know I've ruled out #1. But I'm willing to keep an open mind. That's the only way we learn stuff.

But then, of course, there are lesser stories that hold a darker tone — haunting tales of werewolves whose transformation is dependent on the full moon.

It occurred to me the other day that, canonically, vampires shun sunlight and only come out at night. But moonlight is reflected sunlight, so maybe, just maybe, a Full Moon doesn't provide enough sunlight to fry a vampire, but just enough to turn them into a werewolf.

I don't think anyone's written about that yet, so don't steal my idea; I may use it.

When you set aside superstitions and longstanding myths, is there any scientific truth behind the way the moon bewitches us? Psychologist Susan Albers, PsyD, walks us through some of the research that’s been done on lunar cycles — and why we may just be changing our behaviors based on independent psychological reasons, instead.

All organisms conduct natural biological cycles for survival. When we talk about biological cycles, we probably most often think of our circadian rhythm — our bodies’ internal 24-hour sleep-wake cycle — and infradian rhythms (cycles that last longer than 24 hours) like the 28-day menstrual cycle or seasonal affective disorder (SAD).

Couple of things here. First of all, this is my introduction to the term "infradian rhythm," and I'm both happy to learn a new word and angry that it's taken me this long to discover it.

Second, it's long been noted that the menstrual cycle is similar to the lunar cycle. It's right there in the name; 'menses,' 'moon,' and 'month' share a PIE root. Whether this is coincidence or causation is outside the scope of this entry, but if there were a causal link, you'd think everyone who menstruates would do so at the same time, but, as far as I know, they don't.

And since our human bodies are made up of 55% to about 78% of water, there’s some reason to believe we, too, might be impacted by the moon, its light and its 27-day lunar cycle — especially when you consider the moon’s gravitational pull on the earth is powerful enough to affect the ocean tides.

Here's where I start to really question the source material. For starters, the lunar cycle, from Full to Full or New to New, is about 29.5 days, not 27. I think the 27 comes from the Moon's orbital period, which is shorter because the Earth is simultaneously orbiting the Sun. But we're talking about Full Moons here, not the Moon's orbital return to a certain location against the stellar background, so 29.5 should be the operating number.

Also, I've seen this comparison to tides before. Oh, we're mostly water, so we're also affected by tides? I call bullshit. There's nothing magical about water that makes it special for tides. The ground is subject to tidal forces, too, though with a much smaller effect. If this weren't the case, the Moon wouldn't be tidally locked to Earth, showing us the same side at all times. Point is, though, tides are basically caused by a different size gravity vector on one side of an object than another. Humans are quite small compared to a planet (or moon); I find it extraordinarily unlikely that gravity is involved, especially when there's an even bigger difference in gravity when the Moon is closer or further away in its elliptical orbit. I mean, do puddles experience tides?

And let me digress on that "elliptical orbit" point for a moment: every time the Full Moon occurs near lunar perigee, my news feed gets inundated with articles about the impending Supermoon. I don't mind much; at least it gets people looking at the sky. But perigee changes from lunar month to lunar month; it doesn't always occur at a Full Moon. If there's an effect based on proximity, some force that makes people do crazy shit when the Moon is closer, it should happen every lunar month at a different phase, not be associated with the Full Moon. The Sun might present a confounding factor, but even there, the effect should be similar at a New Moon and a Full Moon, and less at the quarter-phases.

Okay, back to the article.

“Any research that’s been done has been considered controversial, in part, because studies on humans are conflicting,” says Dr. Albers. “In most cases, when there’s been discussion of the moon’s effect on humans, it’s been anecdotal.”

And it's also controversial, I'd hazard to guess, because anyone who tries to study it immediately gets labeled a lunatic.

Ask anyone about how a full moon affects our lives and you’ve probably heard stories about birth rates climbing, an increase in emergency room visits and an uptick in crime. As this review points out, there seems to be no correlation between the lunar cycle and those things.

Well, there you go. The answer.

Okay, no, I'm kidding.

But, nothing happens in a vacuum.

The Moon's orbit does!

Some studies have shown a possible correlation between the moon and human activity.

At least they were careful to use "correlation."

The rest of the article is the "More research needs to be done" section, including a lot of discussion about what I hinted at up above: the self-fulfilling aspect of all this. (I'm going to utterly ignore the final section, which is about maintaining a positive, upbeat, and optimistic outlook, at which point I said "bite my ass" out loud.) When I was a kid, and way further into adulthood than I care to admit (okay, right up until the present moment), every time I'd spot a Full Moon, I'd howl like a wolf. The Moon doesn't make me howl; that's something I decided to do after reading too many werewolf stories.

There is one thing I feel certain about, though; Life is a lot more interesting with a Moon in the sky. And I don't need to be a mystic or a poet to see the beauty or insanity of it all.
March 30, 2025 at 9:54am
March 30, 2025 at 9:54am
#1086212
When I was a kid, I mean a really young kid, after the dinosaur hunt but before the dinosaur meat feast, long before there was a Google or a Wikipedia, I had to rely on my parents for answers to life's important questions. Like, "Why's it called a vanilla envelope? Is it because you lick it?"

Well, my parents set me straight on the spelling, and even told me what Manila was (Dad, as a sailor, had been there). But it took until I found this article from Mental Floss to help me finally get closure on this subject. (Closure? Because it's an envelope? No? Yes? Tough crowd.)

    Why Is It Called a “Manila” Envelope?  Open in new Window.
Manila envelopes carry a few secrets


The days of getting important documents in the mail instead of a PDF may be waning, but there’s still plenty of mileage left in the Manila envelope. The oversized, heavy-duty enclosures can send and store everything from contracts to insurance policies to incriminating blackmail.

Ah, yes, the Official Packaging of Compromising Photos.

But why are they called “Manila” envelopes? Does the name refer to the Philippines? And if so, how did that come about?

Betting colonialism was involved.

American stationery companies were experiencing supply shortages in the 1830s. Cotton and linen rags, which were used to produce paper pulp, were growing scarce. To keep production up, papermakers turned to the Manila rope typically found on ships.

Oh, thanks. That's helpful. Manila envelopes from Manila rope.

In contrast to cotton and linen, Manila rope was derived from Manila hemp—an extremely strong and durable material sourced from Manila, or abacá, plants native to the Philippines (hence it being named after the country’s capital, Manila).

Now, see? That's helpful, and I'm not being sarcastic this time.

Manila rope that was too frayed to remain in use could be recycled rather than discarded, making it a thrifty resource.

This rope walks into a bar and sits down. Bartender goes, "Sorry, we don't serve ropes here." So the rope sighs and walks out. He ties himself into an overhand knot and musses up one of his ends, then heads back into the bar. "Say," says the bartender, "Aren't you that rope who I kicked out a few minutes ago?" "No, I'm a frayed knot."

Despite Manila fibers being their main component, it took a while for the term Manila envelope to catch on. The Oxford English Dictionary cites the first use of the phrase in print in 1889, when printer Barnum and Co. professed to “make a specialty of large Manilla [sic] … envelopes.”

One wonders what they were called before then.

Exporting Manila hemp should have been lucrative for the Philippines. Instead, colonialism got in the way.

Hey, look, I was right. Okay, I cheated. But I was still right.

Manila was phased out of most paper manufacturing over time, with wood pulp growing both more readily available and far less expensive.

No. Let's not gloss over the demonization of the hemp plant in general, regardless of THC content. If we'd kept on making paper from weeds instead of trees, maybe we'd be in less of a mess right now. But no, timber companies had a better lobby, and hemp became an early casualty of the War on Drugs. Or at least the run-up to it.

So that's the origin story, signed, sealed, and delivered in a golden-brown package.
March 29, 2025 at 10:25am
March 29, 2025 at 10:25am
#1086164
I have another 2019 article that came up today, this one from MEL. While I only ran across this particular piece recently, it supports what I've been saying (without much to back me up) for years:

    If Happiness Weren’t the Goal, We’d Have Much Better Mental Health  Open in new Window.
‘It’s not that you’re too sad. It’s that you’re trying too hard to be happy.’


I think I phrased it differently, though, like "happiness is what happens when you're pursuing other goals." Or maybe I just came up with that. You know, 'happy' and 'happen' share a linguistic connection: they derive from Old Norse 'happ,' which meant luck or chance; we still use it that way in words like 'mishap' and 'happenstance,' and even in 'perhaps.' Etymology doesn't dictate definition, but in this case, it might give us a clue to the cognitive problem: we can do things to shape our own luck, but there's always that random element (like in yesterday's entry).

Eric G. Wilson woke up one morning in 2002 and realized he wanted to die. Though he had what seemed on paper to be the perfect life — beautiful wife, newborn baby girl and professional success as a writer and academic — he was deeply depressed and debilitatingly so, a feeling that was worsened by the fact that nothing he did seemed to help.

If you pursue what is, "on paper," the perfect life, that may not be aligned with your own goals. For example, having a newborn would be my personal Hell. And I thought I wanted professional success as a writer, but upon reflection, that might have constrained me in ways that would feel stifling.

On the other hand, sometimes depression just hits. It 'happens,' as it were. I'm no mental health professional, so don't just take my word for it, but from personal experience, it can occur regardless of whether your life seems objectively good or bad.

Luckily for him, help came in the form of a good psychotherapist whose surprisingly simple advice would alter the course of his life. “You know what your problem is?” he asked Wilson during one of their earlier sessions. “It’s not that you’re too sad. It’s that you’re trying too hard to be happy.”

Consequently, I wouldn't interpret that as general advice for everyone with depression. If the shrink was as good as the author says, then presumably, they reached that conclusion after some sessions.

Yeah, this is what me being skeptical about a statement I generally agree with looks like.

Happiness is nice, his psychotherapist told him, but when it’s viewed as constant pleasure and fulfillment, it can be an unrealistic standard to live by. For many people, it invites failure by setting the bar at an impossible height, one that can rarely be reached for any appreciable amount of time through expected avenues like marriage, kids, professional success or material gain.

And, like I said, what makes someone else happy won't necessarily make you happy. Like, I know a lot of people derive a good bit of pleasure from keeping dogs around, but for me, they're too much work, too demanding, and interrupt me with their barking. This does not mean I dislike dogs. I like them, but I know I'd be miserable taking care of them all the time.

In other words, sometimes trying to be happy can make you pretty damn depressed. But if you lean into your dark parts and let go of happiness as your ultimate goal, you might actually get somewhere. And get somewhere Wilson did — this “extremely liberating” realization led him to write Against Happiness, a best-selling book which presents the unpopular opinion that striving too hard for joy and contentment can do a lot more harm than good.

You know, I'm starting to be disappointed whenever one of these articles doesn't plug a book.

“A lot of people don’t feel good about themselves because they have unrealistic expectations about what the good life should be,” Wilson says. “There’s this [well-documented] message that if you’re not happy, there’s something seriously wrong.”

Actually, for people striving to meet the manufactured levels of glee spewed at us by social media and commercial advertising, that might be the case.


Advertising is part of the problem, as I see it, which is why, while I'm not averse to ads for books in here, the whole thrust of advertising is to convince us that we're missing something, and the only way to gain some measure of satisfaction is to Buy This Product. And yes, that includes books, especially self-help books.

The irony isn't lost on me. But irony often makes me happy.

And although Americans spend more time and money chasing after happiness than any other country in the world, we’re still one of the most anxious and least happy of all the developed nations.

And that was before, you know, *gestures at everything in general*

Given that knowledge, it’s not surprising that we medicate at unprecedented rates and spend tens of billions of dollars per year buying into the products and services the commercialized self-help industry promises will make the boo-boo of sadness go away. Pretty ironic for a country that has the “pursuit of happiness” forever emblazoned into its Constitution, no?

I'm not perfect by any means. I make mistakes. Demanding perfection of myself is a good way to get me to freeze up, do nothing, and end up unhappier because I missed a deadline or broke a promise. Consequently, I don't demand perfection from others, either. But, come on. That quote is from the Declaration of Independence, not the Constitution. It's not like it's an integral part of the law of the land: "You must try to be happy!" No, it was just rhetoric aimed at pissing off the King and rallying colonists.

There has to be something else to live for other than the impossible dream of 24/7 glee, which Simon-Thomas stresses, is an entirely different thing than genuine happiness. After all, most of us need something to chase in order to get out of bed in the morning — otherwise, we’d just stay there and eat toast for 20 to 40 years until we die.

Okay, now they're attacking me personally.

Oscar Wilde, for one, believed humanity’s highest purpose had nothing to do with joy, but should instead be focused on self-expression and a gradual “intensification of personality.” Meanwhile, Buddha taught that the goal of life should be to have no goal at all; that is, you don’t necessarily need to pursue anything, you should just “be.” And if you’re Freud — who believed the world was imperfect and attaining happiness was impossible despite it being something “all humans strive for” — the highest goal you should reach for is none other than, drumroll please, sexual pleasure (for men, at least … bastard.)

Really? Those are the philosophers you're going with here? Wilde, the Buddha, and Freud? Well, at the very least, we can safely dismiss anything Freud had to say.

If none of those happiness alternatives work for you, there’s always Wilson’s trusty companion: melancholy. “Melancholy means acknowledging that life is mostly suffering and that perpetual happiness is rare, if not impossible,” he explains. “Accepting that helps us see that there’s no one way life ‘ought’ to be, and that we’re all just doing the best we can.” In other words, when we realize we can exist as imperfect beings whose daily lives don’t always reflect the prefabricated glitz of Instagram filters or the antiseptic cheerfulness and quick fixes marketed to us, it’s much easier to embrace the same flaws that give us so much grief.

Or, and hear me out here, how about the joys of pessimism and schadenfreude? A pessimist can only be pleasantly surprised; if they're not surprised, then they were right, and that feels good, too. As for schadenfreude, no, I don't take pleasure in others' misfortune—unless I feel like they deserved it.

But, you know. Melancholy works too, and has the advantage of being able to be turned into comedy.

Anyway, you're better off reading the article (glaring mistakes about US founding documents aside), not paying attention to me. Like I said, not a professional. If you're seriously depressed, see one. If you're just freaked out because you think you should be happier than you are, well, maybe the article will help with that. But look out for the marketing.
March 28, 2025 at 10:39am
March 28, 2025 at 10:39am
#1086117
Yes, today's article is from Quanta. Yes, it talks about mathematics. You didn't really expect that to stop when I changed blog themes, did you?

    Proof Finds That All Change Is a Mix of Order and Randomness  Open in new Window.
All descriptions of change are a unique blend of chance and determinism, according to the sweeping mathematical proof of the “weak Pinsker conjecture.”


I've accepted the "mix of order and randomness" thing for a long time (my method for selecting articles to feature here uses just such a combination), but it's nice to know there's formal proof—even though I don't understand it.

This particular article is from 2019, so something might have superseded it by now. I don't know.

Imagine a garden filled with every variety of flower in the world — delicate orchids, towering sunflowers, the waxy blossoms of the saguaro cactus and the corpse flower’s putrid eruptions. Now imagine that all that floral diversity reduced to just two varieties, and that by crossbreeding those two you could produce all the rest.

I'm sure there's a better metaphor for this idea.

That is the nature of one of the most sweeping results in mathematics in recent years. It’s a proof by Tim Austin  Open in new Window., a mathematician at the University of California, Los Angeles. Instead of flowers, Austin’s work has to do with some of the most-studied objects in mathematics: the mathematical descriptions of change.

And, again, so far beyond my own knowledge that it might as well be orbiting Betelgeuse. That's why I read stuff like this. Though it's hard to be skeptical when you don't have the necessary background to ask the right questions.

These descriptions, known as dynamical systems, apply to everything from the motion of the planets to fluctuations of the stock market.

I can understand being skeptical about this sentence, though. Planets are predictable, right? Like, we know when the next transit of Venus will occur, and when Jupiter aligns with Mars. The stock market is the antithesis of predictable; even weather forecasts are more accurate than stock market speculations.

And yet, both are chaotic systems (so is weather). It's just that the planets' orbits are indeterminate after a much longer time frame.

Wherever dynamical systems occur, mathematicians want to understand basic facts about them. And one of the most basic facts of all is whether dynamical systems, no matter how complex, can be broken up into random and deterministic elements.

I'm not entirely sure, but I think this means that true randomness really does exist. I'd been contemplating that question for a long time. Even dice rolls can be seen as deterministic, relying on initial state, the configuration of the hand that rolls them, and the surfaces they roll against and upon. Also, I have to remember, "deterministic" doesn't necessarily mean "predictable."

The article, of course, delves into more detail and contains helpful illustrations. There's nothing else in there that I really want to comment on. Again, I can't say I understood it all. But I think the reason I saved this article, lo these many months ago, is because it's not some high-flying proof unrelated to anything in the real world. As the article notes, it can apply to planetary motion and to stock market fluctuations. I added weather up there. But there's another real-world system that people get wrong all the time (though I can't claim that I get it right all the time), and that's evolution.

Evolution deniers have been known to look at organisms (such as humans) or organs (such as the eye) and assert "that couldn't have happened at random!" Their alternative, of course, is that some supernatural intelligence designed everything. But I'm not here to argue that, at least not today. In fact, I'd say that assertion, as far as it goes, is correct—ignoring extremely tiny quantum probabilities, anyway. Because it didn't happen at random. I've been saying for years that it's not random, but there are random elements (such as gene modifications) operating against a backdrop of physical constraints.

What I'm still unclear about, and this may be more in the realm of philosophy than math or science, is just how much randomness is actually in play in a given system. But, mostly, it gives me another chance to crow that John Calvin was Wrong. Some years ago, and I had to look it up, I wrote: "To me, as an outsider, 'divine will' is indistinguishable from random chance working through the laws of physics." And behold, here we have a supposed mathematical proof that random chance works through the laws of physics.

Look, I try to be skeptical about these things, to the extent that I question even those things with which I agree. But right now, at least on this subject, I'll just assume I'm right and there's science backing me up.
March 27, 2025 at 9:04am
March 27, 2025 at 9:04am
#1086072
Well, let's see if the random numbers give us something other than food today. Ah, here we go, from The Guardian—and it's not about food, unless you mean "for the eyes."

    ‘We won’t come again’: dazed visitors fed up with overcrowded Louvre  Open in new Window.
Paris attraction in need of overhaul amid complaints of leaks, long waits, lack of signage – and too many people


Regular readers might remember that, as detailed in my previous blog, I spent a few days in Paris in a hotel just a few blocks from the Louvre. I walked through its gardens and listened to buskers, unironically complained about tourists (in my head, anyway), and decided not to go inside because I had a bum knee and heard it was crowded.

It was only after I'd been back for a while that this article came out.

As the crowds poured out of the Louvre, the look of dazed exhaustion on many faces confirmed what the museum’s director had warned last week: a trip to Paris’s biggest cultural attraction has become a “physical ordeal”.

A part of me has been kicking myself ever since (with the foot attached to the non-shaky knee). I mean, it's the Louvre, the most famous art museum in the fucking world. I'm not a huge art snob or anything, but I know the difference between a painting and a sculpture. I had a chance! I blew it! Then this came along and made me feel better about my decision.

Myriam, 65, a former secondary school science teacher had driven from Belgium with her husband to show their 12-year-old granddaughter the Mona Lisa. They left disappointed.

From what I hear, everyone's disappointed by that puppy. Smaller than you expect, and you gotta fight crowds. Pretty sure I mentioned that at the time.

They had squeezed through huge crowds on Monday to try to catch a glimpse of Leonardo da Vinci’s masterpiece, but found the room badly designed and with no proper flow of people.

So I understand where they're coming from, but it's the Louvre, not Eurodisney.

“There are so many people. Lots of rooms aren’t numbered. The staff are very friendly, but you feel they’re more there to show people the way than to protect the paintings,” said Myriam.

Friendly staff? In a Paris museum? Now I want to go back just to see the unicorn.

As the article notes, they're from Belgium, so it's likely they speak fluent French. I do not, though I can read the written language fairly well. It may just be Anglophone tourists that get the grumpy-Parisian treatment.

On Tuesday, the French president, Emmanuel Macron, will deliver a speech at the Louvre in which he is expected to unveil details of new investment, which could involve major overhaul – even a potential additional entrance.

This article is, as I noted, from a couple of months ago. Since then, other issues may have become more pressing priorities for the French government.

Another visitor, this one from actual Paris, gets quoted:

“The noise is so unbearable under the glass pyramid; it’s like a public swimming pool. Even with a timed ticket, there’s an hour to wait outside. I can’t do it anymore. Museums are supposed to be fun, but it’s no fun anymore. There’s no pleasure in coming here anymore. And to get out you’re made to walk the length of a shopping arcade to force people to buy things – commercial interests have taken over everything.”

From what I've seen, that's pretty standard these days. Hell, when I was leaving the airport there, the signage, in several languages, directed exiting passengers along a winding path through the shinies shop. There was, at least for arriving passengers, no other way out. American consumerism is bad enough, but I think we learned it from France.

The Louvre’s director, Laurence des Cars, warned in a damning note to the culture minister this month that the facilities were below international standards, the visits were not easy and involved long waits, and the building was in poor repair, including leaks and poor temperature controls.

Annoying as "leaks and poor temperature controls" are for human visitors and tourists, they're even worse for the art.

I did find this piece from CBS  Open in new Window. dated after Macron's speech. It looks like France can still walk and chew gum at the same time, as it seems they're planning some renovations. And yet, as the CBS article notes, they're funding this by raising admission prices for tourists.

Well, shit, all they had to do was raise prices to begin with, and then they'd have smaller crowds. Duh.
March 26, 2025 at 9:37am
March 26, 2025 at 9:37am
#1086014
Back when I was in Belgium, at one point, I was on a tour where a guide pointed out one of the local landmarks: an "authentic" Philly Cheesesteak restaurant, complete with bronze statue of Rocky Balboa.

Don't get me wrong. I like a Philly cheesesteak as much as someone not from Philadelphia is allowed to. But American food in Belgium is like copper in a gold mine.

Except for the frites (fries), of course. We stole those from Belgium (not France) fair and square, so it's only right if they steal them back.

This brings me to the article that popped up today, something dated 2018 from Afar:

    The True Tale of the Philadelphia Cheesesteak  Open in new Window.
It all started with a hot dog.


I'm not going to take the article's word that it's a "true tale." Food history is notoriously complicated and mythologized, as with yesterday's bit about Worcestershire sauce. Still, the subhead about the hot dog effectively baited me in.

The cheesesteak I smell frying is nearly the same as the original born here 85 years ago: An Italian hoagie roll packed with thinly sliced rib eye and Spanish onions, both sautéed on a flat-top grill, sometimes with peppers and mushrooms. Cheese, whether you opt for provolone or the iconic Cheez Whiz—just “Whiz” in local parlance—holds the whole thing together.

I might have mentioned before that I tried to explain Cheez Whiz to someone in France, and while I expect looks of pity and contempt from the French, the utter disbelief and disdain radiating off of his face at what America did to cheese was hot enough to melt Cheez Whiz.

But ask anyone what really makes a cheesesteak and they’ll tell you it’s the roll.

I'm always happy to see someone share my point of view that bread is the only food and everything else is a condiment.

“You’re always hearing about how one particular cheesesteak place is the best, even though I probably make more in one day than they sell in a year,” says Olivieri. “But cheesesteak joints are like opinions. They’re everywhere, and all valid.”

If we didn't all have different tastes, there wouldn't be such an enormous selection of food and beverage to choose from. Pizza places would only serve one pie. Breweries would only make one beer. Boring. I'm sure it's fun to argue about which cheesesteak place is the best, though.

And opinions are like assholes: everyone has one, and most of them stink.

Only Pat’s can claim to be the original. The cheesesteak, Olivieri tells me, was invented in 1930 on this very corner by his grandfather Harry and his great-uncle Pat Olivieri.

Maybe. Maybe not. It's good marketing, though.

The duo worked as hot dog vendors in an open-air Italian market, and when times were good they would buy beef and fry it up with onions for their lunch. One day, a taxi driver asked if he could buy the sandwich instead of a hot dog. Pat offered to split it. The driver, smitten, advised the pair to sell them.

Hence the hot dog connection. One might ask, "but why didn't they just eat a hot dog?" Well, I know if I sold hot dogs all day, the last thing I'd want for a snack is a hot dog. This doesn't mean I believe the story, only that it makes it more plausible.

It wasn’t until World War II, however, that the cheesesteak became a Philly emblem. A true showman, Pat started a rumor in the days of WWII rationing that his sandwiches contained horse meat, then, in mock outrage, offered a $10,000 reward for someone to prove it.

Now, that? That is brilliant marketing. Probably couldn't be done today, though, not with DNA sequencing as ubiquitous as it is. It also produces false positives. "See? This report shows it truly is horse meat! Gimme my money." The marketing campaign would backfire, and you'd go bankrupt and get beaten to death by horse lovers.

While places like Pat’s continue to churn out the classics, the current cheesesteak scene reflects the city’s changing dining landscape. HipCityVeg makes a respectable vegan version. The cheesesteak even gets an haute touch at Barclay Prime, a swanky steak house, where Wagyu beef is tucked into a sesame roll and piled with foie gras and truffled cheese.

There is no food so iconic that someone hasn't come up with a gourmet version designed to better separate you from your money..

I should note that I never did go to the Philly cheesesteak place in Belgium. Not because I didn't trust it, but because I wanted to experience things we don't get in the US. If anything, I had some fear that the cheesesteak there would be so good that I could no longer eat the ones here, and that would be a real shame.
March 25, 2025 at 9:51am
March 25, 2025 at 9:51am
#1085973
They say ignorance is bliss. I tend to disagree, but there are some things I feel like I was better off not knowing. Like the subject of this Epicurious article:

    The Murky, Salty Mystery of Worcestershire Sauce  Open in new Window.
The peppery sauce may be wildly popular, but its ingredient list and origin story are shrouded in secrecy.


It's also one of those things where people who know how to pronounce it inevitably look down their noses with disdain on those who don't. Kind of like quinoa or pho.

Culinarily ubiquitous and a perpetual tongue-twister, Worcestershire sauce is one of the great food enigmas of the past two centuries.

Yeah, I know I just did a food article a couple of days ago. Random is random.

Inky brown, sweet and salty, funky and fishy, peppery and piquant, the sauce’s exact ingredient list was kept secret ever since it was first sold in Worcester, England, in the mid-19th century.

There are at least two reasons to keep a recipe secret: to protect the business of making it, or because if you revealed it, people would be disgusted.

This might be a case of "both."

Nowadays, an ingredient list is mandated on most edible items, but one can hide a lot of nastiness under the cover of "natural and/or artificial flavors."

The mystery that originally shrouded Worcestershire sauce has continued to propel its popularity around the world.

Is it really the mystery, though? Or is it that it simply tastes good and helps bring out other flavors in food? Or, ooh, I know: it was the marketing.

In fact, I’m willing to bet you’ve got a bottle of the perennially popular stuff tucked into a corner of your kitchen cupboards at this very moment—perhaps purchased for a platter of deviled eggs, a weeknight meatloaf, or a classic Bloody Mary.

It's in the fridge, but yeah. Also, classic Bloody Marys are lame. Yes, I put Worcestershire sauce in mine. But the base is V-8, not tomato juice or Bloody Mary mix, which is apparently tomato juice with a few grains of seasoning.

The recipe for the original version, developed and sold by Lea & Perrins in the 1830s, remained a closely guarded secret until 2009, when the daughter of Brian Keough, a former Lea & Perrins accountant, disclosed that her father had purportedly discovered an ingredient list in a factory trash pile. That recipe called for water, cloves, salt, sugar, soy, fish sauce, vinegar, tamarind, and pickles, among other ingredients.

It's the "among other ingredients" that still worries me.

Incidentally, I don't believe the "found in a trash pile" story for one second. I could change my mind on that, but it smells like corporate myth-making to me.

Incidentally, "fish sauce" has always confused me. Is it sauce that's made from fish, or is it sauce for fish? Or both? I think one is supposed to just know these things, like how we know that olive oil is made from olives, but baby oil is made for babies.

But Worcestershire’s closest condiment cousin is probably garum, a fish sauce that was integral to the kitchens of antiquity. Made from the fermented and salted innards of oily fish like anchovies and mackerel, this umami-rich potion was used on its own as a table sauce and blended with other ingredients—such as wine, black pepper, honey—to create various dressings for meat, fish, and vegetables.

This is why my answer to the perpetual ice-breaker question of "if you could travel to the past or the future, which would you choose?" is "neither, but we know the past was disgusting, so if I had to choose, it'd be the future."

Anyway, the article goes into the sauce's history for a while, then:

Many of these references, and countless others, were compiled by William Shurtleff and Akiko Aoyagi in their History of Worcestershire Sauce from 2012.

Aha! This is a book ad, after all!

Now owned by Kraft Heinz, Lea & Perrins Worcestershire sauce still dominates American supermarket shelves—but just as in the mid-19th century, alternative versions proliferate.

Of course a giant conglomerate produces it now. One wonders what cost-cutting measures they inevitably took to make Worcestershire sauce as bland and uniform as pretty much everything else is, these days.

They probably kept the disgusting parts, though. Those tend to be cheap.
March 24, 2025 at 11:17am
March 24, 2025 at 11:17am
#1085928
Today's article is old, ancient, even decrepit by internet standards. Nautilus dates it as 2013. But human nature hasn't changed much in 12 years, so here it is.

     Why We Keep Playing the Lottery  Open in new Window.
Blind to the mathematical odds, we fall to the marketing gods.


To grasp how unlikely it was for Gloria C. MacKenzie, an 84-year-old Florida widow, to have won the $590 million Powerball lottery in May...

I know you're thinking it. Okay, I was thinking it, and now you will be too: is she still kicking? The answer is no; she died four years ago, at the age of 92. That's a decent run; my dad made it that far. Well, there was the lawsuit she filed against her son for mismanaging the finances, but family issues are gonna issue whether you've got money or not; it's just that, with money, you get to hire the best lawyers and make your family dispute public.

The news likes to latch on to reasons why winning the lottery won't make your life better, maybe because it fits the whole Puritan "you gotta work for it" mentality, but the truth is more complicated: some jackpot winners do indeed crash and burn, and some go on to lives of happiness and contentment, but most people just go on being people, and people's lives have their ups and downs.

To continue from the article:

...Robert Williams, a professor of health sciences at the University of Lethbridge in Alberta, offers this scenario: head down to your local convenience store, slap $2 on the counter, and fill out a six-numbered Powerball ticket. It will take you about 10 seconds. To get your chance of winning down to a coin toss, or 50 percent, you will need to spend 12 hours a day, every day, filling out tickets for the next 55 years.

Honestly, I didn't check the math, but that sounds about right. There was a thing in, I think, Texas recently, where a group managed to spend considerably less than 10 seconds on each combination, and basically brute-force hacked the lottery. That can work, mathematically, when the jackpot is high enough and the lottery commission, or whatever, doesn't have safeguards in place. I think they do, now.

Williams, who studies lotteries, could have simply said the odds of winning the $590 million jackpot were 1 in 175 million. But that wouldn’t register. “People just aren’t able to grasp 1 in 175 million,” Williams says.

This is not a slight on "people," any more than if you said "people just aren't able to fly without mechanical assistance." He offers up an evolutionary psychology explanation, which always turns me right off, but whatever the reason, very big or very small numbers just don't register on us emotionally. Also, very, very few people really understand probability or statistics, even when the odds can be counted on one's fingers and toes.

It may seem easy to understand why we keep playing. As one trademarked lottery slogan goes, “Hey, you never know.” Somebody has to win.

That's not strictly true when you're talking about lotteries with progressive jackpots. The whole reason they're "progressive" is that no one won the first round.

But to really understand why hundreds of millions of people play a game they will never win, a game with serious social consequences, you have to suspend logic and consider it through an alternate set of rules—rules written by neuroscientists, social psychologists, and economists.

Are we really suspending logic, though? Or are we looking at it logically from different perspectives?

The bulk of the article indeed looks at it from a different perspective: that of marketing, which combines psychology, economics, probability, intuition, aesthetics, and probably a few other disciplines I'm forgetting. I don't need to quote much else, but to summarize, the gist of it is that the lottery sells a dream. What you're paying for is intangible: hope, wishes, what-ifs. That's not necessarily bad, even though one could argue that we should be experiencing these intangibles without having to exchange something tangible (money) for them. For instance, it can provide some clarity on who and what really matters to us in life.

To be clear, though I know I've written similar things before, I'm not ragging on anyone who plays a lottery. That would be hypocritical of me; I haven't messed with a lottery in many years, but I do occasionally gamble. I do it for entertainment, and to me, a lottery is just not nearly as fun as blackjack or even shiny blinky noisy slot machines.

Other people do other things for entertainment. Someone might spend hundreds of dollars on Taylor Swift tickets or whatever, but they don't get judged as much as a gambler who spends the same amount in Vegas; they're just doing what they like and, at the end of the night, all they have is memories and maybe a T-shirt.

It's when you spend more than you can afford that the problem comes in, but that's a problem no matter what you're spending on.

Still, it's worth reading articles like the one here, I think, because it delves into that psychology, and maybe helps understand people's reasons. Those reasons are, as I noted, not entirely logical.

As remote as the odds of winning a lottery can be, there are odds even more remote. Practically no one blasts Rowling for being a billionaire (they blast her for other stuff, but that's irrelevant here), probably because the perception is that she "earned" it. She wrote some of the most popular books in history and, more importantly for the money angle, had them made into popular movies. But she happened to have the right idea at the right time and the capacity to execute it, the chances of which are roughly the same as those of winning a lottery. The only difference is she did more work than just filling out ovals on a slip of paper.

I know, intellectually and emotionally, that I have a greater chance of winning a lottery jackpot than of hitting it that big as a writer. And yet, I still write, and I don't play the lottery. Other people have different talents and opportunities, and some have none, so the lottery may be their only recourse.

It's just important, I think, to be aware of how we can be manipulated by marketing tactics. And by our own internalized thoughts about who deserves what.

31 Entries *Magnify*
Page of 2 20 per page   < >
Previous ... -1- 2 ... Next

© Copyright 2025 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted Writing.Com, its affiliates and its syndicates non-exclusive rights to display this work.

Printed from https://www.writing.com/main/profile/blog/cathartes02