Rated: 18+ · Book · Opinion · #2336646

Items to fit into your overhead compartment


Carrion Luggage

Blog header image

Native to the Americas, the turkey vulture (Cathartes aura) travels widely in search of sustenance. While usually foraging alone, it relies on other individuals of its species for companionship and mutual protection. Sometimes misunderstood, sometimes feared, sometimes shunned, it nevertheless performs an important role in the ecosystem.

This scavenger bird is a marvel of efficiency. Rather than expend energy flapping its wings, it instead locates uplifting columns of air, and spirals within them in order to glide to greater heights. This behavior has been mistaken for opportunism, interpreted as if it is circling doomed terrestrial animals destined to be its next meal. In truth, the vulture takes advantage of these thermals to gain the altitude needed glide longer distances, flying not out of necessity, but for the joy of it.

It also avoids the exertion necessary to capture live prey, preferring instead to feast upon that which is already dead. In this behavior, it resembles many humans.

It is not what most of us would consider to be a pretty bird. While its habits are often off-putting, or even disgusting, to members of more fastidious species, the turkey vulture helps to keep the environment from being clogged with detritus. Hence its Latin binomial, which translates to English as "golden purifier."

I rarely know where the winds will take me next, or what I might find there. The journey is the destination.
<   1  2  3  4  5  6  7  8  9  10  ...   >
March 25, 2026 at 10:49am
March 25, 2026 at 10:49am
#1111491
Something a bit more serious today, from Nautilus.
     Why Middle-Aged Americans Can’t Find Happiness  
Welcome to the real midlife crisis in the US

I've said this before, I know, but I believe that a) happiness is overrated and b) you can't find happiness by looking for it.

That doesn't make the subject of this article (from January) any less real, though.

Middle-aged adults are... more lonely and depressed than previous generations, and their cognitive and physical health are suffering—at least this is the picture you get from studies done in the United States.

Much of the article is devoted to supporting this statement.

They found that U.S. residents are in a category of their own, as the midlife slide is mostly confined to the U.S. Middle-aged adults in many parts of Europe, Mexico, South Korea, and China aren’t suffering the same kinds of decline, showing either mixed patterns or the reverse.

This is what I found interesting about the study. All those regions have a similar standard of living to the U.S., and people everywhere are people, so that tells me there's probably something societal going on.

These patterns seem to be linked to a disintegrating safety net, rising stress levels, higher healthcare costs, and inequality in the U.S.

Something like those things.

I spoke with study author Frank Infurna, a psychologist at Arizona State University, about why middle-aged Americans are suffering so much, what we can do about it, and what gives him hope.

I don't remember what actor, character, or movie this is from, but the quote comes to mind: "Don't do that. Don't give me hope."

What I can't help noticing is how closely "Frank Infurna" resembles "Frank N. Furter," and I absolutely remember what movie he's from.

The rest of the article is in interview format, with the interviewee being, of course, Infurna.

If you look at the U.S. compared to Germany or Sweden, when it comes to family benefits, there’s parental leave in Germany up to 14 months, shared between both partners and at 70 percent pay. Childcare is subsidized. The education system is very well funded. And then it also expands to the workplace. In the U.S., our healthcare is tied to our employment, but if we lose our job, then chances are we lose our healthcare as well. So there’s nothing to really fall back on for those situations when you find yourself out of a job or trying to navigate being a parent.

I, of course, smugly sidestepped some of this by choosing not to be a parent. But I recognize that many people really want kids, and some of them seem happy about them. To each their own, of course, and while I wouldn't want to live in a world where no one wanted to have children, I also don't particularly want to live in a world where people have kids they didn't want.

People want to leave the nest, but they have no choice but to come back because housing prices have far outpaced wages. Also, many middle-aged adults, at least this generation, had kids later on. So you’re balancing the aging parents with kids who aren’t out of the house. And the parenting stressors are just incredible.

That sort of thing has long struck me as a stricture imposed by society. Multi-generational homes used to be the norm, not the outlier. It's only fairly recently that it's become society's expectation that adult children hie off on their own, and I suspect part of that is from propaganda. That is, marketing. Builders don't sell more houses, and banks don't initiate more mortgages, if one family stays in the same house.

There's a lot more at the article, of course. I suspect there's way more going on than merely economic stressors. It does, however, seem to contradict the myth that money doesn't buy happiness.

Still. I'd rather have money than happiness. Happiness doesn't pay the bills.

I suspect a lot of it is about expectations versus reality. If you don't expect much, your barrier for being happy about it is a lot lower. If you expect to be able to pay all your bills, take care of aging parents, raise kids, and have a fulfilling career, well, sure, some people can manage that. But not everyone can, and I imagine resentment starts to set in. Resentment makes no one happy.
March 24, 2026 at 10:43am
March 24, 2026 at 10:43am
#1111423
Here's another writing-related one from Mental Floss. Look, I know they're listicle clickbait, but they're often amusing listicle clickbait. And sometimes even educational, provided you fact-check them.
     7 Weird Old Punctuation Marks We Should Bring Back  
Have you seen all seven before?

"Weird" is subjective.

We’re living in a time where we have more technology than ever for sharing our words, but we still constantly get misunderstood.

Hm. What do you mean by that?

One cause of this is that we simply don’t use enough punctuation in our text.

No.

In our "texts," maybe (though I make an effort to spell and punctuate correctly in texts. It's good practice).

People could stick exclamation marks at the end of every funny sentence, but they don’t, perhaps for fear of looking like they’re laughing at their own joke.

Something I learned yesterday: exclamation marks are British; exclamation
points are USer. (This is like how another punctuation symbol is a period in the US and a full stop in the UK, but don't ask me about other Anglophone countries).

I read so much Brit-lit that I just figured those two were interchangeable, like gray and grey. Turns out those are regional variants, too.

Also, don't stick exclamation marks at the end of every funny sentence. It's tiring, and it's like writing in a laugh track. I can put up with the occasional emoji / emoticon, though.

Maybe we should fight this by using as much punctuation as possible. In fact, you can go even further and bring back some archaic punctuation marks that haven’t been used in decades.

Okay, except that if I gotta look up the ASCII code for it, that, too, gets tiresome. I've memorized most of the French accented letters, but I have to look up the degree symbol, for example,
every damn time.

Except, of course, today, when I discovered I do have it memorized now: Alt+0176 yields °. Maybe it's easier on a phone, but I'm old-fashioned and prefer a laptop keyboard.

As before, the ALL CAPS headers are their fault; I'm just too lazy to change it when I paste them. Also, I'm pasting a few of the characters from the source; I have no idea how this text editor will handle them.

VINCULUM

You know, of course, about underlining. It means sticking a line under a word for emphasis, and the concept is so famous that “underlining” became a word that refers to emphasizing stuff even when we aren’t talking about literal lines.


On typewriters (that's how old I am: I learned to type on a manual typewriter, moving to electric, moving to proto-word-processors, and thence to computers), italics weren't an option. There was lower case and upper case, and some punctuation. Hell, on my first typewriter (actually my dad's), you couldn't even do an exclamation point. Mark. Whatever. You had to do a period, then backspace, then pop in a straight apostrophe. Or eschew bangs (which I call them to avoid the point/mark debate and save on typing) altogether.

So, to underline something, you'd type it, backspace, and spam the _ key.

But there has also existed a concept called overlining. You draw the line on top of the text (a line like this: ‾ ) instead of at the bottom.

Uh huh. Now show me how to do this with a standard US keyboard and/or existing word processing software.

The horizontal line used in overlining is called a vinculum...

No idea what they call it in the UK.

THE DOG’S BOLLOCKS

This one, however, has UK written all over it.

When your sentence says, “I’m about to say something,” and then you say that thing, we have a way of separating the two halves of the sentence: a colon. We also have an alternative way of separating the two—a dash. In previous times, people would sometimes combine the two :— they’d use a colon and a dash together.

Easy enough to do on a standard keyboard. But why a dog, specifically? And why limit the name to the colon part? I can only think it's because we see more dog wangs than any other type of schlong, even if we personally possess one. Unless, of course, you watch a lot of porn. Not kink-shaming here.

...while also urging the reader to pause, which is something colons don’t always do.

Huh? Comma: short pause. Period/full stop: longer pause. Semicolon: even longer pause. Colon: noticeably age before you get to the next part of the sentence. Come on.

The combination of a colon and a dash was referred to as the dog’s bollock’s. This is because it resembles genitalia.

Only if you're looking for genitalia. Which, again: not kink-shaming. But if you're looking for it, we could probably find a use for pipe-in-parentheses, or (|).

DIPLE OBELISMENE

OH COME ON.

This interesting mark consists of a trio of dots, arranged like this: ⸫

Literally nowhere else that I've found on the internet calls it that. It's a "therefore" symbol, as the article correctly notes (albeit obliquely). I think there's a way to make it on a keyboard, but it's taking me too long to figure out.

DIAERESIS

I'm pretty sure I've talked about this one in here before, probably when mocking the style of
The New Yorker. (The article mentions them specifically). I don't remember if I made "diarrhea" puns when I did, though.

If you go through old books, you’ll sometimes see two dots on top of a vowel, such as over the second “o” in “coöperation.”

I don't mock it anymore, though, because the diaresis is standard in French.

PERCONTATION MARK

I swear they're just making stuff up now.

The most common misunderstanding in written language today is someone taking you seriously when you were really joking. For a little while, forums tried a standard whereby people wrote in italics when they were being sarcastic, but this didn’t catch on. A special font that displayed text in reverse italics to show sarcasm was even less easy to adopt.

I've been saying for years that Comic Sans should be adopted as the Official Sarcasm Font.

In the 16th century, printers tried a new symbol called the perconation mark. It looked like a reverse question mark (؟), and it primarily served to mark questions as rhetorical, letting it be known that the author was not looking for an answer. Later, the French used this mark and called it a point d'ironie, to mark irony.

Again, no idea how to make it, and no desire to look it up.

MANICULE

I've gotten those done at spas. Oh, wait, that's a manicure.

The manicule was an actual drawing of a hand that pointed at the desired text.

You still see this from time to time. Pretty sure there's even an emoji for it.

FLEURON

Now that just sounds like an insult. "You're such a fleuron."

A text might not necessarily have any sort of strong conclusion, but then the reader sees a centered fleuron right after it and feels a sensation of release.

And here I thought that was what the :— was for.
March 23, 2026 at 9:49am
March 23, 2026 at 9:49am
#1111315
Presumably, everyone's familiar with the usual optical illusions: a particular arrangement of shapes or lines that trick the brain into seeing something that's not there, or not seeing something that is there. Rarer are the three-dimensional illusions, but that's the subject of this piece from The Guardian:
     Can you solve it? You won’t believe these optical illusions!  
The magical art of Olivier Redon

First of all, yes, I know that the headline is profoundly clickbaity. But this is one of those rare cases where I think it's almost warranted.

Look at the Coca-Cola can in the main image.

It is not a can, but an optical illusion – a trick of perspective. Can you work out what is going on?


Obviously, this is also one of those cases where you should actually click on the link. Well, you always should, but it's truly necessary this time.

Here's the thing about that first image: it's a solid illusion. I don't mean "solid" literally here, just that it's so good that, unless you're told it's an illusion (which you are), you'd never know it. So it's not enough to show the image; you also have to say "this is an illusion."

The illusion is the latest masterpiece from Olivier Redon, a French-American inventor, who has had his creations used in museums and on TV programmes around the world.

And yet, this is the first I've heard of him.

For today’s puzzles, I present five of Redon’s most brilliant images. The challenge is to figure out how he managed to create them.

Okay, again... click on the link in the headline to follow along. Image embedding here is too much of a chore (which is fine; this is writing.com, not images.com).

1. Ceci n’est pas une canette

I was going to use that reference joke, but the article beat me to it.

2. The Oh La La Box

This is not a red box! What is Redon actually holding in his hand?

AKA Ceci n'est pas une boîte rouge. Yes, yes, I know, we've already used that reference. But what's the point of knowing some French if I can't show it off sometimes?

If you're not getting the joke,
here you go.  

There are three others at the article. I have to admit, they pretty much stumped me; he's that good.

And oh, yeah, at the bottom of the article, there's a link to how the magic is made. I needed every spoiler there.
March 22, 2026 at 8:32am
March 22, 2026 at 8:32am
#1111230
Mentioning the American chestnut yesterday basically dared the random number generator to hand me this ars technica article today:
     Bringing the “functionally extinct” American chestnut back from the dead  
Wiped out in its native range by invasive pathogens, the trees may make a comeback.

We lost those trees before I was born, but I've stayed in cabins here that were made from their wood. As far as I know, the cabins are still there.

So, today's theme is similar to yesterday's: saving a plant species from blight.

Very few people alive today have seen the Appalachian forests as they existed a century ago.

Well... duh.

Even as state and national parks preserved ever more of the ecosystem, fungal pathogens from Asia nearly wiped out one of the dominant species of these forests, the American chestnut, killing an estimated 3 billion trees.

Biological warfare?

But thanks in part to trees planted in areas where the two fungi don’t grow well, the American chestnut isn’t extinct. And efforts to revive it in its native range have continued, despite the long generation times needed to breed resistant trees.

I want to contrast this with some hyped-up efforts to "de-extinct" certain animal species. I use quotes there, because unlike the chestnut, the species that are being hyped as de-extincted are genuinely extinct.

Whether the introduction of genetic modifications to a species makes it a different species or not is a categorization issue I leave up to biologists. But inserting a couple of genes into a wolf doesn't make it a dire wolf, and splicing hairiness into an elephant doesn't make it a woolly mammoth.

While the American chestnut is functionally extinct—it’s no longer a participant in the ecosystems it once dominated—it’s most certainly not extinct.

Unlike woolly mammoths.

Finally, a handful of trees have grown to maturity in the American chestnut’s original range. These trees, which the paper refers to as LSACs (large surviving American chestnuts), suggest that there might have been some low level of natural resistance within the now-vanished population.

This is one of the drivers of evolution, and it's why I recoil in horror when I see a cleaning product that claims to destroy 99% of germs: because the surviving 1% could breed true and make more of the little buggers resistant to the product.

This is, as I understand it, already happening in response to antibiotics.

Unlike "germs," though, tree growth is comparatively slow, so any such comeback would take longer
at least without meddling scientists.

A related approach took advantage of the fact that the American chestnut can produce fertile hybrids with the Chinese chestnut, which had co-evolved with the introduced fungi and were thus resistant to lethal infections.

Again pushing the boundaries of what we categorize as a "species." But that's a problem of definition, not one of science itself.

Both efforts suffered from the same problem that faces any biologist working on trees: They are slow-growing and can take years to reach a size at which they produce seeds.

Like I said.

Concerned about what this might mean for the potential reintroduction of the chestnut into the Appalachians, a third project turned to biotechnology.

As usual in these cases, one wonders if we should do so. It's been a hundred years. While that's an eyeblink compared to, say, the woolly mammoth or dire wolf extinctions, and is even shorter than the dodo's demise, those ecosystems have moved on. What would need to be lost to make way for the chestnuts to again dominate the mountains here?

I'm not answering that question. Just asking it.

So researchers from the American Chestnut Foundation assembled a massive collaboration to examine all these options and determine what would be needed to reintroduce blight-resistant chestnuts into the wild.

Ethical questions aside, this (unlike other de-extinction efforts) is good science. Hell, even those other efforts improve scientific knowledge, just not in the way they're hyped.

There's a brief summary of what they've done so far, but then:

Root causes

A terrible pun? In an
ars technica article? Who do they think they are? The BBC?

This is, of course, a primary reason I saved this article.

But even after the exhaustive exploration of resistance traits, the researchers seem to believe that all three approaches—selecting resistant American chestnuts, breeding hybrids derived from Chinese chestnuts, and directed genetic modification—can help bring the American chestnut back.

On another personal note: the chestnut dominated the nearby mountains, but right in my front yard, I have a big old elm tree. Elms, too, were largely wiped out by disease (Dutch elm disease, so I guess we can't blame Asia for this one). There used to be three in my neighborhood, separated widely enough that disease didn't take hold, but mine is the only surviving example that I know of in the area.

Unlike chestnuts, though, elms don't produce tasty seeds, so there might not be as much interest in reviving them. My tree, however, is remarkable. It's also taller than most of the other trees around here, so I'm dreading the inevitable lightning strike.

The researchers warn, though, that as environmental disturbances and invasive species continue to push some key species to the brink of extinction, we need to get better at this kind of species rescue operation.

With the proper ethical considerations in place, right?

...Right?
March 21, 2026 at 11:20am
March 21, 2026 at 11:20am
#1111177
Nautilus is here with an article that should ap-peel to some of us:
     New Gene Discovery Could Postpone the Bananapocalypse  
Bananas could get wiped out, it’s happened before

Have you ever wondered why you don’t have to spit out seeds after snacking on a banana?

No, because I learned the basics of this stuff early on. Not bragging, just fact.

It’s because the Cavendish, the most widely used commercial cultivar, has three copies of chromosomes and can’t produce fertile seeds.

Teenage mutant ninja 'nanas?

Instead, the Cavendish is propagated by cloning, which is convenient for maintaining consistent banana quality, but leaves the plant vulnerable to disease.

Oooh, cloning. Ooga-booga. No. Cloning has been common in botany and agriculture for a long time.

It’s a fate that befell the Cavendish’s predecessor, the Gros Michel (French for “Big Mike”).

What I
have always wondered is who Michel was, and if his you-know-what resembled a banana. If so, J'espère qu'il a consulté un médecin.

This more flavorful cultivar was the most widely available banana for decades until it fell victim to the wilting fungus Fusarium...

Bananas don't grow in my climate. You know what does grow in my climate? American chestnut trees. Or it would if something like 99% of them hadn't been wiped out by a disease. Similar story there, which is why you don't see American chestnuts.

With bananas under threat worldwide, the race to protect the Cavendish from suffering the Gros Michel’s fate began.

We could, in theory anyway, live in a world without widely-available and cheap bananas. But they're just so convenient. And cheap. They're one of my favorite fruits (technically, berries) not just because they're delicious, but because they don't splooge juice everywhere, are easy to peel, and cleanup's a breeze.

“We’ve located the source of STR4 resistance in Calcutta 4 which is a highly fertile wild diploid banana by crossing it with susceptible bananas from a different subspecies of the diploid banana group,” Chen explained in a statement.

"Explained."

Going forward, the researchers are developing molecular markers for the gene so banana producers can more efficiently identify and plant resistant seedlings. “This will speed up selection, reduce costs and hopefully ultimately lead to a banana that is good to eat, easy to farm, and naturally protected from Fusarium wilt through its genetics,” Chen said.

Science caused the problem. Science can fix the problem.

Maybe.
March 20, 2026 at 10:41am
March 20, 2026 at 10:41am
#1111098
 Spring is Sprung  (ASR)
The downside of the vernal season
#2186132 by Robert Waltz Author IconMail Icon


The Equinox occurrs at 10:46 am WDC time today. This would be spring in the most important hemisphere.

Well. Astronomical spring, anyway, which goes from equinox to solstice. There's also meteorological spring, which goes from March 1 to May 31. And some cultures make the equinox the midpoint of the season, so it goes from early February to early May. And, of course, in that other, lesser hemisphere, it's autumn.

Other misconceptions about the equinox:

No, there's nothing special about the gravitational alignments, or whatever, that allows you to balance an egg on its end on the equinox and only on the equinox. Half a second's thought should be enough to debunk this, and yet, I keep seeing people swearing up and down that it "works."

This, in my view, encapsulates everything that is wrong with people: Hear something, believe it, fail to test it fully under controlled conditions, and hold on to the belief with a white-knuckled grip even if someone presents overwhelming evidence to the contrary.

Another thing is that the name "equinox" is misleading. The amount of daylight isn't precisely equal to the amount of darkness. Because of refraction in the Earth's atmosphere, the Sun appears to be above the horizon when it's actually slightly below. Also, this is because of refraction, not because Earth's gravity is bending light noticeably.

If we didn't have an atmosphere, then yeah, day would equal night. But we'd have bigger problems than worrying about the timing of astronomical events, if that were the case.

And, finally, the Earth is round, dammit. See the above about holding on to beliefs in the face of overwhelming evidence.

So, yeah, that's all I have for today. Looks like I'm posting this before the actual moment of the equinox, but you're probably reading it after that. So... bon printemps, mes amis.
March 19, 2026 at 10:40am
March 19, 2026 at 10:40am
#1111016
Here's an animalistic one from Mental Floss.
     6 Misconceptions About What Animals Eat  
If you believe that mice love cheese and milk is great for cats...you're wrong.

Well, no, I knew those things (though I wonder if mice simply don't like what we laughingly call "cheese" in the US). But they're iconic. Perhaps even clichéd. So, a joke like "The early bird may catch the worm, but the second mouse gets the cheese" is instantly understandable, even if both the teller and listener know that cheese isn't actually mouse Kryptonite.

Even my use of "Kryptonite" there is wrong. Superman never, as far as I know (he's been around a long time), deliberately sought out Kryptonite to eat. It would be more accurate for me to say that cheese isn't actually mouse Lois Lane.

When we’re babies, many of our parents teach us what sounds animals make, then give us plush animal toys, then show us cartoons starring animals who talk.

For some reason, they never told me what the fox says.

Many of us, therefore, grow up convinced we know all about animals, even if our knowledge consists entirely of nonsensical stereotypes.

It is, for example, required for city people, while driving through the country and seeing cattle, to go "MOOO!" in the car.

Apologies for the all-caps headers. I copy/paste these things, and I can't be arsed to convert to Upper Lower Case.

MICE DON'T LOVE CHEESE

We all picture mice nibbling at cheese, and if you buy a mousetrap, it might feature a little drawing of a wedge of cheese on it.


Yes, and it's always,
always Swiss cheese. I've covered this in here before.

In fact, mice will choose most foods in your kitchen over cheese.

Especially the plastic crap that marketing has convinced us is cheese.

The idea may have also spread because artists began to commonly draw mice next to big wedges of Swiss cheese full of holes, perhaps since mice are so associated with holes.

I've also covered the concept of "holes" in here.

CATS CAN’T DIGEST MILK

As for the mouse’s enemy, the cat, we all think we know what food to give it: a saucer of milk. Or, in practice, we know enough to give cats actual cat food, but we still have the image in our minds of kindly giving a cat a saucer of milk. Don’t ever do that.


As a nearly lifelong cat person, of course I knew that.

I am, however, way more concerned about people who think their cats can survive on a vegan diet. They're obligate carnivores with the occasional craving for whatever houseplant you love the most.

PEANUTS AREN'T GOOD FOR ELEPHANTS

If you've ever imagined yourself feeding an elephant a handful of peanuts, consider that this is one snack they are not terribly likely to encounter in the scrublands and savannas.


Peanuts originated in South America. Ever seen an elephant roaming wild in South America? I haven't, but perhaps that's because I've never actually been to South America.

Which, of course, doesn't explain the popularity of cat food made from beef. I haven't seen a housecat take down a cow, either. Try, yes. Succeed, no.

RABBITS AREN’T MADE FOR CARROTS

Rabbits are one more animal that would be better off eating leafy greens than eating anything else humans eat, but people still associate them with one specific food. Rabbits are commonly linked to carrots, but this is only because Bugs Bunny nibbles on carrots in cartoons.


They also don't say "What's up, doc?" I know, I know: your entire worldview has just been shattered. Sorry.

I've kept rabbits. You know what they do like? Carrot tops. Not the terrible comedian, but the actual green stuff that sticks out of the ground above carrots.

This worked out for me because I love carrots (they are, in fact, the only vegetable I would associate with the word "love" rather than "like," "tolerate," or "despise"), and we could get whole carrots with the tops on and not waste the green parts.

I did see some article recently that was something about "why you should be eating carrot tops," but I didn't even bother to save that one to rag on.

Incidentally, carrots are in the same family as parsley. You know, in case you were wondering why carrot leaves so closely resemble parsley in morphology, if not in taste.

DON’T FEED DUCKS BREAD

It’s unclear where the idea that you should feed bread to ducks originated. Maybe it came from the fact that people often picnicked next to water, had bread with them for sandwiches, and found bread to be the most convenient food to tear apart and throw to the ducks.


Again: ever seen a wild duck foraging for wild bread? No, because bread doesn't exist in the wild.

The article says "don't feed ducks anything," but I wouldn't go that far, personally. I have it on good authority that mealworms are preferred if you're going to feed ducks. Since you'd have to go out of your way to get mealworms, then yeah, don't feed the ducks.

PIRANHAS DON’T EAT HUMANS

Dangit. Now I have to find something else to populate the moat around my lair.
March 18, 2026 at 9:17am
March 18, 2026 at 9:17am
#1110941
Fairly short thing today, and a link I won't have nearly as much to say about as I did yesterday. Yeah, I get passionate about music. Deal with it.

I signed up here in 2004. So, sometimes, when I'm doing these riffs, I think, "How would this have played in 2004?" In this case, I think the headline would have been viewed as utter nonsense.

Summary: Jason Mangone, our Executive Director, writes about his reflections on our research with Movember, a charity focusing on men’s health.

The irony of someone named Mangone writing about men's issues isn't lost on me. Though I imagine it's not pronounced like that.

Online discourse has been statistically proven to be weird. This is especially true when talking about issues related to men and masculinity.

That's because we're supposed to sit down, shut up, and let the women talk.

In my effort to highlight the maw of the engagement machine, I went to Twitter and typed “masculinity” into the search bar. I discovered that this week’s online debate centers on a VICE article about “Mankeeping,” or “the emotional labor women end up doing in heterosexual relationships.”

This article is from late 2025. Twitter no longer existed then.

As for "Mankeeping," do a little thought experiment here: suppose some man came up with a word like "womankeeping," describing how exhausting it is to have to listen to women complain about every little detail of their day. Now, what label would you slap on such a man?

We asked Americans to choose, from a list of seventeen traits, “which do you think are most important that men try to exemplify nowadays?” The top answer, chosen by 36 percent of all respondents, is “Providing for your family.”

36 percent is hardly a resounding majority. There's a chart with a bunch of the answers at the link.

Even more boring: most Americans don’t view discussions about men’s issues to be zero sum. An overwhelming majority – 77 percent – of Americans agree that “Addressing men’s issues and women’s issues are both important and talking about one doesn’t reduce support for the other,” compared to only 12 percent who agree that “Discussions about men’s issues take attention away from addressing the challenges women face today.”

While that's heartening (and there's another chart for it), I feel like 12 percent is still way too high, and reflects a common attitude in society which is like "Why are we even thinking about Y when X is a problem?"

As if we can't walk and chew gum at the same time.

If you want to parse the headline, though, I'll have to leave that up to the linked article. I'm done for today.
March 17, 2026 at 10:43am
March 17, 2026 at 10:43am
#1110864
Here's a musical one from Mental Floss:
     6 Hit Songs Believed to Have Hidden Meanings, From “Stairway to Heaven” to “Bohemian Rhapsody”  
These iconic songs have sparked a variety of rumors, some more far-fetched than others.

People "believe" lots of things; that doesn't make them true. Also, I question the phrasing "from 'Stairway to Heaven' to 'Bohemian Rhapsody.'" There is other music besides epic anthems from British bands in the 1970s. Usually, phrasing something like that implies some sort of diverse spectrum, not just Led Zeppelin and Queen.

I will acknowledge, however, that British bands in the 1970s put out the second-best music ever recorded, exceeded only by the stuff Springsteen released in that decade.

As for "meaning," well...

Musicians have also always been the subject of wild rumors and conspiracy theories, and their songs are no different.

My favorite conspiracy theory of that kind involves another British artist, Phil Collins, then a freshly-minted solo artist. He missed the 70s by a year or so, depending on how you calculate it, but the song "In The Air Tonight" generated an urban legend that circulated by word of mouth long before the internet was a thing. The rumor was that Collins had witnessed a bunch of kids drowning another kid, and when he played the song in concert, the kids felt guilty enough to confess.

It was a semi-plausible "theory." It even fit the lyrics, sort of. Problem is, it's not true. Collins almost always wrote the music first, fitting the lyrics in afterward (which seems backwards to me, but he has musical talent and I don't, so whatever). Collins himself
denies it,   which of course only means he was part of the conspiracy. That's how this stuff works, right?

Humans seek patterns: faces in natural rock formations, the future in tea leaves, recognizable shapes in inkblots and paint splatters, invocations of Satan in backwards-played music, and, more relevant to this discussion, unintended meanings in poetry and song lyrics. As another example, I spent decades trying to find meaning in Springsteen's "Blinded by the Light," until I saw a video where he said he'd been sitting on his bed with a guitar and a rhyming dictionary, just trying to make words fit.

Anyway, point is, it's natural for humans to find meaning where none (or a different meaning) exists.

Each one of the following songs has sparked rumors, conspiracies, and enduring speculation about possible deeper meanings buried within their iconic chord changes and vocal riffs.

And let me just point out that all but one of the artists featured here are British. The odd one out? German.

“Stairway to Heaven” - Led Zeppelin

Few songs have been the subject of more conspiracy theories than Led Zeppelin’s epic masterpiece, “Stairway to Heaven.” This 1971 song is known for its incredible guitar solo as well as its mystical, enigmatic lyrics, which have given rise to a great deal of speculation about their meaning.


The song became something of a cliché, but it's popular for good reason. But I think part of its popularity is people projecting their own meaning onto the lyrics. Me? I once wrote a whole lyrics analysis of it, laying out the reasons why it's a song about abandoning organized religion for a more personal spiritual connection to nature.

Countless other analyses exist about the song’s lyrics, but one particularly extreme conspiracy theory about this song centers around the idea that if played backwards, the track holds Satanic messages.

You know the worst idiocy of that "theory?" It's not the Satanic bit; it's that there is not, and has never been, any evidence whatsoever that we can understand back-masked recordings, even on a subliminal, subconscious level. In other words, I could say "kill your parents, eat your dog, rape a child" and play it backwards, and it would sound exactly like nonsense and not somehow hypnotize the listener into becoming a psychopath. To hear the message, you'd have to reverse it again, and then you'd be like "what the hell, Waltz?"

Don't believe me? Think about it. If back-masking subliminally affected behavior,
every single advertisement would use it.

What is true is what I said above: we seek patterns. So it's entirely possible that there's a specific combination of sounds in a backwards-played song that serves as a kind of auditory illusion.

And one good thing came out of it: we got the wonderful joke, "What happens if you play country music backwards? You get your truck, your wife, your job, and your dog back." Or, as I riffed on it, "What do you get when you play jazz backwards? Music."

“Bohemian Rhapsody” - Queen

One major theory about this song that might actually have some substance to it is the idea that it might have been Mercury’s way of coming out as queer.

Or, you know, maybe he was just expressing his internal frustration with a world that adored his work yet wasn't ready to accept his sexuality. Hell if I know. I do remember reading about his struggles with getting the perfect sounds for the song.

“Strawberry Fields Forever” - The Beatles

The belief that Paul McCartney is dead and has been for a long time is one of music’s most common conspiracy theories. While this theory is widely unfounded, as McCartney continues to release music and make public appearances today...


Obviously, that's the original Paul's doppelgänger. Obviously.

Which would make Ringo Starr the only surviving Beatle, thus proving that the Universe is fundamentally flawed.

Look, the "Paul is dead" thing sold records and got people talking, but there's absolutely no evidence for it beyond a few trolling lyrics.

On the other hand, I've always had a hard time believing that the same guy who co-wrote "A Day in the Life" also came up with the vapid twattery that is "Simply Having a Wonderful Christmastime." Which, by the way, is another McCartney song that's generated conspiracy interpretations.

“Wind of Change” - Scorpions

While they may be best known for their song “Rock You Like a Hurricane,” the band Scorpions also had a hit with the 1990 ballad “Wind of Change,” which describes the fall of the Soviet Union and became popular in Eastern Europe as the iron curtain fell. However, the song’s success and political message also sparked a conspiracy theory that the CIA may have written it to further push a dissolving Soviet Union over the edge.


Oh, come on. The CIA has even less musical talent than Scorpions. They're German, and the Berlin Wall had just come down. Seriously, folks...

“Five Years” - David Bowie

I'm not even going to dignify this one by quoting it. I'll just say: Bowie (who, by the way, was actually Davy Jones, but there was already a Monkee by that name so he had to use a different one, and that's not a conspiracy theory) was brilliant and talented, but he wasn't a time traveler.

Space traveler, maybe.

“Empty Spaces” - Pink Floyd

Pink Floyd’s “Empty Spaces” is one rare case wherein a song, when played backwards, actually does contain a hidden message. When you play the song’s audio in reverse, you can indeed hear the words, “Congratulations. You've just discovered the secret message. Please send your answer to Old Pink, care of the funny farm, Chalfont…”


None of which, I'll reiterate, made any sense until someone played it backwards. Pretty sure that wasn't the only back-masking Pink Floyd ever used, either; but they used it for its sound effect, not for nonexistent subliminal messaging.

If you don't recognize the song, by the way, don't feel bad. I don't think it ever became a "single." You'd have to be completely obsessed with Pink Floyd to even recognize the title. You know, like how I used to play the album it came from over and over, or how I've seen the movie based on it more often than I've seen any other movie.

Anyway. Point is: just because you think something, or see a pattern, that don't make it so. All in all, it's just another brick in the wall.
March 16, 2026 at 9:16am
March 16, 2026 at 9:16am
#1110778
Sometimes, I hold on to a link just because I find it interesting. Here's one such, from ars technica:
     Why are vertebrate eyes so different from those of other animals?  
A new hypothesis proposes that our ancestors lost their eyes, then rebuilt them.

Okay, well, they didn't "rebuild" them, but I'll allow for some poetic license. Still, lots of people misunderstand evolution: there's no agent behind it, and "rebuild" smacks of agency.

After losing its original eyes, one of our distant ancestors may have done what evolution does best: tinkered with what was available, reshaping a single central visual organ into two new eyes.

I'll also emphasize that this is a hypothesis. It's not a certainty. That means there may be some evidence to support it, but the evidence could mean something else entirely. So, you know, if you're the kind of person who gets invited to cocktail parties (I am not), it's best not to present this as settled science.

According to the data considered by its authors—a team from the University of Sussex (UK) and Lund University (Sweden)—vertebrate eyes, ours included, may not descend directly from the paired eyes of early bilaterian animals. Instead, they may have been “reinvented” from what was once a single light-sensitive organ that survived an evolutionary detour.

I think most people have at least a vague idea of what a vertebrate is: an animal with a body plan that features a spine, which includes true fish, amphibians, reptiles, birds, marsupials, and mammals; and excludes ones like insects, octopuses, jellyfish, and politicians.

“Vertebrate eyes are so fundamentally different from the lateral eyes of other animal groups,” explains Dan-Eric Nilsson, senior author of the study from Lund University and a leading expert in eye evolution.

We think spider eyes are weird and scary. But to the spider, we're the weird, scary ones.

It always amused me as a kid to see the "elephant is scared of mouse" trope in cartoons, when humans being scared shitless of spiders is about the same thing.

“The key difference is the identity of the main photoreceptor, which is of ciliary nature in the vertebrate eye but rhabdomeric in other animal groups, such as arthropods and cephalopods,” he adds.

Oh, yeah, that clears things right up, like Visine after a good night. Look, I don't claim to be a genius or anything (except when I'm being smug), but I'm not completely stupid. I just don't know everything, and a lot of that is specialized biology lingo.

To understand what Nilsson is getting at, we need to unpack a few key concepts.

Fortunately, the article recognizes that and is about to 'splain things.

There are two major classes of light-sensitive photoreceptor cells—rhabdomeric and ciliary—that differ in shape, in the visual pigments (opsins) they contain, and in their electrical responses to light.

Somehow, I don't remember that being covered in the few biology classes I took. But it tracks with what I've learned since.

Most invertebrates rely on rhabdomeric photoreceptor cells for vision, while ciliary cells mediate light sensing but not vision—they generally help regulate internal biological clocks. Vertebrates, however, brought both types of photoreceptors into the same organ.

So, even the expert's explanation with all the lingo was incomplete.

The authors argue that the invertebrate, rhabdomeric-based arrangement represents the ancestral state of eyes, inherited from the common bilaterian ancestor and shared by present invertebrates.

I don't think the article explains "bilaterian." By the word, I'd think it would mean something like "midline symmetrical;" in the case of vertebrates, the spine runs along the symmetry line. And I'd be
mostly right.  

After the bilaterian lineage split—one branch giving rise to insects, crustaceans, and mollusks, the other leading to a group called deuterostomes that includes chordates and vertebrates—one of our distant ancestors appears to have become more sedentary.

That's right, that's why I really saved the article: so I could joke about how I'm an evolutionary throwback with my sedentary nature.

“The ancestral deuterostome adopted a burrowing lifestyle, either living sessile on the seafloor or partially burrowed, with only parts of its body protruding,” says George Kafetzis, research fellow at the University of Sussex. Under those conditions, two lateral eyes may have become more of a liability than an advantage. “Neural tissue in general is very expensive to maintain and function,” Kafetzis explains.

As a result—an idea already proposed in the literature—the lineage may have gradually lost its paired eyes.


I'm not 100% sure, but I think this is analogous to how certain species who adapted to caves lost their vision.

“We think that in this early deuterostome, the median eye contained both ciliary and rhabdomeric cells,” Kafetzis explains. As a result, both cellular lineages were incorporated into a single, ancient, cyclopean eye, which later evolved into the vertebrate eyes.

I mean, okay, it's a hypothesis. What nonscientists might call a "theory," but it's not a theory in the scientific sense.

A trace of this transformation may still survive in the pineal complex at the base of the brain—often referred to as a vertebrate “third eye.”

Mystics like to make a big deal out of this "third eye" thing. Unfortunately, the band name is already claimed.

Scientists have long recognized striking similarities between the retina and the pineal organ, leading many to suspect that the two evolved from a single ancestral structure, with the pineal representing a more rudimentary version.

Kafetzis and his colleagues see it differently.


Oh-HO! I "see" what you did there.

The article explains things further; then:

Though grounded in existing ideas and data, the new proposal offers a potentially far-reaching synthesis. Several aspects still require firmer evidence. The idea that the ancestral chordate adopted a burrowing lifestyle remains debated, and the claim that early bilaterians already possessed paired lateral eyes is still speculative.

Additionally, I didn't see any proposals for how the hypothesized single cyclopean eye might have evolved into the kind of bilaterally symmetric eyes most of us see, and see with, every day.

The article does emphasize the hypothetical nature of this claim, and it's not something that we can "do our own research" on. We'll just have to let the slowly grinding gears of science to do their thing: either falsify it, or support it enough to turn it into a theory.

In other words, we just have to wait... and see.
March 15, 2026 at 8:55am
March 15, 2026 at 8:55am
#1110694
Been a while since I've talked about cheese. Here's a cheesy article from Good Housekeeping.
     I'm a Cheese Expert and There’s One Cheese I Always Have in My Fridge  
It’s punchy, savory, and often cheaper than Parm—here’s why I always stock it.

With a clickbait headline like that, I was almost hoping it was an ad for Cheez Wiz. Alas, it seems to be a serious article (though it might still be an ad).

I used to be a cheesemonger and sold artisan cheese to chefs in Chicago for almost ten years, and when I think of pasta, (and I often do), I dream of cacio e pepe.

I'm not sure how I feel about that lede. It's a sentence that makes sense grammatically, sure (though I'm not sure it needs the comma after "pasta"). It also works if one already knows what the author is about to talk about, or in hindsight, once the connection between artisan cheese and cacio e pepe is more clear. But as a lede? It's functionally equivalent to "I used to be a plumber, and when I think of cars, I dream of Toyotas."

The sharp, salty flavor of a hunk of Pecorino Romano cheese, plus freshly ground black pepper, is crucial to the rustic flavor of this simple dish.

At least it's explained there in the first paragraph.

I buy it in small wedges, preferably cut fresh from the 55-pound wheel, and whiz it in the food processor for a rough, toothy texture that is fine enough to melt instantly.

Look at Ms. Fancy Chef over here with her fancy-cheese-pulverizing Food Processor.

Don’t get me wrong, I love Parmigiano-Reggiano, the so-called King of Cheeses, for its savory, caramelly, nutty flavor that brings umami to every plate.

Just so everyone's aware: there are other countries besides Italy that make cheese. Just saying.

Dishes with olives, roasted peppers, anything grilled with a bit of a char pair beautifully with the slightly pungent, briny, rustic taste of a true Pecorino Romano.

We're edging dangerously close to the No True Pecorino Romano fallacy.

Pecorino Romano and Parmesan are both aged Italian (or Italian-style) cheeses made in large wheels. Pecorino Romano, however, is made from sheep milk (”sheep” is “pecora” in Italian) while Parmigiano-Reggiano is made from cow’s milk and is a protected designation of origin (PDO) cheese produced in specific regions of Italy.

You know one thing I've never fully understood, though it's something I probably should: how protected labels are enforced outside of the country they're from. Perhaps the most famous is champagne, which, as everyone knows, can only be called champagne if it's from the Champagne region. Tequila can only be called tequila if it comes from a small region in Mexico.

But the US, for example, isn't France or Mexico, so what's stopping someone in, say, Oregon, from importing a bunch of agave and making a fine distilled beverage and calling it tequila? I mean, sure, Mexico could declare war on us over the violation (and right now, might just win), but other than war, what? Treaties? The UN? Pinky promises? Trade embargo? Strongly worded letter?

I'll have to look into that. Anyway...

Pecorino Romano cheese possesses a PDO (protected designation of origin) and thus can only be made in particular regions of Italy—primarily Sardinia, Lazio, and the province of Grosseto in Tuscany...

I'm embarrassed that I had to look up Lazio. It's the region containing Rome. For some reason, I'd never been aware of that name. In my defense, I've never been to Italy. Still... four years of Latin in high school and a love of Italian cuisine here. Just saying.

Cheese just called “Romano,” on the other hand, generally refers to an American-made version of a hard, salty cheese that can be made from cow’s milk, sheep’s milk, goat’s milk (or a blend) without the nuances in flavor and texture that a heritage Pecorino Romano from Italy might possess.

Now, just because it's made in the US doesn't make it automatically bad. Sure, the mass-produced, pre-grated Kraft version is basically sawdust, but we do make some decent cheese here. Just mostly smaller-batch stuff, like the difference between a delicious craft (not to ever be confused with Kraft) beer and the mass-produced swill that gets heavily advertised.

I can accept, however, that Romano from the US is, at the very least, different from Pecorino Romano from Italy.

Speaking of advertising, the article ends by mentioning a couple of brand names. Sneaky, sneaky ads. Oh, well, at least it doesn't come with a cheesy jingle.
March 14, 2026 at 8:42am
March 14, 2026 at 8:42am
#1110618
Here's one from The Conversation that I forgot was in the pile:

25? Try 50.

While driving recently, a long-forgotten song came on the radio. I found myself singing along; not only did I know all the lyrics to a song I hadn’t heard in 25 years or more, but I also managed to rap along.

If you suspect you're being spied upon (which you are), sing as much as you can as loud as you can, as off-key as you can (for me, that last one's a given).

How is it that I could give this rendition, but often cannot remember what I came into the room for?

I know this one. Because memory is tied to the senses, and as you move from room to room, you see, hear, and smell different things.

It is tempting to treat these moments as evidence of cognitive decline. A quiet, creeping sense that something is slipping.

Nah, that's when you forget what a room is.

We tend to talk about “memory” as if it were a single thing. It isn’t.

Yeah, there's ROM, RAM, disk drives, flash drives... oh, meat memory. Never mind.

Remembering song lyrics relies on long-term memory – networks distributed across the brain that store information consolidated over years.

Hence why I can't get advertising jingles from 1975 out of my head.

Each time you repeated those lyrics – in your bedroom, in a car, at a party – you reinforced the synaptic connections involved.

What I'm not sure of is if personal repetition is necessary. Do I remember songs that I didn't sing, only heard? I have no idea.

By contrast, remembering why you walked into the kitchen relies on working memory – the brain’s temporary holding space.

If I've walked into the kitchen, there are only three possibilities: Find something to eat, find something to drink, or feed the cats.

Working memory is fragile. It can hold only a small amount of information for a short period, and it is highly sensitive to distraction. A single competing thought is enough to overwrite it.

Yeah, that's why my memory sucks: because I think too much. Yeah, that's gotta be it.

Psychologists have described what is sometimes called the “doorway effect”. When you move from one physical space to another, the brain updates context. It segments experience into discrete episodes.

That's kind of what I meant above, with the "senses" thing.

The intention formed in the previous room – “get my glasses”, “find my charger” – was encoded in that earlier context.

So, make all the rooms in your house look the same. Got it.

Strikingly, even in neurodegenerative conditions such as Alzheimer’s disease, musical memory can remain relatively preserved long after other forms of recall deteriorate.

Great. I'll be in some dementia ward and the only thing I'll ever remember is "Plop plop, fizz fizz, oh what a relief it is" from the 1970s Alka-Seltzer commercials.

A lyric repeated hundreds of times in adolescence may be neurologically “stronger” than a single fleeting intention formed five seconds ago.

Baby, we were born to run.

What feels like memory loss is frequently attentional overload. Modern environments are saturated with interruptions: notifications, internal thoughts, competing demands. Working memory was never designed to withstand this level of interference.

Working memory was never designed, period. Okay, I know, that's a quibble; she probably meant it as a metaphor.

The issue is not that your brain can no longer store information, it’s that it is selective about what it stabilises. Small adjustments can reduce those frustrating “roomnesia” moments.

I really should be angry at that silly portmanteau, but I'm not.

One of the simplest is to say the task out loud before you move. Verbalising an intention – “I’m going upstairs to get my charger” – strengthens its encoding by engaging additional language networks.

"I'm going to the bathroom to take a good long dump."

There's more at the article, including other tips for remembering between rooms, but I don't remember them now.

Seriously, though, I can't vouch for the article's scientific accuracy. But, at the very least, these are decent working hypotheses for that particular memory issue. We still don't know enough about the brain to know what's really going on in there, but, despite my joke above, it doesn't work the same way as computer memory.

Now if you'll excuse me, I need to defrag my hard drive.
March 13, 2026 at 9:57am
March 13, 2026 at 9:57am
#1110550
Here's one from the end of last year, from Science-Based Medicine. I don't know much about the source, except what they put on their website; they seem rational, like their name isn't misleading.
     Well Dr. Stephanie Seneff, 2025 is Over. Did Glyphosate Turn Half of All Children Autistic?  
Failed predictions are a key feature of pseudoscience, and much of my writing has documented instances of credentialed academies making bold, confident declarations, only to act like they never happened when reality intruded into their fantasy.

I'll just butt in here: yes, sometimes science makes failed predictions, too. That's part of the process. What they don't normally do is sweep it under the rug.

A big date has arrived. Those of us who have been following pseudoscience for some time will remember that way back in 2014, MIT computer scientist and American Loon #2234 Dr. Stephanie Seneff, predicted that “half of all children will be autistic by 2025“.

Well... no. I don't remember that. I don't remember almost anything about 2014. I do remember RFKJr. saying shit about finding the cause of autism in 2025 and, when his deadline passed, mumbled some obvious bullshit about acetaminophen / paracetamol and called it a day.

The culprit, in her opinion, would be glyphosate, an herbicide initially manufactured by Monsanto for genetically engineered crops. As you can imagine, an MIT scientist using the buzzwords autism, Monsanto, and GMOs made quite the splash in the wooisphere back then. It was very big deal at the time.

Oh, yeah. That would do it. Not sure why I don't remember it. I wasn't blogging at the time, so maybe I wasn't as tuned in to certain news reports.

Brace yourself. It turns out that no, glyphosate did not turn half of all children autistic.

"Well, then, her warning must have worked!" < this may or may not be satire.

Predictably, instead of reflecting, apologizing, and retreating from public commentary as someone with integrity would do, Dr. Seneff has seamlessly moved on to claiming that vaccines will cause 50% of children and 80% of boys to be autistic by 2032.

Vaccines prevent or reduce deaths. Autism isn't usually fatal. They're either saying "I'd rather have a dead child than an autistic one" or are trying to demonize those on the spectrum. Maybe both. I don't know. Either way, the overwhelming evidence, ignoring for the moment the efficacy of particular vaccines, is that they have nothing whatsoever to do with autism. And it's gotta be tough for people on the spectrum to get bombarded with that kind of messaging.

Failed predictions- both of catastrophes and better days ahead- are a key feature of misinformation, and much of my writing has documented instances of similarly credentialed academies making bold, confident declarations, only to act like they never happened when reality intruded into their fantasy.

So, this is the real reason I hung onto this article, not the particular debunking in the headline: to emphasize that these crank predictions need to be remembered (though apparently not by me) and scrutinized, like when the whole "the world will end in 2012 according to the Mayan calendar" took hold of popular imagination enough for someone to make a truly epic shitty movie about it. That movie remains probably the only John Cusack movie I've seen that I don't like. Well, okay, I wasn't a huge fan of
Say Anything, but that was a really long time ago.

Those of us with working minds, meanwhile, noted that the Mayan calendar is cyclic (like, you know, every other calendar that we know of) and it was basically just the end of a counting cycle. Claiming the world will end then is like claiming that it's going to end because a year does.

That, of course, has nothing to do with medicine. But it's still scaremongering.

Anyway, the article goes on to help with the remembering by listing particular predictions. Most of them had to do with COVID.

I don’t know what 2026 will bring, but I predict that most of it will be very bad, and unlike the doctors I write about, I will admit it if I am wrong.

I, too, will admit it if I am wrong. That is... if I can remember what I said in the first place. I trust y'all to remind me and tell me "I told you so."

It just occurred to me that we have prediction betting markets now. I've never messed with them, but I say if you make a prediction like that, put some real money behind it. I might have to jump in and bet against you.
March 12, 2026 at 8:12am
March 12, 2026 at 8:12am
#1110473
Yesterday was a friend's birthday, so we went to dinner at a sushi place. A sushi place with a bar. So I'm not feeling much like doing anything today, including writing my normal blog entry.

Worth it.

Anyway, I'd halfway planned on taking a blog break today (not to skip a day, of course
unthinkablebut just to do something like this). As I noted yesterday, this entry marks one year of blogging here, 365 entries because 2026 isn't a leap year.

Thing is, my life is boring enough that there's not much to say here about it, which is why I rely on outside sources to riff off of. I avoid drama, but I don't mind reading about it.

The "boring enough" thing is by design, and it's a good thing. While shit happens to me just like it happens to everyone, I've got things under control. Well, mostly. Well, partly. Can't say the same for the rest of the world right now, but I have no control over that. So I'm not actually bored; it's just that talking about it would bore everyone else.

That's Year 1, folks. Tomorrow
if there is a tomorrowsame old boring stuff.
March 11, 2026 at 9:46am
March 11, 2026 at 9:46am
#1110382
This NPR article is a few months old, so don't panic.

Oh. Apparently, we weren't supposed to panic when it came out, either.

A Consumer Reports investigation has found what it calls "concerning" levels of lead in roughly two dozen popular protein powder brands — but says that's not necessarily cause for tossing them.

Of course not. After all, lead is known to cause cognitive decline, and cognitive decline in consumers is great for producers.

The nonprofit organization tested multiple samples of 23 protein powders and ready-to-drink shakes from a range of stores and online retailers over a three-month period beginning last November.

Also, it was a little late, even then.

The results, published on Tuesday, show that more than two-thirds of the products contain more lead in a single serving than Consumer Reports' experts say is safe to consume in an entire day.

"Tuesday" was last October.

Also, insofar as I understand these things, there's no safe level of lead.

The Council for Responsible Nutrition, a trade group representing the dietary supplement industry, released a statement on Wednesday urging caution in interpreting the study's results. It says that modern testing methods are sensitive enough to identify trace amounts of naturally occurring heavy metals, and that alone does not equate to a health hazard.

But lead is
natural.

Consumer Report's study adds to a growing body of research into heavy metals in a variety of everyday products, from cinnamon to tampons.

Two items that probably should not be combined, regardless of lead levels.

The nonprofit Clean Label Project tested 160 products from 70 brands earlier this year and found that 47% of them exceeded California Proposition 65 safety thresholds for toxic metals.

Yeah, well, it doesn't matter there because in California, everything gives you cancer.

There is no known safe level of exposure to lead, which is present in many of the environments in which food is grown, raised and processed.

That's what I thought. Still, I imagine it's not possible to eliminate it entirely. If nothing else, decades of lead-additive gasoline spread the stuff all over everything via the atmosphere.

There's a lot more at the link, including questioning why they're pushing protein powders so hard in the first place. I'm not discussing this because of the specific product
I don't use protein powders, so it doesn't directly affect mebut for the larger insight into how they handle product contamination issues.

Unrelated: Tomorrow will mark one year of daily entries in this blog. My daily blogging streak is a lot longer than that, but I switched books last month, on the 13th. Will I look at another article, or do a personal update? I'll decide tomorrow.
March 10, 2026 at 10:31am
March 10, 2026 at 10:31am
#1110309
This one, from SciAm, is the polar opposite of the last bit I did on the anthropology of human relationships, just a couple of days ago. That one talked about single people. This one...
     The truth about polyamory  
An anthropologist’s detailed research shows polyamorists focus on intimacy and honesty, not sleeping around

Intimacy? Honesty? A Jedi craves not these things.

Kelly and Tim practice polyamory: they form deep, meaningful, romantic relationships with more than one person at a time, with the full knowledge and consent of everyone involved.

You know what grinds my ass? In most cases, what consenting and/or eager adults do with each other is absolutely no one else's business. And yet other people feel the need to make it their business.

In popular media, though, it is usually ridiculed and dismissed.

Part of that is ignorance, but it may also be propaganda.

Critics deride polyamorists as decadent liberal hedonists looking for ethical cover for their desire to sleep with lots of people.

As a decadent liberal hedonist, I resent that characterization. I don't sleep with anyone.

An Atlantic article says polyamory is emblematic of the “banal pleasure-seeking of wealthy, elite culture in the 2020s,” allowing people to justify indiscriminate sex and avoid the hard work of commitment.

I avoid hard work, period. Trying to find someone to have sex with is itself hard work.

“No one can truly feel safe inside a marriage whose vows have an asterisk,” claim the authors of a piece distributed by the Institute for Family Studies.

No one? Bullshit. But that tracks with my rule: Any group or organization with the word "Family" in it spouts bullshit.

I am an anthropologist and licensed therapist, and I have spent the past seven years researching polyamory the way anthropologists do: by spending a lot of time with a lot of people who engage in it.

This is where, normally, I'd quip something like "Oh, I'll just bet you have." But that would undermine the point I'm trying to make, so I'll just pretend I didn't even think of the joke.

Politically, polyamory is a rare place where the left and right meet: you might encounter a libertarian or a Donald Trump supporter or a Bernie Sanders bro. The philosophy and practice of polyamory resonate with people across political divides and are not simply liberal indulgences—in fact, they tie into a libertarian and conservative ethos with deep roots in U.S. society, where people rebel against the powers that be telling them what to do.

That's an interesting observation, certainly. From what I'd gathered so far in the article, any moral panic about polyamory comes mainly from the conservative side.

Where popular portrayals of polyamory most miss the mark, though, is in the idea that the practice is primarily about having sex with multiple partners. Polyamory is mostly about intimacy, not sex, say the people involved in it, and it has ethics at its core.

I'm not exactly arguing the point, but I'd expect there to be a bit more variation, because, surprise, people are different. Just like not all singles are alike, and not all couples are alike.

Respect, consent, trust, communication, flexibility and honesty are fundamental to these unconventional dynamics, according to a large review by researchers at Virginia Tech published in 2023.

Fake those qualities, and you're golden.

Psychologist Justin Lehmiller, a senior research fellow at the Kinsey Institute, reported in the Journal of Sexual Medicine that polyamorists engage in safer sexual practices than the people who say they are monogamous—a quarter of whom reported having sexual relationships unknown to their partner—and this caution may reduce rates of sexually transmitted infections.

You remember up there when I said, "In most cases, what consenting and/or eager adults do with each other is absolutely no one else's business?" I admit to being judgmental about people who "cheat." If you're going to cheat on your partner, what else are you going to be dishonest about? But for me, that's not about the sex bit. As I said, that's none of my fucking business (goddamned right that pun was intended). It's about the ethics bit.

Also, I find it difficult to believe that only "a quarter" of self-described monogamists wandered off on their nominal partner. In my experience, it should be more like 90%. Like I said: dishonesty.

In short, polyamory is radically different from what many people may envision. Its current flourishing is not just a curiosity or random event: it indexes something important about this cultural moment and how people experience and value intimacy and relationships.

My own attitude toward it was shaped by reading science fiction.

No, really.

It's not something I've ever been interested in as a lifestyle. Single, or paired up: that's me. Maybe the occasional bit of group fun when I was much younger, though there were no commitments involved there. But what science fiction made me realize is that I can accept that other people want different things, and, I must reiterate, it's none of my business. Two of my best friends are in poly relationships, both of which have lasted way longer than my supposedly monogamous relationships have, so, great, works for them. I have another friend who is completely asexual. That works for her. And I'm in a purely platonic living situation, which I know a lot of people can't comprehend, either.

I am not an apologist for polyamory. I have been in such relationships in the past and had positive experiences, but I ultimately decided polyamory wasn’t for me. It activated some insecurities that I have spent years of my life working to heal, and I never felt that polyamory resonated deeply with my sense of who I am. For me, participating in polyamory successfully would take continual, deep work around old and familiar emotional wounds, and I simply wasn’t all in.

So, a lot like my own attitude, which probably has a bit of fear of abandonment thrown into the mix. But it seems to me that the author is being an apologist for polyamory
not for herself, perhaps, but as a general idea.

Polyamory holds that what’s wrong is the very premise of monogamy in the first place. One person cannot possibly meet all our needs. “It’s like this,” Kris, a 37-year-old real estate agent, says. “We have groups of friends, right? Maybe one you go out dancing with on the weekends, another one is the person you call when you’ve had a horrible day; maybe someone else is a sports fan, so you go to ball games together. Totally normal, right? We don’t expect one friend to be our only friend, because we have different kinds of relationships with different people. It’s unrealistic to expect one person to do it all.”

I can relate to that. Only for me, it doesn't mean I need to have intimate relationships to fulfill those "needs." But I do insist on being able to have friends outside a committed relationship.

Love, polyamory practitioners say, is similar. Like friendship, it is not a limited resource—it is additive. More love begets more love. “When you have multiple kids, you don’t love one of them less just because another one is born,” John, a 36-year-old business analyst, explains. “There’s enough love for all of them. You love them each for who they are uniquely.”

Which brings me back to science fiction.

No, really.

There's a quote from Robert A. Heinlein that I memorized at an early age:
The more you love, the more you can love--and the more intensely you love. Nor is there any limit on how many you can love. If a person had time enough, he could love all of that majority who are decent and just.

Now, I'm fully aware that Heinlein had some, shall we say, regressive attitudes about sex and gender. But that quote always resonated with me, and this article reminded me of it. (
Time Enough for Love, 1973)

There's a lot more at the article. Even though it is, and I cannot emphasize this enough, none of our business, I think it does help to learn more about these things.

None of this is about telling other people how to live. Quite the contrary. It's about accepting peoples' differences. Do poly relationships sometimes not work out? As the article notes, absolutely, yes. But so does every other kind of relationship. Including no relationship at all. Nominally monogamous couples divorce. Friends drift apart. People have falling-outs. That's life. Life is also, in my view, knowing that different people have different desires for intimacy and sex, and knowing when something's none of our business.

The sooner people realize this, I think, the better off we'll be as a species.
March 9, 2026 at 9:26am
March 9, 2026 at 9:26am
#1110224
Okay, kids, here's one for the etymology nerds, from NPR:

Well, they're both annoying, messy, noisy, stubborn and smelly, so it shouldn't be a surprise.

I did, however, briefly have nightmares as a child when I came across the term "kid gloves."

When Deborah Niemann tells you about her kids, ask for clarification: "When people hear me ... talk about my kids, it's not always obvious … are you talking about the two-legged kind, or kids ones in the barn?" she admits.

I know a couple who used to keep goats and other farm animals (they lived on a farm, go figure). They didn't have kids, but they had kids.

Where did the word "kid" come from, and how did it become a synonym for children?

This is another of those things that I'd always been mildly curious about, but never enough to go look it up. I had this idea in my head, though, that it was probably related to "kit" and "kitten," other names for young animals.

Kid entered the English language as a term for the offspring of a goat some 1,000 years ago as Vikings from Scandinavia (mainly modern-day Denmark and Norway) increasingly chose permanent settlement over raiding in northern and eastern England...

All words are made up. Some are made up, and then stolen.

Large-scale Viking settlement in England was established from about the mid-800s to mid-900s A.D., a time known as the Danelaw, or "law of the Danes." It was during this time that "kid" supplanted the earlier English word for a young goat, "ticcen."

Making me wonder: what was the old English word for an adult goat?

Apparently, it was gāt. Boring.

Around the turn of the 17th century, in Shakespeare's time, "kid" was beginning to be used interchangeably to mean either a young goat, a child or young adult. "It must have been something about the goaty vibes—sprightly, energetic, curious, bouncy," Watts says. "That metaphor just caught people's imagination."

Well, it seems my mind was on the right track with that. Just not as positive a spin.

Not so much with the etymology, though. I can't find anything that relates kid to kit, which, at least in the sense of young animal, seems to have come from French, not Scandinavian. It's possible that they share a PIE origin, but a quick glance at online sources doesn't point in that direction.

The word was even used for boxers and thieves. "Billy the Kid comes out of that," he notes.

Now that? That, I didn't know. I always figured he had a youthful appearance.

The word "kidnap" combines the modern sense of kid with the English verb nab or nap, meaning "to seize."

Not to be confused with "catnap." Or "catnip."

The use of kid as a verb also crops up in the 1800s, says Watts. It originates from the idea of playing someone for a kid, which "comes out of the criminal underworld… fooling them while someone steals their money off them while they're not looking," he says. Over time, it "morphed into a word meaning to hoodwink someone or, more playfully, just to joke with them."

Another mystery solved.

But "kidding" is also the season when baby goats are born, a fact that provides Niemann and her fellow goat enthusiasts an occasional bit of mirth.

Ugh. Other people's puns aren't funny. Only my own puns are funny.

As for the "kid gloves" I mentioned above?

"It's not about children at all. It's gloves made of kid skin (goat skin) thought to be particularly soft and delicate," according to Watts.

In fairness, child skin would also make soft, delicate gloves. Or so I would think. Not that I'd ever try. But hey, I've been known to write horror stories.
March 8, 2026 at 10:12am
March 8, 2026 at 10:12am
#1110134
I held on to this article from The Independent for reasons anthropological, not because it has anything to do with me.
     Men told me why they really hate singles nights – and it was heartbreaking  
When Olivia Petter wrote a piece on why men aren’t signing up to singles nights, she couldn’t have anticipated the outpouring that came from the men who read it. And what they told her really resonated

Yes, the headline's a bit clickbaity. Might need some translating, too: "singles night" seems to be what we in the US call "speed dating," which is kind of a get-to-know-you musical chairs game.

Before we get into it, yes, regular readers have run into Olivia Petter before here:
"Friend Zone No, I'm not stalking her.

Occasionally, you write something that strikes a nerve. A recent one of mine about men not attending singles nights was one of them. Since the piece was published – you can read it here if you missed it – I’ve received hordes of emails from men, eager to share their thoughts with me.

Knowing how some men are, I don't think "thoughts" were the main thing they were eager to share.

But the men writing to me this time weren’t like that. They were intentional, heartfelt, and honest.

So, they were actually women, pretending to be men on the internet.

Yes, I'm joking. We can be those things, or at least fake them. It's easier when you're anonymous.

And they were interesting, too, offering up a wide range of insights into why men might be more reluctant going to a singles night than women.

The important part here is, I think, the "wide range" bit. Men aren't all the same, despite what androphobes will tell you.

One of the common themes was vulnerability, which my article touched on. “Men are used to being rejected; women are often the ones rejecting,” one person wrote. “Experiencing this again, but with an audience, can’t be that tempting.”

Bit of a stereotype there, too. But there's probably a bit of truth to it.

Yes, it’s a bruise to the ego if someone you’re attracted to doesn’t reciprocate your feelings. But it’s not like that happens on stage in front of a crowd that will jeer and throw tomatoes at you.

Are you sure about that?

There were a few helpful pointers, with some men saying that the alcohol element made it tricky for those who don’t drink, while others added that the noise of these events can be overwhelming – honestly, I agree, and I often lose my voice at my own singles nights.

On the flip side, if there's no alcohol, I absolutely ain't going. And the noise thing sounds downright inhospitable.

Some men argued that the psychology of modern dating favours women more than men, potentially because women can be more emotionally fluent, a skill that the men writing to me often revealed can make them feel inadequate and even more awkward. I think that’s a shame and a view that reinforces harmful stereotypes that will only divide us further in the long run.

Gotta agree with the author here, even if she is a chick.

That said, some people clearly enjoy gender roles and feel that singles nights harmfully undermine them. One man wrote that the format itself goes “against the grain of how many men are wired to court”. “Being lined up for inspection, filling in forms, rotating on a timer – not just uncomfortable, but actively undermines the qualities that tend to make men attractive in the first place: spontaneity, confidence, a bit of mystery. Hard to be mysterious when you’re wearing a name badge. It doesn’t feel particularly ‘blokey’ to offer yourself out for selection.”

Counterpoint: I play video games and appreciate it when the other characters' names are floating above their heads.

Lots of men suggested integrating activities into dating nights to give them a more competitive edge – “Add some sort of competition with built-in conversation starters. A quiz? Cooking? Cocktail-making competition? Why not a go-kart event?” – and one rather boldly advised archery, as he’d been to a singles event like this recently.

And that would shut me right out entirely. It's already a competition. I despise competition. Why would I want to manufacture more of it?

But I remain somewhat unconvinced that the way to help men meet women in person is to give them weapons.

And this is why I bothered to save the article: there used to be a trope in comics where a primitive man would beat a primitive woman over the head and drag her back to his cave. Obviously, it's a good thing that this trope has died out (though I think an echo remains in the
Star Trek universe, with Klingon culture), but that's absolutely what it reminded me of.

One gentleman got in touch via email with an even more unconventional suggestion. “May I suggest you interview multiple Pokemon Go players and set up your girls’ dating trips on a weekend at a park with Pokemon Go being the focus?” he wrote. “You could bring cases of wine from Costco and have your membership still [valid] for your side gig dating programme, but trade dresses and high heels for comfortable walking shoes and sneakers.” I’m sure there’s a market for this somewhere, though I can’t say it’s something I’ve got planned in the pipeline.

I'm a huge nerd. I know I'm a huge nerd. But listen, if your entire personality is Pokemon, then hang out with other Pokemoners, or whatever they're called. I'm not judging, mind you; I know it's very popular, but it's still going to leave out people who aren't into Pokemon.

Overall, I’m flattered that so many men got in touch with such a range of responses. Evidently, many of us are feeling fatigued and confused by modern dating, particularly within the heterosexual demographic.

And I'm just glad I'm out of that game for good. It seems wearying and degrading for everyone involved, not just some of the men. I'm also curious
again, just from an anthropology perspectiveif there might be a cultural component to it, if it applies outside the UK as well. People are people everywhere, but there are different cultural norms and expectations for gender roles.

Still, I can't help but think the problem is a symptom, not the disease. It seems to me, though I'm far from an expert, that such things as singles nights (or speed dating on this side of the pond) just encourages people to think of relationships as fungible, and to keep looking even after you've found a match, because you never know: There might be someone better just around the corner.

Maybe that's a good thing, though. Maybe it helps people be better people, so they can keep up. Or, like me, you just give up entirely.
March 7, 2026 at 9:27am
March 7, 2026 at 9:27am
#1110035
Last year, I did an entry about the Rubin telescope: "Hey Rubin Today, a followup, this one featuring an article from space.com:

In June of 2025, we were greeted with a set of space images so special that one scientist even deemed them worthy of the title "astro-cinematography." Indeed, they were unbelievable, dotted with TV-static-like dots representing millions of galaxies, printed with nebulas resembling watercolor canvases, and bursting with data about some of the farthest cliffs in our observable universe.

"Unbelievable" is here being used figuratively. The cool thing about it is that it's totally believable
if astounding, amazing, superlative, etc.

Rubin has the ability to thoroughly image the night sky over and over again from its vantage point atop Cerro Pachón in Chile, and with unprecedented efficiency at that.

It's only natural to wonder why, if we can do such great astronomy here on Earth, we need to also spend billions on space telescopes. I'm not an expert, but space-based observatories still have major advantages, including being able to see in wavelengths that even our thinnest atmosphere blocks.

"We're going to actually create more data than all optical astronomy has ever had in the first year of our decade of operations, which absolutely blows my mind," Meredith Rawls, an astronomer working on the observatory, said during January's American Astronomical Society meeting.

If true, and I'm not doubting it, that really is unbe- er, I mean, astounding.

An Earth-based telescope approaching the limits of modern technological power is unfortunately forced to contend with another kind of scientific advancement happening in space: the exponential rise of satellites in Earth orbit.

I'm not the only one who sees the irony here, right? We finally have the technology to make ground-based optical astronomy better, but that same level of technology allows us to loft satellites into orbit fairly cheaply, thus detracting from the awesomeness of the astronomy.

As of writing this article, there are about 14,000 satellites orbiting our planet — nearly 10,000 of which belong to SpaceX — and the number is going to increase aggressively as commercial interests in this realm continue to grow.

Some years back, I spent a week in the way too high and cold mountains in Colorado with a bunch of other astronomy nerds. Even with our commercially available telescopes, we couldn't observe a single star or planet without seeing at least one flash of a satellite cross the field of view.

SpaceX has actually recently floated the idea of a data center in our planet's orbit, which would involve putting something like a million more satellites up there.

Heh. "Floated." I see what you did there.

Seriously, though, Space-sex's head honcho has floated a lot of ideas, the vast majority of which suck, and most of which never come to fruition anyway.

Priceless Rubin images could therefore be tainted by commercial satellite interference, or "streaks," as astronomers say.

Which, I suppose, brings us back to needing to loft more space-based telescopes, which adds to the number of human artifacts in space.

Just this month, physicians and scientists from Northwestern University announced they're worried about satellites in Earth orbit disrupting our sleep patterns.

I went to the link to that, because it seemed farfetched to me, but it seems like it's a warning against further light pollution from orbit, not saying that it's already causing sleep problems.

"They change the night sky," Rawls said. "Turns out, telescopes are not the only things that look up."

"We are all in the gutter, but some of us are looking at the stars." -Oscar Wilde
March 6, 2026 at 10:35am
March 6, 2026 at 10:35am
#1109947
Something else a little different today, from Inverse:
     70 Years Ago, Forbidden Planet, Flaws and All, Changed Sci-Fi Forever  
Return to Forbidden Planet. If you dare.

Full disclosure up front here: I've never actually seen the whole movie. So I'm not here to discuss the movie; I'm here to discuss the article, which discusses the movie.

On March 3 and 4, 1956, at a humble science fiction convention in Charlotte, North Carolina, called SECon II (Southeastern Science Fiction Convention), roughly 30 people got an early screening of what one hardcore enthusiast, at the time, called “the first real s-f film, as fans know science fiction.”

Yes, cons have been around for a while. So have huge nerds. And gatekeepers.

By which I mean, calling it "the first real s-f film" is rather a matter of opinion. I think it's generally accepted that
Le Voyage dans la Lune   holds that honor, and that one was made before two brothers from Ohio paid a visit to North Carolina, when even airplanes were the stuff of science fiction.

It could be argued, of course, that the Méliès film is more fantasy than science fiction, but sometimes the boundaries blur into insignificance. Consider
Star Wars, for example, which is fantasy with SF tropes.

He also noted that the people in the audience (again, very small, made of hardcore fans) were “sitting on the edge of their seats,” and “comments following the showing were enthusiastic.”

It's easy to sit here in 2026 and scoff at the primitive films of the 20th century. But I believe in taking things in historical context. So, while I dispute the claim of "first real s-f film," I don't deny its impact within its own time period. Again, for context, this was the year before Sputnik turned another SF speculation into reality.

Today, this might seem like an understatement, considering the degree to which Forbidden Planet changed pop culture, or at least pioneered a certain kind of mainstream space-oriented science fiction which would dominate mainstream TV and film sci-fi for decades to come. (For what it’s worth, they didn’t call it sci-fi back then, by the way, hence s-f.)

I still refuse to call it "sci-fi." Yes, I know that's the official genre label here on WDC, but as a huge nerd and gatekeeper, I hate that particular shortcut. If you're going to shorten something, have the common decency to keep the vowel sounds intact.

Forbidden Planet
is a beautiful film, way ahead of its time visually and sonically, that now feels slow, poorly paced, and full of concepts that the 1960s Star Trek did much better, and with more joy.

Yes, okay, but
Star Trek wouldn't ever have existed without three major pillars: Roddenberry (obviously), Lucille Ball (yes, really), and Forbidden Planet. So, I feel like claiming it's a low-class version of Trek is disingenuous; it's an important part of Trek background.

In short, in 2026, 70 years after its release, Forbidden Planet isn’t greater than the sum of its robot parts, but some of its parts are not only great, but now woven into the basic fabric of science fiction in general.

And FP, in turn, built on SF concepts that preceded it.

Mild spoilers ahead.

For fuck's sake, the movie is 70 years old. Hey guys, spoiler alert: Rosebud was his sled!

Hume’s rewrite of the movie injected a more intellectual angle, which, today, scans as almost a rough draft for the original Star Trek. Gene Roddenberry screened Forbidden Planet to his Star Trek collaborators in 1964 to get a sense for the vibe he was going for.

"More intellectual" should not be parsed as "highbrow art."

Not that I care about brows. Just managing expectations, here.

Like Star Trek — at least early 1960s Star Trek Forbidden Planet presents a story about a spacecraft crewed by people who behave in a roughly navalist way, assigned to check on the status of an older Earth spaceship, the Bellerophon, which was lost on the planet Altair IV years prior. (Both Star Trek pilot episodes in 1964 and 1954 find the crew searching for clues about a lost Earth mission, too.)

So, fact check here: 1) There was no early 1960s
Star Trek; the best one can say is that it began in 1964, which I'd call mid-sixties, when the first pilot episode (The Cage) was made, and even then, it wasn't ready until early '65. 2) "1954?" Gotta be a mistype. The second Trek pilot was in 1966, though it was the third episode aired: Where No Man Has Gone Before.

You don't have to be a hardcore Trek fan to know that
The Cage eventually got folded into the series, with a framing story involving Kirk and Spock, a two-parter called The Menagerie.

Why does this detail matter? Well, at the time, having a science fiction movie that presented interstellar space travel as an established fact, rather than a gee-wiz new invention, was somewhat novel.

And this is why context matters.

I would be remiss if I didn't point out that in literature, such stories already existed, but SF had a really bad reputation (partially deserved) at the time, so the stories didn't reach a mass audience the way movies did.

There are probably more words written about Robby the Robot — the most expensive movie prop ever built up until that time — than there are about any other aspect of Forbidden Planet. But what makes the movie worth watching, or, perhaps, worth studying, isn’t the robot. It’s the tone.

The whole trope of the robot companion, brilliantly parodied by Douglas Adams and turned on its head by
Battlestar: Galactica, may be the most lingering echo of FP. Consider the droids in Star Wars, the computer HAL9000 in 2001, the entirety of Lost in Space, the freakin' Jetsons, etc. Oddly, it was the one thing Trek never really dabbled in: there was The Computer, which probably wasn't sentient like HAL, and of course the character of Data in TNG, but he was presented as a fully sentient being, not a robot pal.

In short, what makes Forbidden Planet less than brilliant today is threefold: The prevalence of sexism in its first act is extremely distracting, by both 1956 standards and today. The plotting is poorly paced, with everything great crammed into the last 15 minutes. And, finally, let’s face it, Star Trek did it better a decade later.

Okay, well, I'm going to leave it to the article to make these cases. I'll present a different point of view here.

Sexism: Look, pretty much every movie from the 1950s is cringeworthy on this front today. As I have not seen FP, I don't have a personal opinion about it. But having read a great deal of SF from that era, it doesn't sound out of line with what one expected from SF in the 1950s. There was no secret that the principal audience of SF at the time was young men, and the writers wrote what they thought young men at the time wanted, which included manly men who are also huge nerds blasting at space aliens and getting the girl in the end.

I'm not saying it was right, mind you. Just that I have my doubts about it being distracting "by 1956 standards."

One of the more brilliant things George Lucas ever did was making Luke and Leia (SPOILER ALERT) siblings, which neatly sidestepped that trope. And then leaned into it again with Han Solo, but that's beside my point.

As for the plotting, again, I haven't seen it, but if what the article's author wrote is true, that is indeed a damning indictment. At least if you care about the writing. I'm assuming everyone here would, because, well.

The third point there, the one about
Trek, may also be true. But I think it's irrelevant, because, as I noted (and the article seems to agree), Trek wouldn't exist without FP.

Where Forbidden Planet introduces these themes with Shakespeare-esque gravitas, Star Trek smartly always made those kinds of conflicts deeply personal as well as philosophical, especially in its first two pilot episodes.

There is one other major difference:
Star Trek has moments of real comedy. Comedy was even a plot point in the aired pilot. The scene where Scotty defeats a far superior alien by getting him completely and totally schloshed is one of the greatest TV show moments of all time, and that was in the pilot.

Comedy is, in fact, baked into
Trek's DNA. But that should come as no surprise, considering who finally greenlit the show.

Thanks, Lucy.

378 Entries *Magnify*
Page of 19 20 per page   < >
<   1  2  3  4  5  6  7  8  9  10  ...   >

© Copyright 2026 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted Writing.Com, its affiliates and its syndicates non-exclusive rights to display this work.