\"Writing.Com
*Magnify*
◄     March     ►
SMTWTFS
      
1
2
3
4
5
6
7
8
9
10
11
12
29
30
31
Archive RSS
SPONSORED LINKS
Printed from https://writing.com/main/profile/blog/cathartes02
Image Protector
\"Reading Printer Friendly Page Tell A Friend
No ratings.
Rated: 18+ · Book · Opinion · #2336646
Items to fit into your overhead compartment

Carrion Luggage

Blog header image

Native to the Americas, the turkey vulture (Cathartes aura) travels widely in search of sustenance. While usually foraging alone, it relies on other individuals of its species for companionship and mutual protection. Sometimes misunderstood, sometimes feared, sometimes shunned, it nevertheless performs an important role in the ecosystem.

This scavenger bird is a marvel of efficiency. Rather than expend energy flapping its wings, it instead locates uplifting columns of air, and spirals within them in order to glide to greater heights. This behavior has been mistaken for opportunism, interpreted as if it is circling doomed terrestrial animals destined to be its next meal. In truth, the vulture takes advantage of these thermals to gain the altitude needed glide longer distances, flying not out of necessity, but for the joy of it.

It also avoids the exertion necessary to capture live prey, preferring instead to feast upon that which is already dead. In this behavior, it resembles many humans.

It is not what most of us would consider to be a pretty bird. While its habits are often off-putting, or even disgusting, to members of more fastidious species, the turkey vulture helps to keep the environment from being clogged with detritus. Hence its Latin binomial, which translates to English as "golden purifier."

I rarely know where the winds will take me next, or what I might find there. The journey is the destination.
March 28, 2025 at 10:39am
March 28, 2025 at 10:39am
#1086117
Yes, today's article is from Quanta. Yes, it talks about mathematics. You didn't really expect that to stop when I changed blog themes, did you?

    Proof Finds That All Change Is a Mix of Order and Randomness  Open in new Window.
All descriptions of change are a unique blend of chance and determinism, according to the sweeping mathematical proof of the “weak Pinsker conjecture.”


I've accepted the "mix of order and randomness" thing for a long time (my method for selecting articles to feature here uses just such a combination), but it's nice to know there's formal proof—even though I don't understand it.

This particular article is from 2019, so something might have superseded it by now. I don't know.

Imagine a garden filled with every variety of flower in the world — delicate orchids, towering sunflowers, the waxy blossoms of the saguaro cactus and the corpse flower’s putrid eruptions. Now imagine that all that floral diversity reduced to just two varieties, and that by crossbreeding those two you could produce all the rest.

I'm sure there's a better metaphor for this idea.

That is the nature of one of the most sweeping results in mathematics in recent years. It’s a proof by Tim Austin  Open in new Window., a mathematician at the University of California, Los Angeles. Instead of flowers, Austin’s work has to do with some of the most-studied objects in mathematics: the mathematical descriptions of change.

And, again, so far beyond my own knowledge that it might as well be orbiting Betelgeuse. That's why I read stuff like this. Though it's hard to be skeptical when you don't have the necessary background to ask the right questions.

These descriptions, known as dynamical systems, apply to everything from the motion of the planets to fluctuations of the stock market.

I can understand being skeptical about this sentence, though. Planets are predictable, right? Like, we know when the next transit of Venus will occur, and when Jupiter aligns with Mars. The stock market is the antithesis of predictable; even weather forecasts are more accurate than stock market speculations.

And yet, both are chaotic systems (so is weather). It's just that the planets' orbits are indeterminate after a much longer time frame.

Wherever dynamical systems occur, mathematicians want to understand basic facts about them. And one of the most basic facts of all is whether dynamical systems, no matter how complex, can be broken up into random and deterministic elements.

I'm not entirely sure, but I think this means that true randomness really does exist. I'd been contemplating that question for a long time. Even dice rolls can be seen as deterministic, relying on initial state, the configuration of the hand that rolls them, and the surfaces they roll against and upon. Also, I have to remember, "deterministic" doesn't necessarily mean "predictable."

The article, of course, delves into more detail and contains helpful illustrations. There's nothing else in there that I really want to comment on. Again, I can't say I understood it all. But I think the reason I saved this article, lo these many months ago, is because it's not some high-flying proof unrelated to anything in the real world. As the article notes, it can apply to planetary motion and to stock market fluctuations. I added weather up there. But there's another real-world system that people get wrong all the time (though I can't claim that I get it right all the time), and that's evolution.

Evolution deniers have been known to look at organisms (such as humans) or organs (such as the eye) and assert "that couldn't have happened at random!" Their alternative, of course, is that some supernatural intelligence designed everything. But I'm not here to argue that, at least not today. In fact, I'd say that assertion, as far as it goes, is correct—ignoring extremely tiny quantum probabilities, anyway. Because it didn't happen at random. I've been saying for years that it's not random, but there are random elements (such as gene modifications) operating against a backdrop of physical constraints.

What I'm still unclear about, and this may be more in the realm of philosophy than math or science, is just how much randomness is actually in play in a given system. But, mostly, it gives me another chance to crow that John Calvin was Wrong. Some years ago, and I had to look it up, I wrote: "To me, as an outsider, 'divine will' is indistinguishable from random chance working through the laws of physics." And behold, here we have a supposed mathematical proof that random chance works through the laws of physics.

Look, I try to be skeptical about these things, to the extent that I question even those things with which I agree. But right now, at least on this subject, I'll just assume I'm right and there's science backing me up.
March 27, 2025 at 9:04am
March 27, 2025 at 9:04am
#1086072
Well, let's see if the random numbers give us something other than food today. Ah, here we go, from The Guardian—and it's not about food, unless you mean "for the eyes."

    ‘We won’t come again’: dazed visitors fed up with overcrowded Louvre  Open in new Window.
Paris attraction in need of overhaul amid complaints of leaks, long waits, lack of signage – and too many people


Regular readers might remember that, as detailed in my previous blog, I spent a few days in Paris in a hotel just a few blocks from the Louvre. I walked through its gardens and listened to buskers, unironically complained about tourists (in my head, anyway), and decided not to go inside because I had a bum knee and heard it was crowded.

It was only after I'd been back for a while that this article came out.

As the crowds poured out of the Louvre, the look of dazed exhaustion on many faces confirmed what the museum’s director had warned last week: a trip to Paris’s biggest cultural attraction has become a “physical ordeal”.

A part of me has been kicking myself ever since (with the foot attached to the non-shaky knee). I mean, it's the Louvre, the most famous art museum in the fucking world. I'm not a huge art snob or anything, but I know the difference between a painting and a sculpture. I had a chance! I blew it! Then this came along and made me feel better about my decision.

Myriam, 65, a former secondary school science teacher had driven from Belgium with her husband to show their 12-year-old granddaughter the Mona Lisa. They left disappointed.

From what I hear, everyone's disappointed by that puppy. Smaller than you expect, and you gotta fight crowds. Pretty sure I mentioned that at the time.

They had squeezed through huge crowds on Monday to try to catch a glimpse of Leonardo da Vinci’s masterpiece, but found the room badly designed and with no proper flow of people.

So I understand where they're coming from, but it's the Louvre, not Eurodisney.

“There are so many people. Lots of rooms aren’t numbered. The staff are very friendly, but you feel they’re more there to show people the way than to protect the paintings,” said Myriam.

Friendly staff? In a Paris museum? Now I want to go back just to see the unicorn.

As the article notes, they're from Belgium, so it's likely they speak fluent French. I do not, though I can read the written language fairly well. It may just be Anglophone tourists that get the grumpy-Parisian treatment.

On Tuesday, the French president, Emmanuel Macron, will deliver a speech at the Louvre in which he is expected to unveil details of new investment, which could involve major overhaul – even a potential additional entrance.

This article is, as I noted, from a couple of months ago. Since then, other issues may have become more pressing priorities for the French government.

Another visitor, this one from actual Paris, gets quoted:

“The noise is so unbearable under the glass pyramid; it’s like a public swimming pool. Even with a timed ticket, there’s an hour to wait outside. I can’t do it anymore. Museums are supposed to be fun, but it’s no fun anymore. There’s no pleasure in coming here anymore. And to get out you’re made to walk the length of a shopping arcade to force people to buy things – commercial interests have taken over everything.”

From what I've seen, that's pretty standard these days. Hell, when I was leaving the airport there, the signage, in several languages, directed exiting passengers along a winding path through the shinies shop. There was, at least for arriving passengers, no other way out. American consumerism is bad enough, but I think we learned it from France.

The Louvre’s director, Laurence des Cars, warned in a damning note to the culture minister this month that the facilities were below international standards, the visits were not easy and involved long waits, and the building was in poor repair, including leaks and poor temperature controls.

Annoying as "leaks and poor temperature controls" are for human visitors and tourists, they're even worse for the art.

I did find this piece from CBS  Open in new Window. dated after Macron's speech. It looks like France can still walk and chew gum at the same time, as it seems they're planning some renovations. And yet, as the CBS article notes, they're funding this by raising admission prices for tourists.

Well, shit, all they had to do was raise prices to begin with, and then they'd have smaller crowds. Duh.
March 26, 2025 at 9:37am
March 26, 2025 at 9:37am
#1086014
Back when I was in Belgium, at one point, I was on a tour where a guide pointed out one of the local landmarks: an "authentic" Philly Cheesesteak restaurant, complete with bronze statue of Rocky Balboa.

Don't get me wrong. I like a Philly cheesesteak as much as someone not from Philadelphia is allowed to. But American food in Belgium is like copper in a gold mine.

Except for the frites (fries), of course. We stole those from Belgium (not France) fair and square, so it's only right if they steal them back.

This brings me to the article that popped up today, something dated 2018 from Afar:

    The True Tale of the Philadelphia Cheesesteak  Open in new Window.
It all started with a hot dog.


I'm not going to take the article's word that it's a "true tale." Food history is notoriously complicated and mythologized, as with yesterday's bit about Worcestershire sauce. Still, the subhead about the hot dog effectively baited me in.

The cheesesteak I smell frying is nearly the same as the original born here 85 years ago: An Italian hoagie roll packed with thinly sliced rib eye and Spanish onions, both sautéed on a flat-top grill, sometimes with peppers and mushrooms. Cheese, whether you opt for provolone or the iconic Cheez Whiz—just “Whiz” in local parlance—holds the whole thing together.

I might have mentioned before that I tried to explain Cheez Whiz to someone in France, and while I expect looks of pity and contempt from the French, the utter disbelief and disdain radiating off of his face at what America did to cheese was hot enough to melt Cheez Whiz.

But ask anyone what really makes a cheesesteak and they’ll tell you it’s the roll.

I'm always happy to see someone share my point of view that bread is the only food and everything else is a condiment.

“You’re always hearing about how one particular cheesesteak place is the best, even though I probably make more in one day than they sell in a year,” says Olivieri. “But cheesesteak joints are like opinions. They’re everywhere, and all valid.”

If we didn't all have different tastes, there wouldn't be such an enormous selection of food and beverage to choose from. Pizza places would only serve one pie. Breweries would only make one beer. Boring. I'm sure it's fun to argue about which cheesesteak place is the best, though.

And opinions are like assholes: everyone has one, and most of them stink.

Only Pat’s can claim to be the original. The cheesesteak, Olivieri tells me, was invented in 1930 on this very corner by his grandfather Harry and his great-uncle Pat Olivieri.

Maybe. Maybe not. It's good marketing, though.

The duo worked as hot dog vendors in an open-air Italian market, and when times were good they would buy beef and fry it up with onions for their lunch. One day, a taxi driver asked if he could buy the sandwich instead of a hot dog. Pat offered to split it. The driver, smitten, advised the pair to sell them.

Hence the hot dog connection. One might ask, "but why didn't they just eat a hot dog?" Well, I know if I sold hot dogs all day, the last thing I'd want for a snack is a hot dog. This doesn't mean I believe the story, only that it makes it more plausible.

It wasn’t until World War II, however, that the cheesesteak became a Philly emblem. A true showman, Pat started a rumor in the days of WWII rationing that his sandwiches contained horse meat, then, in mock outrage, offered a $10,000 reward for someone to prove it.

Now, that? That is brilliant marketing. Probably couldn't be done today, though, not with DNA sequencing as ubiquitous as it is. It also produces false positives. "See? This report shows it truly is horse meat! Gimme my money." The marketing campaign would backfire, and you'd go bankrupt and get beaten to death by horse lovers.

While places like Pat’s continue to churn out the classics, the current cheesesteak scene reflects the city’s changing dining landscape. HipCityVeg makes a respectable vegan version. The cheesesteak even gets an haute touch at Barclay Prime, a swanky steak house, where Wagyu beef is tucked into a sesame roll and piled with foie gras and truffled cheese.

There is no food so iconic that someone hasn't come up with a gourmet version designed to better separate you from your money..

I should note that I never did go to the Philly cheesesteak place in Belgium. Not because I didn't trust it, but because I wanted to experience things we don't get in the US. If anything, I had some fear that the cheesesteak there would be so good that I could no longer eat the ones here, and that would be a real shame.
March 25, 2025 at 9:51am
March 25, 2025 at 9:51am
#1085973
They say ignorance is bliss. I tend to disagree, but there are some things I feel like I was better off not knowing. Like the subject of this Epicurious article:

    The Murky, Salty Mystery of Worcestershire Sauce  Open in new Window.
The peppery sauce may be wildly popular, but its ingredient list and origin story are shrouded in secrecy.


It's also one of those things where people who know how to pronounce it inevitably look down their noses with disdain on those who don't. Kind of like quinoa or pho.

Culinarily ubiquitous and a perpetual tongue-twister, Worcestershire sauce is one of the great food enigmas of the past two centuries.

Yeah, I know I just did a food article a couple of days ago. Random is random.

Inky brown, sweet and salty, funky and fishy, peppery and piquant, the sauce’s exact ingredient list was kept secret ever since it was first sold in Worcester, England, in the mid-19th century.

There are at least two reasons to keep a recipe secret: to protect the business of making it, or because if you revealed it, people would be disgusted.

This might be a case of "both."

Nowadays, an ingredient list is mandated on most edible items, but one can hide a lot of nastiness under the cover of "natural and/or artificial flavors."

The mystery that originally shrouded Worcestershire sauce has continued to propel its popularity around the world.

Is it really the mystery, though? Or is it that it simply tastes good and helps bring out other flavors in food? Or, ooh, I know: it was the marketing.

In fact, I’m willing to bet you’ve got a bottle of the perennially popular stuff tucked into a corner of your kitchen cupboards at this very moment—perhaps purchased for a platter of deviled eggs, a weeknight meatloaf, or a classic Bloody Mary.

It's in the fridge, but yeah. Also, classic Bloody Marys are lame. Yes, I put Worcestershire sauce in mine. But the base is V-8, not tomato juice or Bloody Mary mix, which is apparently tomato juice with a few grains of seasoning.

The recipe for the original version, developed and sold by Lea & Perrins in the 1830s, remained a closely guarded secret until 2009, when the daughter of Brian Keough, a former Lea & Perrins accountant, disclosed that her father had purportedly discovered an ingredient list in a factory trash pile. That recipe called for water, cloves, salt, sugar, soy, fish sauce, vinegar, tamarind, and pickles, among other ingredients.

It's the "among other ingredients" that still worries me.

Incidentally, I don't believe the "found in a trash pile" story for one second. I could change my mind on that, but it smells like corporate myth-making to me.

Incidentally, "fish sauce" has always confused me. Is it sauce that's made from fish, or is it sauce for fish? Or both? I think one is supposed to just know these things, like how we know that olive oil is made from olives, but baby oil is made for babies.

But Worcestershire’s closest condiment cousin is probably garum, a fish sauce that was integral to the kitchens of antiquity. Made from the fermented and salted innards of oily fish like anchovies and mackerel, this umami-rich potion was used on its own as a table sauce and blended with other ingredients—such as wine, black pepper, honey—to create various dressings for meat, fish, and vegetables.

This is why my answer to the perpetual ice-breaker question of "if you could travel to the past or the future, which would you choose?" is "neither, but we know the past was disgusting, so if I had to choose, it'd be the future."

Anyway, the article goes into the sauce's history for a while, then:

Many of these references, and countless others, were compiled by William Shurtleff and Akiko Aoyagi in their History of Worcestershire Sauce from 2012.

Aha! This is a book ad, after all!

Now owned by Kraft Heinz, Lea & Perrins Worcestershire sauce still dominates American supermarket shelves—but just as in the mid-19th century, alternative versions proliferate.

Of course a giant conglomerate produces it now. One wonders what cost-cutting measures they inevitably took to make Worcestershire sauce as bland and uniform as pretty much everything else is, these days.

They probably kept the disgusting parts, though. Those tend to be cheap.
March 24, 2025 at 11:17am
March 24, 2025 at 11:17am
#1085928
Today's article is old, ancient, even decrepit by internet standards. Nautilus dates it as 2013. But human nature hasn't changed much in 12 years, so here it is.

     Why We Keep Playing the Lottery  Open in new Window.
Blind to the mathematical odds, we fall to the marketing gods.


To grasp how unlikely it was for Gloria C. MacKenzie, an 84-year-old Florida widow, to have won the $590 million Powerball lottery in May...

I know you're thinking it. Okay, I was thinking it, and now you will be too: is she still kicking? The answer is no; she died four years ago, at the age of 92. That's a decent run; my dad made it that far. Well, there was the lawsuit she filed against her son for mismanaging the finances, but family issues are gonna issue whether you've got money or not; it's just that, with money, you get to hire the best lawyers and make your family dispute public.

The news likes to latch on to reasons why winning the lottery won't make your life better, maybe because it fits the whole Puritan "you gotta work for it" mentality, but the truth is more complicated: some jackpot winners do indeed crash and burn, and some go on to lives of happiness and contentment, but most people just go on being people, and people's lives have their ups and downs.

To continue from the article:

...Robert Williams, a professor of health sciences at the University of Lethbridge in Alberta, offers this scenario: head down to your local convenience store, slap $2 on the counter, and fill out a six-numbered Powerball ticket. It will take you about 10 seconds. To get your chance of winning down to a coin toss, or 50 percent, you will need to spend 12 hours a day, every day, filling out tickets for the next 55 years.

Honestly, I didn't check the math, but that sounds about right. There was a thing in, I think, Texas recently, where a group managed to spend considerably less than 10 seconds on each combination, and basically brute-force hacked the lottery. That can work, mathematically, when the jackpot is high enough and the lottery commission, or whatever, doesn't have safeguards in place. I think they do, now.

Williams, who studies lotteries, could have simply said the odds of winning the $590 million jackpot were 1 in 175 million. But that wouldn’t register. “People just aren’t able to grasp 1 in 175 million,” Williams says.

This is not a slight on "people," any more than if you said "people just aren't able to fly without mechanical assistance." He offers up an evolutionary psychology explanation, which always turns me right off, but whatever the reason, very big or very small numbers just don't register on us emotionally. Also, very, very few people really understand probability or statistics, even when the odds can be counted on one's fingers and toes.

It may seem easy to understand why we keep playing. As one trademarked lottery slogan goes, “Hey, you never know.” Somebody has to win.

That's not strictly true when you're talking about lotteries with progressive jackpots. The whole reason they're "progressive" is that no one won the first round.

But to really understand why hundreds of millions of people play a game they will never win, a game with serious social consequences, you have to suspend logic and consider it through an alternate set of rules—rules written by neuroscientists, social psychologists, and economists.

Are we really suspending logic, though? Or are we looking at it logically from different perspectives?

The bulk of the article indeed looks at it from a different perspective: that of marketing, which combines psychology, economics, probability, intuition, aesthetics, and probably a few other disciplines I'm forgetting. I don't need to quote much else, but to summarize, the gist of it is that the lottery sells a dream. What you're paying for is intangible: hope, wishes, what-ifs. That's not necessarily bad, even though one could argue that we should be experiencing these intangibles without having to exchange something tangible (money) for them. For instance, it can provide some clarity on who and what really matters to us in life.

To be clear, though I know I've written similar things before, I'm not ragging on anyone who plays a lottery. That would be hypocritical of me; I haven't messed with a lottery in many years, but I do occasionally gamble. I do it for entertainment, and to me, a lottery is just not nearly as fun as blackjack or even shiny blinky noisy slot machines.

Other people do other things for entertainment. Someone might spend hundreds of dollars on Taylor Swift tickets or whatever, but they don't get judged as much as a gambler who spends the same amount in Vegas; they're just doing what they like and, at the end of the night, all they have is memories and maybe a T-shirt.

It's when you spend more than you can afford that the problem comes in, but that's a problem no matter what you're spending on.

Still, it's worth reading articles like the one here, I think, because it delves into that psychology, and maybe helps understand people's reasons. Those reasons are, as I noted, not entirely logical.

As remote as the odds of winning a lottery can be, there are odds even more remote. Practically no one blasts Rowling for being a billionaire (they blast her for other stuff, but that's irrelevant here), probably because the perception is that she "earned" it. She wrote some of the most popular books in history and, more importantly for the money angle, had them made into popular movies. But she happened to have the right idea at the right time and the capacity to execute it, the chances of which are roughly the same as those of winning a lottery. The only difference is she did more work than just filling out ovals on a slip of paper.

I know, intellectually and emotionally, that I have a greater chance of winning a lottery jackpot than of hitting it that big as a writer. And yet, I still write, and I don't play the lottery. Other people have different talents and opportunities, and some have none, so the lottery may be their only recourse.

It's just important, I think, to be aware of how we can be manipulated by marketing tactics. And by our own internalized thoughts about who deserves what.
March 23, 2025 at 10:01am
March 23, 2025 at 10:01am
#1085877
Yesterday, we had multiple scientists with the same name. Today, we have multiple plants with the same name. Coincidence? Well, yes. From Atlas Obscura:

    A Guide to the Peppers of the World  Open in new Window.
Which came first: the pepper, or the pepper?


You know, in case you were wondering, as I did for most of my life, how the ubiquitous ground black pepper got the same name as the bell and chili peppers. And why Dr Pepper seemed to contain none of the above, but that's outside the scope of today's article.

As an undergrad Classics major, I first heard of long pepper as something the Ancient Romans ate.

I wasn't a Classics major, but I took Latin classes in high school, whereupon I found out that the Ancient Romans ate all kinds of weird (to us) stuff.

Scientifically known as Piper longum, this elongated cousin of black pepper tastes more complex, but carries a similar zing thanks to piperine, a different compound from the capsaicin that gives chilies their heat.

It's generally possible to know which of the peppers someone's talking about by context, so it's not all that confusing, at least to me. What is sometimes confusing is using the word "heat" to describe the spiciness of chili peppers and their relatives.

The English word pepper traces back through Latin to the Sanskrit pippali, which specifically meant “long pepper” (and still does in Hindi and Urdu).

I do like etymology, but I didn't double-check this assertion.

Europeans once loved long pepper so much that they called all “hot” spices by its name: First its relatives, black and cubeb peppers, then unrelated plants like Mexican chili.

Two other mysteries that haunted me for a very long time: why is (ground black) pepper such a staple on American tables, where it shares pride of place with that other seasoning that Romans were obsessed with (salt). And why it's so goddamned hard to get any pepper out of most of its shakers.

The article goes on to list many varieties of pepper, though not peppers, and the answer to that first mystery is at least partially solved (it is, of course, linked to colonialism).

As for the other mystery, well, probably, it's because the holes in a pepper shaker are too small compared to the flakes of ground pepper inside. But that's not really an answer; it just kicks the question down the road: Why are the holes in a pepper shaker too small compared to the flakes inside?

The only thing I can come up with is aesthetics: you want the salt and pepper shakers to look similar. Historically, making matters worse, a salt shaker has more holes than a pepper shaker, which sometimes only has one. I can only conclude that the people who come up with these table etiquette things didn't really expect anyone to use the pepper shaker; it's just there to provide balance for the salt shaker.

Incidentally, thanks to budget constraints, the original Star Trek series used what was then futuristic and sleek-looking salt and pepper shakers for some of the medical tools. That's funny enough, but what was the very first Star Trek episode launched upon an unsuspecting 1966 public? The Man Trap, which featured an alien who craved salt.

Still missing from that show after nearly 60 years and hundreds of stories told: aliens who crave pepper. Probably they all died off-screen trying to get it out of the goddamned shaker.
March 22, 2025 at 10:28am
March 22, 2025 at 10:28am
#1085834
I have to admit, I was confused as hell until I saw this article, from Nautilus, made me realize that there are two of them.

    The Sean Carrolls Explain the Universe  Open in new Window.
Why are we here? Is there life on other planets? The renowned scientists who share a name share their answers to life’s big questions.


Why are we here? Because we're here. -Rush

This is the tale of two Sean Carrolls. Nautilus brought the two scientists together for the fun reason that they share a name.

It's a pretty good reason, from an entertainment standpoint.

The Sean Carrolls bring their perspectives from physics and evolutionary biology to bear on timeless questions about the origin of life, the possibilities of life on other planets, the tension between science and religion, the fate of Earth, and how they first got enchanted by science as kids.

And this is the answer to the question you've been dying to ask me: who the hell are Sean Carroll? You might not have heard of them, but I read about physics and evolutionary biology for fun. The physicist/cosmologist one does videos that I've seen, which is why seeing the biologist one's name in print confused me.

The article does a brief bio of each of them, including plugs for their books. Then, they yap.

Physicist: So, what happened to make you do biology? It’s so messy and hard.

Hey, at least biologists study things on Earth. Physicists look at shit way far away, or way smaller than biologists do.

Biologist: Catching snakes and salamanders and frogs was something I did nearly every day. I thought “maybe I’ll be a herpetologist.” How did you stay on your path? How old were you when you knew you were interested?

Physicist: I was 10. I was reading books about black holes and the Big Bang. I was never a go-out-there-and-touch-things-in-nature kind of guy.

For contrast, I was both, and I didn't go into pure science. It was only later that I developed a strong aversion to the outdoors, after enough things out there tried to munch, poop, and/or slime on me.

Physicist: I remember very vividly my high school teacher asking us all what we wanted to do, and I said I wanted to be a theoretical astrophysicist. He was so aghast that he wrote the words “theoretical astrophysicist” on the board, just to show everyone how weird that was.

Ah, yes, this was back when high school teachers could spell "theoretical astrophysicist" without a spell-checker.

There follows a stretch where the two Carrolls discuss religion and its relationship to science. This is, to me, an interesting bit, but it doesn't lend itself well to quote-response here. It's almost inevitable that religion should come up in discussions like this, because of some of the fraught history that scientific inquiry has had with dogmatic belief structures.

Fortunately, the discussion stays civil, not just to each other, but to those readers who might hold different views.

Then they start talking about life: here on Earth and the possibility of its existence elsewhere, and, considering the number of times I've ranted on the topic, I figured one more won't hurt.

Physicist: If you do replay it from 5 billion years ago, where are the bottlenecks? Do you think the initial existence of life was difficult? Do you think that multicellular life was difficult?

Those are some of the base questions to ask when estimating the possibility of ET life, which is something people love to speculate about. I have my own ideas on the subject, but they could be wrong. Even Biologist Carroll could be wrong, but he at least has the background to make more informed guesses.

Biologist: My sense, and I think it’s shared by a pretty good part of the biological community, is that simple, unicellular, microbial life might be fairly prevalent in the universe.

Oh, but he's probably right, because that agrees with my preconceptions. (That was meant as humor.)

Physicist: That’s telling the story from a slightly parochial perspective. We know that Earth did it in the last 4 billion years of evolution. A lot of contingent events needed for it to happen that way. Are there completely different ways to get big animals with big brains?

Speaking of parochial, there is no direct correlation between brain size and smarts; that's a human perspective. I think he's obliquely asking, here, about the possibility of the kind of life that sends out deliberate EM signals and builds spaceships, the way humans do. As I've noted before, repeatedly and ad nauseam, the existence of life is no guarantee that a technologically capable species will eventually arise—though we know it can happen because, well, here we are, communicating via technology.

Biologist: If you gave me 100 planets with life, I would love that sample. I’m going to think large life like ours might be relatively scarce.

And that's the kind of data we need to answer that question with any higher degree of certainty. Right now, we only have one planet known to harbor life.

Physicist: Well, I do know enough to say that there’s a lot of planets out there. Back when you and I were graduate students, we had the solar system. Now we have thousands of exoplanet systems. I’m not at all surprised. I think there was some kind of weird PR thing where people were acting surprised that we saw all these planets. I was just expecting most stars to have planets around them.

If there was a "weird PR thing," it was probably aimed at the general public, who are only a few generations away from "stars are holes in the sky that let Heaven's light in," or whatever guesses their ancestors came up with.

Finding exoplanets, then, was no surprise to me, either—but it's one thing to say "other stars probably have planetary systems" and quite another to say "other stars definitely have planetary systems." And we can say the latter now, which is a testament to how clever we can be when we try.

Physicist: Many of them seem to be perfectly habitable by the little information we have.

And that's where you lose me, Sean. Sure, some of them are in the theoretical habitable zone of their respective stars, where stellar radiation is hot enough for liquid water and cold enough to not melt rocks. This does not mean that they are habitable. Mars and Venus are in the Sun's habitable zone. Europa and Enceladus (moons of the outer solar system) are not, and yet other factors contribute to them being possible Petri dishes.

I say this not to assert that I know more than either Carroll when it comes to science. I do not. I learn from them. It's a communication thing. You tell the average person that a planet is "habitable," and they jump to green trees and an atmosphere and cute, half-dressed, blue-skinned Zoe-Saldana-looking aliens (okay, maybe that last part's just me), but that's misleading.

Physicist: I think the simplest thing is that there’s lots of life in the universe, and it’s all monocellular, unicellular. We’re weird in that we’re not.

I tend to agree with that sentiment, for whatever it's worth. Occam's Razor and all that. Doesn't mean it's right.

Biologist: Visit the Earth anytime in the first 4 billion years or so and everything’s small. Everything’s essentially microbial. That was the state for the longest time. Animals and redwoods are weird. They’re the unusual things.

I'm a bit disappointed that Biologist Carroll didn't point out the role of eukaryotes, which, as far as I've been led to believe, make up all macroscopic life (and some very important microscopic life, like beer yeast) on our planet. That kind of organization, and I mean the word in its most literal sense, is what might eventually lead to rockets and radios and maybe antigravity or warp drives. Other readings have led me to the conclusion that this is probably an even less likely occurrence than that of life starting from chemicals in the first place.

But, you know. Whatever. It's possible to imagine lots of stuff, like simple life organizing into multicellular aliens and developing space travel. That doesn't mean it happens. Want to know the possible? Imagine the impossible.

I'm not sure, but I think it was Physicist Carroll who convinced me of that, many years ago. What we have to avoid is the trap of believing everything we think.

I'll wrap this up with something relevant to writing. Physicist had just spun a yarn, probably mostly true, from the dawn of modern science. Then:

Biologist: You told a story right there—that is the tool of engagement.

Physicist: It’s the single most effective one, if you had to pick one.

People respond to stories more than to dry facts. It's just part of who we are. The difficulty is that stories can fit almost any narrative, both factual and false. I lean to the belief that we have a responsibility to truth, even when we're writing fiction.

Well, except for jokes. I never let the facts get in the way of a joke.
March 21, 2025 at 11:05am
March 21, 2025 at 11:05am
#1085787
The moon's close to last quarter now (which really looks like a half, not a quarter), but my random number generator doesn't always produce coincidences. From Mental Floss:

    The History of How Each Month’s Full Moon Got its Name  Open in new Window.
Each month’s full moon marks the changing seasons.


I've harped on this sort of thing before, I know. It's the closest thing I have to a crusade, apart from visiting all the breweries. But hey... new blog, new rants, right? I've half a moonmind to consolidate all my arguments on this subject into one item, including counter-counterarguments.

My basic premise can be summed up as such: We need to stop associating full moon names with Gregorian calendar months, and return them to a system based on other, verifiable astronomical events such as equinoxes and solstices.

The moon is an integral part of the sky above us. Over time, each month’s full moon has acquired a unique name of its own:

(There follows a table, which is too hard to reproduce here, listing full moon names and their Gregorian calendar dates for this year.)

I get that definitions change over time. But some changes, I think, need to be walked back, and this full-moon-to-calendar-month definition is one of them.

Many of the names we use today come from Native American traditions, though some originate in Europe as well.

While this is true enough, what's missing is that these naming traditions preceded the adoption of the Julian/Gregorian calendar. They might not have held the English names we use now, but I'll accept that those are close translations. Translating the full moon name does not, however, require changing the seasonal definitions to the arbitrary calendar month definitions.

The monikers correspond with the seasons in the Northern Hemisphere.

And there's a hint that it wasn't always J/G calendar based.

Read on for more information about the history behind the name of each month’s full moon.

Now, I'm not going to take the time to verify all the history behind the names they present. It's Mental Floss, so based on other articles from that source which I've picked apart, I assume they got some stuff right and some stuff wrong. I feel like that's important overall, but not very relevant to the point I'm trying to make today; one can change the name of a particular full moon, or confirm or dispute the history behind it, without changing the basic premise.

So instead, I'll just list the same full moon names they do, with my preferred definition of when it occurs. Calendars can start at any point, but I'll stick as close as I can to the familiar Gregorian, just so we have reference points. So we'll start with the northern hemisphere winter solstice, closest to January 1.

Wolf Moon: First full moon after winter solstice. This has an approximately 2/3 chance of occurring in January, as described, but it could happen in late December.

Snow Moon: Second full moon after winter solstice. Weather tends to lag astronomical seasons, and the period of late January sees the lowest average high and low temperatures in the northern hemisphere. February sucks, too. The Snow Moon can occur in late January to just past mid-February.

Worm Moon: Last full moon before spring equinox. Could happen late February through around March 20/21 (today is the 21st, but yesterday was the equinox).

Pink Moon: First full moon after spring equinox. March or April.

Flower Moon: Second full moon after spring equinox. April or May.

Strawberry Moon: Last full moon before summer solstice. May or June.

Buck Moon: First full moon after summer solstice. (You get the pattern now, I hope.)

Sturgeon Moon: Second full moon after summer solstice.

Corn Moon / Harvest Moon: Last full moon before autumn equinox.

Hunter's Moon: First full moon after autumn equinox.

Beaver Moon: Second full moon after autumn equinox. (This is the one that can, rarely, fall on or close to Halloween.)

Cold Moon: Last full moon before winter solstice.

And that brings us full circle.

Since the orbits of the Earth and Moon don't match up, on rare occasions (every few years), there are four full moons between solstice and equinox or vice versa, which I refer to as an "astronomical season." That's where the Blue Moon comes in: it's like a leap moon, occurring as the third full moon in an astronomical season with four full moons. It will always appear in February, May, August, or November, never in any other calendar month.

Is this system harder to follow than the simplified full-moon-to-calendar-month system? Yeah, probably. Is it worth it? I think so. As I said, these cultures who initiated the tradition didn't have the Roman calendars. Many cultures these days use lunar or lunisolar calendars, notably the religious calendars for Jews and Muslims, and some Eastern timekeeping systems. These weren't just full moons; these were months, one way they told time. While the Gregorian calendar is a useful tool for coordination, much as English is a useful language for international communication, it's decidedly Eurocentric and tailored for one specific religion. It is also pretty damn good at calculating solar returns, and it's baked into the fabric of the internet. But it's not culturally universal; solstices and equinoxes are, even if they mean different things in the Northern and Southern hemispheres.

I should note, as an aside, that the full-moon naming conventions are not just Northern Hemisphere in origin, but generally from the upper latitudes of said hemisphere. Seasons have somewhat different effects in Norway than in, say, Libya. The names themselves reflect those environmental conditions that prevail in northern latitudes: long, snowy winters and mild-to-moderate summers. But as I said, it doesn't matter what name you use, as long as we know we're talking about the same full moon.

The system I'm describing here works irrespective of the calendar used to track it. A Beaver Moon (or whatever you want to call it) will always happen at (close enough to) the same moment for everyone, though local times may vary due to time zones. Same with a Blue Moon; under the mistaken "calendar month" definition of that, sometimes, you get a full moon that occurs at the end of one month for some areas, and the beginning of the next month for others. I have a rather long explanation for that, but I'll have to save it for my future dissertation.

Now, one might say that none of this has any real import. I can understand that. It doesn't affect science, technology, or, in an industrialized world, even its traditional purpose of agricultural planning. But I feel that returning to the seasonal definitions might connect us better to ancestral folklore that comes from other sources than the ubiquitous Mediterranean (Egyptian, Levantine, Turkish, Greek, Roman, etc.)

The stories we tell each other have value, and I think the calendar is one such story. Let's take this one back to its roots.
March 20, 2025 at 7:52am
March 20, 2025 at 7:52am
#1085726
“Your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should.”
         -Ian Malcolm  Open in new Window.

"Your philosophers were so preoccupied with whether or not they should, they didn’t even stop to see if they could."
         Nick Pether  Open in new Window.

It usually amuses me, but sometimes enrages me, when I find people claiming to uncover ethical issues that science fiction authors have been tackling for decades, or even centuries. Like with this one from MIT Press Reader, which does both:

    The Thorny Ethics of Planetary Engineering  Open in new Window.
Whenever someone waxes poetic about terraforming alien worlds, it’s worth taking a moment to consider the ethical implications of the proposal.


To be clear, I find these discussions are valid and valuable. What enrages me is the implicit snubbing of the most important literary genre, science fiction. It's the most important because it does tackle those ethical issues, but in a way that includes things like holodecks or death rays, so it's less boring.

Exploration, habitation, and resource extraction all carry a risk of inflicting environmental damage in space, just as they do here on Earth.

And right away, I run into an issue: when the definition of "environmental" expands to incorporate all of space, and the goal seems to be to stop environmental damage altogether, we're left unable to do anything, even to asteroids. Would it suck to fill a moon crater with a giant factory, or fill in the Valles Marineris? From a sentimental point of view, yes. But it's not like we'd be affecting living creatures or their habitats. Well, probably. Almost certainly in the former case; research still needs to be done for the latter.

From a certain point of view, we've already done irreparable damage to space just by lofting shit-tons of junk into Low Earth Orbit and just leaving it there to whirl around and cause hazards to astrogation. Not to mention the annoyance to Earthbound astronomers trying to see through all the shiny debris.

So it's not "just as they do here on Earth." Here, people understandably protest things like strip-mining, but not necessarily because there was a mountain that's not there anymore, but because of damage to the ecosystem. Having no ecosystem simplifies the ethical debate.

But some futurists and space settlement enthusiasts have proposed an even more drastic alteration of the space environment: the transformation of the surface of a planet or moon into a more Earth-like environment via a process known as terraforming.

"Proposed" is a strong word. I might have chosen "envisioned," because "proposed" carries the implication that we have the technology and the will to do something. At this point, we can't even terraform Earth, let alone other planets.

But, nitpicking aside, okay. The time to discuss ethical issues is now, before we have the ability; not afterwards, when it's already been done. Still, like I said, science fiction authors have tackled this issue for a very long time.

For example, in 1961, Carl Sagan speculated on the possibility of the “microbiological re-engineering” of Venus by introducing blue-green algae into its atmosphere.

Sagan was, technically, a science fiction author, though he's better known for his factual communications. So my point stands. Incidentally, I don't think that particular method would work; we know a hell of a lot more about Venus now than we did in 1961.

Sagan later turned his attention to the potential for “re-engineering” Mars, a planet now considered to be one of our best candidates for successful terraformation.

I don't think any world has received more attention in SF than Mars, except for Earth itself. The idea of terraforming Mars goes way back; one could argue that Edgar Rice Burroughs' Mars novels, published well over 100 years ago, featured a kind of planetary engineering (though instituted by the native Martians, not Terrans; plus, the series was more fantasy than SF).

Terraformation is the ultimate example of long-term planning, as even optimistic estimates predict that it would take centuries of effort and patience before a human could walk unprotected on the surface of Mars.

Which is one reason, apart from technological limitations, that this remains in the realm of science fiction and philosophical debate: humans generally don't think past the next rent due date, fiscal quarter, or election, let alone multiple lifetimes ahead. See also: the SF concept of generational spaceships.

Whenever someone waxes poetic about humankind bending the universe to our will, it’s worth taking a moment to consider the ethical implications of the proposal.

I cannot, however, argue with that.

One major consideration about terraforming is that the process could damage or even wipe out any existing life on the planet being terraformed.

While I agree, it goes further than that: it could destroy evidence that life once existed there (this is especially a concern for Mars right now). On the flip side, though, some major archaeological discoveries here on Earth might never have been made if it hadn't been for land development and construction unearthing them.

If we allow planetary engineering to race ahead of astrobiological research, we could miss our opportunity to make what would be the most important scientific discovery in human history: the discovery of life that evolved beyond our planet.

Again, I agree with the sentiment, but from what I've learned of those fields, there's no race: we don't do planetary engineering, yet. If we did, we wouldn't be arguing about climate change here, but doing something about it. Meanwhile, probes have been working on detecting current or former life, on Mars and within some of the icy moons in the outer solar system.

But suppose we do discover evidence of existing microbial life on a planet like Mars. Should this disqualify Mars as a target for terraforming? Should we avoid settling on Mars at all?

I'm not going to weigh in on that. But it should certainly be discussed. Wait, what's that? Science fiction authors had already started the conversation before philosophers ever thought about it, just like with AI and cloning? You don't say.

It may seem premature to debate the ethics of using a technology that does not yet exist to indirectly destroy an ecosystem that may not exist at all.

Or, for that matter, it may seem premature to debate the ethics of doing something that we might not even be able to do because we failed at terraforming our own planet.

Nevertheless, I find questions like these valuable—with or without definitive answers.
March 19, 2025 at 8:18am
March 19, 2025 at 8:18am
#1085682
This will be my last entry. Well, the last entry of the current astronomical winter, anyway; the equinox will occur at 5:01am EST tomorrow. Okay, sorry for the misleading first sentence; we're still a couple of weeks away from April Fools' Day.

Speaking of fools, this article from The Conversation was published in late 2022, but I still see the subject discussed occasionally.



What I haven't seen is anyone actually walking backwards. Thinking backwards, sure. Walking? Not so much. Then again, I don't get out much.

Walking doesn’t require any special equipment or gym memberships, and best of all, it’s completely free.

That's not entirely true. Decent walking shoes aren't cheap.

But what happens if we stop walking on auto-pilot and start challenging our brains and bodies by walking backwards? Not only does this change of direction demand more of our attention, but it may also bring additional health benefits.

What happens is: you've become a victim of social engineering, like a while back when people tried to convince us that we're supposed to open bananas from the other end. "Let's see if we can get people to do something dumb-looking. Then we can monetize the resulting videos!"

Remaining upright requires coordination between our visual, vestibular (sensations linked to movements such as twisting, spinning or moving fast) and proprioceptive (awareness of where our bodies are in space) systems.

Yes, that's why lying down is far superior.

When we walk backwards, it takes longer for our brains to process the extra demands of coordinating these systems.

That sounds plausible, and I did glance at the studies linked in the article. That doesn't change the fact that it makes you look like an idiot.

Barely touched on in the article: how to cross busy streets, deal with curbs, avoid other people, and not trip over sidewalk cracks. (I have this visual in my head of two backwards-walkers colliding, sending them both off in their respective forwards directions.) To me, the hazards outweigh any potential health benefits, which can probably be approximated by more intentional walking, carrying a heavy backpack, or ankle weights. And I'm not afraid of looking like a fool in public (as anyone who has seen my wardrobe can attest), but this is a bridge too far.

In other words, sometimes, the science can be sound, as far as it goes, but it doesn't always take every factor into account.
March 18, 2025 at 8:23am
March 18, 2025 at 8:23am
#1085624
Did you know that no Ouija board ever spelled out the word "gullible?" A rather long treatise on the soi-disant spirit-communicator from Smithsonian:

    The Ouija Board Can’t Connect Us to Paranormal Forces—but It Can Tell Us a Lot About Psychology, Grief and Uncertainty  Open in new Window.
The game was born from Americans’ obsession with Spiritualism in the 19th century. Since then, it’s functioned as a reflection of their deep-seated beliefs and anxieties for more than a century


You know how I knew Ouija boards weren't what they were advertised to be, when I was a kid? Two things: One, it uses standard English, when everyone knows that spiritualist devices have to be in intrinsically arcane languages like Hebrew, Sanskrit or Latin. Two, it was available in places like Toys R Us, alongside Monopoly and Risk; if kids could actually speak with spirits from beyond, it would have been locked away in some secret vault, only to be discovered later by some plucky young archaeologist who then had to spend the rest of the movie containing the horrors she had released, at great personal sacrifice to her wardrobe.

And yet, there was something there.

In the late 1800s, advertisements for a new paranormal product started appearing in papers: “Ouija, or, the Wonderful Talking Board,” boomed a Pittsburgh toy and novelty shop, describing a magical device that answered questions “concerning the past, present and future with marvelous accuracy” and provided a “link which unites the known with the unknown, the material with the immaterial.”

Also, if it did work as advertised, it would have made detectives' jobs that much easier. "Who murdered you?" "J-O-H-N-S-M-I-T-H." It would at the very least narrow down the list of possible suspects.

Not to mention, with the "future" bit, do you really want to know? "When am I going to die?" "T-O-N-I-G-H-T."

The idea was that two or more people would sit around the board, place their fingertips on the planchette, pose a question, and watch, dumbfounded, as the planchette moved from letter to letter, spelling out the answers seemingly of its own accord.

You want to impress me? Have it move on its own, without fingers, batteries, magnets, or stray gusts of wind.

Are Ouija boards real?

Well, yes, in a sense, they are, in the same way that a porn star's breasts are real: they exist, but they're also a misleading illusion.

Ouija historian Robert Murch has been researching the story of the board since 1992, when he first purchased a copy. At that time, he says, no one really knew anything about its origins, which struck him as odd: “For such an iconic thing that strikes both fear and wonder in American culture, how can no one know where it came from?”

I suppose asking the Ouija board never occurred to him.

As I said, the article is rather long, so I'm skipping over bits like the background of the spiritualist craze in the US in the 19th century, which supposedly helped to birth the board.

When a few men in Baltimore started the Kennard Novelty Company, the first producers of the Ouija board, in the late 19th century, opening the gates of hell was the last thing on their minds. Instead, they were mostly interested in opening Americans’ wallets.

It really doesn't get more American than that: see a market, take advantage of it, rake in the dough.

Contrary to popular belief, “Ouija” is not a combination of the French word for “yes,” oui, and the German equivalent ja. According to Murch, it was Bond’s sister-in-law, Helen Peters (who was, Bond said, a “strong medium”), who supplied the now instantly recognizable handle. When she asked the board what they should call it, the name “Ouija” came through.

I honestly hadn't heard that fauxtymology, but I absolutely get how people would believe it (the board itself is evidence that people will believe anything, given the right circumstances). After all, English is basically a French/German creole that somehow (coughcolonialismcough) spread across the entire world.

Again, skipping over quite a bit here.

Parker Brothers (and later, Hasbro, after acquiring Parker Brothers in 1991) still sold thousands of them, but the reasons people were buying them had changed significantly: Ouija boards were spooky rather than spiritual, with a distinct frisson of danger.

I just quote this bit to point out that it's still being made by a game company. Hasbro also publishes Dungeons and Dragons, whose character arc is pretty much the exact opposite of that of the Ouija board, though much shorter: D&D went from being feared and accused of demonic associations to being acknowledged as a fun pastime and font of creativity; Ouija went the other way.

For whatever it's worth, Hasbro also controls My Little Pony.

As interesting as the history is, I was looking for non-paranormal explanations. As I said above, there's something there; I just figured it had something to do with the subconscious, which can be scary enough without needlessly adding in entities from the Great Beyond.

The boards are not, scientists say, powered by spirits or demons. But they’re still equally fascinating—because they’re powered by us, even when we protest that we’re not doing it, we swear.

This is where I admit that no, I've never actually played with a Ouija board. This is not out of fear or skepticism, but largely disinterest.

Ouija boards work on a principle known to those studying the mind for more than a century: the ideomotor effect. In 1852, physician and physiologist William Carpenter published a report for the Royal Institution of Great Britain examining automatic muscular movements that take place without the conscious will or volition of the individual (think crying in reaction to a sad film, for example).

Astute readers of the article, or even of those few excerpts I provide here, will note that the ideomotor effect report was published decades before the first Ouija board was produced.

Around the same time, chemist and physicist Michael Faraday, intrigued by table-turning, conducted a series of experiments that proved to him (though not to most Spiritualists) that the table’s motion was due to the ideomotor actions of the participants.

Yes, that Faraday. ("Table-turning" was a common spiritualist practice when Faraday was alive, and had nothing directly to do with Ouija. The former might have influenced the invention of the latter, however.)

While Ouija boards can’t give us answers from beyond the veil, we can learn quite a lot from them. Researchers think the board may be a good way to examine how the mind processes information differently on different levels.

And that is why this sort of thing shouldn't be dismissed entirely from a scientific perspective, in my opinion. Same for astrology, tarot, sympathetic magick, cryptid sightings, hauntings, alien abduction, etc.: there's something going on there that might help us understand ourselves, or even the world around us. Research is thin on the ground, though, because almost everyone attracted to it is either a Believer or a Skeptic, neither of which is ideal when doing real science; and also because other scientists tend to mock those attracted to what's called "paranormal."

I'm not saying that Ouija, or any of those other things, is actually doing what it's advertised to be doing; just that it would be a mistake for us to think we know everything.

The article goes into some actual experiments, and then ends with a statement that echoes my own thoughts:

The team has managed to make good on one of the claims of the early Ouija advertisements: The board does offer a link between the known and the unknown. The unknown just happened to be different from what many wanted to believe.
March 17, 2025 at 9:17am
March 17, 2025 at 9:17am
#1085558
Yes, it's St. Patrick's Day. No, I won't be doing anything special. It, like New Year's Eve, Cinco de Mayo, and other drinking holidays, are Amateur Days.

Okay, I might make my green Star Trek-inspired cocktail later, at home, but that's about it. Here it is: "It's GreenOpen in new Window.

You know what else is green? Most salads. From Atlas Obscura's Gastro Obscura:

    Midcentury America’s Most Scandalous Salad  Open in new Window.
According to Betty Crocker, Candle Salad was even “better than a real candle.”


Now, look. It's a huge pain in the ass to embed pics from other webpages here. It's far, far easier to implore you to click on the link in the above headline. Because that's the only way you can see a picture (actually several pictures) of this "Candle Salad" in all its proud glory. So, do that. Seriously, go click on it right now. You can even read the article; I'll only comment on a little bit of it here. But definitely look at the pictures.

In 2014, around Thanksgiving, talk-show host Ellen Degeneres showed her audience a photo of a mid-century American dish. “There’s something called a ‘Candle Salad.’ This is real,” she said, while the studio audience howled with laughter. “It is made with banana and pineapple … and mistakes. I tried it once. It was not my thing.”

Say what you will about Degeneres, but that's comedy gold, right there.

It consists of a lone banana held upright with either a pineapple base or a ring of Jell-O.

Jell-O recipes were everywhere in that era. Very successful marketing.

Personally, I think it could benefit from added kiwi fruits. Or maybe one, split in half lengthwise, and nestled at the base.

There’s a maraschino cherry on top, along with a dribble of whipped cream or mayonnaise down the side. If you use your imagination, it could be said to resemble a candle—but I bet that’s not where your brain went first.

Humans are rather predictable.

According to Aldrich, there was a pragmatic reason why this snack showed up in kid’s cookbooks. “It was a very simple recipe. Children didn’t have to worry about using a knife or burning their hands on the stove,” she says.

Yeah... I'm going to call bullshit on that. There's gotta be a huge number of "very simple recipes" that are suitable for kids of varying ages, and the overwhelming majority of those recipes aren't hentai.

Well, like I said, I'm not going to comment on the whole article, which gets into the history of thing-shaped foods (but not always that "thing"), and even provides a handy recipe so you can troll your family and/or friends yourself.
March 16, 2025 at 8:39am
March 16, 2025 at 8:39am
#1085499
I held on to this article from Polygon, not just because it's about Star Trek specifically, but because it has some insights into writing in general.



Not too long ago, I went on a personal quest to watch every single Star Trek episode and movie, in release order. There's a lot to watch, and too much for me to remember. Lower Decks was full of inside jokes, so, as someone who has lived and breathed Trek for their entire life, I've sometimes wondered whether someone without much Trek background would appreciate the show. The article suggests that they might.

Most of the article is in the form of an interview with Mike McMahan, the creator mentioned there in the headline.

The show McMahan was working on was Rick and Morty, which went on to be a massive pop culture sensation. More confident than ever in McMahan’s instincts, Secret Hideout reached out again in 2018, this time to ask him what he wanted to do. McMahan answered with a pitch for an animated sitcom based in the Star Trek universe, a truly wild swing for the typically reverent and cerebral sci-fi franchise.

On the flip side, I have never seen even one single episode of Rick and Morty.

I should emphasize that Star Trek has been no stranger to comedy, however serious and dramatic some of the stories and situations have been. From the beginning, many episodes of TOS included humorous moments. Hell, we wouldn't have the show at all were it not for the vision of funny-lady Lucille Ball, back when she and Arnaz were running Desilu studios, before it got bought out by Paramount.

But Lower Decks dialed the comedy to eleven. Somehow, though, it not only maintained some continuity with live-action Trek, but also kept the setup-conflict-resolution style of its more serious cousins.

It could have devolved into a self-parody. But it didn't. And I want to know how to write like that.

As the series comes to a close after five seasons, Polygon caught up with McMahan about how his wacky passion project made its mark on one of American pop culture’s most cherished legacies.

The article is from last year; the series has since wrapped up, by design. Everything ends; it's only a question of whether they leave us hanging or not.

Yeah. It was cool because when I was becoming a writer in TV and writing my own stuff all the time, I was watching Star Trek with my wife, being like, “Man, I wish Star Trek was still around,” because it was in the in-between phase. And I remember being like, “I’m just gonna write Star Trek whether somebody pays me to or not.”

There have been a few in-between phases in Trek history. McMahan was certainly not alone in writing Trek fanfiction. Some of it even became non-canonical, but official, novels. I've read many of them. Some of them suck. Most of them are passable. Some are even excellent. Some SF/Fantasy authors got their start writing Trek fanfiction.

I say this because fanfiction has a crap reputation, but it really shouldn't. Not all fanfiction involves crap writers writing crap porn.

And this part of the interview ties in to my entry from the day before yesterday:

Sure. I mean, luck is usually something that only works in your favor if you’ve done a lot of hard work first, right?

Yes.


I'm still fuzzy about the definition of "hard work," because I don't think most day laborers would consider what writers do hard work. Whatever you want to call it, sometimes you have to put in the effort, physical and/or mental, to take advantage of opportunities when they arise. Though I think even possessing the ability to do that effort is also dependent on luck.

The remainder of the interview, which I won't reproduce here, is relevant to writers, no matter what the genre. Well, maybe not the literary genre, but stories that people actually pay attention to.

Whether the article is any more or less accessible to non-Trek watchers than Lower Decks, I have no idea.

Since Lower Decks ended, there has only been one other Trek installment: Section 31, which was originally slated to be a series but became a TV movie (which, nowadays, is basically just a movie that never made it to a theater but went directly to streaming). Despite the presence of always-awesome Michelle Yeoh as one of the primary characters, the movie was (for me at least) a huge letdown.

But Trek has always had its ups and downs, and I look forward to what's next.
March 15, 2025 at 8:42am
March 15, 2025 at 8:42am
#1085451
All words are made up. Some were made up more recently than others. And some are more made up than others. Here's Mental Floss to not help:

    What’s the Longest Word in English?  Open in new Window.
Spoiler alert: Despite what you might have heard, it’s not ‘antidisestablishmentarianism.’


I memorized that one long ago, as well as another contender.

If you can find a way to work pneumonoultramicroscopicsilicovolcanoconiosis into a conversation, congratulations!

But not that one.

You’ve just managed to use the longest defined word you’ll find in any dictionary in everyday chatter.

Being in a dictionary just means that someone put it in a dictionary.

The word was coined in the 1930s, probably by the president of the National Puzzlers’ League, “in imitation of polysyllabic medical terms,” according to the Oxford English Dictionary, “but occurring only as an instance of a very long word.”

Like I said, made up. In this case, made up less than 100 years ago. Somehow I doubt it ever entered public use the way 'antidisestablishmentarianism' once did. If anyone ever tried to say it, it would have most likely been in connection with longest-word contests.

Another pretty long word, floccinaucinihilipilification—meaning “the action or habit of estimating as worthless”—was created by mashing together four words in a Latin grammar book that all meant something with little value and adding -fication at the end.

And who doesn't need a -fication? That was the other long word I had memorized, incidentally. It seems appropriate in this context, though, as I consider the competition for longest word to be of little value.

There are even longer words than pneumonoultramicroscopicsilicovolcanoconiosis out there—you just won’t necessarily find them in a dictionary.

I'm reminded of the Blackadder the Third dictionary scene, which is second only to the "Scottish Play" scene from the same serial in its capacity to send me into paroxysms of cacchination:



As the article points out, words of even greater length are possible. They already exist in technical fields such as chemistry, so it's questionable if the matter can ever be settled with any definitization, as words can be crafted at any time by nearly anyone.

The thing that's important is the usefulness of the word. Useful ones enter the lexicon with disturbing regularity. One might even say that what matters most isn't length, but girth.
March 14, 2025 at 10:44am
March 14, 2025 at 10:44am
#1085390
I managed to see the lunar eclipse last night. Exactly as expected, the Earth's shadow started munching on the lunar orb a bit after midnight, and it was almost fully dark (matter of fact it's all dark) at around 2:30. And it was dark indeed, darker than previous ones I've seen; I suppose there's less dust in the Earth's atmosphere to create the refracted, then reflected, red glow.

But the important thing is: for once, I managed to see a celestial event without clouds interfering. Later this morning, I woke up to thick cloud cover, so it seemed that the atmospheric water vapor held out so I could see the eclipse. One might say I got lucky.

Which segués smack into this article from Wired.

    The Secret to Being Lucky  Open in new Window.
Everything happens for no reason.


And here I thought the secret to being lucky was to be lucky.

Alexa’s approach to prediction is a revelation: “Today you can look for sunny weather, with highs in the mid-70s.”

I was wondering why they'd open the article with a quote from a spying device.

Really, what more can or should be said about the future? Look around and see what happens. You can look for your crypto windfall. You can look for the love of your life. You can look for the queen of hearts. Seek and ye might find. You can even look for a four-leaf clover, though the chances are about 1 in 10,000. But if you find one, the shamrock is no less lucky because you looked for it.

Oh, I get it. It's going to be about looking.

But wait. A shamrock is a clover, but not all clovers are shamrocks. The whole point of shamrocks, at least since Christianity subjugated Ireland, is that it's got three leaves and somehow symbolizes the Trinity. I can't be arsed to look up whether four-leaf shamrocks are a thing, but if they are, I don't know why it would be lucky to break out of the Trinity metaphor. And every four-leaf clover I've ever seen has been non-shamrock in origin, perhaps because shamrocks aren't exactly native to Virginia.

I do know that their name has nothing to do with being a fake stone.

Anyway.

“Diligence is the mother of good luck” and “The harder I work the luckier I get”—these brisk aphorisms get pinned on Ben Franklin and Thomas Jefferson, lest we earnest Americans forget that salvation comes only to individuals who work themselves to dust.

I consider this trolling (Franklin) and propaganda (Jefferson). If hard work led to good luck, there would be some damn lucky sharecroppers out there.

In truth, the luck = work axiom does nothing but serve the regime and the bosses, by kindling credulity in a phantom meritocracy instead of admitting that virtually every single advantage we get in the world is one we lucked into—by being born to the right parents who speak the right language in the right zip code.

Like I said. Trolling and propaganda.

Even possessing the capacity for "hard work" (whatever that really means) is a matter of luck.

How about we invert the meritocratic fallacy in those aphorisms and create a new aphorism that makes “work” the delusion and “luck” the reality? “The luckier I get, the harder I pretend I’ve worked.”

I'm not big on jumping on board bandwagons, but I'll give this one a climb.

After all, the chances of the precise sperm colliding with the exact egg in the right fallopian tube and convening to make you—or me—are so low as to be undetectable with human mathematics.

And then the article undermines a good point with a spurious example. And not even an accurate one. "Undetectable," my ass.

What I mean is this: so what if the odds were really, really low? It obviously happened, so the prior odds are irrelevant, except as an intellectual exercise. Some sperm collides with some egg and whips up a unique combination of genes every fucking minute. Pun absolutely intended.

If there’s any method of prediction that never fails, it’s luck. You look for your horse—or your candidate—to win, and she wins? What luck. What if she loses? Better luck next time. If Alexa says you can look for rain, and you look and find it—lucky you, you brought an umbrella! Luck is fate and fate is what happens and a prediction of what happens is a perfect prediction.

Is it just me or is that argument as circular as something drawn with a compass? If so, we were lucky to be treated to this article (selected at random from around 60 possibilities) on Pi Day.

But work and diligence can never be the parents of luck, because luck has no mother, no father, no precedent or context. Luck is a spontaneous mutation, signaling improbability; it shows up randomly, hangs around according to whim, and—as every gambler knows—makes an Irish goodbye.

I can't let this slide by without cringing at the more-than-slightly racist final words of the quote. But apart from that, I got to thinking about the possibility that evolution is really selecting for luck. I can't take full credit for this idea; Larry Niven proposed something similar in Ringworld. But his version was more narrow: a character was lucky because she was the result of several generations of ancestors winning a lottery that allowed them to reproduce. I'm taking a broader view, that species survival depends on not the strongest or the fastest or even the fittest, but the luckiest (which might include strong or fast or fit).

As luck is pretty much unquantifiable, it's not a scientific hypothesis. Just something to contemplate for stories and whatnot.

So where does the “looking for” luck come in? Ah—your agency comes in the almost-passive search for luck. The noticing.

Congratulations. You've just rediscovered the power of observation. When opportunity knocks, it helps if you don't have your noise-canceling headphones on.

Einstein didn’t like the idea of God “playing dice” with the world. Lucky for Einstein, dice, in a world determined by luck, are not thrown by anyone, much less a God who is said to have Yahtzee skills. Instead, the chips fall where they may—and really they just fall, unpredictably, spontaneously.

And that's a bit misleading, too. It's like, okay, let's limit ourselves to the standard cubic six-sided dice for the sake of discussion. It's true that we don't know what number will come up on the roll of a pair of dice, absent some cheating-type intervention. But there are boundary conditions. Most obvious is that the result will be between 2 and 12 inclusive. A fair throw of 2 standard dice will never come up with a 1, or any number greater than 12. It certainly won't come up with a noninteger number, a negative number, zero, or an imaginary number. Those are boundaries. Perhaps a bit less obvious is that the chance of rolling a 7 is much greater than the chance of rolling a 2 or a 12 (specifically, 1 in 6 as opposed to 1 in 36). No, you can't predict what the next throw will bring (assuming no cheating, of course). But you can predict, with a very high degree of certainty, the frequency of each result occurring after multiple rolls. If that were not true, casinos couldn't stay in business.

Point being, even randomness has constraints. That's why "everything happens at random" is just as nonsensical as "everything happens on purpose."

We then look for patterns in them.

And humans are, probably for reasons related to the luck of evolutionary development, very, very good at spotting patterns—even when there are none.

The arrangement of tea leaves at the bottom of a tasseomancer's cup is random with constraints. Sometimes, the leaves seem to form patterns. Whole books have been written about interpreting the patterns. Whole books have been written about a lot of bullshit things. You'd be lucky to ignore them.
March 13, 2025 at 9:52am
March 13, 2025 at 9:52am
#1085331
Well, then, let's just jump right in, shall we? I've got a pretty big backlog of interesting articles to tackle.

Today's is from MIT Press Reader, and it makes for an appropriate-enough premier entry here.

    The Greatest Unknown Intellectual of the 19th Century  Open in new Window.
Emil du Bois-Reymond proclaimed the mystery of consciousness, championed the theory of natural selection, and revolutionized the study of the nervous system. Today, he is all but forgotten.


I suppose he's no longer unknown now, at least by me and anyone with the talent, taste, good looks, and perspicacity to read my blog. Or to read the linked article. Or to buy the book that the linked article is nakedly promoting.

As much as I hate ads in general, I have no issue with sharing book promotions here, as long as they're transparent about it.

Unlike Charles Darwin and Claude Bernard, who endure as heroes in England and France, Emil du Bois-Reymond is generally forgotten in Germany — no streets bear his name, no stamps portray his image, no celebrations are held in his honor, and no collections of his essays remain in print.

You know who is unknown to me? Claude Bernard. Or, I should say now, was unknown to me.  Open in new Window. Yeah, that sounds like someone I should have heard of.

But it wasn’t always this way. Du Bois-Reymond was once lauded as “the foremost naturalist of Europe,” “the last of the encyclopedists,” and “one of the greatest scientists Germany ever produced.”

Which is high praise, considering some of the other scientists from Germany. Though Einstein came later and probably overshadowed him.

Their lives did overlap, though only for about 17 years. If Wikipedia can be trusted for that.

Those familiar with du Bois-Reymond generally recall his advocacy of understanding biology in terms of chemistry and physics, but during his lifetime he earned recognition for a host of other achievements.

That alone is pretty significant, though it's almost certainly a case of "if he hadn't done it, someone else would have."

He pioneered the use of instruments in neuroscience, discovered the electrical transmission of nerve signals, linked structure to function in neural tissue, and posited the improvement of neural connections with use.

I'm curious what neuroscience was like before instruments, but not curious enough to make a side trip.

He owed most of his fame, however, to his skill as an orator.

Now this is the most interesting part, at least to me. Doing science is one thing. Being able to communicate it effectively is, I believe, an entirely different skill. Our modern-day science communicators may write books or record YouTube videos in addition to holding in-person lectures, but, well, not all of them are really suited to explaining big new concepts to non-scientists.

In matters of science, he emphasized the unifying principles of energy conservation and natural selection, introduced Darwin’s theory to German students, rejected the inheritance of acquired characters, and fought the specter of vitalism, the doctrine that living things are governed by unique principles.

It turns out that, under certain circumstances, some acquired characteristics (here the article's proofreader failed; it's not "characters") can be inherited. That's okay. Scientific theories get modified and revised over time; that means the process is working.

In matters of philosophy, he denounced Romanticism, recovered the teachings of Lucretius, and provoked Nietzsche, Mach, James, Hilbert, and Wittgenstein.

I hold the conviction that science and philosophy are symbiotic: science informs philosophy, while philosophy guides science. Others insist they should remain separate, which is self-contradictory because it is itself a philosophy of science.

In any case, anyone who provoked Nietzsche is okay in my world.

In matters of history, he furthered the growth of historicism, formulated the tenets of history of science, popularized the Enlightenment, promoted the study of nationalism, and predicted wars of genocide.

Funny thing about predictions. Sometimes, they're not predictions but plans (though maybe someone else's plans). I hope that wasn't the case here.

And in matters of letters, he championed realism in literature, described the earliest history of cinema, and criticized the Americanization of culture.

All of which is extra amusing now that some of the world's most popular cinema involves fantasy stories originating in the US.

Today it is hard to comprehend the furor incited by du Bois-Reymond’s speeches. One, delivered on the eve of the Prussian War, asked whether the French had forfeited their right to exist; another, reviewing the career of Darwin, triggered a debate in the Prussian parliament; another, surveying the course of civilization, argued for science as the essential history of humanity; and the most famous, responding to the dispute between science and religion, delimited the frontiers of knowledge.

Oh, I don't know. Some speeches today still incite fury. They're often labeled "controversial." High on that list remains arguments concerning the dispute between science and religion.

The important thing to note, as far as I'm concerned, is that just because you're a great communicator and can give a fiery speech, it doesn't mean you're right.

Du Bois-Reymond supported women, defended minorities, and attacked superstition; he warned against the dangers of power, wealth, and faith; and he stood up to Bismarck in matters of principle.

It also doesn't mean you're wrong.

The rest of the moderately long article goes into more detail about du Bois-Reymond's life and times, and touches on why he might have been all but forgotten despite his celebrity status. Despite the memory hole he seems to have fallen afoul of, I think echoes of his ideas remained; those, after all, are more durable than mere individuals. As support, I'll just provide one more quote, from near the end of the page:

Du Bois-Reymond reminds us that individuals mark their times as much as their times mark them. “If you want to judge the influence that a man has on his contemporaries,” the physiologist Claude Bernard once said, “don’t look at the end of his career, when everyone thinks like him, but at the beginning, when he thinks differently from others.”

I do hope we can forgive his then-contemporary stylistic use of masculine pronouns. I'm pretty sure that idea applies to all people.

16 Entries · *Magnify*
Page of 1 · 20 per page   < >

© Copyright 2025 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted Writing.Com, its affiliates and its syndicates non-exclusive rights to display this work.

Printed from https://writing.com/main/profile/blog/cathartes02