*Magnify*
SPONSORED LINKS
Printed from https://writing.com/main/profile.php/blog/cathartes02/sort_by/entry_order DESC, entry_creation_time DESC/page/32
Rated: 18+ · Book · Personal · #1196512
Not for the faint of art.
Complex Numbers

A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.

The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.

Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.

Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.




Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning Best Blog in the 2021 edition of  [Link To Item #quills] !
Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2019 Quill Award for Best Blog for  [Link To Item #1196512] . This award is proudly sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . *^*Delight*^* For more information, see  [Link To Item #quills] . Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2020 Quill Award for Best Blog for  [Link To Item #1196512] .  *^*Smile*^*  This award is sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] .  For more information, see  [Link To Item #quills] .
Merit Badge in Quill Award 2
[Click For More Info]

    2022 Quill Award - Best Blog -  [Link To Item #1196512] . Congratulations!!!    Merit Badge in Quill Award 2
[Click For More Info]

Congratulations! 2022 Quill Award Winner - Best in Genre: Opinion *^*Trophyg*^*  [Link To Item #1196512] Merit Badge in Quill Award 2
[Click For More Info]

   Congratulations!! 2023 Quill Award Winner - Best in Genre - Opinion  *^*Trophyg*^*  [Link To Item #1196512]
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the Jan. 2019  [Link To Item #30dbc] !! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on taking First Place in the May 2019 edition of the  [Link To Item #30DBC] ! Thanks for entertaining us all month long! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2019 round of the  [Link To Item #30dbc] !!
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2020 round of the  [Link To Item #30dbc] !! Fine job! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning 1st Place in the January 2021  [Link To Item #30dbc] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the May 2021  [Link To Item #30DBC] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning the November 2021  [Link To Item #30dbc] !! Great job!
Merit Badge in Blogging
[Click For More Info]

Congratulations on winning an honorable mention for Best Blog at the 2018 Quill Awards for  [Link To Item #1196512] . *^*Smile*^* This award was sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . For more details, see  [Link To Item #quills] . Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the January 2020 Round of the  [Link To Item #30dbc] ! Blog On! *^*Quill*^* Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the May 2020 Official Round of the  [Link To Item #30dbc] ! Blog on! Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the July 2020  [Link To Item #30dbc] ! Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the Official November 2020 round of the  [Link To Item #30dbc] !
Merit Badge in Highly Recommended
[Click For More Info]

I highly recommend your blog. Merit Badge in Opinion
[Click For More Info]

For diving into the prompts for Journalistic Intentions- thanks for joining the fun! Merit Badge in High Five
[Click For More Info]

For your inventive entries in  [Link To Item #2213121] ! Thanks for the great read! Merit Badge in Enlightening
[Click For More Info]

For winning 3rd Place in  [Link To Item #2213121] . Congratulations!
Merit Badge in Quarks Bar
[Click For More Info]

    For your awesome Klingon Bloodwine recipe from [Link to Book Entry #1016079] that deserves to be on the topmost shelf at Quark's.
Signature for Honorable Mentions in 2018 Quill AwardsA signature for exclusive use of winners at the 2019 Quill AwardsSignature for those who have won a Quill Award at the 2020 Quill Awards
For quill 2021 winnersQuill Winner Signature 20222023 Quill Winner

Previous ... 28 29 30 31 -32- 33 34 35 36 37 ... Next
October 19, 2022 at 12:01am
October 19, 2022 at 12:01am
#1039398
Some names just have staying power, I suppose.

The Trouble With “The Big Bang”  
A rash of recent articles illustrates a longstanding confusion over the famous term.


As Calvin noted, it should have been called the Horrendous Space Kablooie. Some cosmologists do call it the HSK. There aren't too many instances of science referencing comic strips; the only other one I can think of is the thagomizer.   (At that link are some others that I hadn't remembered.)

By Sabine Hossenfelder

I'm including the byline here, because I keep seeing her stuff all over the place. She has a whole YouTube video series, and I just finished a book she wrote. She's a very good science communicator.

Unfortunately, knowing she wrote this article means I read it in my mind with a soft German accent.

I can’t blame readers for being confused by recent news stories about the Big Bang. The article that kicked them off, “The Big Bang Didn’t Happen,” is bad enough. But some of the rebuttals also don’t get it right. The problem is that writers conflate ideas in astrophysics and use the term “Big Bang” incorrectly. Let me set the record straight.

Science writers love to do this shit, and it pisses me off. "Darwin overturned!" "Einstein was WRONG!" "Big Bang didn't happen!" "Schroedinger's cat found ALIVE!" Stop it. It's misleading and only adds fuel to the fires started by the willfully ignorant, anti-intellectual "my opinion is better than your book learnin'" crowd.

Let’s call Big Bang #1 the beginning of the universe. It’s what most people think the expression means. This Big Bang is what we find in the mathematics of Einstein’s general relativity if we extrapolate the current expansion of the universe back in time. The equations say that matter and energy in the universe becomes denser and hotter until, eventually, about 13.7 billion years in the past, both density and temperature become infinite.

Hossenfelder certainly knows more about this shit than I do, but I also don't think talking about infinite density or temperature is helpful, because no one can really grok the idea of infinity. It's just not possible. Georg Cantor tried, and had some success, but he died insane.

Besides, there are different kinds of infinity.

This Big Bang is sometimes more specifically called the Big Bang Singularity. This word has somewhat fallen out of favor among physicists, partly because it’s clumsy, but also because I don’t know anyone who thinks this singularity is physically real.

Probably because of the whole "infinity" thing. I mean, to me the whole thing sounds like a limit problem. Like, you know you can't divide by zero because that's mathematically undefined, right? You learn that pretty early on. But if you take a series of fractions where the denominator approaches zero, you can approach infinity. Mathematically, not practically.

Since physicists don’t believe the singularity is real, the phrase “Big Bang” has come to refer to whatever event might replace the singularity in the to-be-found theory of quantum gravity in this Planck time. Let’s call it just that—the Big Bang Event.

This is, as far as I understand it, not what would be called the Horrendous Space Kablooie.

The problem has long been that the term Big Bang is used to refer to the expansion of the universe in general, and not to the event of the creation of the universe in particular. These are, however, two separate scientific hypotheses.

That's the HSK.

Historically, the first evidence for the expansion of the universe was Edwin Hubble’s observation that the light from other galaxies is systematically shifted to the red, indicating that they all recede from us.

And I finally got to see the telescope he used for those observations. It's in the mountains near L.A. (Thanks, Annette )

While this may have been the first evidence, the decisive evidence for the expansion of the universe was the discovery of the cosmic microwave background that ruled out the competing hypothesis, the “steady state” universe.

I'm sure you've seen an image of that, but if not, here.  

One of the most mind-blowing things about this, at least in my opinion, is that the (observable) universe can be thought of as being inside-out. That is, the further you look, the more back in time you go, until you can't see any further back; that's the CMB. It's roughly the same in all directions. So, in a sense, I really am the center of the universe. Unfortunately for my ego, so are you. So is Alpha Centauri. So is everything else in the universe.

This confusion between the expansion of the universe and the Big Bang Event becomes apparent, for example, just by looking at the Wikipedia entry for Big Bang. It starts out in the first paragraph referring to something called the “Big Bang theory,” and explains that this is the theory for the expansion of the universe. In the second paragraph, the Big Bang theory is distinguished from its extrapolation to the Big Bang singularity. But by the fourth paragraph the distinction has gotten lost, and we are informed, “A wide range of empirical evidence strongly favors the Big Bang, which is now essentially universally accepted.” The reader is misled to think that evidence for the expansion of the universe is evidence that the universe began with the Big Bang Event, which is incorrect.

I'm aware that Wikipedia has its shortcomings, but if you're a physicist who's also an effective science communicator, like Sabine here, you could, you know... edit that.

I have not seen or heard the term “Big Bang Theory” being used by a physicist in a seminar or paper for the expansion of the universe.

I'm sure that's at least partly because the show of that name has ruined the expression for all time.

Anyway, after a long discourse about these problems in definition, she finally gets to the point made back at the beginning:

In the attention-grabbing article, “The Big Bang Never Happened,” Eric Lerner questions that the universe expands in the first place. His article was published in August by the Institute of Art and Ideas, a British organization that, by my own experience, prioritizes debate over scientific rigor.

Buuuuurrrrrrrnnnnn.

Lerner argues against the “cosmological establishment [that] has circled the wagons to protect this failed [Big Bang] theory with censorship,” presumably because Lerner has faced some difficulties in getting his alternative theory published.

This is directly equivalent to "spherical establishment that has circled the wagons to protect this failed Round Earth theory with censorship."

It becomes clear, later in Lerner’s essay, that he is not attacking the Big Bang Event (which can reasonably be questioned) but the expansion of the universe. And because it is true that the Webb telescope has delivered data in tension with the concordance model, the reader (or editor) who does not know the difference, may get away finding Lerner’s piece reasonable.

And this is why we need more effective science communication.

It is, as Hossenfelder notes, sometimes very difficult to replace precise scientific or mathematical terminology with imprecise words. She goes on about it in the book I just read, too (which contains very nearly no math and lots of words). This problem is both pernicious and persistent, though: even some of the most basic words, like "theory," have a different meaning to scientists and the general public, thus leading to such utterly ignorant dismissals as "evolution is 'only' a theory."

And as much as I'd like to fix it, I'm not actually a scientist. It's like: you're not a football player, right? But you probably watch football on TV. And you probably have some knowledge of players and strategies, at least enough to occasionally shout "OH COME ON!" at the screen. Well, that's like me with science. So I'm not in a position to fix it. I'll just watch people like Hossenfelder do that.
October 18, 2022 at 12:01am
October 18, 2022 at 12:01am
#1039361
In today's "we were wrong all along" news...

Goodbye to the Vikings  
The term ‘Viking’ as it is commonly used is misleading, warping our perception of the Middle Ages. It should be retired.


That yipping noise you just heard was the collective barking of Minnesota sportsball fans.

There was no such thing as a ‘Viking’ in the medieval period. Use of the term emerged in the 19th century. The word wicing occurred in Old English and víkingr in Old Icelandic, but were used very differently, to mean something like ‘pirate’.

Which explains much about a certain cruise line and its pricing structure.

In Old Icelandic víkingr could be applied to any pirate regardless of where they came from or when, or what language they spoke; they might be Estonians or Saracens, for example. It is also noteworthy that it is almost never used to describe the people who we today call ‘Vikings’. Many of the men labelled ‘Vikings’ in textbooks and popular histories were warriors led by kings on military expeditions with clear political objectives, such as the Great Heathen Army that fought Alfred the Great or the Norwegian force that accompanied Harald Hardrada to his death at Stamford Bridge in 1066. Calling such people ‘Vikings’ would be like calling 18th century British, French or Dutch naval officers ‘pirates’ simply because they wore vaguely similar hats and sailed vaguely similar ships to Blackbeard.

Have you seen what 18th century British, French, and Dutch naval officers got up to? That's not far off the mark.

The final development, the ‘ethnicisation’ of the word that allows the use of terms such as ‘Viking farms’, ‘Viking towns’ and ‘Viking women and children’, is much more recent and has gradually crept up since the Second World War. This is insidious; by linking military prowess and savagery to an entire ethnic group, it encourages its appropriation by racial supremacists.

Right, because when I think of military prowess and savagery, I think of Scandinavians and not a certain Mongolian.

On the other hand, my ex-wife is of Norwegian descent, so I get the savagery part.

The issue with the term is not merely semantic. This conception of ‘the Vikings’ seriously distorts our understanding of European history. We have tended to group almost all Scandinavian activity between the 790s and the mid-11th century together under the ‘Viking’ label, creating a distinct ‘Viking Age’ and an imagined ‘Viking’ culture and identity. The evidence, however, does not support this analysis.

The article goes on, of course, to explain said evidence, not all of which I'm going to repeat here.

Irish, English and Frankish chronicles generally refer to Scandinavian aggressors as ‘heathens’ and this, rather than any ethnic identity, seems to have been what struck the victims of these attacks as significant. The 793 raid on Lindisfarne, often said to herald the ‘Viking Age’, is described in the Anglo-Saxon Chronicle thus: ‘The ravages of heathen men miserably destroyed God’s church on Lindisfarne with plunder and slaughter.’

Hey, Lindisfarne again (from a couple of entries ago).

The construct of the ‘Vikings’ conflates and blurs the distinction between eighth- and 12th-century pirates. Tenth-century kings based in Dublin and Christian rulers such as Cnut, all of whom lived in very different societies, had different belief systems and political and economic objectives. Each of these contexts needs to be dealt with on its own terms and not within a 19th-century construct that has more than a hint of racist essentialism to it.

It's also, insofar as I can infer from this article, religious exceptionalism.
October 17, 2022 at 12:02am
October 17, 2022 at 12:02am
#1039313
What the hell is hard work, anyway?

The law of reversed effort: The harder you try, the harder you fall  
There are many things in life that cannot be improved with greater effort. Sometimes, life requires that you step back.


Is it meant to mean great physical exertion, as in the hard work of keeping a farm going? Or something mentally difficult, like proving something in mathematics? Neither is any guarantee of success.

You’re lying in bed, staring at the ceiling, listening to cars go by. You have no idea how long you have been like this, but it must be a few hours, at least. Go to sleep, you tell yourself. Just close your eyes and: Go. To. Sleep. You shut your eyes tight, force your body to relax, and wait for the blissful slumber to come. But, nothing happens. More minutes pass and… nothing happens. It’s 3 a.m., and you’re still staring at the ceiling.

I learned how to deal with that years ago. No, drinking isn't involved.

We have all been in this situation. Try as we might, it is nearly impossible to consciously will yourself to sleep. Sleep comes to those who let their mind wander and focus on anything other than sleep itself. Count sheep, control your breathing, listen to an audiobook, or whatever — so long as it turns your mind from wanting to sleep.

Or I'd get up and do something else, which would make me tired, so I'd go back to bed, which would keep me awake, so I'd get up and do something, which would make me tired, etc. Then I'd fall asleep at my desk all day.

That's my body telling me I'm a night owl.

This is a common and familiar example of the “law of reversed effort.”

Yet, whenever I make an effort, I get tired.

The Law of Reversed Effort was first coined by the author Aldous Huxley, who wrote:

“The harder we try with the conscious will to do something, the less we shall succeed.

“Proficiency and the results of proficiency come only to those who have learned the paradoxical art of doing and not doing, or combining relaxation with activity, of letting go as a person in order that the immanent and transcendent unknown quantity may take hold.”


I guess it worked for him. He sold a few books, as I recall.

But, there’s a spiritual or holistic way of viewing the “law of reversed effort” as well. It’s something that has a much longer history than Aldous Huxley — it’s the Daoist idea of “Wu Wei.”

I was going to say that makes sense and I understand it, but it doesn't and I don't. I'm pretty sure that's what Taoism is all about though, so I guess I'm doing it right. Which means I'm not.

To surrender to a greater power — or a nobler, righteous one — is not an act of cowardice. It is an act of profound wisdom. There is nothing praiseworthy about swimming in a storm or punching a bear in the face. There is wisdom in knowing our limits, in embracing humility, and even in being pushed around.

This is the meaning of Wu Wei. It is not some lazy torpor, or an excuse for a duvet day and Netflix binge.


Sure it is.

That’s nice, you might think, but how does that actually translate to real life? The problem with a lot of philosophy of this kind is that it rather leaves us no better off than before.

And usually worse off.

Anyway, here's why I decided to share this:

Writing: For an author, there is nothing so terrifying as the blank page. If you have been told you have to write something, especially on a deadline, the mind often can go into meltdown grasping for something — anything — to write. It’s much better to let ideas come and write them in a notebook so they don’t get lost.

I'm sure that works for some people. Me, I do freewrites. And since I started doing these entries on a daily basis, I think maybe one in 400 entries gave me writer's block. It's even more trite than Tao, but sometimes you "just do it."

Stress and anxiety: We all get stressed about things. All jobs involve bottlenecks and crunch points. Life has good days and bad days. But when we obsessively run things over in our heads, we actually make anxiety worse. There is a reason why “mindfulness” is such a breakaway phenomenon, and why Headspace is a $250-million business.

Yeah. That reason is marketing.

Conversations: When it comes to how we talk to people, less really is more. A bad conversation involves you talking too much and your “listening” consisting of simply waiting to talk again. Yet research shows that active listening gives more “conversational satisfaction” and leaves the partner feeling more understood.

Oh, sorry, were you saying something?

Perhaps it is time to step away from what you are doing and enjoy Wu Wei or inaction. After all, if I tell you not to think of pink elephants, there’s only one way to do it.

Drink until they go away.
October 16, 2022 at 12:02am
October 16, 2022 at 12:02am
#1039277
I don't really have a lot to say about today's article. It was just interesting to me, and something I'd never heard about before.



I'd heard of one-armed bandits, but not one-legged superspies.

To plow forward through 50 miles of dangerous hiking on foot would be arduous in the extreme. But if she remained, she’d almost certainly be captured by the Nazis, who now considered her their most feared Allied spy.

As much as I hate doing outdoors stuff, if the alternative is to be captured by Nazis, I'd take the hike.

Cuthbert was what Hall had named her wooden leg. It was going to be a long journey.

I was curious why an American would name her wooden leg "Cuthbert." The only Cuthbert I knew of was a monk at Lindisfarne (which, incidentally, if you're ever in England, is absolutely worth the trip—do it before sea level rise cuts off the only land access).

So I looked up St. Cuthbert, and what's the first thing I see on his Wiki page?   An image with the caption "Cuthbert discovers a piece of timber, from a 12th-century manuscript of Bede's Life of St. Cuthbert." Like, seriously, someone in the 1100s found this event so transcendentally important that they painted it.

Why it matters that Cuthbert discovered a piece of timber, so thrilling that Bede illustrated it, shall remain a mystery. was England deforested in his era? Was it a Norse timber and they were about to be invaded (again)? I can't be arsed to research it any further than Wikipedia. But I found it amusing that Hall named her wooden leg after someone apparently famous for discovering a hunk of wood.

Though it would be decades before the world knew the full extent of Hall’s efforts during the war, it was clear from a young age that she seemed destined for an exceptional life. Born on April 6, 1906, in Baltimore, Maryland, to parents Edwin and Barbara Hall, Virginia enjoyed a comfortable upbringing and could have easily settled into a sedentary existence.

I also found it amusing that a couple of Baltimorons would name their daughter after a neighboring and rival state.

In school, she picked up several languages and appeared disinterested in conforming to the societal expectations of the time. She happily accepted parts in plays intended for boys and enjoyed being slightly provocative, once shocking classmates by showing up at school with a “bracelet” made of live snakes around her wrist.

I think I would have liked her.

During a bird hunting expedition in 1933, she discharged her firearm into her foot while climbing over a fence. The blast from the 12-gauge shotgun caused severe injury, and the resulting gangrene forced doctors to amputate half of her left leg below the knee.

I think for most people, literally shooting yourself in the foot would end an adventuring career.

Hall’s greatest disguise, though, was achieved by taking advantage of chauvinism. Few men believed women could be effective spies—particularly one with a wooden leg.

Nowadays, of course, everyone expects the spy to be female. Due to some unfortunate choices by Hollywood, we also expect them to exclusively run honey traps.

Still, I imagine that spycraft involves doing stuff that the enemy wouldn't expect.

She connected with a brothel in the city of Lyon, France, where she was able to gather intelligence from prostitutes that had met with German troops.

So you don't have to run the honey trap yourself.

Her contributions grew so significant that the Gestapo began searching France for la dame qui boite, or "the lady with a limp."

Seems like they finally figured it out, though. The Nazis were a lot of things, but stupid wasn't one of them.

When Hall arrived in Spain, she was promptly arrested for not having a passport. It was better than facing a horde of angry Nazis.

Also better than hiking one-legged through the Pyrénées.

Despite her reputation in Nazi-occupied France, she insisted on returning, adding grey to her hair, drawing wrinkles on her face, and even having her teeth ground down to alter her appearance, according to author Sonia Purnell, who wrote a book on Hall titled A Woman of No Importance.

"We found a woman with a fake leg, but she has grey hair. Can't be the same spy." Okay, maybe some Nazis were stupid. Or maybe Hall was just that good.

Anyway, like I said, not a lot of snark here; just something I thought I'd share. At least her name wasn't Eileen; then I would make some more puns like the horrible one in the title..
October 15, 2022 at 12:01am
October 15, 2022 at 12:01am
#1039232
Wrapping up October's "Journalistic Intentions [18+] with a blind quote.

"It is important to start socializing the idea of reforms now—sometimes they are upon us quicker than we think."


I have no idea what this is in reference to, and I don't want to look it up to do this entry. It's called a blind quote for a reason. I'm going to go ahead and assume it's referring to something in the US, though.

Consequently, no matter to what it refers, half the country stopped reading at "socializing" and immediately dismissed anything else. "Sounds like soshulizm to me."

I considered listing some issues that might need reform here, but there are so many things in that category in the US that it might be easier to list the things that are not in need of reform.

And if I had given myself more time, I might even be able to think of one. Thing is, when something is going right, we tend not to worry about it too much. This was the case with (yes, I'm going here) abortion laws, until suddenly it wasn't going right and it became something in need of reform.

The other problem with listing the things I think do need reform is that, inevitably, listing them creates the implication of some sort of order. Like, even if I say "in no particular order," whatever I list first will stick in a reader's mind as my top priority. If I say "police" first, you'll think that's at the top of my mind, for example. And now you probably already think it's police.

It is not.

And it's not like fixing one thing would make everything else fall into place. Well, with one possible exception: elections.

But that's still not the foremost problem on my mind.

Nor is education, nor space exploration, nor even the imminent threat of nuclear war. Not health care delivery (actual health care, now that I think of it, is one thing we do right—for those who can afford it). Not animal cruelty, not the environment, not climate change, not energy, not terrorism (domestic or otherwise), not gun violence. Even the problem of homelessness isn't the top issue for me; neither is inflation. Or the looming recession. Or living wages, or the existence of tipping, or lack of religious freedom. Racism (both systemic and personal), a general lack of affordable housing, Twitter, regressive drug laws (especially the continuing illegality of cannabis at the federal level), private prisons, infrastructure rot, drought, political corruption, and far more issues are certainly important, but again, I wouldn't put any of them at #1.

No, the biggest issue needing reform right now is the severe lack of beer in my refrigerator.

Fortunately, that one's easily solved, unlike these others, and I will do so later today.

What? With enough beer, I can pretend that these other problems don't exist.

Okay, now that I'm done, I'm going to Google the blind quote.

Huh. Supreme Court term limits and related reforms.

Just goes to show that, even when sober, I can't think of everything.
October 14, 2022 at 12:01am
October 14, 2022 at 12:01am
#1039198
Lies.

Everyone Likes Red and Pink Candies Best  
Sweets manufacturers are finally catching on and selling packages without the lesser colors.


It only takes one data point to falsify the headline. Here it is: Me. I don't like red and pink candies the best.

Article is from 2015, but I doubt time has blurred the edges on the point. To mix a metaphor.

There’s an Internet meme floating around...that implores, “Don’t ever let someone treat you like a yellow Starburst. You are a pink Starburst.”

One of the reasons I live a solitary lifestye: cherry-pickers. If there's a mixed bag of treats, I take what I get (though I generally prefer the orange ones, or, in the case of chocolate, the dark ones). Kind of like how I enjoy choosing these articles at random. But everyone I know roots through the bag looking for choice morsels, which skews the balance and thus offends my sense of what's right and pure in the world. In a perfect world, my tastes and theirs would be different. But even though my taste isn't aligned with that of the majority, they still narf the ones I like best. Some of that, I suspect, is trolling. More likely, though, is that friends have similar tastes.

Except for the guy I know who mixes Skittles with M&Ms in a bowl. That guy's going to Hell for sure.

The message acknowledges and plays on a widely held belief: that pink and red candies are the best and all the other flavors are also-rans.

Okay, I can accept it's "widely-held." Don't generalize that into "everyone."

This isn’t to say there aren’t outliers, but more often than not, people prefer their fruity candy in shades of red.

So why the extremist, clickbait headline? Oh, right. Because it's clickbait.

It’s reminiscent of the frequency with which people claim seven as their lucky number.

I don't play craps, either.

Last year Popsicle came out with Red Classics, a line of its ice pops that nixes grape and orange from the usual lineup and only contains strawberry, cherry, and raspberry, an assortment of reds. Starburst has its FaveReds, a version of the fruit chews that includes strawberry (aka pink), cherry, and in place of orange and lemon, two rosy flavors: fruit punch and watermelon.

I don't eat sweets all that much, so I wasn't aware of any of these things. I suspect this article came to my attention because it's Halloween candy season. It might be fruitful (pun intended) to see if any of these things are still being marketed, eight years later, but I can't be arsed. In any event, while I don't dislike strawberry or cherry Starburst, I'm not a fan of fruit punch and I actively despise watermelon. I'd rather have yellow Starburst than watermelon. Hell, if there were a shit-flavored Starburst and a watermelon-flavored Starburst, and I had to pick one, I'd have to think about it.

Mogelonsky speculated that red was nonthreatening and lacked the acidic quality that can turn people off lemons and other citruses. But it’s not only that. The importance of the color red, sometimes over or in place of specific flavors, is notable. What is fruit punch, when you think about it, but a generic, noncommittal red flavor that doesn’t even bother to associate itself with a specific fruit?

What is fruit punch? Ass. It's ass. Watermelon is moldy ass.

According to Charles Spence, a University of Oxford psychologist who studies how people perceive flavor and consults for major food and beverage companies, color has a bigger influence on flavor than most people are aware. “There are probably a couple of hundred studies now since the first ones in the 1930s showing that if you change the color of a food or drink it will very often change the taste of the person rating it,” Spence said. “You sort of think intuitively, well … the color isn’t part of the taste. And yet this growing body of research over the decades does show it can influence the taste in quite dramatic ways that can’t necessarily be overwritten.”

That's because taste isn't everything. It's the whole experience that matters. Nowhere was this more evident than during the Great Coke Crisis of 1985, a truly dark time in our history. Blind taste tests apparently indicated that people preferred the disgusting taste of New Coke to the tried-and-true formula. What the marketing gurus didn't seem to understand was that we were used to the taste of actual Coke, and had grown to like it. They learned. Eventually.

Speaking of Coke and red, a couple years ago Coke changed the packaging of their "Zero" product (which is what I drink now when I'm not drinking the harder stuff) from black cans to red. A slightly different shade than Coke/Santa red, but still red. I wonder if that increased sales, given the apparent preference for red, but again, I can't be arsed to research it. I didn't care, because it still tasted the same.

This "whole experience" thing is why you don't do blind taste tests of beer or scotch. I'll be the first to admit that I drink expensive scotch because it's expensive and I can. Are there cheaper whiskeys that are just as good? Probably. I don't care.

Regarding red, Spence said that studies have shown that “it seems like red is a particularly effective cue for sweetness, maybe because there’s a cue in nature, which is fruits going from green and sour and unripe through redder and sweeter and riper.”

A moment's thought should be enough to falsify that hypothesis. First, lots of sweet fruits aren't red. Oranges, e.g., or blueberries (it's right there in the names). Second, lots of red things aren't sweet. Like tomatoes.

Marcia Pelchat, a psychologist at the Monell Chemical Senses Center who studies food preferences, disagreed with Spence’s theory. “I don’t think you can make an evolutionary argument that goes back to our primate ancestors. I think it’s shared cultural experience,” Pelchat said.

Another strike against evo-psych.

Confection and dessert companies are certainly aware of the power of red. “For our brand, red is a magical color,” said Nick Soukas, who oversees ice cream products at Unilever, the owner of Popsicle.

Maybe it's because meat is red, and meat is delicious.

As for me, I associate red with holidays I hate.

Spence agreed that anything novel can grab attention and sales. And in an era where you can custom-order any color of M&M’s online, in colors that used to be strictly unavailable, maybe other candy companies feel they need to flood the market.

Like I said, I don't eat much candy these days. But when I do, color is the least of my concerns.

As long as they're not watermelon flavored.
October 13, 2022 at 12:02am
October 13, 2022 at 12:02am
#1039134
Another entry for "Journalistic Intentions [18+], this one about a trope.



The Random Number Generator (peace be unto it) is messing with me again. I just did a "woman in a non-traditional role" entry yesterday.

Fortunately, this one's about a trope in fiction. Unfortunately, it's sometimes difficult to talk about the fictional trope without referencing those works of fiction that purport to be biographical.

Let's start with the TVTropes page. Fair warning: the page suffers from grammar issues. But I think it gets the point across.

For most of mankind's history, leadership and authority are associated with men.

We could start by not calling humanity "mankind."

After all, a leader — especially the supreme ruler of a nation — are expected to be strong (to defend their borders), ambitious (to expand and improve their territory) and aloof (so that they won't be swayed from their long-term goals by a moment of impulse), all of which are often considered masculine qualities.

And yet, some of the greatest rulers in history have been women, and some of the worst have been men. To be fair, there are far more examples of the latter. Perhaps the male ones didn't have to do quite as much to prove themselves capable leaders, due to expectations. Again, though, we're talking about fiction here, and we can write our queens however we like.

So when a woman finds herself in a position of power, expect her subjects to be less than enthused by the idea — a volatile, emotionally-driven, Hysterical Woman in charge of other?

You know who lets emotions get in the way of ruling? Men. Well, anyone, really. We're all human. Society expects us to demonstrate our emotions differently, but in all cases they sneak through sometimes.

While it is possible for this trope to apply to male rulers (if the man in question lacks the traditional masculine qualities, or if the work is set in a Lady Land and the man is forced to be more "feminine" to be accepted), what makes this an Always Female trope is the still commonly held view that leadership is an inherently masculine role, and the point of the trope is that the character suffers emotionally because they are forced to divest themselves of their society's gender norms just to be taken seriously as a monarch.

An ardent feminist once asserted to me that the world would be a better place if all its leaders were women. While I don't really have an argument against that, just to be a troll, I said, "Margaret Thatcher." She responded with, "She doesn't count!"

And now I have an even better counterargument.

The UK Prime Minister isn't, however, a royal; women who are elected (even indirectly, in that case) aren't really part of this trope.

Still, I suspect that any hereditary ruler who isn't a sociopath to begin with has to wear a mask of some sort, regardless of sex or gender identity. Even in an absolute monarchy, rulers are expected to abide by certain... well, rules. And if they don't, if they let sentimentality or passing emotions guide them, well, they tend to get stabbed in the back. Sometimes literally.

I don't think that's any easier for men than it is for women, but I have no way of knowing.

One of the purposes of fiction is aspirational; it doesn't always reflect reality, but rather display how the writer thinks reality should be. And we get used to whatever it is they're portraying. The more shows depict a competent female President, for example, the more likely we are to have one. Not only do we get used to it, but maybe it becomes a bit of an ideal to strive toward for the real person who eventually gets that position.
October 12, 2022 at 12:02am
October 12, 2022 at 12:02am
#1039096
It goes without saying that science requires scientists. Today's article features one I hadn't heard of before.

Tenacity, the Art of Integration, and the Key to a Flexible Mind: Wisdom from the Life of Mary Somerville, for Whom the Word “Scientist” Was Coined  
Inside the hallmark of a great scientist and a great human being — the ability to hold one’s opinions with firm but unfisted fingers.


A middle-aged Scottish mathematician rises ahead of the sun to spend a couple of hours with Newton before the day punctuates her thinking with the constant interruptions of mothering four children and managing a bustling household.

Meanwhile, I have trouble finding the time to read any science texts, and I don't have any such distractions (cats don't count).

“A man can always command his time under the plea of business,” Mary Somerville (December 26, 1780–November 28, 1872) would later write in her memoir; “a woman is not allowed any such excuse.”

Well, at least that's improved since then.

When her parents realized that the household candle supply had thinned because Mary had been staying up at night to read Euclid, they promptly confiscated her candles.

But not, apparently, Euclid.

Mary was undeterred. Having already committed the first six books of Euclid to memory, she spent her nights adventuring in mathematics in the bright private chamber of her mind.

"But girls can't do math." Pfft.

When Somerville was forty-six, she published her first scientific paper — a study of the magnetic properties of violet rays — which earned her praise from the inventor of the kaleidoscope, Sir David Brewster, as “the most extraordinary woman in Europe — a mathematician of the very first rank with all the gentleness of a woman.”

Well, at least he included the important part, that she was feminine.

As for the paper about violet rays, that's also something I'd never heard of. Learning two new things in one day nearly broke my brain. If you're in the same situation I was, here.  

Lord Brougham, the influential founder of the newly established Society for the Diffusion of Useful Knowledge — with which Thoreau would take issue thirty-some years later by making a case for “the diffusion of useful ignorance,” comprising “knowledge useful in a higher sense” — was so impressed that he asked Somerville to translate a mathematical treatise by Pierre-Simon Laplace, “the Newton of France.”

Thoreau was a willfully ignorant ass.

As the months unspooled into years, Somerville supported herself as a mathematics tutor to the children of the wealthy. One of her students was a little girl named Ada, daughter of the mathematically inclined baroness Annabella Milbanke and the only legitimate child of the sybarite poet Lord Byron — a little girl would would grow to be, thanks to Somerville’s introduction to Charles Babbage, the world’s first computer programmer.

Now Lovelace, I'd heard of.

In The Mechanism of the Heavens, published in 1831 after years of work, Somerville hadn’t merely translated the math, but had expanded upon it and made it comprehensible to lay readers, popularizing Laplace’s esoteric ideas.

For a scientist, it is often enough to just do science. Being able to make it comprehensible to people almost as ignorant as Thoreau, though, that takes real skill.

Years later, Edgeworth would write admiringly of Somerville that “while her head is up among the stars, her feet are firm upon the earth.”

One of Springsteen's first lyrics went, "My feet they finally took root in the earth / but I got me a nice little place in the stars." I wonder now if that echo was intentional.

Don't laugh; he's much better-read than you think.

In 1834, Somerville published her next major treatise, On the Connexion of the Physical Sciences — an elegant and erudite weaving together of the previously fragmented fields of astronomy, mathematics, physics, geology, and chemistry. It quickly became one of the scientific best sellers of the century and earned Somerville pathbreaking admission into the Royal Astronomical Society the following year, alongside the astronomer Caroline Herschel — the first women admitted as members of the venerable institution.

Before you go looking it up, yes, Caroline Herschel was related to famous astronomer William Herschel; she was his sister. For some reason (I wonder why) you never hear about her discoveries.

To be fair, William discovered an actual planet. I won't name it here because you'll make a pun out of it. But she did make significant astronomical discoveries, mostly nebulae and the like (this was before astronomers figured out that galaxies were galaxies and not nebulae).

But I digress.

When Maria Mitchell — America’s first professional female astronomer and the first woman employed by the U.S. government for a professional task — traveled to Europe to meet the Old World’s greatest scientific luminaries, her Quaker shyness could barely contain the thrill of meeting her great hero. She spent three afternoons with Somerville in Scotland and left feeling that “no one can make the acquaintance of this remarkable woman without increased admiration for her.” In her journal, Mitchell described Somerville as “small, very,” with bright blue eyes and strong features, looking twenty years younger than her seventy-seven years, her diminished hearing the only giveaway of her age. “Mrs. Somerville talks with all the readiness and clearness of a man, but with no other masculine characteristic,” Mitchell wrote. “She is very gentle and womanly… chatty and sociable, without the least pretence, or the least coldness.”

Even other women couldn't help commenting on woman stuff.

Now, here's the bit about how the term "scientist" came to be:

Months after the publication of Somerville’s Connexion, the English polymath William Whewell — then master of Trinity College, where Newton had once been a fellow, and previously pivotal in making Somerville’s Laplace book a requirement of the university’s higher mathematics curriculum — wrote a laudatory review of her work, in which he coined the word scientist to refer to her. The commonly used term up to that point — “man of science” — clearly couldn’t apply to a woman, nor to what Whewell considered “the peculiar illumination” of the female mind: the ability to synthesize ideas and connect seemingly disparate disciplines into a clear lens on reality. Because he couldn’t call her a physicist, a geologist, or a chemist — she had written with deep knowledge of all these disciplines and more — Whewell unified them all into scientist.

Thus presaging the feminist nomenclature of the late 20th century, such as appending "-person" to occupations that previously ended in "-man."

It could have been worse. It could have been "woman of science." I think if that had happened, progress in equality might have been stunted.

That bit about "the peculiar illumination of the female mind" might give you pause, but "peculiar" didn't have the negative connotations then that it does today. We use it to mean strange or weird, but then it was more like "special." But "special" itself was ruined when people stopped using other words to describe the mentally deficient.

Whewell saw the full dimension of Somerville’s singular genius as a connector and cross-pollinator of ideas across disciplines. “Everything is naturally related and interconnected,” Ada Lovelace would write a decade later. Maria Mitchell celebrated Somerville’s book as a masterwork containing “vast collections of facts in all branches of Physical Science, connected together by the delicate web of Mrs. Somerville’s own thought, showing an amount and variety of learning to be compared only to that of Humboldt.”

Our view of science, as our view of many things, tends to be compartmentalized. Biology is distinct from chemistry. Cosmology is a different study from astronomy. That sort of thing. In reality, things aren't so clear-cut. Biology is the result of chemistry, which itself relies on quantum physics; cosmology and astronomy are likewise intertwined. Our neat little packages turn out to have frayed, fractal edges.

If it took a woman to see that, to tear down the boundaries between disciplines, well, that in itself would be an argument for inclusivity, regardless of ideas about equality and gender roles.

Above all, Somerville possessed the defining mark of the great scientist and the great human being — the ability to hold one’s opinions with firm but unfisted fingers, remaining receptive to novel theories and willing to change one’s mind in light of new evidence.

And you don't have to be a scientist to live your life with that philosophy. It's the only reason I hold out any hope that things will improve: that people might change their opinions when presented with new evidence. Some don't, preferring to get stuck in their ruts.

As Wilde noted, "We're all in the gutter, but some of us are looking at the stars."
October 11, 2022 at 12:05am
October 11, 2022 at 12:05am
#1039041
Sadly, "soil degradation" doesn't refer to a BDSM mud fetish. "Journalistic Intentions [18+] #6 of 8:



"...and why we won't."

A third of the world’s soil is moderately to highly degraded, threatening global food supplies, increasing carbon emissions and foreshadowing mass migration. A change in farming practices has never been more urgent.

Lots of things are urgent now. I'm all out of urgent. You might say I'm det-urgent. Detergent useful for removing soil from garments.

The dirt beneath our feet often goes unnoticed but it is key to sustaining all life on Earth.

I'll just point out that the first organisms on earth existed before soil as we know it, so I wouldn't say "all" life. Just the part we care about. Which, if you think about it, is the problem.

Silvia Pressel, a Museum researcher in the Algae, Fungi and Plants Division, says, 'Soil is full of millions of living organisms that interact with one another. These organisms have a major influence on soil, such as its formation, structure and productivity.'

One must be cautious when using the term "soil." Its definition depends on context and audience. To a farmer, soil is topsoil: that bioactive layer that you plant seeds in. To a civil engineer, soil has a broader definition, including (mostly) purely mineral layers; it's more of a structural thing than biological. To other people, it may have a negative connotation, as in "that diaper was soiled." And then there's the Soil Stradivarius,   which isn't even pronounced the same.

I've worked the land, and I've been a civil engineer. I've even seen a Stradivarius up close (though not that one). Fortunately, I've never had to deal with a dirty diaper.

Anyway, it's clear to me that this article is concerned primarily with the agricultural definition.

Soil degradation describes what happens when the quality of soil declines and diminishes its capacity to support animals and plants.

I'm just including this because for those who won't click on the link, I felt a definition was in order.

It's when the topsoil and nutrients are lost either naturally, such as via wind erosion, or due to human actions, such as poor land management.

Or, often, both. Like you might have heard of the Dust Bowl,   which, contrary to popular belief, wasn't a football game played during a drought.

There are many types of soil around the world. The UK alone has over 700 varieties, such as clay, sand, silt, loam and peat.

Oh yeah, this article is from the UK; I forgot to mention. North America isn't that different in terms of soil types. I learned precise definitions for all of those in engineering school, then promptly forgot them. But I think even an amateur could look at a soil sample and take a fair guess at what its general classification is. In the UK, the most important of these soil types is peat, because it's used in the production of the finest beverage in all the world, single-malt scotch from the island of Islay.

Sadly, as noted, soil is largely non-renewable. At some point, Islay scotch will have to change. That will be a tragedy for all the world, but fortunately it'll happen after I'm converted to soil myself.

Healthy soil has a good combination of soil structure, chemistry, organic matter content, biology and water permeation for its type.

Here in Virginia, there are soil scientists associated with that other state university that evaluate these things for agricultural purposes. I'm sure other localities have similar programs. Thinking of buying some farmland? Get the soil tested first. It may be cheap for a reason.

A typically healthy soil will be teeming with biodiversity and may include a variety of earthworms, 20-30 types of small arachnids, 50-100 species of insects, hundreds of different fungi and thousands of bacteria species.

I knew a guy in middle school who ate dirt. I think we all knew that guy. If you didn't, you were that guy. His favorite saying was, "God made dirt, so dirt won't hurt." Occasionally, I wonder what became of him, and whether he currently supports cannabis legalization.

I'll just point out the logical fallacy there: if God made dirt, then God also made death cap mushrooms. And they definitely hurt.

Nowhere else in the world is nature so densely packed. A teaspoon of soil can contain more organisms than there are humans living on Earth.

[Citation needed] Your gut biome contains trillions of organisms.

Soil plays a vital role in cleaning water. Minerals and microbes filter and buffer potential pollutants, some of which are absorbed by soil particles.

This is how septic fields work. We've also harnessed this in parking lots; biofilters can be used to remove some of the most harmful pollutants from surface runoff.

While soil degradation is a natural process, it can also be caused by human activity. In the last few decades, soil degradation has been sped up by intensive farming practices like deforestation, overgrazing, intensive cultivation, forest fires and construction work.

One of the trickiest parts of designing what we glibly call "improvements" (subdivisions, strip malls, industrial parks, etc.) is dealing with the soil. You take a guess at how deep the topsoil is; that's stripped out and sold to farmers and whatnot, at least the part that isn't used to resurface the green spaces. The rest should balance cut and fill volumes, but at that point we're talking about the engineering definition of "soil." This is further complicated by also having to guess at depth to rock.

These actions disturb soil and leave it vulnerable to wind and water erosion, which damages the complex systems underneath.

We also had to present erosion control plans for approval, and contractors had to follow them. And yet I'm always seeing those measures poorly maintained, with obvious water erosion leaving construction sites.

But it's not just agriculture that is to blame: increasing urbanisation also has a negative impact. The widespread use of tarmac and concrete prevents water from being absorbed into the ground. This results in the death of millions of microorganisms and can lead to water runoff in other areas where it may cause flooding and erosion.

Like that.

Developing a site can vastly increase the runoff from it. While mitigation measures are normally required and implemented, they have their limitations.

Areas that are most likely to be affected are developing countries which usually provide services and materials to middle- and high-income countries. Many of the people who live in low-income countries could be forced to leave their homes in search of safety and fertile lands, resulting in the loss of cultural identity as well as possible economic and political instability in other areas.

Newsflash: this is happening now. Not just due to soil degradation, but they're running out of fresh water. Not just in "developing" countries, but right here in the US.

Many practices can be changed to prevent, and in some cases reverse, soil degradation.

This part of the article is wishful thinking. Most mitigation measures cost money, or reduce a farmer's income, which amounts to the same thing. The economic winners will be the ones who ignore these things and pursue short-term profits. At least in the US, most farming is corporate, and they care about nothing but profit. So, in short, we're doomed.

The global population size is projected to increase from seven billion today to more than nine billion by 2050.

Crop production has risen dramatically over the past few decades due to intensive agricultural practices, but this has had a huge negative impact on the environment and cannot be sustained. In fact, agricultural productivity is now declining because of this, posing a major threat to global food security.

Altering our eating habits and moving towards a plant-based diet is something we can all do to help make a difference.


Or we could have fewer children. I can't say that ecology isn't the main reason I'm childfree, but it's certainly a factor.

Also, fuck this "plant-based diet" crap. Sure, I eat a lot of fruits and vegetables these days, but only as an accompaniment to actual food.

But hey, it's not all doom and gloom. At least nuclear winter will offset global warming.
October 10, 2022 at 12:01am
October 10, 2022 at 12:01am
#1038966
Confession: I have never had an actual s'more.



The reason why should be obvious from the headline: an actual s'more requires a campfire, which would imply that I'm camping. Which in turn would imply that I'm *shudder* outside.

Well, okay, fine. I have gone camping. That's how I know I hate it. And sometimes, someone actually brought marshmallows. But never the particular combination of marshmallows, chocolate, and graham crackers with which to form s'mores.

But of course I've consumed that combination on other occasions, sometimes even heated.

This summer, millions of marshmallows will be toasted over fires across America. Many will be used as an ingredient in the quintessential summer snack: the s’more.

This article, of course, is from back when summer was fresh and new, full of promise and the possibility of redemption. That is to say, the beginning of summer in 2018, in the Before Time.

Eating gooey marshmallows and warm chocolate sandwiched between two graham crackers may feel like a primeval tradition.

Unless you actually think about it. Or read this article.

But every part of the process – including the coat hanger we unbend to use as a roasting spit – is a product of the Industrial Revolution.

How long before we make coat hangers illegal?

The oldest ingredient in the s'more’s holy trinity is the marshmallow, a sweet that gets its name from a plant called, appropriately enough, the marsh mallow.

Yeah, not really. As the article goes on to point out:

Today the marshmallow on your s’more contains no marsh mallow sap at all. It’s mostly corn syrup, cornstarch and gelatin.

Another vegetarian trap.

Chocolate is another ancient food. Mesoamericans have been eating or drinking it for 3,000 years.

Yes, from an entirely different continent than the marsh mallow. Which supports once again my claim that international food trade is one of the few actual benefits of endgame capitalism.

Also, what the ancients called "chocolate" (or xocolatl or whatever) bears about as much resemblance to a plastic Hershey's bar (from what I understand, that's the preferred ingredient in a s'more) as the marshmallow does to the marsh mallow.

The chocolate that the Mesoamericans ate was dark, grainy and tended to be somewhat bitter.

Much like me after my divorce.

Finally, the graham cracker was invented by the Presbyterian minister Sylvester Graham, who felt that a vegetarian diet would help suppress carnal urges, especially the scourge of “self-pollution” (read: masturbation).

I have it on good authority that it does no such thing, unless you simply clutch a vegetable or a graham cracker in each hand at all times.

The original graham cracker used unsifted whole-wheat flour. Graham felt that separating out the bran was against the wishes of God, who, according to Graham, must have had a reason for including bran.

Thus proving once more that you can be right for the wrong reasons.

As for how the graham cracker became a part of the s'more, the snack’s true origin remains unclear.

Probably some marketing guru was like, "Hey, here are three things that are approximately 97% sugar. Let's combine them!"

The first mention of this treat is in a 1927 edition of the Girl Scout manual “Tramping and Trailing with the Girl Scouts.” In a nod to the treat’s addictive qualities, it was dubbed “Some More.”

I don't think the GSA could get away with issuing a manual that includes the verb "tramping" these days.

The term s'more is first found the 1938 guide “Recreational Programs for Summer Camps,” by William Henry Gibson. Some think the s'more may be a homemade version of the Mallomar or the moon pie, two snacks introduced in the 1910s.

That would support my "marketing guru" theory.

Today, the s'more has become so popular that it’s inspired a range of spin-offs. You can eat a s'mores-flavored Pop Tart for breakfast...

Those are actually not bad, but my days of eating sugar bombs for breakfast every day are long behind me.

...munch on a s'mores candy bar for dessert...

One of these days, someone will explain to my satisfaction why it's socially acceptable to eat donuts for breakfast, but not cake. Or vice-versa.

...and even unwind after a long day at work with a s'mores martini.

Oh, hell to the power of no squared.

Okay, no, it's not that I wouldn't drink it. It's just that I'm an absolute purist when it comes to the word "martini." If it's not gin and vermouth with an olive, it's not a martini. I can accept vodka in place of gin, but then it's a vodka martini. I can even accept different garnishes (though some kind of garnish is essential). But in no sane universe does a sweet drink get to be called a "martini," regardless of the shape of the glass it's served in.

As I often tell my students, the health-conscious Sylvester Graham is probably rolling over in his grave after what became of his beloved cracker.

Only if they eat s'mores while masturbating.
October 9, 2022 at 12:01am
October 9, 2022 at 12:01am
#1038889
Entry #5 for the October round of "Journalistic Intentions [18+]:

One reason for environmental justice


I have this conceit, had it for a very long time now, that I can write something about anything. This started when I was quite young, when used to find random quotes in printed material (pre-internet, even pre-home computers) to expand upon in paper journals that have, to my vast relief, since been lost. In a way, this blog has turned into an extension of that; I guess I really haven't changed much, except now I make fewer fart jokes.

This prompt, though? This one stretches my abilities. I can't think of one reason for environmental justice.

No, I can think of way more than one.

For starters, there's a reason corresponding to every person living unwillingly in an environmentally-compromised location. Not just the people, either, but the wildlife and even the vegetation.

But I got to thinking: what would environmental justice actually look like? There's eight billion of us on the planet, and we all produce waste. I get the impression that some of the bigger producers of waste are the ones least affected by it, but without doing actual research, I can't be sure about that. Either way, though, it's clear that the bulk of the burden falls on people who are already disadvantaged in other ways.

"Justice" implies that, ideally, we should all feel the same burden. Thing is, then that would suck for everyone equally. And rich people would never stand for it; you have to remember the Golden Rule: "Those who have the gold make the rules."

I had a friend who was like, "Why don't we just shoot the waste into the sun?" Leaving aside for the moment the incredible expense of doing so, in order to do it, you'd have to cancel the load's angular velocity, as it starts out with the same value as that of the planet (give or take a few angular seconds/second, depending on where it's launched from and when). To do that you'd need reaction mass. Where would we get the reaction mass? Why, from the waste itself, of course. Which would mean, basically, burning it all before it even got to the accursed daystar. Along with the space barge carrying it.

But never mind that; it's just that I'm still laughing at the concept. No, in reality, it has to go somewhere on the Earth. And the further you move it from population centers, the more expensive it is to deal with. And then you have to employ people to take care of it—transport it, then bury or burn it—which would mean people traveling to and living near the waste disposal site, which puts the burden on them. We're not advanced enough to automate the whole schlemiel.

Producing less waste would be good, but you're never going to get it to zero. Basic thermodynamics.

So, what can we do? Hell if I know. Yes, being a civil engineer, this sort of thing is in my wheelhouse, but all that means for me is that I'm maybe more aware of the economies involved. And hell, right now I'm more focused on immediate quandaries. Like, it's getting cold at night now, which means I want to fix myself a nice hot mug of tea. The quandary is: why the hell does my tea kettle have the "Max Fill" line on the outside? It's utterly useless to me there, as the kettle is opaque.

I used to have a tea kettle where that line was an actual dimple in the metal, so you could, if you looked at it from the right angle, see it on the inside and then stop the faucet when the water reached that line. That tea kettle burned (NOT MY FAULT) and it became part of America's waste disposal problem a few years ago. But even then, could I ever be sure that, as I held the thing under the faucet, it was level enough so that the water surface at the line was correct? Even a slight tilt could result in more or less water than the desired maximum volume.

Point is, I can't even address this inequity, let alone the larger one concerning waste disposal.

So I'll just leave you with the video that came with this prompt. Maybe you can think of something where I didn't.


October 8, 2022 at 12:03am
October 8, 2022 at 12:03am
#1038835
Stop liking what I don't like!



FROM THE LOFTY perch of old age, and after a lifetime of thrift, I declare that I am qualified to comment on how not to waste money.

OK Boomer.

We’ve all heard the reports: Most Americans live paycheck to paycheck, a large number can’t come up with $400 for an emergency, and there’s no money to save for retirement and other goals.

"Most" isn't very helpful. It could mean anything between 51 and 99%. Turns out the current number is about 64%   as of last January. But I don't know what the basis of that statistic is. Does it only include wage-earning employees? Contractors (such as Uber drivers)? Whatever; I'll grant it's a lot of people. Also the link provided later in this article is from 2017 and quotes something like 78%, which seems high.

Most of that data comes from surveys where people are, in effect, saying they don’t have enough income. My curmudgeonly reaction: Stores, fitness centers and entertainment venues are packed with shoppers, many of them buying unnecessary goods and services.

Well? They don't have enough income. Also, how do you know that "stores, fitness centers and entertainment venues are packed with shoppers" unless you go there yourself?

It’s a funny thing: I have yet to see Warren or Bill in one of the many local spas.

Oh, do you live in Omaha or Seattle? Warren Buffet is known to live a relatively modest lifestyle for all his wealth, but I don't know (or care) much about what Gates is up to. I do know that when you're that kind of rich, you can have your own spa installed in your supervillain lair.

Most Americans live like no other people on earth. We have more and bigger stuff: Larger houses, bigger vehicles, more shoes. And, in my not so humble opinion, we can’t tell the difference between needs and wants, between necessities and desires—and we sure can’t defer gratification.

Who's this "we" shit?

As for deferring gratification, in my opinion, if you keep doing that, you'll die unfulfilled.

All this leads me to one conclusion: We’re unable to control our spending or manage our money.

This is by design. Even if you don't take into account the systemic problems with employment right now, advertising, which is ubiquitous, ensures that we're always feeling unsatisfied, no matter how much junk we have. Sure, you can lay the blame on the people on the receiving end of that, but that's like blaming someone for being fat when the only food available to them is ultra-processed high-calorie junk food.

Here are 16 things that this 75-year-old considers big money wasters:

And of course this is what the article's all about. I'll skip some in the interest of not writing a thesis here.

1. Tattoos. They’re an admitted obsession of mine. What will they look like when you’re my age? From what I’ve heard, a good tattoo artist charges $200 an hour.

No young person ever considers what they'll look like at 75. Hell, the way things are going, they won't make it that far anyway. I don't have tattoos either, but I don't tell other people what to do with their skin.

2. Vacations. Hey, everyone needs a break. But you don’t need to go into tuition-level debt to have a good time. Your kids will survive if they never visit the Magic Kingdom.

Disney World is undeniably expensive, but calling it "tuition-level debt" is hyperbole beyond even my usual levels.

4. Restaurants. Eating out, or buying $4 designer coffee, is expensive and—wait for it—it’s also a luxury. Skip that daily $4 coffee and after 30 years you’ll have more than $121,000, assuming a 0.5% monthly return.

What you don't seem to understand is that some people work two or three jobs just to make ends meet, which leaves absolutely no time or energy for grocery shopping or cooking for yourself or even making your own coffee. Also, a 0.5% monthly return (which is 6% a year) is a hell of an assumption; it does track with the long-term average post-inflation gains of the stock market, but there are no low-risk investments that can provide that sort of return.

7. Credit cards. When people say they live paycheck to paycheck, does that include purchases put on credit cards that aren’t paid off that month? In that case, they’re spending more than their paycheck—and what they buy will cost them the purchase price, plus a hefty interest rate.

I gotta agree on this one. But credit cards aren't the problem.

8. Lottery. The lowest-income groups spend the most on lottery tickets, wasting hundreds of dollars a year—about the same as that $400 emergency fund they don’t have.

We've discussed this here before. Yes, the math shows that it's not worthwhile. But life is about more than math; the lottery is the only thing that gives some people hope. Sure, some folks have a gambling problem, but let's address that instead of being like "oooh gambling ooga booga."

10. Shoes. Surveys suggest the average American woman owns more than 25 pairs of shoes, which they admit they don’t need. So why buy so many pairs? It seems shopping and wearing trendy stuff makes us feel good.

"So I want to ensure that you don't feel good."

14. Holidays. Somehow, every December, financial caution goes out the window and we pay for it the following year. But my pet peeve are those inflatable characters on lawns that cost hundreds of dollars. Talk about blowing money.

They annoy me, too, but not in terms of how much money people spend on them. And yet, I understand that they make some people deliriously happy, and who am I to argue with that?

16. Haircuts. The average haircut reportedly costs $28.30 in a barber shop. Many men pay a lot more. Nowadays, nearly a third prefer a “salon.”

Admittedly, I got off that hamster wheel. Long-haired hippie freak for the win.

I kind of agree with some of what he's saying, but here's the issue I have:

Pretty sure I've said this before, but whatever. I once saw a video on YouTube (so it may or may not be true, but that doesn't matter for my point) where a guy talked about spending $22,000 on Superb Owl tickets one year. Now, that's tuition-level spending, unlike Disney World. It's not something I'd want to do. I wouldn't spend $22K on any entertainment, not even Springsteen tickets. But, so what? It's what he wanted. Presumably, it was his dream, his life goal. You know what I would spend $22K on? A beer tour of Europe. Lots of people would think that's a massive waste of money, but it's my dream and one of my life goals, assuming WWIII ends in my lifetime.

The point being that we all have things that we want, over and above needs. Some of them involve spending money, because despite what you're told by people with a vested interest in keeping you poor (companies, churches, etc.), money can buy happiness, at least briefly (all happiness is brief). I won't agree with all your choices; you won't agree with all of mine.

Then there was the time I've also talked about when, driving through Nevada from Reno to Vegas (one of my all-time favorite drives), I made a pit stop in Beattie. I'm in this little shop waiting for the guy to get off the phone so I can pay him for something I'm buying that I don't really need. Looking around, behind the counter, I see a cat carrier. So of course I'm hoping to see a cat. But this little hand comes out and grips one of the carrier door bars.

Guy gets off the phone. I'm like, "Is that a monkey?"

"Yup. Wanna see?" He proceeds to take out the monkey, a young capuchin named Hannah, who then starts climbing all over the counter, then me, then him, all the while making cute little monkey noises. Meanwhile, he's asking me about what I'm doing in Beattie.

"Oh, I'm on the way to Vegas."

"Not to gamble, I hope. Might as well take a lighter to your cash."

As I was, indeed, on my way to Vegas to gamble, I didn't say anything. But I did note, to myself, that he was dissing me for how I spend my entertainment budget while he had a literal monkey on his back. Not to mention happily taking my money for a T-shirt (which I still have and occasionally wear) and some ghost pepper sauce (which was delicious).

Gambling is, for me, an entertainment expense, like the guy who spent half a year's salary on the Superb Owl. Only not even close to that level. The guy in Beattie blew his money on an exotic pet (legal in Nevada from what I understand). It's easy to rag on someone for spending money on things you consider frivolous; it's also easy to justify spending your own money on something other people consider frivolous.

So I ended up writing a dissertation anyway, I guess. But I can't let this go without noting my own list of things that I, somewhat younger than the self-described curmudgeon above, consider big money wasters.

1. Kids. Utter wastes of money.

2. Sports. By which I mean going to live sporting events, paying for cable, subscribing to ESPN, etc.

3. A boat. Or any other leisure vehicle.

4. Art. Sure, it's nice, but that's what museums are for.

5. Cable TV. Never did have it. I'd be paying $200 a month for something I'd rarely use, and still have to suffer through commercials when I did. No thanks.

Or how about some things that I do spend money on that other people would consider frivolous? Well, no, this has gone on long enough already. If you've been following along, you already know most of it.

Point is, though, sure, people could manage their money better. Or employers could stop being so damn stingy. But when budgeting, it's always important to reserve some discretionary funds, if you can, for things that could bring you joy, or at least relief from the crushing boot of endgame capitalism. Even if those things are items or experiences that some old guy considers a waste.
October 7, 2022 at 12:03am
October 7, 2022 at 12:03am
#1038776
Every time I think I've heard of everything in some category, like clothing or games or food, something comes along to prove me wrong.

This is a good thing.

Entry #4 for "Journalistic Intentions [18+]

Southern Classic: Daube Glacé  
Daube glacé is a labor-intensive dish, best reserved for special occasions


Labor-intensive? Well, that leaves me out. I have a simple rule: if it takes longer to prepare than it does to eat, it's not worth it.

While country ham and salami are hardly foreign to New Orleans these days, they were rarities in the along the Gulf Coast two centuries ago. “We can’t hang meats outside here. They rot,” says Isaac Toups, who runs the kitchen at Toups’ Meatery. In the years before the advent of refrigeration, locals had to find other ways to keep the pantry stocked.

My father was born in New Orleans before refrigeration was common. Probably explained why he refused to put air conditioning in the house. Doesn't explain why he didn't know any New Orleans recipes.

This no-refrigeration thing plagued brewers, too. They had to get creative to find ways to cool the wort and keep the final product from getting too warm. But that's for another entry.

As Toups tells it, one of the best-known snacks of old New Orleans owes its existence to just those circumstances. “As any chef knows, when you reduce a stock down to a glace, it lasts longer,” he says.

Oh. Yep. Sure. I knew that.

Okay, no I didn't.

Daube is a nourishing beef and vegetable stew with French roots.

Quelle surprise.

When Creole cooks simmered the leftovers into a sliceable, shelf-stable concentrate, they inadvertently created daube glacé, a meaty aspic most often consumed on crackers and mayonnaise-slicked po’boys.

"Inadvertenty?"

Nowadays, many cooks in New Orleans make daube glacé using reliable store-bought gelatin, rather than the gelatins already present in beef stock and in pigs’ feet—an extra ingredient in some vintage recipes.

As an aside, I always feel an unreasonable sense of amusement whenever some vegetarian who didn't know where gelatin comes from finds out where gelatin comes from.

Daube glacé is a labor-intensive dish, best reserved for special occasions. If you’re in New Orleans, you don’t have to make it yourself: Langenstein’s grocery store offers a fine version.

If there's something that takes a long time to do, someone will find a way to commercialize it. Look up the history of ramen sometime. Or maybe I'll do a thing on that at some point.

The rest of the article is the actual recipe. I don't know what it is about online recipes; they always have to write a PhD dissertation before getting to the how-to part. No. Stop that. Post the recipe first and then do the notes. Or at least keep the intro short and to the point. I hate reading a whole book only to find that, as in this case, the recipe is Way Too Much Work to bother with (in this case, though, they did say so up front).

I just have one more thing to say, though: it's fashionable now to make fun of the gelatin mold creations from the middle of the last century. And, to be fair, some of them were absolutely disgusting. But as is often the case, it comes from a place of privilege. Aspics, like the one in the linked article, preserved food in a time before cheap and easy refrigeration. Hell, even the way we make gelatin now is way, way easier. And that's okay. I like easy. Easy is good; work is bad.

My father would probably smack me for saying that, but whatever. I'm still salty that I didn't get to try any Cajun dishes until I left home.
October 6, 2022 at 12:02am
October 6, 2022 at 12:02am
#1038722
As it gets colder here, sometimes I have to be reminded that, objectively, 60F isn't really all that cold.



When I was in college, with access to the proto-internet, I was able to look up temperatures in cities around the world. Whenever it got really cold around here, I used to make myself feel better by looking up the temperature in Nome, Alaska, a town (if you can call it that) barely south of the Arctic Circle.

That stopped when, one day, by some climatic fluke, Nome was actually warmer than Charlottesville.

Even looking up the temperature on Mars didn't always help. Sometimes it gets warm enough there to go out in shorts and a t-shirt. If, you know, it had about 100 times more air pressure and any oxygen.

But of course, some places are even colder. The South Pole in winter, for example. Or most of outer space.

Far outside our solar system and out past the distant reachers of our galaxy—in the vast nothingness of space—the distance between gas and dust particles grows, limiting their ability to transfer heat. Temperatures in these vacuous regions can plummet to about -455 degrees Fahrenheit (2.7 kelvin). Are you shivering yet?

Nope. I'm sitting on my deck on a mid-fifties (F) night, with the infrared heater on, and feeling better about things because at least it's not as cold as intergalactic space.

But understanding how cold is space, and why the vacuum of space is this cold, is complicated.

Because of course it is.

Vacuum doesn't technically have a temperature. There's nothing to have temperature. Vacuum is a remarkably good insulator, which is the science behind advanced Thermos technology. Of course, outer space isn't really a complete vacuum, just close enough for most practical purposes.

For physicists, knowing what the temperature in space is is all about velocity and motion. “When we talk about the temperature in a room, that’s not the way a scientist would talk about it,” Jim Sowell, an astronomer at the Georgia Institute of Technology, tells Popular Mechanics. “We would use the expression ‘heat’ to define the speeds of all the particles in a given volume.”

Yeah, that's a bit misleading. It's more like the average speed of all the particles. Some are zipping right along, while others are in the slow lane getting in other particles' way.

Coincidentally, the lowest temperature ever recorded in our solar system was clocked much closer to home. In 2009, scientists measured the depths of a dark crater on the surface of our moon and found that temperatures dropped to about 33 kelvin, according to New Scientist.

Just a bit colder than Nome in winter.

Well, that’s where things get tricky. Within near and distant galaxies, the mesh of dust and clouds that weaves between the stars has been observed at temperatures between between 10 and 20 kelvin. The sparse pockets of space that contain little but cosmic background radiation, leftover energy from the formation of the universe, hover in at around 2.7 kelvin.

That's where the 2.7K comes from, incidentally: it's the leftover radiation from the Horrendous Space Kablooie.

These temperatures dip perilously close to an elusive measurement: absolute zero. At absolute zero, which to -459.67 degrees Fahrenheit—no motion or heat is transferred between particles, even on the quantum level.

Now, I don't claim to be a quantum mechanic. Hell, I can't even change the oil in a neutron. But that wasn't my understanding; even at absolute zero, some quantum fluctuations occur. From Wikipedia: "Absolute zero is the lowest limit of the thermodynamic temperature scale, a state at which the enthalpy and entropy of a cooled ideal gas reach their minimum value, taken as zero kelvin. The fundamental particles of nature have minimum vibrational motion, retaining only quantum mechanical, zero-point energy-induced particle motion."

"Minimum" isn't the same as "none."

Back here on Earth, we have it easy. “You can have high-speed particles zipping by us outside the Earth’s atmosphere, but if you took off your space suit, you would feel cold because there aren't that many particles hitting you,” says Sowell.

"Cold" wouldn't be the only thing to worry about in that situation.

Were you to weave between galaxies in the vacuum of space without a spacesuit, the heat from your body—about 100 watts, according to Space.com—would start to radiate away from you because conduction and convection don't work in space. This would be a slow, frigid way to go, and, eventually, you'd freeze to death. But ... it's likely you'd asphyxiate first.

You know, I'm usually the first one to get annoyed at unscientific scenes in SF movies and shows. Like when a spaceship zooms by and you hear it; you wouldn't, really, because there aren't enough particles to transfer the sound. Or when they portray zero-g doinking; that wouldn't be nearly as much fun, or as easy, as you might think. But one thing they usually get close to right: you can, hypothetically, survive limited exposure to vacuum. I've heard "thirty seconds" bandied about, though I don't think anyone's had the balls to test that. But you can probably hold your breath for more than 30 seconds, so it's not the lack of air. Nor do you get cold enough fast enough for a 30 second exposure to vacuum to kill you. You won't enjoy the experience, sure, but those shows where someone leaps out an airlock and makes it to the next ship over in a few seconds? Sure, probably. The biggest thing you'd have to worry about, I think, is the pressure differential; if you take a deep breath first, your lungs can explode. And the jury's still out on whether your eyeballs would freeze before they exploded, or vice-versa.

That's why the common method of execution on a fictional starship is inhumane. You throw someone out an airlock, and they get to flail around in near-total vacuum for maybe up to a minute before they pass out and/or explode and/or freeze and/or boil (at low pressures, you can freeze and boil at the same time, fun fact). Hell, even beheading is more humane than that; the severed head might stay conscious for "only" up to about ten seconds.

No, if I were running the judicial system on a starship, and we had the death penalty, I'd take advantage of there being airtight compartments on the thing, and pump in some knockout gas before evacuating the chamber. Painless, quick, not cold.

Why am I thinking about these things? Well, science fiction writers have to. I may not write much fiction these days, but when I do, I want to get the science (mostly) right.
October 5, 2022 at 12:01am
October 5, 2022 at 12:01am
#1038635
Entry #3 for "Journalistic Intentions [18+]. Blind quote, no link:

"Sure, when teenage girls write self-insert characters that hang out with their faves it's 'cringey' and 'bad fanfiction', but when Dante does it it's a 'literary classic' and 'redefines the way we view Catholicism'."


Uh huh. In 700 years, we'll see how much of today's fanfiction is taught as "the classics."

Look, teenage writing is generally cringey. Mine certainly was, and I wasn't a girl. You know what Dante wrote before his masterpiece? Love poems. LOVE POEMS. Starting when he was in his teens.

I wrote love poems back then, too. Well, not in Dante's day, but almost. I'm glad they've been lost to the mists of time; Dante had no such luck. One line I specifically remember is "Now my broken heart has mended / But the scars won't let it beat."

*Sick* *Sick* *Sick*


You know what else is almost always cringey? Self-insert characters. One of the most successful writers of our time is Stephen King, and you know what it was when he did it, well past his teenage years? Cringey. Like he was writing Stephen King fanfiction.

Oh sure, occasionally you'll get a young writer who's very talented and their early works are things of beauty. Shelley wrote Frankenstein when she was 20, while all of her dudes were stuck in the "love poem" rut (to be fair, those are considered classics too). But not everyone can both create a lasting work of literature and invent a new genre, especially not at that age. No, most of us write cringe in the beginning. And then, if we stick with it, if we really learn our craft well, then maybe—maybe—we write something that stands the test of time. Well, not "me" "we." The general authors' "we."

That said, my advice (for whatever it's worth) to young authors? Do it anyway. Some great authors have built careers starting with fanfiction. Greg Bear comes to mind; he wrote Star Trek novels. So did Diane Duane. Okay, you've probably never heard of either of them, but they both went on to write more than passable SF and/or fantasy.

Doesn't mean I'm going to read it. But just like worldbuilding for a fantasy novel, you're setting up a foundation for the possibility of later greatness.

Speaking of Star Trek, probably the second most famous work of fanfiction (after 50 Shades) in modern history is the story (a work of parody) that coined the term "Mary Sue." I've ranted about that in here in the past, probably for another JI prompt. The canonical "Mary Sue" (insofar as I can use the word "canon" for anything related to fanfiction) is an author self-insert, usually but not always penned by a teenage girl, who's unbearably competent and has no real flaws, and who's universally loved and accepted.

Not fun to read, usually. It's a clear case of wish fulfillment. But in the end, isn't all fiction wish fulfillment? The difference, at least in my mind, between good characters and bad is that the good characters have both external and internal battles to overcome. We're far more satisfied reading about someone who has trouble solving all the puzzles and gets into tough situations—and the Mary Sues or Marty Stus of the writing world are already at the top of their game. It would be like John Wick having the invulnerability, strength, and laser vision of Superman: the stakes would be nonexistent, and the story would be boring.

But the Superman character itself started out as... wish fulfillment. Writing a super-powered character takes some skill; in a compelling story, they have to face challenges that their powers aren't useful for. Like the latest TV incarnation of Superman, who has to deal with having teenage sons, something that, as far as I know, no superhero has ever been able to handle with powers alone.

So, yeah, I don't like teenage fanfiction, and I don't have to or want to read it.

But I'm not going to tell them not to write it. You gotta start somewhere. Artists—by which I mean people who make drawings or paintings—often get their start by copying others; they thereby learn the techniques and only later come into their own style. Or so I heard; I can't draw to save my life. Musicians start out by learning others' chord progressions. And so on. It should be no different with writers.

As a final note, "fanfiction" has somehow acquired the connotation of being sexual in nature, expressing the writer's desire to doink some character someone else created. While it can be erotic (again, 50 Shades), that's not the essential feature of fanfiction. I'm using the term in its broader sense here.

That said, I did once write a scene featuring a liaison between Snape and McGonagall.

"Oh! Yes! Ten points to Slytherin! Twenty points to Slytherin! YES! FIFTY POINTS TO SLYTHERIN!"

...no, I will not be sharing the rest of it. You're welcome.
October 4, 2022 at 12:02am
October 4, 2022 at 12:02am
#1038578
Even as a kid, I had issues with this.

How the Pledge of Allegiance Went From PR Gimmick to Patriotic Vow  
Francis Bellamy had no idea how famous, and controversial, his quick ditty would become


My biggest issue was expecting second-graders to know what words like "allegiance" meant.

On the morning of October 21, 1892, children at schools across the country rose to their feet, faced a newly installed American flag and, for the first time, recited 23 words written by a man that few people today can name. “I pledge allegiance to my Flag and to the Republic for which it stands—one nation indivisible—with liberty and justice for all.”

While this article (actually an ad for a book but it still contains interesting information) is from 2015, I note that this month marks the 130th anniversary of that day. That's a bit more than half of the country's existence.

Francis Bellamy reportedly wrote the Pledge of Allegiance in two hours...

What did he do, laboriously engrave it into granite? Hopefully it was American granite.

In a marketing gimmick, the Companion offered U.S. flags to readers who sold subscriptions, and now, with the looming 400th anniversary of Christopher Columbus’ arrival in the New World, the magazine planned to raise the Stars and Stripes “over every Public School from the Atlantic to the Pacific” and salute it with an oath.

"A marketing gimmick."

I mean, really, can you think of anything more quintessentially American than that? Okay, anything that doesn't involve firearms?

Bellamy, a former Baptist preacher, had irritated his Boston Brahmin flock with his socialist ideas.

I just want to leave this part here for contemplation.

In a series of speeches and editorials that were equal parts marketing, political theory and racism, he argued that Gilded Age capitalism, along with “every alien immigrant of inferior race,” eroded traditional values, and that pledging allegiance would ensure “that the distinctive principles of true Americanism will not perish as long as free, public education endures.”

Okay, maybe rampant capitalism, anti-immigrant sentiment, racism, and jingoism are right up there with marketing gimmicks as true American values. We're trying to destroy free public education, though, so there's that.

In 1954, as the cold war intensified, Congress added the words “under God” to distinguish the United States from “godless Communism.”

Because clearly that's the only difference.

It's rare that I'm in a position to endure the Pledge these days, but when I do, I omit that part on general principle.

That it was a preacher who wrote the thing, and didn't include any references to any deity in it, should be cause for thought. But again, this is America; thought is for commies and pussies.

The snappy oath first printed in a 5-cent children’s magazine is better known than any venerable text committed to parchment in Philadelphia.

Oh, I don't know. The Schoolhouse Rock rendition of the Preamble to the Constitution is an earworm that's firmly etched in my memory.

Yet the pledge continues to have its critics, with some pointing out the irony of requiring citizens to swear fealty to a nation that prizes freedom of thought and speech.

On paper.

The historian Richard J. Ellis, author of the 2005 book To the Flag: The Unlikely History of the Pledge of Allegiance, acknowledges that the oath is “paradoxical and puzzling,” but he also admires the aspirational quality of its spare poetry. “The appeal of Bellamy’s pledge is the statement of universal principles,” he says, “which transcends the particular biases or agendas of the people who created it.”

What universal principles? That a flag exists? "Republic," "indivisible," "liberty and justice" are all abstract concepts and hardly universal.

So is "nation," for that matter.

Bellamy did some transcending of his own. The onetime committed socialist went on to enjoy a lucrative career as a New York City advertising man, penning odes to Westinghouse and Allied Chemical and a book called Effective Magazine Advertising.

Ah, yes, the sure cure for socialism: money.

But his favorite bit of copy remained the pledge—“this little formula,” he wrote in 1923, with an ad man’s faith in sloganeering, which “has been pounding away on the impressionable minds of children for a generation."

Sure, he didn't invent indoctrination. But he nationalized it.

Despite my issues with making impressionable kids recite a loyalty oath, though, I can't fault the ideals of "liberty and justice for all." But I think we have to acknowledge that those ideals are something to work towards, not reflective of the current state of affairs. Especially knowing that the guy who wrote the words wasn't interested in the "for all" part, going by his statements on "inferior races." (Just to be clear, no, I'm not trying to "cancel" him. Sucks that I have to issue this disclaimer, but these days that's what some people will infer.)

I vaguely remember saying a while back that the Coca-Cola Santa Claus was the most successful marketing campaign of all time. I'm big enough to admit that I was wrong: the Pledge holds that title.
October 3, 2022 at 12:03am
October 3, 2022 at 12:03am
#1038519
Entry #2 for "Journalistic Intentions [18+] today.



Having never heard of Justin Sutherland, I was relieved to discover he's not the illegitimate love child of Justin Bieber and Kiefer Sutherland. My relief was short-lived, however, as it became clear that the above linked "article" is a barely-disguised ad. Fortunately, it's mostly an ad for a book, which I've repeatedly stated isn't going to stop me from linking it on a writing site.

Anyway, it turns out that Sutherland is a chef, though someone I'd never heard of. Not that surprising, as I don't follow celebrity chefs and I don't live in Minnesota.

Sutherland was boating with friends on the St. Croix River on July 3 when his hat blew off, causing him to reach for it. At the same moment, the boat hit a wave, sending him into the water near the propeller, which "did a number on his head and left arm,"

Sounds like he was more than "near" the propeller. I guess you could say he got screwed.

Look, I'll never pass up an opportunity for a pun, no matter how tragic the incident that prompts one. I'm not making fun of the accident; it sounds like no fun at all, one of those shitty things that occasionally happens to people.

Here's a snippet of Sutherland's interview. You can watch his full appearance on The Jason Show in the player above.

No.

Jason: What do you remember about that day?

Sutherland: I mean, honestly, I remember everything. It was a beautiful, perfect, you know, day. I remember everything from hitting the water, to getting slammed back to the boat, and to getting to the ambulance.


The key word being "was." One wonders how much booze was involved, though a large amount wouldn't work with remembering "everything."

Jason: I know this seems very simple, but you don't remember any of the pain?

Sutherland: No. Not a single piece.


I ain't no doctor, but that sounds like shock to me.

Jason: You look fantastic!

Sutherland: Regions was amazing … I still haven't eaten solid food in nine weeks now. I have five metal plates in my face. And this is all brand new (pointing to his face), I can't put any pressure on my teeth. I've lost about 14 pounds, trying to drink protein shakes.


That sucks and all, but at least he's a chef: those protein shakes were probably amazing. For protein shakes.

Jason: Let's talk about the positive … the positive is the support. Wow. … What did that feel like?

Sutherland: Incredible. … Nothing but just gratitude. We touch a lot of people in our lives … but sometimes you don't realize the people you cross paths with and the lives that you impacted. The people that came together – still, eternally grateful.


You know, this is kind of refreshing. So much so that I'm willing to overlook the whole "positivity" thing. Why? Because he wasn't all "God healed me." No, he acknowledges the very human, very skilled people involved in his recovery. I don't see a lot of that in the news. I see a lot of "That tornado ripped through and slammed me against a trailer and I couldn't walk for a week, but God healed me," ignoring the neurosurgeon that fixed the walking thing. Or, "I was driving drunk and slammed into a tree, but after a month I could see again, thanks to God," without mentioning the doctors who fixed your unworthy drunk-driving-ass face.

Never mind that God did those things to you in the first place. I mean, if he's in control, then didn't he both send the tornado *and* heal you afterward? If not, then he's not all-powerful, is he? And if he does, why only praise him? It's like, occasionally you'll get a firefighter who lights shit on fire and then swoops in to play the hero. We call them "arsonists" and they generally get arrested for it.

Or maybe the initial incident was some combination of your fault and the random workings of the universe; the medical attention you get afterward, though, that's deliberate. No need to resort to supernatural causes.

Don't get me wrong; I'm not condemning people who have faith that they think helps them get through a tough situation. Whatever brings you comfort. But all the work done to fix your broken ass? That was people. Humans. Sure, some of us suck, but we can also be quite clever, competent, and helpful. Well, not me. But doctors and EMTs and such. Not to mention any of your friends and family who help you through it all.

Point being, it sounds like this guy acknowledges that.

Well, the rest of the article is the ad for his book and his restaurants, and you can look at that part if you want. As for me, next time I'm in the area, maybe I'll remember this and check out one of those restaurants. After all, that's one of the reasons I travel in the first place.

I just hope they're serving solid food.
October 2, 2022 at 12:02am
October 2, 2022 at 12:02am
#1038460
Today's article is from over five years ago, but it should still be relevant.



Why is it relevant? Well, maybe you want to sell something. Maybe that something is a story, to a publisher or to the general public as a self-publisher.

Or maybe, like me, you just want to know what tricks they're using so you can protect yourself.

Sadly, the "four-letter code" turns out not to be one starting with F.

Several decades before he became the father of industrial design, Raymond Loewy boarded the SS France in 1919 to sail across the Atlantic from his devastated continent to the United States. The influenza pandemic had taken his mother and father, and his service in the French army was over. At the age of 25, Loewy was looking to start fresh in New York, perhaps, he thought, as an electrical engineer. When he reached Manhattan, his older brother Maximilian picked him up in a taxi. They drove straight to 120 Broadway, one of New York City’s largest neoclassical skyscrapers, with two connected towers that ascended from a shared base like a giant tuning fork. Loewy rode the elevator to the observatory platform, 40 stories up, and looked out across the island.

Appropriately, the lede comes from the New Yorker School Of Not Getting To The Fucking Point.

That building still exists, by the way. It is, itself, a pretty amazing design; it's no wonder that it apparently influenced a designer.

In France, he had imagined an elegant, stylish place, filled with slender and simple shapes. The city that now unfurled beneath him, however, was a grungy product of the machine age—“bulky, noisy, and complicated. It was a disappointment.”

That's okay. We Americans feel something similar when we first visit Paris.

The world below would soon match his dreamy vision. Loewy would do more than almost any person in the 20th century to shape the aesthetic of American culture. His firm designed mid-century icons like the Exxon logo, the Lucky Strike pack, and the Greyhound bus.

Great. Two industries that deliberately contributed to our impending doom, and a bus.

To sell more stuff, American industrialists needed to work hand in hand with artists to make new products beautiful—even “cool.”

It is true that one of the only practical uses for art (for certain values of "practical") is in marketing.

Loewy had an uncanny sense of how to make things fashionable. He believed that consumers are torn between two opposing forces: neophilia, a curiosity about new things; and neophobia, a fear of anything too new. As a result, they gravitate to products that are bold, but instantly comprehensible.

I once got into an argument with a published author at a convention. She claimed that people wanted new things. I countered with evidence that they just want old things, repackaged—this was at the height of the "vampire" craze in the noughties, and everyone was talking about Twilight. I also noted that Harry Potter wasn't anything new; it just merged the classic British "boarding school" genre with high fantasy.

I lost the argument on the basis of, well, she was published and I'm not, but I still don't think I was wrong. And we may have actually been saying the same thing: what Loewy's saying here, that stories need a touch of the familiar and a touch of the novel (pun intended, as usual). So yeah, this was the bit that made me want to link this here, because it's just as relevant to selling a story as it is to selling a pack of chewing gum.

Loewy called his grand theory “Most Advanced Yet Acceptable”—MAYA. He said to sell something surprising, make it familiar; and to sell something familiar, make it surprising.

Sounds easier said than done, but it does ring true for me. And I'm just glad it's not a stupid acronym like ABC - Always Be Closing.

Why do people like what they like? It is one of the oldest questions of philosophy and aesthetics. Ancient thinkers inclined to mysticism proposed that a “golden ratio”—about 1.62 to 1, as in, for instance, the dimensions of a rectangle—could explain the visual perfection of objects like sunflowers and Greek temples.

There's still discussion about the Golden Ratio, but as far as I can tell, it may not be "the most beautiful," but it certainly doesn't suck. And it's nice to know the GR, anyway. If only to remember that it's a quick and easy approximate conversion from km to miles and vice versa.

Other thinkers were deeply skeptical: David Hume, the 18th-century philosopher, considered the search for formulas to be absurd, because the perception of beauty was purely subjective, residing in individuals, not in the fabric of the universe. “To seek the real beauty, or real deformity,” he said, “is as fruitless an enquiry, as to pretend to ascertain the real sweet or real bitter.”

Hume obviously lived before there was really good beer.

In the 1960s, the psychologist Robert Zajonc conducted a series of experiments where he showed subjects nonsense words, random shapes, and Chinese-like characters and asked them which they preferred. In study after study, people reliably gravitated toward the words and shapes they’d seen the most. Their preference was for familiarity.

I'm sure that, as with most studies of that sort, they completely ignored the minority preference for novelty.

The evolutionary explanation for the mere-exposure effect would be simple: If you recognized an animal or plant, that meant it hadn’t killed you, at least not yet.

Sigh. At the risk of repeating myself to exhaustion, evo-psych explanations are speculative at best.

But the preference for familiarity has clear limits. People get tired of even their favorite songs and movies. They develop deep skepticism about overfamiliar buzzwords. In mere-exposure studies, the preference for familiar stimuli is attenuated or negated entirely when the participants realize they’re being repeatedly exposed to the same thing. For that reason, the power of familiarity seems to be strongest when a person isn’t expecting it.

This, however, also makes sense.

Several years ago, Paul Hekkert, a professor of industrial design and psychology at Delft University of Technology, in the Netherlands, received a grant to develop a theory of aesthetics and taste. On the one hand, Hekkert told me, humans seek familiarity, because it makes them feel safe. On the other hand, people are charged by the thrill of a challenge, powered by a pioneer lust.

Meh. Challenges are exhausting. That's why I prefer being in a relationship to pursuing one.

Raymond Loewy’s aesthetic was proudly populist. “One should design for the advantage of the largest mass of people,” he said.

And once again, that leaves out outliers like me.

Let me give you an example. I used to buy Oreos fairly often. Don't judge. But then Nabisco started putting out different flavor Oreos. Some of them were obvious trolls, like Candy Corn Oreos (abomination) and Swedish Fish Oreos (crime against nature). But they also made a Dark Chocolate Oreos. Those were delicious. So delicious, in fact, that when they pulled them because they weren't popular enough, I quit buying any kind of Oreos. So instead of reaching a wider audience, they lost one: me.

Could Loewy’s MAYA theory double as cultural criticism? A common complaint about modern pop culture is that it has devolved into an orgy of familiarity.

Sure, people complain about that. But then when a truly new movie (for example) comes out, no one goes to see it. So there's no incentive to try new stuff there.

The hit-making formula in Hollywood today seems to be built on infinitely recurring, self-sustaining loops of familiarity, like the Marvel comic universe, which thrives by interweaving movie franchises and TV spin-offs.

You shut your whore mouth. Those are (mostly) awesome.

One of Loewy’s final assignments as an industrial designer was to add an element of familiarity to a truly novel invention: NASA’s first space station.

Can't get more novel than the first space station, I suppose. But this led me to wonder who designed the iconic NASA logo.

Huh. It was an artist working for NASA.  

See? There's hope for that liberal arts degree yet.
October 1, 2022 at 12:02am
October 1, 2022 at 12:02am
#1038403
I'm participating in "Journalistic Intentions [18+] again this month, so some of my entries will be prompts from that activity.

This is one of them—as usual, selected at random.



When I was in Minneapolis in the summer of 2021, I got to see some of these locks and dams up close. I wrote, briefly, about it at the time: "Dam It

So apparently, based on the link above, there's talk about removing some of them. Not the ones I toured, but downstream from there.

You might think that, being a civil engineer whose whole thing was hydrology, I'd have an opinion on it. After all, I have opinions on plenty of things that I know far less about. But you'd be wrong. Well... I do actually have an opinion, which is "they should do what's expected to have the best overall outcome." Which is why we have people do studies for this sort of thing.

If I thought about it long enough, I'd probably come to the conclusion that this should probably be my opinion on everything else, too. But no, then life would be no fun, and worse, I'd have nothing to blog about.

Besides, rivers are complicated. They build dams for reasons: flood control, erosion prevention, power, recreational purposes; usually some combination of these things. There's always a trade-off too, like when the dam blocks fish spawning or whatever, or the resulting lake drowns a community.

Left to its own devices, a river will shift over time. Not just geological time, but often in one human lifetime. You can see the effects if you look closely at a map of the rest of the Mississippi: horseshoe bends where the river once flowed, and down the center of which a state line was surveyed, have gotten cut off, leaving, for example, a bit of Tennessee on the Arkansas side of where the river is now. Several bits, actually. Some of that might have been due to Army Corps projects, but probably there was also some non-human-caused shifting over the years. If you wanna see, look up Memphis on Google Maps and then scroll north.

And I'm not saying this is bad, or good; it just is. We modify rivers for our own purposes, and for a long time we didn't think much about the consequences. Now that they're modified, though, there are consequences to unmodifying them, as well.

Even further downstream, in Louisiana, the primary Mississippi discharge has been trying to shift over to the Atchafalaya river (no, I can't pronounce it, either). If it weren't for levees and locks and whatnot—engineering projects—it probably would have switched channels by now. From what I understand, it used to switch back and forth every thousand or two years. As one channel silts up, water flows to the other, and vice-versa.

This, of course, would leave New Orleans dry (if not high), so there are economic reasons not to allow it. But there are probably good solid ecological reasons to do so. But we're a part of the ecology, too, part of nature, and we have to make decisions based on incomplete information.

I'm just glad it's not my job.
September 30, 2022 at 12:07am
September 30, 2022 at 12:07am
#1038358
My main sources for scientific discussions are YouTube (I must have broken their algorithm because everything I see on the sidebar is comedy, booze, science, philosophy, music, math, or some combination thereof) and, of course, Cracked.

You can go down your own YouTube rabbit hole; today I'm linking the latter.



Science is generally self-correcting. Theories are discarded or adjusted when new experimental data comes in, or is interpreted differently. But one thing that remains constant, apart from the speed of light, is that no matter how weird we think the Universe is... it always turns out to be weirder.

After all, we’re only here because of a giant explosion that occurred everywhere yet also nowhere, and for no reason, during a time before time. It may have happened infinite times to create an infinite multiverse. It may be happening “now,” though “now” is relative and doesn’t really mean anything worth jack.

Yeah, it's not really an explosion because there was nothing for it to explode into. If it actually happened. There's still some pushback to the Horrendous Space Kablooie theory.

5. Do We Have Another Planet? It May Be A 13.8-Billion-Year-Old Primordial Grapefruit 10 Times Heavier Than Earth

To quote Douglas Adams, "“Space [...] is big. Really big. You just won't believe how vastly hugely mind-bogglingly big it is. I mean, you may think it's a long way down the road to the chemist, but that's just peanuts to space.” Well, compared to the vast distant reaches of space, even the outer reaches of our solar system might as well be a trip to the chemist (that's pharmacy for us Americans). And, as this section indicates, we don't even know how many things orbit our sun.

One thing that probably will be solved in our lifetime, cyborg or not, is among the most mouth-watering space mysteries: the existence and identity of Planet Nine. Is our solar menagerie hiding at least one more big planet? Or just another derelict Kmart? The unequivocal answer is maybe.

Your lifetime, maybe; probably not mine.

If a hidden planet exists, it’s 400-800 times farther away from the Sun than Earth. So it would trace an elongated 20,000-year orbit. Humans were still wiping themselves with their hands last time it swung around our star.

To be clear, we still do; there's just usually paper in said hand. Unless you're fancy and own a bidet.

Being so far away, scientists can’t see P9, but infer it’s there because nearby objects seem to be pulled by the gravity of something hefty.

On the one hand, that's a totally legitimate way to determine if there's another planet involved; it worked to discover Neptune, as I recall. On the other hand, discrepancies in Mercury's orbit compared to prediction led to the hypothesis that there was another planet orbiting even closer to the sun. They even named the planet: Vulcan. Plot twist: Vulcan doesn't exist; the perturbations turned out to be due to relativistic effects.

Point being, maybe there's another planet; maybe our theory of gravity needs to be tweaked again; or maybe it's Maybelline.

If they do find that planet, please lobby for it to be named Maybelline.

Let’s not entertain such dismal, unexciting possibilities. A much more excellent idea is that P9 isn’t a planet but a black hole. A primordial black hole, dating to the first second of creation. As in, the first second ever.

No.

But it's not as farfetched as you might think. After a certain radius, black holes act just like any other mass. If our sun were to be suddenly replaced by a black hole of the same mass, all the planets would continue orbiting sedately just as they do now, not get sucked directly in like pulp SF would have you believe. Which we'd notice just before we froze to death.

4. We Can Theoretically Wee Actual, Individual Aliens By Making A Solar System-Sized “Virtual Telescope”

Please don't wee aliens. They might get pissed off.

Scientists just achieved the most amazing visual feat ever: capturing an image of Sagittarius A*, the bulldog of a black hole at the center of our Milky Way.

This article came out about the same time as JWST started producing real images, and some of them are pretty amazing. Still, imaging something that far away is pretty amazing, sure.

It has the mass of 4 million Suns, packed into an infinitely tiny point.

That's a bit misleading. The dimensions of a black hole are undefined because dimension implies space, and the whole point (see what I did there) of a black hole is that it warps space so completely that the concept of "distance" or "radius" or "diameter" is meaningless. And the event horizon, whose diameter can be inferred, isn't infinitely tiny. Whatever. I'm quibbling about language written on a dick joke site that can't even edit out obvious typos in headers.

Anyway, the image is helpfully included in the article, but it hardly matters; you've seen it. It's become iconic. Even Star Trek started riffing off of it. The real point is HOW they did it, which involves turning widely-spaced Earthbound telescopes into a giant virtual telescope, which is indeed cool. Now imagine putting a bunch of space telescopes in orbit around the sun, say between Earth and Mars, and turning them into a virtual telescope. That's what they mean when they claim we could "wee" aliens. If aliens existed.

3. Sadly, No Telescope Can See An Anti-Universe With Backward Time Which Could Explain Dark Matter. But Math Can

I mean, okay. This is speculation. Math is very, very good at describing the universe (and even perhaps a multiverse), but it doesn't follow that just because math shows something, it must have a physical interpretation.

To be clear, no one yet knows exactly what dark matter is. If it's even matter. That's cool; it means there's more to discover.

2. Neutrinos Will Blow Your Ass Out Of Its Mind

In addition to the Big Bang and other energetic origins, neutrinos come from nuclear reactions in stars

Neutrinos are another thing we don't fully understand. As with anything else we don't understand, speculation runs rampant. Like in this section. Still, as long as we know it's speculation, it's interesting to read about.

1. An Infinite Universe Guarantees The Occurrence Of Things That Are So Unlikely It’s Literally Impossible To imagine, Comprehend, or Perceive

I mean, sure. Infinity itself is impossible to imagine, comprehend, or perceive. It also might not really exist, being perhaps one of those mathematical concepts that doesn't have a real-universe counterpart.

Infinity means that any non-zero event will occur.

I think they mean "any non-zero probability event."

Infinitely.

Not necessarily.

The simplest conception of infinity is "the set of all positive integers." We have notation that can render any arbitrary integer, like 2, or 42, or 10100 — which is just as far away from infinity as 2 is. But each of those numbers is unique. Sure, our notation uses some combination of ten arbitrarily-assigned digits, so two numbers may seem to have something in common, like 543 and 1345, but just because you label something "infinite" doesn't mean, as is sometimes popularly reported, that there's an exact copy of you running around on some exact copy of Earth somewhere else in the universe.

And since the universe appears to be infinite, this forces us to ask if infinity exists, or is just a human failing to comprehend the true nature of the cosmos.

Or, you know, I could be wrong about that. It's not like I'm any better at comprehending the nature of the cosmos, or infinity, than any other ape.

And to be strict about it, there are different kinds of infinity.

Hope that clears everything up.

2,744 Entries · *Magnify*
Page of 138 · 20 per page   < >
Previous ... 28 29 30 31 -32- 33 34 35 36 37 ... Next

© Copyright 2024 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted Writing.Com, its affiliates and its syndicates non-exclusive rights to display this work.

Printed from https://writing.com/main/profile.php/blog/cathartes02/sort_by/entry_order DESC, entry_creation_time DESC/page/32