*Magnify*
SPONSORED LINKS
Printed from https://writing.com/main/profile.php/blog/cathartes02/sort_by/entry_order DESC, entry_creation_time DESC/page/34
Rated: 18+ · Book · Personal · #1196512
Not for the faint of art.
Complex Numbers

A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.

The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.

Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.

Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.




Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning Best Blog in the 2021 edition of  [Link To Item #quills] !
Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2019 Quill Award for Best Blog for  [Link To Item #1196512] . This award is proudly sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . *^*Delight*^* For more information, see  [Link To Item #quills] . Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2020 Quill Award for Best Blog for  [Link To Item #1196512] .  *^*Smile*^*  This award is sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] .  For more information, see  [Link To Item #quills] .
Merit Badge in Quill Award 2
[Click For More Info]

    2022 Quill Award - Best Blog -  [Link To Item #1196512] . Congratulations!!!    Merit Badge in Quill Award 2
[Click For More Info]

Congratulations! 2022 Quill Award Winner - Best in Genre: Opinion *^*Trophyg*^*  [Link To Item #1196512] Merit Badge in Quill Award 2
[Click For More Info]

   Congratulations!! 2023 Quill Award Winner - Best in Genre - Opinion  *^*Trophyg*^*  [Link To Item #1196512]
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the Jan. 2019  [Link To Item #30dbc] !! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on taking First Place in the May 2019 edition of the  [Link To Item #30DBC] ! Thanks for entertaining us all month long! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2019 round of the  [Link To Item #30dbc] !!
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2020 round of the  [Link To Item #30dbc] !! Fine job! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning 1st Place in the January 2021  [Link To Item #30dbc] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the May 2021  [Link To Item #30DBC] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning the November 2021  [Link To Item #30dbc] !! Great job!
Merit Badge in Blogging
[Click For More Info]

Congratulations on winning an honorable mention for Best Blog at the 2018 Quill Awards for  [Link To Item #1196512] . *^*Smile*^* This award was sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . For more details, see  [Link To Item #quills] . Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the January 2020 Round of the  [Link To Item #30dbc] ! Blog On! *^*Quill*^* Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the May 2020 Official Round of the  [Link To Item #30dbc] ! Blog on! Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the July 2020  [Link To Item #30dbc] ! Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the Official November 2020 round of the  [Link To Item #30dbc] !
Merit Badge in Highly Recommended
[Click For More Info]

I highly recommend your blog. Merit Badge in Opinion
[Click For More Info]

For diving into the prompts for Journalistic Intentions- thanks for joining the fun! Merit Badge in High Five
[Click For More Info]

For your inventive entries in  [Link To Item #2213121] ! Thanks for the great read! Merit Badge in Enlightening
[Click For More Info]

For winning 3rd Place in  [Link To Item #2213121] . Congratulations!
Merit Badge in Quarks Bar
[Click For More Info]

    For your awesome Klingon Bloodwine recipe from [Link to Book Entry #1016079] that deserves to be on the topmost shelf at Quark's.
Signature for Honorable Mentions in 2018 Quill AwardsA signature for exclusive use of winners at the 2019 Quill AwardsSignature for those who have won a Quill Award at the 2020 Quill Awards
For quill 2021 winnersQuill Winner Signature 20222023 Quill Winner

Previous ... 30 31 32 33 -34- 35 36 37 38 39 ... Next
September 9, 2022 at 12:01am
September 9, 2022 at 12:01am
#1037524
I'm going to talk about a British thing today. No, not that British thing. Or that one. Or even that one. And it doesn't even involve York, but New York.

Physical Graffiti Building  
Manhattan, New York
The East Village tenement from the cover of Led Zeppelin’s 1975 album has a tea shop on the ground floor.


Yes, I've been all over Manhattan, but it's a big place and I don't think I ever passed by this building. Hell, I owned a copy of the album as a kid. (Me: "Most expensive album cover ever made." Dad: "And you bought it?") I guess I just assumed the building was in the UK somewhere, because, you know, Led Zeppelin.

Incidentally, expensive to produce, not to buy. And I was just going by some random DJ's assertion; I never bothered to verify it, because this was before the internet and also I was already lazy. Further, for all I know, that was bullshit, or maybe some later album exceeded it. Anyway, as I recall, the consumer price was about the same as any other double LP collection. Pink Floyd's The Wall, e.g., which also had amazing cover art.

Led Zeppelin’s 1975 album Physical Graffiti is considered one of their best, featuring such fan-favorite songs as “Kashmir” and “Trampled Under Foot.” Its release saw a delay due to the complexity of its cover design, which proved difficult to manufacture.

Hence the supposed cost.

Designed by Peter Corriston, the iconic die-cut artwork portrays a symmetrical brownstone tenement block, something that could only hail from New York City, with each letter of the album’s title written on its window.

I always thought the cover was clever. It's one of those things that you can only appreciate as an LP cover. Not nearly as impactful on CD, and certainly not as digitized artwork on the internet.

The building in question is a real tenement in Manhattan, still standing at 96 and 98 St. Mark’s Place. Since similar-looking buildings are not uncommon in the city, more than a few tourists have mistaken other brownstones for the one on the album cover. But once you know the actual location, it’s not hard to find.

Other fun places to visit in Lower Manhattan include the Ghostbusters firehouse  , clear on the other side of the island; and, very close to the PG tenements, The Museum of the American Gangster.  

Today, the East Village building is home to a basement tea shop aptly named Physical GraffiTea, boasting a great selection of organic, fair-trade tea and a reputation for offering a wide selection of loose leaf tea and medicinal blends.

A tea shop with a punny name? Now I definitely have to go.

As for my copy of the album, well, it—along with a couple hundred other LPs—was destroyed in a flood, when I was renting a shitty basement apartment not far from where I live right now. This was around the time that CDs were supplanting LPs and cassettes, 1987 or 88 or so. I lost a great deal that fateful night, but the one that still hurts when I think about it, 35 years later, is my album collection.

I never did replace Physical Graffiti. But another thing I'll always remember about it is my old dad (he wasn't much older than I am right now) coming into my room while I was listening to it and going, "That's good music." Considering he would have turned 105 tomorrow, and he grew up pre-rock, I called that high praise.

Can't let a music entry go without including some.



Oh, let the sun beat down upon my face
And stars fill my dream
I'm a traveler of both time and space
To be where I have been
To sit with elders of the gentle race
This world has seldom seen
They talk of days for which they sit and wait
All will be revealed

September 8, 2022 at 12:01am
September 8, 2022 at 12:01am
#1037485
Today is Star Trek Day, the anniversary of the day in 1966 when Star Trek was first unleashed upon a war-weary world. More about this later. Apropos of nothing, today's random article concerns the avocado.

Chance, Choice, and the Avocado: The Strange Evolutionary and Creative History of Earth’s Most Nutritious Fruit  
How a confused romancer that survived the Ice Age became a tropical sensation and took over the world.


Fun fact: the French word for avocado and the French word for lawyer is the same—avocat. I can only assume it's due to the latter's reputation for having a stony, unbeating heart.

In the last week of April in 1685, in the middle of a raging naval war, the English explorer and naturalist William Dampier arrived on a small island in the Bay of Panama carpeted with claylike yellow soil.

Where he immediately converted the natives to Christianity and enslaved them?

https://en.wikipedia.org/wiki/William_Dampier

Okay, maybe not, but he was a dick.

Dampier described the black bark and smooth oval leaves of the tall “Avogato Pear-tree,” then paused at its unusual fruit — “as big as a large Lemon,” green until ripe and then “a little yellowish,” with green flesh “as soft as Butter” and no distinct flavor of its own, enveloping “a stone as big as a Horse-Plumb.”

It is unclear to me whether he meant the horse plum, an American fruit; or horse doody. Probably the former, because horseshit tends to be larger than a modern avocado stone. Source: me, having shoveled way too much of it in my errant youth.

He described how the fruit are eaten — two or three days after picking, with the rind peeled — and their most common local preparation: with a pinch of salt and a roasted plantain, so that “a Man that’s hungry, may make a good meal of it”; there was also uncommonly delectable sweet variation: “mixt with Sugar and Lime-juice, and beaten together in a Plate.”

And thus was born that bane of the Millennial generation, avocado toast. What, you didn't think it wasn't cultural appropriation, did you?

"It is reported that this Fruit provokes to Lust, and therefore is said to be much esteemed by the Spaniards."

Huh. Usually a Brit would take this opportunity to rag on the French.

But far more fascinating than the cultural lore of the avocado are its own amorous propensities, uncovered in the centuries since by sciences that would have then seemed like magic, or heresy.

Any sufficiently advanced science is indistinguishable from heresy. (with apologies to Arthur C. Clarke.)

The most nutritious known fruit, the avocado — a mostly evergreen member of the laurel family — is a ghost of evolution that should have grown extinct when the animals that fed on it and disseminated its enormous seeds did.

Basically, from what I understand, megafauna would eat the things whole and then shit out the stone somewhere else, allowing for expansion of the plant's range. This sort of dispersal, scatological or otherwise, of seeds by animals ("biological agents") is an evolutionary adaptation shared by some other plants, in one form or another. Acorns and squirrels, e.g. As we will see here, we're the biological agents for the avocado.

Mercifully, it did not. Ample in Europe and North American[sic] during the Ice Age, it somehow managed to survive in Mexico and spread from there. But even more impressively, it managed to survive its own self-defeating sexual relations...

Which is more than I can say for some humans.

The avocado, however, is far from reproductively self-sufficient due to an astonishing internal clock, which comes in two mirror-image varieties.

And this is where it gets interesting, but I can't do it justice in quotes; check the article for details. Essentially, the plant is a hermaphrodite, and pollination is tricky.

The world’s most beloved avocado — the Hass — is the consequence of human interference consecrated by happenstance in the hands of a California mailman in the 1920s.

It's always the mailman, isn't it? Or the milkman, if there are still any of those around.

Waltz Fact: I only like Hass avocados. None of the others are nearly as tasty, to me.

The year he turned thirty, Rudolph Hass (June 5, 1892–October 24, 1952) was leafing through a magazine when an illustration stopped him up short: a tree growing dollar bills instead of fruit.

Math-inclined readers will note that this was exactly 100 years ago. Or you can just take my word for it.

By the way, that sort of illustration today would scream "SCAM" at me. It'd probably be an ad for NFTs or cryptocurrency.

The article goes on to describe the fortuitous happenstance that resulted in the best avocado.

When Rudolph cut one open for his five young children, they declared those were the most delicious avocados they had ever tasted.

One avocado for five kids? The Great Depression sucked.

This being America and that being the wake of the Great Depression, the Hass family had patented the avocado within a decade.

I think the article glosses over a 10-year growing and experimentation period, hence the sudden jump to the thirties.

After describing his “new and improved variety of avocado which has certain characteristics that are highly desirable” and listing all the ways in which “the present invention” differed from existing avocados — higher oil content, superior flavor, doesn’t drop from the tree or rot inside before ripening...

Except for the ones I get from the grocery store, which have an approximately 15 minute window between "hard as a rock" and "rotten." There's a green one sitting in my fruit bowl right now. If I'm lucky, it'll ripen later today. If not, it'll ripen in a couple of hours while I'm asleep, and then, by the time I'm ready to use it, will be black as a lawyer's soul. (That's a pun; see above.)

Today, every single Hass avocado in every neighborhood market that ever was and ever will be can be traced to a single mother tree grown by a destitute California mailman in 1926 — tender evidence that every tree is in some sense immortal, and a living testament to how chance and choice converge to shape our lives.

Oooh, poetic. No, it's immortal in the same sense we all are: our constituent elements (including the ones derived from avocados) get recycled, and maybe our legacy endures for a few years after we die. Unless you're a complete dick like Dampier, in which case your dickishness endures forever.

*Avocado**Avocado**Avocado*


As this is Star Trek Day, I think it's a good time to unveil the collection of drink recipes, inspired by Star Trek, that I'm working on with PuppyTales . Right now there are only four recipes, but more are in the works; also, it could use more photos, but that would mean me actually making the drinks again and then taking pictures, uploading them into image items here, and then embedding them into the book, all of which is too much work for my hungover ass. But it'll happen, eventually.

 
BOOK
Ten Forward  (18+)
The Bar on the Edge of Forever
#2279836 by Robert Waltz

September 7, 2022 at 12:01am
September 7, 2022 at 12:01am
#1037430
Why do I keep finding articles about the (shudder) outdoors?



1. Because they go into the wild
2. Because they go into the wild
...
10. Because they go into the wild.

Between the years of 1992 and 2007, our national parks were the site of 65,439 search and rescue (SAR) incidents.

Did they get billed? They should get billed.

From these thousands of SAR missions, a large percentage of “call outs” involved outdoor enthusiasts who had become lost in the wild.

...what were the others? Indoor enthusiasts who got lured into the forest, perhaps with the promise of hot nymph sex?

But with all of the technology we have today—such as cell phones, GPS devices, and satellite phones—how does this still happen?

The article specifies 1992-2007. Accurate personal GPS wasn't available until 2000, halfway through that time period. I'm curious if that made a difference, but not curious enough to be arsed to look it up.

So what kind of blunders leave people floundering and lost? Read on to discover the common pitfalls of wilderness navigation, and how to avoid them.

Not to pummel a deceased equine, but the easiest way to avoid them is to NOT GO OUTSIDE.

Misjudging Distances

With a heavily laden backpack, every mile you trudge may feel like two miles.

More like two thousand.

Unless you are skilled at pace counting (likely from military service, which we thank you for), the average person will have to make a wild guess about the distance they have traveled when “mile marker” features are absent.

I just count cross streets.

Once you have a handle on the amount of ground you normally cover in any given situation, you can use the amount of time to help you calculate your distance traveled.

Or you could—just spitballing here—carry a GPS unit (and a backup).

Inattention To Surroundings

At this point, the article features a helpful photograph of a rattlesnake. There's a story going around that rattlesnakes are evolving to lose their rattles due to selective extermination of the rattling ones. While it's nice that certain people are finally acknowledging that evolution by natural selection happens, in this particular case, there's no evidence for it.  

In any case, nope ropes are but one of many reasons not to go outside.

Letting Egos Run Wild

We all know “that” guy. He’s arrogant, bombastic, and feels he can do no wrong. He’s also a statistic.

I don't know "that" guy. Wait, does that mean I'm that guy? No... no. Can't be. I know better than to go out into the non-indoor part of the world.

Traveling in Tricky Terrain

Flat and featureless landscapes with no distant landmarks can be trying to travel.

And are generally pointless to explore.

Map Mishaps

Don’t rely on digital devices alone for navigation.

This one, I actually agree with. Not with regards to venturing into the trackless wilderness, of course, but even just on a road trip. I always have 2 GPS devices and a paper map. Or at least I did when I had a car. I might get one soon; we'll see.

Following a Game Trail

More like "following a bear trail."

Getting Caught in the Dark

Okay. Yeah. This happened to me once. I've written about it before; no need to rehash my idiocy right now.

A Turn in the Weather

Some of my favorite schadenfreude-laden stories are of people going hiking on Mount Washington in New Hampshire, which basically has its own weather system disconnected from the rest of the troposphere. One time, the observatory on that mountain recorded a wind speed of 231 miles an hour.   That's enough to blow you off the mountain and into Canada.

Taking a “Shortcut”

To be fair, this is a Bad Idea in the city, too.

Splitting Up

There’s definitely strength in numbers. We all seem to know this instinctively. Yet, for a wide range of reasons, people will leave their hiking, camping or hunting group and become lost.

Anyone who's ever played D&D, or watched a horror movie, knows damn well that splitting up is always a Bad Idea. We have a saying: "Let's split up. We can cover more ground that way." Because we can individually cover more ground when we walk into an ambush and die, splattered all over said ground.

Now, don't misunderstand me—I love the wilderness. And I'll happily look at pictures of it.
September 6, 2022 at 12:01am
September 6, 2022 at 12:01am
#1037379
It has always perturbed me that they call the illness a "cold" when it sometimes results in an elevated body temperature.



But that's really not the point of today's discussion.

98.6 degrees. That's what we grow up being told the temperature of the human body is—if you use Fahrenheit, anyway; that same temperature is 37 in Celsius. That’s a weirdly specific value.

Not only is it weirdly specific, but different parts of the body necessarily have a different temperature. Balls, for example, are generally cooler. But as far as I know, no one has proposed taking a guy's temperature by sticking a needle into his balls, and if they did, 99% of us would run screaming in terror.

The other 1% would go, "How much do I have to pay?"

98.6 degrees actually represents the average temperature of a whole bunch of people. One scientist named Carl Wunderlich came up with that number in 1851 by taking people's temperatures repeatedly. He studied 25,000 people many times each, taking around a million measurements, and 98.6 was the mean of his many readings.

And here we have another example of the hazards of assuming that the mean is representative of the range. It's kind of like saying "The average debt of an American citizen is $40k" (it's not actually that number, but that's irrelevant), so you assume every American has $40k in debt, when the reality is some have lots more debt, and others actually have money.

Your temperature also fluctuates quite a bit even over the course of each day.

Especially if you venture (shudder) outside.

Though Wunderlich had a lot of data, it wasn't very good data. He took temperatures of the armpit, which isn't the best spot compared to the mouth or anywhere more internal.

To be fair, if he'd wanted to study 25,000 people by sticking a thermometer where the sun don't shine, 24,750 of them would run screaming in terror, and the other 250 would ask that he please use a bigger thermometer.

Other scientists have tried calculating the mean human body temperature since Wunderlich did. Invariably, they come up with an answer lower than 98.6. The mean human body temperature is below 98 degrees. In fact, it seems that every time scientists calculate a new mean, they come up with a slightly lower value than a few decades before.

Oh no, we're turning cold-blooded as a species! Oh, wait...

That's because when you examine thousands of people, a few of them are bound to be a little sick and have higher than normal temperatures. The healthier that people in general get, the fewer of these feverish people slip into the sample, and the lower the mean temperature falls.

And also probably some of the people studied are secretly our alien reptilian overlords in disguise.

Incidentally, the Fahrenheit scale originally used human body temperature as a benchmark, along with a particular mixture of water, salt, and ammonium chloride for 0. Why that particular mixture? Whatever; it's all arbitrary anyway.

Clearly, the Celsius scale makes more sense; it's much easier to calibrate a thermometer using ice water at 0 and boiling water at 100. But even in places that use Celsius (e.g. pretty much everywhere that's not the US), they sometimes quote temperatures in F just for the shock value. "It's 100 degrees in London today!"

Today, though, the temperature scale is based on—get this, now—the Boltzmann constant, the Planck constant, and the unperturbed ground-state hyperfine transition frequency of the caesium-133 atom.  

So next time you're running a fever, ask a quantum physicist about it.
September 5, 2022 at 12:01am
September 5, 2022 at 12:01am
#1037330
I expect the answer to every headline question with a potential binary result to be "no."



Time travel makes regular appearances in popular culture, with innumerable time travel storylines in movies, television and literature.

Yes, and most of them do something self-inconsistent for the sake of plot.

But it is a surprisingly old idea: one can argue that the Greek tragedy Oedipus Rex, written by Sophocles over 2,500 years ago, is the first time travel story.

Well, we'll have to go back and ask Sophocles, wouldn't we?

But is time travel in fact possible? Given the popularity of the concept, this is a legitimate question. As a theoretical physicist, I find that there are several possible answers to this question, not all of which are contradictory.

Clearly, a theoretical physicist has more knowledge in this area than I do, but that doesn't stop me from making comments.

The simplest answer is that time travel cannot be possible because if it was, we would already be doing it.

Even the simplest answer runs into a complex problem; in this case, tenses. Does he mean that we would have invented it by now, or we will-would invent it in the future, sending schlubs back to our time along with, presumably, the technology?

Of course, future-us (assuming there is one, which is a big assumption right now) might be walking among us unseen, undetected. So might aliens. Or fairies. Maybe the fairies are the time travelers. Maybe the alien fairies are time travelers.

All of which is great fodder for fiction writing, but, without evidence, I can no more accept any of that than I can Russell's Teapot.  

One can argue that it is forbidden by the laws of physics, like the second law of thermodynamics or relativity.

As we do not know all the laws of physics, and probably never will, there's always going to be some room for someone to go "but we don't know everything and we've found loopholes in 'laws' before."

There is also the matter of time-travel paradoxes; we can — hypothetically — resolve these if free will is an illusion, if many worlds exist or if the past can only be witnessed but not experienced.

Free will is probably an illusion, but we can't write time travel stories based on that, because most people need to believe that free will is real. Also they like protagonists with some agency, and this takes away their agency.

Even just witnessing the past presents potential paradoxes. To witness an event, one must intercept some of the light from that event, light that, presumably, would have affected other atoms. Sure, it might be a really, really tiny change, but, well, that's all it takes for chaos to shift. Butterfly effect and all that.

Unless, of course, the interception was always part of the past and the future-us only will-did cause an effect that had already happened.

I told you tenses were insufficient.

We can actually design time machines, but most of these (in principle) successful proposals require negative energy, or negative mass, which does not seem to exist in our universe.

FTL drives suffer from the same problem. But hey, we can't rule out creating exotic matter at some point.

Mathematical physicist Frank Tipler conceptualized a time machine that does not involve negative mass, but requires more energy than exists in the universe.

So we will have to learn to tap the energy of other universes.

Time travel also violates the second law of thermodynamics, which states that entropy or randomness must always increase. Time can only move in one direction — in other words, you cannot unscramble an egg.

Here's the thing, though: our brains are also constrained by the Second Law. Our perception "moves" in the same temporal "direction" as the egg-scrambling.

There is no doubt that if we could time travel freely, we run into the paradoxes. The best known is the “grandfather paradox”: one could hypothetically use a time machine to travel to the past and murder their grandfather before their father’s conception, thereby eliminating the possibility of their own birth. Logically, you cannot both exist and not exist.

Plenty of ways to get around paradoxes. The easiest is to accept that we don't have free will.

Kurt Vonnegut’s anti-war novel Slaughterhouse-Five, published in 1969, describes how to evade the grandfather paradox. If free will simply does not exist, it is not possible to kill one’s grandfather in the past, since he was not killed in the past. The novel’s protagonist, Billy Pilgrim, can only travel to other points on his world line (the timeline he exists in), but not to any other point in space-time, so he could not even contemplate killing his grandfather.

It's been a while since I read that. What I find amusing about Vonnegut is that he was absolutely a science fiction writer, but refused to describe himself as one, because so much of science fiction was space opera, and he was serious.

Could we allow for actual modifications of the past, so that we could go back and murder our grandfather — or Hitler?

The Hitler-killing thing isn't even a trope anymore; it's a cliché. But let's assume for the moment that we will invent time travel, and that time travel will-would allow future-us to alter the timeline. Logically, future-us would have a vested interest in keeping the timeline the way it is, if only to assure they will have come into being in the first place.

Or, just maybe, Hitler was the least of all possible evils. I mean, his regime almost developed an atomic bomb and a means of delivery over long distances. Perhaps future-us will have decided that losing 15 million people in conventional war and holocaust was better than losing a billion in nuclear holocaust. Trolley Problem, writ large.

If we will-would be able to go back and stop all of history's evil actors, then why were there evil actors?

In other words, what if we invent time travel and use it to go back and create the best of all possible worlds. And what if we're living in the result?

Time travel conjectures make one's head hurt. Especially because whenever I bring up that last argument, someone hits me upside the head. Sometimes that "someone" is me.

So is time travel possible? Probably not, but we don’t know for sure!

Yeah. I'm going to go with "no." But that's not going to stop me or anyone else from writing stories about it.
September 4, 2022 at 12:15am
September 4, 2022 at 12:15am
#1037286
Today's article is a few years old now, but I just found it. It is from the Before Time, and since I'm not sure how relevant it still is, well...



Meh. We'd rather stay in bed. Rising is for hustlers.

It’s hard to remember a time when scrolling through Instagram was anything but a thoroughly exhausting experience.

Yeah, it really is hard to remember. Oh, wait, that's because I've never been to Instagram.

Where once the social network was basically lunch and sunsets, it’s now a parade of strategically-crafted life updates, career achievements, and public vows to spend less time online (usually made by people who earn money from social media)—all framed with the carefully selected language of a press release.

You know, one of the great things about this blog, for me anyway, is that I'm not making any money at it. I'm not trying to make money at it. I'm not selling anything; while I am promoting my point of view, my livelihood doesn't depend on convincing you of anything. I'm not constrained by commercial requirements, only by the self-imposed 18+ rating and my own conscience.

I have no problem with people trying to make money, per se. Just when it's hidden behind a few proxies; that is, if you're pretending not to try to make money.

Well, I despise most ads, too. So there's that.

Everyone is striving, so very hard.

No. Everyone you see is striving, so very hard. The whole thing about slacking is... if we're doing it right, you're not seeing us.

Back in the 1990s, our heroes were slackers: the dudes and the clerks, the stick-it-to-the-man, stay-true-to-yourself burnouts we saw in Ferris Bueller’s Day Off, and Slacker, and Reality Bites.

I admit I haven't seen all those movies, but yeah, The Dude abides.

Fun side story here: A friend of mine has a 12 year old son. He had the kid watch The Big Lebowski to show him the dangers of being a slacker. They watched the movie together, and at the end, the kid was like "I wanna be The Dude."

But somewhere in the early 2000s, the slacker of popular culture lost ground to the striver.

Noughties. Goddammit, the first decade of this century is the Noughties.

I am not immune to this thoroughly aspirational mindset, and you probably aren’t either.

Do I want more money? Yes. Am I willing to be someone I'm not in order to get it? No.

Whether we have side hustles, personal brands, gig economy jobs, or entrepreneurial leanings (I’ve had all four), to survive in the modern economy is to aspire to something much greater than what we are.

I'm not sure exactly what the author intends by "personal brands" there. Is the meaning that of "public persona" or "stuff you're trying to sell?" If it's the former, sure, I have a personal brand, mostly displayed right here on this blog: the alcohol-positive lifestyle, the low tolerance for bullshit, the thinking about science and philosophy, and some things that some people might consider humor.

But all of that is just Me. I mean, aspects of my personality, perhaps magnified or exaggerated, but I'm not trying to be anything I'm not or, as I said above, trying to sell anything.

The internet influencer is the apotheosis of all this striving, this modern set of values taken to its grotesque extreme: Nothing is sacred, art has been replaced by “content,” and everything is for sale. This is true even when the message is swathed in the language of counter-culture: Eco-conscious influencers see no issue in flying long-haul on free trips from brands. Yoga gurus who traffic in anti-consumerist spirituality promote tea brands owned by Unilever.

I've taken to calling them "influenzas."

But as anyone who has lived a few decades knows, youth culture swings like a pendulum. The buttoned-up post-World War II period gave way to the countercultural Free Love generation (arguably the original slackers, as they were the first to have middle class comfort to rebel against). Similarly, the 1980s excess of Gordon Gecko’s Wall Street set the stage for the slackers amid the economic recession of the 1990s, with their flannel shirts, skater culture, Beastie Boys and Nirvana records.

Point of clarification here: There was one (1) economic recession in the 1990s, early on, while Bush was president (not that I'm blaming or absolving him, just setting the stage). This was essentially a product of the eighties. Decades don't always start or end neatly on years ending in 0 or 1. The point I'm making here is that the culture they're describing coincided not with the recession, but with the subsequent economic powerhouse that was fueled by the rise of the internet and other advances in technology.

The 1990 recession was also really mild, as these things go.

But there’s always something to glean from the dominant youth culture of an era. What was cool—what the kids were into—tells us something fundamental about what we valued.

You know why people care about what the kids are doing? Because advertisers want them to. That's it. That's all. (That's also the source of generational labels.) They market mostly to young people because when you get to be my age, you start getting all cynical about advertising. No, kids aren't "cool." They're awkward and trying to figure out what to make of their world, and their place in it. So no, I reject this assertion out of hand.

And, in a modern aspirational marketplace so saturated that fake influencers are now posting advertising-like content that nobody even paid them for, there are signs that our individualist culture of achievement and brand alignment has jumped the shark.

I find this statement hilarious because the term "jumped the shark" came from a sitcom made in the seventies about fifties culture, featuring a character developed as the Avatar of Cool.

To be fair, Fonzie was absolutely the Avatar of Cool. But that's not the point.

If the cycle of history is any guide, once our culture of striving flames out, it may well be time for the slacker to rise again.

And here we have the rare example of a prognostication coming true. Great Resignation? Quiet-Quitting (which should actually be called "doing your job")?

For the internet influencer, everything from their morning sun salutation to their coffee enema (really) is a potential money-making opportunity. Forget paying your dues, or working your way up—in fact, forget jobs. Work is life, and getting paid to live your best life is the ultimate aspiration.

Except are they really living their best life, or just faking it for the clicks?

The phrase “famous for being famous” used to be a derisive slur for socialites; now it’s an entire category of global commerce that has landed the world’s youngest “self made” billionaire on the cover of Forbes: Kylie Jenner.

"Self-made," my fat white pimply ass.

The likes of Kathy Griffin, Ben Stiller, Janeane Garofalo, and Andy Dick typified the ethos of the slacker era.

Also, let's be clear, here: "slacker culture" was also framed, packaged, and sold to us.

What’s more, we’ve never been more skeptical of the very social media platforms that have quietly and powerfully shaped influencer culture.

Yeah, right. Bookface, maybe, but people keep trying to get me to join the Chinese Communist Party. Er, I mean, join TikTok. Same thing.

But as we settle into our roomier trousers and perhaps take a toke of a legally sanctioned weed pen, there’s a less comfortable question to ask: Is it even possible to truly be a slacker anymore?

It's not nearly as slackerish if it's legal.

As Storr writes of our culture’s failed promise: “It wants us to buy the fiction that the self is open, free, nothing but pure, bright possibility … This seduces us into accepting the cultural lie that says we can do anything we set our minds to … This false idea is of immense value to our neoliberal economy.”

I don't have much else more to say, but as far as I'm concerned, that right there is the money quote.

I just think it's buried too deep in the article for anyone with a twenties-era attention span to ever see.
September 3, 2022 at 12:18am
September 3, 2022 at 12:18am
#1037249
I just found this one yesterday, added it to the queue as usual, and behold, its number came up. This is either random chance, or the Universe trying to tell me something. (It's the former.)

How to read philosophy  
The first thing to remember is that the great philosophers were only human. Then you can start disagreeing with them


"The great philosophers were only human" except for Nietzsche (of whom there is a sketch at the top of the article). No one with a mustache like that could possibly be only human. You might say he was superhuman. An Übermensch, if you will (please don't).

It might seem daunting to read philosophy.

No more difficult than crossing Antarctica in a sweatshirt.

Giants of thinking with names like Hegel, Plato, Marx, Nietzsche and Kierkegaard loom over us with imperious glares, asking if we are sure we are worthy.

Yeah, no, they all had to take a shit sometimes.

Some of them were real jerks. Here’s Arthur Schopenhauer on his fellow German philosopher Georg Wilhelm Friedrich Hegel, for instance: ‘a flat-headed, insipid, nauseating, illiterate charlatan, who reached the pinnacle of audacity in scribbling together and dishing up the craziest mystifying nonsense.’

Wow, if only they were alive today, I might actually sign up for Twatter just to watch them fight there.

That is, why read philosophy in the first place? The chief goal is, simply, the improvement of your own soul.

I'm going to go ahead and assume that they're using "soul" metaphorically.

No one should read philosophy just to sound smart, or intimidate others, or have impressive books on the shelf.

Not that those aren't fun things to do.

One should read philosophy because one wants a better mind, a better spirit, and a better life. (Or, at least, one wants a better understanding of why none of these things are possible, or why none of them matter; philosophy leaves no possibility unexplored.)

And also so you can fully appreciate things like this.   (That link will take you to the latest installment of Existential Comics. You can peruse earlier comics or do what I do and just hit "Random.")

Reading philosophy gives us richer perspectives, casts us into deep wonder, and helps us grapple with the biggest questions a human can ask. It is a heady call to action – and one that can only be met by diving in to the works themselves. So: how does one read philosophy?

Um... one word at a time?

To this day, there are bookstores with sections titled ‘Philosophy’ that include titles like The Seven Secrets to a Happier Life, or Get Your Sh*t Together, or Living With Your Heart Wide Open. These are self-help books, and some of them may actually prove helpful in giving you a better perspective to overcome or live with the obstacles in your path.

There's a word to describe œuvre like this: copium. Rather than helping with the Big Picture, it zooms in on details, trying to help people cope instead of encouraging thought. And the only person a self-help book really helps is the author, provided they sell enough copies.

Everybody needs a helpful nudge from time to time, and probably every self-help book has helped someone somewhere.

Every newspaper horoscope has probably helped someone somewhere. That doesn't make it worthwhile; the stopped clock analogy applies.

Philosophy typically raises less personal questions, such as whether time is real, or whether humans can exempt themselves from laws of nature, or if a person is just what their brain does, or whether we have moral obligations to strangers.

And sometimes it gazes so deeply into its own navel that the navel starts staring back. (Sorry again, Nietzsche.)

But more generally, philosophical problems are the ones that unavoidably come along with being conscious. If you can think, you have problems: this is why there is philosophy.

No, this is why there is C-4. I don't remember where it was; it was probably on a bumper sticker: "There are very few personal problems that cannot be solved with a suitable application of high explosives."

But how does one get started? What books are good to begin with? Philosophy isn’t like mathematics, where people generally agree that you need to start at one place and take a sequence of steps that steadily build upon one another.

Mathematics has its own branches, and every attempt to unify them thus far has, as far as I know, utterly failed. Including attempts by philosophers. Especially attempts by philosophers. (Lots of mathematicians were also philosophers, like Russell and Descartes.)

...but perhaps the most important point to bear in mind is that there’s no single or best way to begin. Start anywhere, and follow your interest wherever it goes.

Yeah, I don't know (that phrase, incidentally, is what I consider the most important in all of science and philosophy). Personally, I'd start with logic. Without a good background in logic, including a study of the fallacies we're all subject to, it would be more difficult to read other philosophy critically, to identify any flaws in reason or logic as a philosopher lays out an argument.

For example, I remember a graffito I saw on a bathroom stall in the philosophy department of the university I attended:

God is love.
Love is blind.
Ray Charles is blind.
∴ Ray Charles is God.


The logical fallacy of which I leave as an exercise for the reader (I hate when math books do this).

Muddying the waters there (as philosophers love to do), there's Kant's Critique of Pure Reason. So there's that. (Like Existential Comics, I'm strenuously avoiding Kant/Can't puns.)

Dogs must think that we slip into comas when we are reading books because we hardly move at all.

Assumes facts not in evidence: do dogs think, and if so, do they have any concept of "coma?"

It has been said that there are ultimately two replies to any philosophical claim: ‘Oh yeah?’ and ‘So what?!’ That is right.

They forgot "Oh hell no" and "Snort."

You should be thinking of counterexamples to the general claims that are made, or other possible explanations, or asking whether the philosopher is handling similar cases consistently.

You mean like I do here?

Thinking of counterexamples, I mean. I'm anything but consistent. "Do I contradict myself? Very well then I contradict myself. I am large; I contain multitudes."

Whitman, by the way. Poet, not philosopher. Though that line is blurry as hell.

In all likelihood, the philosopher you are reading is not an idiot...

Snort.

Let us not be book snobs: Plato himself thought that real philosophy takes place only in live conversation, and a written text is at best only an imitation of the real thing. (And yes, he wrote that down in a text.)

As you can see, self-contradiction is baked into the very sourdough of philosophy.

One nice thing about a discipline as old as philosophy is that textbooks don’t exactly go out of date, and sometimes some old introductory textbook from 1970 can provide a nice overview of a tangled set of questions.

Unless you're taking philosophy classes at a modern college, where the 18th edition differs from the 17th edition by three punctuation marks in the Foreword, so you have to spend $250 on the 18th instead of buying the 17th used for a buck and a half.

Anyway. I've banged on long enough. There's way more at the article, if you're interested. And if you're not, well, can't say I blame you.

As usual, I can't end an entry about philosophy without embedding this video:


September 2, 2022 at 12:11am
September 2, 2022 at 12:11am
#1037199
I've noted before the sometimes blurry distinction between beer, wine, cider, and other delicious fermented beverages. One recent example: "Going Bananas

Why Is Wine (Almost) Always Made From Grapes?  
On the merits of blueberry, cherry, and pumpkin wine.


Short version: In general, beer is made from grain; wine is made from fruit. But then you have sake, which is called rice wine, even though rice is a grain. And cider, which is made from fruit (apple or pear) but is more similar to, and marketed like, beer.

I note this because "pumpkin wine" threw me before I remembered that pumpkin is basically winter squash, and squash is technically a fruit (in the same way that tomatoes are technically a fruit, and yes, tomato wine exists but that's another day's blog).

Anyway. The article, which is from Gastro Obscura.

In 1951, Konstantin Frank arrived in the Finger Lakes region of upstate New York. An immigrant from the Soviet Union, he took a job at a local state agricultural station.

Where the CIA undoubtedly had at least three guys watching him.

Once there, he tried to convince his colleagues to grow grapes from Europe—Riesling, Chardonnay, Cabernet—and make fine wine. He was ignored.

"No, teach us how to make vodka!"

So Frank went to herculean lengths to prove them wrong. He drew on techniques he’d developed while producing and researching wine under Stalin in frigid Ukraine—like burying the vines each winter.

Even more stereotypical than vodka: Russians stealing shit from Ukraine.

Frank’s case is both exceptional and typical. (He eventually succeeded in bottling tasty Chardonnays and Rieslings, inspiring wineries to set up shop in Virginia, New York, and New Jersey.)

I'm most familiar with Virginia wines, of course, but I will note that the Niagara Valley produces some excellent wines, including ice wine, which pretty much requires the grapes to freeze on the vine. They lose most of the crop, but what remains concentrates the sugars in the fruit.

Why is this? If wine is simply fermented fruit juice, why didn’t Frank turn to the blueberries that grow easily in the Finger Lakes? Why do Egyptians struggle with grapevines in the desert rather than sell date-palm wine?

I don't think "simply" gets to be used here.

Wine made with other produce does exist: You can buy cherry wine and blueberry wine. But the market for them is tiny. They’re seen as a novelty and not taken seriously within the industry.

I had some homemade blueberry wine a while back. As with homemade mead, the results can vary wildly in quality. Of course, if a friend makes it, you force a smile and go "This is wonderful!" and then find a houseplant to pour it into when the host isn't looking.

But the history of (grape) wine’s ubiquity looks a lot like how Greco-Roman constellations are now almost universally known: We have Plato and Caesar to thank.

Today's Greeks make a distilled beverage from grapes. It's called Raki, and it's disturbingly good.

In BC times, most humans got tipsy from beer and other grain-based beverages. Almost everyone grew grain, whereas grapes came from a limited number of grape-growing areas, such as Georgia.

No, no, Georgia is for peaches.

Yes, that's a joke; I know which Georgia they're talking about. Duh.

Per Tom Standage in A History of the World in 6 Glasses, ancient Greece was the first place where wine-drinking became universal. The vines took well to the Mediterranean climate, producing enough wine for drinkers of all social classes, who often opted for watered-down wine to sterilize pathogens in the water and to stay sober enough for symposiums of (allegedly) high-brow thought and debate.

A couple of notes here:

The amount of alcohol in wine may or may not be enough to sterilize pathogens. My sources differ on that, and it might depend on how strong the wine is. But to make wine (or beer) you usually first boil the water, which definitely removes pathogens. The classical Greeks may not have known about microscopic life, but I'm sure they saw the effects.

And second, I'd have to have something stronger than watered-down wine to participate in a Greek philosophy seminar.

When Rome established its empire, it admired and adopted much of Greek culture.

You spelled "appropriated" wrong.

Grapes dominate the wine industry, but their hegemony is not complete. To learn why grapes are the go-to fruit for winemakers, and why he bottles vintages made from blueberries and cranberries, I spoke with Keith Bodine of Sweetgrass Farm Winery in Union, Maine.

It is true that the very last US state I'd associate with wine would be... okay, Alaska, but Maine would be a close second.

The rest of the article is an interview with Bodine, and I don't have much more to say about it, but I will include the money quote, as it were; the answer to the question in the headline:

Over time, we’ve selected and propagated [grapes] for characteristics that make better wine, or wine as we know it. That’s mainly higher sugars.

And:

Most fruits have half the sugar, or less.

The article helpfully ends with a short list of wineries working with non-grape fruits (including grapefruit). All of them are eastern North America, except for one outlier in... Israel? Israel, which historically was all about grapes? Okay. Fine. If I ever go back there, I'll try it. Meanwhile, the one in New Jersey (which makes the pumpkin wine) might make for a good overnight trip for me, and I've been getting the travel bug again.

Unfortunately, I still don't have a car...

Oh, one more thing. Speaking of beverages. Between me and PuppyTales , we now have enough drink concoctions to start a collection. I'm going to put them in a book item and link it here when I've added some entries. Most of them are Star Trek themed, and I've included some of the recipes in here already, but it would be nice to have them all in one place. I'm mostly noting it here to remind myself to actually follow through on this, something I don't have a great track record at.
September 1, 2022 at 12:34am
September 1, 2022 at 12:34am
#1037153
You know how sometimes you find something that agrees with your own ideas, but the way they present it leaves something to be desired?

Intelligent Life Really Can't Exist Anywhere Else  
Hell, our own evolution on Earth was pure luck.


Problem #1: using the word "intelligent."

That's just begging some amateur comedian to go "Well, there's no intelligent life here, either ha ha heh ha hurr."

I prefer to use "technological." There's no question that we're a technology-using species. While the vast majority of us couldn't invent a tin can string telephone, we are really, really good at copying what smarter humans do, and smarter humans have built, among other things, airplanes, spaceships (albeit primitive ones), a vast communications network, and refrigerators, to name but a tiny fraction.

The reason this matters is that, in the search for extraterrestrial civilizations, what we're looking for isn't signs of "intelligence" but evidence of alien technology: communications, a Dyson sphere, robot infestations of other worlds (like we've done with Mars), maybe obvious signs of stellar engineering, etc. Those would, in principle at least, be easier to find than signs that some other planet has flowers.

Corvids are "intelligent." So are octopodes, dolphins, cats, one or two breeds of dogs, and the mold that used to be a container of leftovers in my fridge. None of them have developed radio or interplanetary travel.

Okay. Enough about that. I've beaten that deceased equine in here before.

Problem #2: using the word "can't."

Jury's out on whether the Universe is infinite or not. What is known is that the observable universe is finite, but extremely large, larger than our puny, unintelligent minds can comprehend. I find it extremely unlikely that there "can't" be another technology-using species in its vastness.

The problem is, the farther out you look, the farther back in time you go. Technology requires access to relatively heavy elements, and those are only produced over vast cosmic timescales, in supernova explosions and the like. In practice, we can only look at our own neighborhood in our own galaxy, and even there, the distances involved are (currently at least) insurmountable even if we could detect, say, a Ringworld.

And that's just the headline.

In newly published research from Oxford University's Future of Humanity Institute, scientists study the likelihood of key times for evolution of life on Earth and conclude that it would be virtually impossible for that life to evolve the same way somewhere else.

While I tend to accept this as it agrees with other findings in the past...

Problem #3: misconceptions about evolution

Humans are not the inevitable end-product of evolution. It's absolutely not inevitable that evolution will produce a technology-using species. Millions of other species right here on Earth are highly successful without having sent rockets to the Moon or spending time doomscrolling when they should be trying to sleep. Further, we're not the end product, but only one of many ways populations have adapted.

Problem #4: "virtually impossible."

It's "virtually impossible" that Halle Berry will show up at my door, wearing the catsuit and carrying two growlers full of delicious craft beer. But as long as she's alive, there's a chance.

For decades, scientists and even philosophers have chased many explanations for the Fermi paradox.

Problem #5: the Fermi paradox is not a paradox.

I don't have time tonight to go into the various kinds of paradoxes, but I assert that the Fermi paradox doesn't fit into any of those categories. The "paradox," as stated, is (based on Wiki): "The Fermi paradox is the conflict between the lack of clear, obvious evidence for extraterrestrial life and various high estimates for their existence."

First, like I said, we only have a relatively small sample size to find such evidence; and, second, those who make "high estimates" of the chance of it existing could be very, very wrong. Also:

Problem #6: Finding life is not the same thing as finding species we can communicate with.

This is related to #3 above, but we can't effectively communicate with other intelligent species here on Earth, beyond some very basic signals, and they have the same origin as we do; what makes us think we'd be able to communicate with a species that doesn't share our evolutionary history?

Additionally, there's a very good chance that there's life under the ice covering Jupiter's moon Europa. What I mean by that is there might be some bacteria analogues there, not that they've built radio towers.

How, in an infinitely big universe, can we be the only intelligent life we’ve ever encountered?

See above discussion of "infinitely." And "intelligent."

Even on Earth itself, they wonder, how are we the only species that ever has evolved advanced intelligence?

Using the "i" word again. We are not the only intelligent species; we're just the only one stupid enough to dig up billions of years' worth of concentrated energy to power an industrial society.

There are countless naturally occurring, but extremely lucky ways in which Earth is special, sheltered, protected, and encouraged to have evolved life.

Problem #7: Not explaining this very well.

I absolutely agree, but I don't have time to give examples, and apparently neither did this writer. There are hotlinks at the article, though; maybe they shed some additional light. And this also seems to equate "life" with "technological civilization," which I've already complained about. Basically, "life" can exist in far worse conditions than our own, but not necessarily a (mostly) cooperative civilization that can send robots to neighboring planets.

“The fact that eukaryotic life took over a billion years to emerge from prokaryotic precursors suggests it is a far less probable event than the development of multicellular life, which is thought to have originated independently over 40 times,” the researchers explain.

Problem #8: Not explaining this very well, either.

This article is from Popular Mechanics, not some science publication where the audience can be expected to know these terms.

Basically, prokaryotic life is simple cells: nucleus, cell wall. Like bacteria. Eukaryotic life, like us and most multicellular organisms on Earth, adds mitochondria to the cells, making them better able to produce the kind of energy one would need to become a rocket scientist.

But again, agreed. Of course, we won't know until we find some hard evidence of evolved life elsewhere, which we have not.

In this case, they’ve used a Bayesian model of factors related to evolutionary transitions, which are the key points where life on Earth has turned from ooze to eukaryotes, for example, and from fission and other asexual reproduction to sexual reproduction, which greatly accelerates the rate of mutation and development of species by mixing DNA as a matter of course.

Problem #9: "True believers" will argue with this, saying that there might be other biological avenues to producing Chee-Tos-eating apes or their equivalents. They're not necessarily wrong, but their arguments need to be addressed.

And using their model, these scientists say that Earth’s series of Goldilocks lottery tickets are more likely to have taken far longer than they really did on Earth.

Problem #10: "If technological life is so unlikely, how is it we're here?"

I've said this before, but I'll say it again: The chance of winning a lottery is absolutely irrelevant after you've already won it. Once you've won the lottery, the chance of having won it is unity.

Anyway. Enough. None of this is anything I haven't banged on about in here before, but it gave me a chance to look at it from another direction.
August 31, 2022 at 12:04am
August 31, 2022 at 12:04am
#1037122
Did you think Urban Dictionary   was something new?



Like most kids, I amused myself in my youth by looking up certain words as soon as I encountered a new dictionary. Kid Me derived great enjoyment from finding Forbidden Words in school library dictionaries.

A “dictionary of the vulgar tongue” may sound like some kind of prank gift, something you pick up as a means of upping the ante on your name-calling or adding some spice to your conversations for all occasions. But you won’t find this dictionary at Spencer’s Gifts. It’s tucked away at the British Library in London, shelved and looking prim and proper in its original 1785 binding.

How much better, then, to find the 18th century proto-Urban Dictionary in such a staid institution as the British Library? It's like finding fart jokes at the Library of Congress.

[The dictionary] looks a lot like another noted 18th-century dictionary—Samuel Johnson’s Dictionary of the English Language. The only thing differentiating these two is their focus: the English language versus the “vulgar tongue.”

As the article points out, "vulgar" meant a different thing back then. Basically it was peasant-speak.

Today, “it’s one of the more important slang books ever published,” says lexicographer Jesse Sheidlower, an adjunct assistant professor at Columbia University. “Johnson made a specific effort to keep out this kind of language.”

It is not possible for me to think of Samuel Johnson's dictionary without recalling one of the funniest comedy shows of all time, of which the following is but a fragment:



Anyway...

The entries in Grose’s dictionary run the gamut from words and phrases common to laborers, military personnel, and bar frequenters to cant—the jargony language of criminals. Among the pages are such listings such as “cheeser,” another word for a fart; an “Admiral of the narrow seas,” someone who drunkenly vomits into the lap of the person sitting opposite him; and “to dance upon nothing,” meaning “to be hanged.”

One wonders how many words the dictionary contained for private parts. I once participated in an impromptu roundtable discussion where we listed every synonym for "penis" we could come up with, and while I don't remember the exact final tallywhacker, it was well into the hundreds.

But this isn’t just a collection of fun phraseology, explains Sheidlower. It’s a window into a crossroads of language at the heart of 18th-century Britain. Terms used in various underground criminal enterprises—like “bean feakers,” or bill counterfeiters—intermingle with simple words used among commonfolk like “lobkin,” which is just another word for a house or home. (Some of the words and phrases included live on into the modern vernacular with their centuries-old meaning, such as “to screw” and “to kick the bucket.”)

Whereas I'm willing to bet the vast majority of the words and phrases have been lost from common speech, having lost their relevance, or replaced by more modern equivalents. And yet, a quick glance at the contents shows that others have, indeed, persisted; some have even entered formal language. I will point out, for example, that one of the definitions of "punk" therein is "a little whore." I will also note that, since it still amuses me to do so, I looked up some of the naughtier words and was delighted to find that some of them were included—albeit with apparent self-censorship.

Grose and his dictionary gave the world a peek inside various groups in danger of having their cultures steamrolled and made the language of commoners as worthy of study as that of aristocrats.

And the article ends with a link to the digitized version of the dictionary. If you're too lazy to go to the article to find it, here it is.  .

Incidentally, there's a word in there: Frenchified. (I found it when I was looking for other words starting with F.) It means "Infected with the venereal disease." I can only assume that there exists, or at least existed, a French version of the dictionary, in which the word is "Anglified."
August 30, 2022 at 12:03am
August 30, 2022 at 12:03am
#1037088
Most of you probably know by now that there's a new merit badge available to the community, one I commissioned and, of course, The StoryMistress implemented. Here it is, called "Complexity":

Merit Badge in Complexity
[Click For More Info]

Congratulations on your new merit badge! Thank you for supporting the Writing.Com community with your inspirations, participation and activities. We sincerely appreciate it! -SMs


While obviously named after this blog, it's not restricted; anyone on WDC can send it to someone else on WDC. But the meaning might be obscure to some; therefore, I will explain it. This may be the only time I ever do so.

Fair warning: math discussion ahead. But not a very technical one.

The header for this blog has always had a brief definition of what a complex number is in mathematics. As I've stated before, though, for me the title is a pun of sorts; both "complex" and "numbers" have other definitions. For example, the former can describe a kind of psychological disorder, and the latter also can be a synonym for a musical composition.

Also, all blog entries have an identifying number. Point is, I thought it was appropriate.

Now, there's a deceptively simple iteration you can do on any number, which I won't go into in detail (you can find all the detail you want by searching for it, or look at the Wiki link I provide below), only point out that when you do this iteration, the result either blows up to infinity, or it doesn't. If the number is a complex number, and the iteration doesn't tend to infinity, it's in the Mandelbrot set, usually graphically represented in black. Other colors are assigned to numbers outside the Mandelbrot set, depending on how quickly the iterations tend to infinity.

The cool thing, though, is that no matter how close you zoom in on a point on the boundary between "goes to infinity" and "doesn't go to infinity," you get the same sorts of spirals, whorls, and intricate designs. The boundary is self-similar at all scales, to any number of decimal places. (This is what I mean, in the blog header, by "Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.")

In the "real world," this can't happen. Zoom in closer and closer and eventually you get to atoms, and the stuff inside atoms, and there's a smallest size.

To me, this is a metaphor for how imagination can extend reality.

To see the image the MB is based on, go look at this article.   There's not a lot of math there, either; it's more of a philosophical essay and a discussion of the life of Benoit Mandelbrot, after whom the set is obviously named.

True to form, I'll quote from the article. (Just one passage, though)

Mandelbrot needed a word for his discovery — for this staggering new geometry with its dazzling shapes and its dazzling perturbations of the basic intuitions of the human mind, this elegy for order composed in the new mathematical language of chaos. One winter afternoon in his early fifties, leafing through his son’s Latin dictionary, he paused at fractus — the adjective from the verb frangere, “to break.” Having survived his own early life as a Jewish refugee in Europe by metabolizing languages — his native Lithuanian, then French when his family fled to France, then English as he began his life in science — he recognized immediately the word’s echoes in the English fracture and fraction, concepts that resonated with the nature of his jagged self-replicating geometries. Out of the dead language of classical science he sculpted the vocabulary of a new sensemaking model for the living world. The word fractal was born — binominal and bilingual, both adjective and noun, the same in English and in French — and all the universe was new.

Now if you want a more technical discussion and lots more pretty pictures (including an animation of a deep dive into the set boundary, showing its self-similarity), you can always peruse the Wikipedia page.  

Unlike the boundary of the Mandelbrot set, this blog won't go on forever, and neither will I. This Merit Badge will, however, hopefully outlast both of us.

Oh, and if you want one, just comment below. I'm feeling magnanimous. That feeling won't last, so you have until midnight tonight, WDC time.
August 29, 2022 at 12:01am
August 29, 2022 at 12:01am
#1037049
I don't usually link to The Torygraph, but I am today.



The answer to almost every "Why are..." question concerning airlines is either a) It makes or saves them money, or b) some regulation requires it.

From Ryanair and British Airways to American Airlines (the world’s largest carrier), airlines across the board incorporate various shades of blue in their cabin seats, and it’s no coincidence. There does appear to be some psychology behind it.

Because blue is associated with depression, which is one of the outcomes of suffering the indignities of 21st century air travel?

Blue is associated with the positive qualities of “trust, efficiency, serenity, coolness, reflection and calm,” according to Colour Affects, the London-based consultancy run by Angela Wright, author of The Beginner’s Guide to Colour Psychology.

Funny, those are also the qualities associated with [insert your zodiac sign here]. The astrology is probably more accurate.

Nigel Goode, lead aviation designer and co-founder at Priestman Goode, which has been delivering aircraft interiors for 30 years for airlines, including most recently the Airbus Airspace cabins, states: "Our job as designers is to reinforce the airline’s brand and make it more recognisable, but our primary concern is to deliver an interior that maximises comfort to create a pleasant environment.

Well, then, you failed.

“It’s all about making the travelling experience less stressful and blue is said to evoke a feeling of calm. While some of the more budget airlines might use brasher, bolder shades, most others go with muted tones. The overarching aim is to create a home-like relaxing feel, so airlines tend to use muted colours that feel domestic, natural and earthy for that reason."

How is blue "domestic, natural and earthy?" Also, if you want to make "the traveling experience less stressful," there are less questionable ways to do it like, oh, I don't know, not packing people in like olives in a jar?

Cabin lighting is also geared towards creating a stress-free atmosphere on board, particularly in newer planes which have introduced soft LED lighting to replace the harsher light used in earlier models.

Right. LEDs not having to be replaced nearly as often, and requiring less power, has nothing to do with it.

As a general rule, most long-haul carriers won’t install leather seating because they can get unpleasantly sweaty.

Fixing this for you: because leather's more expensive.

Synthetic fabrics breathe, which makes for a more comfortable experience.

They also absorb the farts of the last slob who sat in your seat.

In fairness, I've sat in leather (I think it was faux-leather but whatever) seats on an airplane before, and the problem isn't sweat, but less friction.

“Lighter-coloured interiors, however, are more commonly found in first and business class seats, given not as many people fly in those cabins as they do in economy, where you’ll see more of the darker shades," Mr Goode explains.

That explanation makes no sense at all. There's a fixed number of each on an airplane, and these days, they're all filled. I'm betting it enforces class distinctions.

Found only on wide-body aircraft like Boeing’s Dreamliner, there is a small room hidden behind what looks like a small cupboard from the exterior.

Known as the 'crew rest', this tiny compartment opens up to a steep staircase that leads down to a few seats and bunk beds where crew members can sleep.


Et alors?

There are large areas on wide-body aircraft that are not used in the lower level, which have not been converted into seating areas because of the low height and lack of windows - both of which make the space too claustrophobic for passengers.

Quoi? That's never stopped them before.

Passenger seats with dummies strapped to them are put through the 16G test, a process which involves hurtling the seat down a 'sledge' ramp at high speeds to simulate a plane crash setting. All seats are required to withstand a 16g dynamic force.

Right, because people can withstand a 16g force.

I'm going to let the "dummies" thing slide. I'm feeling magnanimous.

Seats on many low-cost carriers like Ryanair don’t have a recline mechanism, which also adds to the weight, and others go as far as removing the seatback net for magazines to help reduce weight.

Want to start an argument online and sick of talking about Chicago "pizza?" Assert that a passenger in a reclining seat has every right to use said reclining mechanism to its fullest extent.

They do, incidentally.

“There’s a big push at the moment for the magazine pouch to be relocated just a tiny bit higher to allow just a little bit more space," notes Mr Goode.

Wow.

As an aside, for a while there, at least one airline (I think it was American) forbade passengers from sticking anything into the pouch other than the stuff that's native to it (safety card, barf bag, overpriced consumer goods ads). This was likely the most idiotic change to airplane rules since the non-smoking section (kind of like a non-pissing section in a pool, and I say that as a cigar smoker). Last time I flew that airline, though, there was no mention of this.

Anyway, snark aside, some of this stuff is interesting, whether you fly or not. Me? I used to love flying, but these days, it's nothing but a hassle. I might change my mind, though, if they relocate the magazine pouch just a tiny bit higher.
August 28, 2022 at 12:03am
August 28, 2022 at 12:03am
#1037010
Everything I need to know about medieval peasants, I learned from Monty Python and the Holy Grail.

And maybe today's article.

What Did Medieval Peasants Know?  
The internet has become strangely nostalgic for life in the Middle Ages.


The period, which spans roughly 500 to 1500, presents some problems for people trying to craft uncomplicated stories. “No age is tidy or made of whole cloth, and none is a more checkered fabric than the Middle Ages,” Tuchman wrote. Historians, she noted, have disagreed mightily on basic facts of the era...

We know enough for comedy movies. You know Holy Grail is almost 50 years old now? Still one of the greatest and most quotable movies of all time.

...the Middle Ages have been a common hobbyhorse for people of all political persuasions who suspect modernity might be leading us down the primrose path, especially as the internet has become a more central and inescapable element of daily life. Our ancestors of the distant past can be invoked in conversations about nearly anything: They supposedly worked less, relaxed more, slept better, had better sex, and enjoyed better diets, among other things.

Sure, you can prop up straw men and just make shit up about the past. But look, Hobbes (the philosopher not the tiger) described our lives as "nasty, brutish and short" for good reason. Namely, the past sucked. It sucked hard. I can kind of understand the Renaissance Faire people, but even they know they're just role-playing, and if one of them got a foot infection, they'd go to a modern hospital and not have a barber treat them with leeches. Or whatever.

The present is no walk in the park, but the problem isn't the internet; it's endgame capitalism. And at least we have global trade and laser eye surgery.

The problem is that these assertions about our glorious history usually don’t quite check out—they tend to be based on misunderstandings, disputed or outdated scholarship, or outright fabrications long ago passed off as historical record. But that doesn’t stop people from regularly revisiting the idea, counterintuitive though it may be, that some parts of life were meaningfully better for people who didn’t have antibiotics or refrigeration or little iPhone games to play to stave off boredom. What, exactly, is so irresistible about a return to the Middle Ages?

Boredom? I'd bet the peasantry had lots of problems, but boredom wasn't one of them.

But as the article points out, I don't know shit.

He now thinks that English peasants in the late Middle Ages may have worked closer to 300 days a year. He reached that conclusion by inspecting the chemical composition of fossilized human remains, as well as through evidence of the kinds of goods that urban peasants in particular had access to.

This is in reference to the claim that European peasants worked like 150 days out of the year. Which, even on the face of it, has got to be suspect. After all, what was the ruling body of Europe, the one thing that defined and united the continent during what we call the Middle Ages? The Church, obviously. And what's the Official Doctrine of the Church? "Six days shalt thou labor..." It's right there in the beginning. Now, sure, there's question about how much the lower classes could actually read, but they'd have gotten the message. So 365 minus 52 Sundays, and take off maybe a few other days for Suck the Duke's Dick Day or whatever, and 300 sounds reasonable.

Compared to that, our standard two-day weekend is luxury itself.

Clark and his colleagues have revised their estimates upward, but the school of thought his previous numbers belonged to still has many academic supporters, who generally base their estimates of how much peasants worked on records of per-day pay rates and annual incomes. “This other view is that they were quite poor, but they were poor kind of voluntarily, because they didn’t like work and didn’t want to do a lot of work,” he told me.

There were probably signs up at medieval convenience stores: No Onne Wonts Too Werk Anymoor.

“It allows people to make their own medieval mythology and cling to that,” Janega told me. “They’re just kind of navigating on vibes.”

Like I just did. Only I know I did it.

The clear delineations that people assume between work and personal life just aren’t particularly tidy for peasants doing agrarian labor. “They’re thinking of these people as having, like, a 9-to-5 job, like you’re a contracted employee with a salary and you get vacation days,” she told me. “The thing about having a day off is like, well, the cows ain't gonna milk themselves.” So while people are correct that European peasants celebrated many more communal holidays than modern Americans, in many cases, that just meant they weren’t expected to do a particular set of tasks for their lord. Minding the animals, crops, and themselves never really stopped. Their vacations weren’t exactly a long weekend in Miami—after all, they didn’t really have weekends.

I'm imagining a feudal serf clocking in and out of their field. Er, their lord's field. Whatever.

There were, of course, some other obvious downsides to medieval life. A huge chunk of the population died before the age of 5, and for people who made it out of childhood, the odds of seeing your 60th birthday weren’t great. There wasn’t any running water or electricity, and there was a very real possibility that mercenaries might one day show up and kill you because your feudal lord was beefing with some other lord.

At least there was beer. On the other hand, there was a nonzero chance that you'd be burned as a witch for brewing it.

If you’re looking for a vision of history where people were generally peaceful and contented, though, you might want to check in with societies outside of the Middle Ages. Perhaps look for a group of people not perpetually engaged in siege warfare. “Medieval peasants are a weird one to go to, because, you know, they were rebelling constantly,” Janega noted. “Why are they storming London and burning down the Savoy Palace, if this is a group of happy-go-lucky, simple folk who really love the way things are?”

And maybe don't be so hyperfocused on Europe? I get it; the vast majority of Americans have (relatively) recent ancestry there. But other areas have much to teach us as well, as indicated by, for instance, the Chinese invention of the compass, the Indian invention of the concept of zero, and the Arabic invention of higher mathematics—all of which took place during the European Middle Ages.

The article actually has quite a bit more relevant information than what I've copied here, but the takeaway, at least for me, is: don't look at the past with rose-colored glasses (spectacles were a medieval European invention).
August 27, 2022 at 12:02am
August 27, 2022 at 12:02am
#1036983
Today's your lucky day. Oh, wait... no. Well, maybe it is. But I'm here to talk about luck.

The radical moral implications of luck in human life  
Acknowledging the role of luck is the secular equivalent of religious awakening.


I think the headline goes too far, bordering on clickbait. But that doesn't mean the content is wrong.

In July 2018 (when we first published this piece), there was a minor uproar when Kardashian scion Kylie Jenner, who is all of 22, appeared on the cover of Forbes’s 60 richest self-made women issue. As many people pointed out, Jenner’s success would have been impossible if she hadn’t been born white, healthy, rich, and famous.

People say the Kardashians are famous for being famous. I don't know. I don't keep up with them (pun intended). I don't care about them. Their antics mean exactly nothing to me. I'm rather disappointed that Forbes, a magazine that my very serious father always read very seriously, would fall so low as to cover one of them for any reason, but hey, publications change. But anyway, no, they're not famous for being famous; they've invested a lot of time and money into self-promotion, netting them more money (if not more time). I can't help hearing about them, no matter how strenuously I try not to. I get the feeling that if I moved out to the middle of Nevada and started living under a rock in an abandoned silver mine, within a week someone would come trudging into the mine, lift the rock and go, "Hey, did you hear about the latest thing some Kardashian did?"

All of which is to say that it's not just luck. Or if it is, it's lucky that they're so talented at self-promotion.

She built a successful cosmetics company — now valued at $900 million, according to Forbes — not just with hard work but on a towering foundation of good luck.

"Now" apparently being the date of the updated article, early 2020, still in the Before Time.

And I will reiterate that the idea that "hard work" (whatever that is) alone brings success is easily refuted: if it were the case, migrant laborers would be billionaires.

Around the same time, there was another minor uproar when Refinery29 published “A Week in New York City on $25/Hour,” an online diary by someone whose rent and bills are paid for by her parents.

A publication I've never heard of. But I've seen similar articles, usually with the tone of "This young couple managed to buy a house and pay it off in full by the time they were 30," while if you actually read the article, you find that they were able to do so because their parents paid for most of their crap.

It’s not difficult to see why many people take offense when reminded of their luck, especially those who have received the most. Allowing for luck can dent our self-conception. It can diminish our sense of control. It opens up all kinds of uncomfortable questions about obligations to other, less fortunate people.

Oh, I don't take offense. I just smile (insofar as I can), narrow my eyes, and go, "So?"

Nonetheless, this is a battle that cannot be bypassed. There can be no ceasefire.

And now we're back to the hyperbolic tone of the headline.

Individually, coming to terms with luck is the secular equivalent of religious awakening, the first step in building any coherent universalist moral perspective... Building a more compassionate society means reminding ourselves of luck, and of the gratitude and obligations it entails, against inevitable resistance.

I find that to be somewhat contradictory. If I'm playing craps and I make my point, and I'm an atheist (remember, they're talking about secular morality here), to whom or what do I express gratitude? The dice? That's silly. God? Nonexistent. Lady Luck? Still atheist. Fortuna, Roman goddess of luck? Still atheist. Some nebulous concept of the quantum fluctuations of the universe? See "dice." Sure, if an actual person does something nice for you, you express gratitude. Sure, gratitude is actually an emotional state and is intransitive (meaning it doesn't require an object), but if I win at gambling, I don't say "thank you;" I just bask in the fortune.

All of which is to say I'm not convinced that "gratitude and obligations" are a necessary byproduct of being lucky. But I'm willing to read on to see what the author might have to say about that.

How much moral credit are we due for where we end up in life, and for who we end up? Conversely, how much responsibility or blame do we deserve?...

How you answer these questions reveals a great deal about your moral worldview. To a first approximation, the more credit/responsibility you believe we are due, the more you will be inclined to accept default (often cruel and inequitable) social and economic outcomes. People basically get what they deserve.


I think those are fair questions, and I accept that people will answer them differently. In my worldview, we only have the illusion of being able to make decisions. It's more like we do whatever it is that we do, and then either justify or regret it afterward. It's also very clear to me that people do not, in general, get what they deserve; it's more like they get something, and have to (or get to) live with it.

The idea that people get what they deserve is pernicious. You end up worshiping successful people, and scorning those in poverty, on the basis of "well, they must have done something to deserve their state." (The article does delve into this morass later.)

Of course it is true that you have no choice when it comes to your genes, your hair color, your basic body shape and appearance, your vulnerability to certain diseases. You’re stuck with what nature gives you — and it does not distribute its blessings equitably or according to merit.

But you also have no choice when it comes to the vast bulk of the nurture that matters.


On that point, I can agree. You can no more choose your parents, or your childhood environment, than you can choose your eye color (please don't tell me about colored contact lenses; you know what I mean).

Here, a distinction made famous by psychologist Daniel Kahneman in his seminal Thinking, Fast and Slow is helpful. Kahneman argues that humans have two modes of thinking: “system one,” which is fast, instinctual, automatic, and often unconscious, and “system two,” which is slower, more deliberative, and emotionally “cooler” (generally traced to the prefrontal cortex).

Our system one reactions are largely hardwired by the time we become adults. But what about system two?

We do seem to have some control over it. We can use it, to some extent, to shape, channel, or even change our system one reactions over time — to change ourselves.


The key word there, to me, is "seem." This argument implies that we are somehow separate from our selves, that there's a ghost in the machine, pulling the levers, and we're the ghost. The problem with that implication is that, well, we're not. System one, system two, whatever; they're both products of brain activity—products of a physical process.

We do change, sure. Other environmental inputs give us more information, and the brain itself changes over time.

Everyone is familiar with that struggle; indeed, the battle between systems one and two tends to be the central drama in most human lives. When we step back and reflect, we know we need to exercise more and eat less, to be more generous and less grumpy, to manage time better and be more productive. System two recognizes those as the right decisions; they make sense; the numbers work out.

But then the moment comes and we’re sitting on the couch and system one feels very strongly that it doesn’t want to put on running shoes. It wants greasy takeout food. It wants to snap at the delivery guy for being late. Where is system two when it’s needed? It shows up later, full of regret and self-recrimination. Thanks a lot, system two.


This is usually represented in cartoons with an angel on one shoulder and a devil on the other.

As an aside, I take issue with the idea that we should be more productive. Productivity has led to myriad problems. Maybe we could do with being less productive.

I'm skipping a bunch here, but something else I have issue with:

The promise of great financial reward spurs risk-taking, market competition, and innovation. Markets, properly regulated, are a socially healthy form of gambling.

No. They are not gambling. I mean, sure, if you take a short-term view, they can be. But unlike, say, gambling in a casino, investment in the stock market gives you the house edge. So unless you're also prepared to call running a small business or a casino "gambling," this is another pernicious misconception.

And there’s no reason we shouldn’t ask everyone, especially those who have benefited most from luck — from being born a certain place, a certain color, to certain people in a certain economic bracket, sent to certain schools, introduced to certain people — to chip in to help those upon whom life’s lottery bestowed fewer gifts.

Oh, you can ask all you want. You can even enforce the ask with taxes. But those types are lucky enough to be able to afford lawyers and accountants to minimize their tax burden. In the end, it doesn't matter whether someone knows they were lucky or not; what matters is how much they give a shit.
August 26, 2022 at 11:57am
August 26, 2022 at 11:57am
#1036955
As this entry is later than usual—as I vaguely remember mentioning in a note yesterday, I might have consumed an excessive amount of ethanol—I'm not going to comment too much. But it's an interesting bit of history, illustrating some of the best and worst of humanity.

The Story of Charles Willson Peale’s Massive Mastodon  
When a European intellectual snubbed the U.S., the well-known artist excavated the fierce fossil as evidence of the new Republic’s strength and power


In the 18th century, French naturalist George-Louis Leclerc, Comte du Buffon (1706-1778), published a multivolume work on natural history, Histoire naturelle, générale et particuliére. This massive treatise, which eventually grew to 44 quarto volumes, became an essential reference work for anyone interested in the study of nature...

The Comte de Buffon advanced a claim in his ninth volume, published in 1797, that greatly irked American naturalists. He argued that America was devoid of large, powerful creatures and that its human inhabitants were “feeble” by comparison to their European counterparts.


Obviously, Buffoon wasn't familiar with Sasquatch. As for "feeble," well, those "European counterparts" were busy systematically destroying the inhabitants by means of more advanced technology.

The claim infuriated Thomas Jefferson, who spent much time and effort trying to refute it—even sending Buffon a large bull moose procured at considerable cost from Vermont.

You know, this is about when, normally, I'd stop reading. Why? Because according to this article, the Comte died in 1778. The Wikipedia page claims he died in 1788 (I know there was a major calendar switch in the 18th century, but not that major). The ninth volume was, again according to this article, published in 1797, either 19 or nine years after his death. And yet Jefferson sent him a moose? To what, his mausoleum?

So, okay, something's really wonky about the dates here, and that definitely needs resolved (especially as Leclerc was a noble and the French revolution was mostly a 1790s thing).

In 1739, a French military expedition found the bones and teeth of an enormous creature along the Ohio River at Big Bone Lick in what would become the Commonwealth of Kentucky.

I'm mostly just including this quote so that those of you unfamiliar with Kentucky can have a sensible 12-year-old chuckle at "Big Bone Lick."

Of course, the local Shawnee people had long known about the presence of large bones and teeth at Big Bone Lick.

It's right there in the name, folks. What? You didn't actually think the other definition applied?

For millennia, bison, deer and elk congregated there to lick up the salt, and the indigenous people collected the salt as well. The Shawnee considered the large bones the remains of mighty great buffalos that had been killed by lightning.

This is completely tangential to the article, but I've had this working hypothesis for a while now that the reason so many cultures have dragon myths is because they'd occasionally find dinosaur bones. Having no concept of deep time, they had to make up stories about how such enormous skeletons got to be part of the landscape, and those stories became dragon legends.

I have no real support for this, but it tracks with what I know about humans.

Anyway. Not much else to say, except that the article calls out my hometown, which I always think is cool (unless it's to recall the events of 2017). The rest of the story details the process of figuring out what those bones were (spoiler: mastodon), and, like I said, is an interesting look into the history of scientific discovery.

Oh, but before I go, don't give much credence to those stories you keep finding about people trying to Jurassic-Park the mastodon back into existence. Most of them are sensationalist.
August 25, 2022 at 12:14am
August 25, 2022 at 12:14am
#1036902
No, I'm not trying to turn this into a cooking blog. Ugh. But sometimes a food article catches my eye, and also sometimes the random number generator spits them out back-to-back. In this case, and the last one, the hook isn't cook, but science.

How to Use Baking Soda Like a Scientist  
This ingredient belongs in both the laboratory and the kitchen.


Next time someone spouts off about not eating stuff with chemicals in it, you can point out that sodium bicarbonate is, by definition, a chemical, and they almost certainly eat delicious pancakes, which are made with baking powder (baking powder is baking soda with other stuff added). If that doesn't work, mention sodium chloride. If they're still being stubborn about it ("But I only eat organic salt"), point out that every ingredient contains chemicals. It still won't stop the ignorance, but at least you've gained the high ground.

Whipping up a recipe can feel awfully similar to conducting a science experiment. Either one could involve adjusting burners or measuring out various powders and liquids, all while carefully watching to make sure your project doesn’t explode, burn, or turn a funny color.

Be fair, now. Sometimes exploding is the point.

Cookbook author and food writer Nik Sharma happens to be both a cook and a scientist.

Oh, that name is so close to being an aptronym. Nik Shawarma would be an awesome name.

And most scientists are also cooks. It's not like they get paid enough to hire a full-time chef. The only question is whether they apply one activity to the other. The important thing is that he's also a writer.

In his writing, especially 2020’s The Flavor Equation, Sharma is educating the food world on the science behind the most common cooking techniques and ingredients.

Yeah, this is kind of a book promo. But it seems like a useful book.

The article then switches to the author/scientist/cook's point of view.

My fascination for cooking and chemistry developed simultaneously. It all started in my high school chemistry lab, during a lesson on the relationship between acids and bases.

When he was bitten by a radioactive papier-maché volcano?

Sodium bicarbonate—or baking soda—was one of the first ingredients that made me realize that a kitchen is, in essence, a laboratory.

Depends on your definition of "laboratory." Most cooks don't approach it from a scientific perspective, instead using recipes or their ancestral knowledge. Nothing wrong with that, as the goal is to provide something appetizing and edible, but it's not science.

The Rise of Baking Soda

Oh, ho ho ho. I see what you did there.

I approve.

Bakers have used carbonates as chemical leaveners since the Middle Ages. These substances release carbon dioxide bubbles when dissolved in water, or mixed with an acid. In a batter, this has a lightening, lifting effect.

It's interesting to me that this postdated the use of yeast, which also has the effect of leavening baked goods (in addition to the magic it works in delicious fermented beverages). In either case, something's producing carbon dioxide. I would have expected it to be the other way around, as people didn't actually know what yeast was until, like, microscopes. Yes, yeast was used for thousands of years before someone said "holy shit, it's alive!"

Sodium bicarbonate, NaHCO₃, is of course not a living organism.

Fun Fact of the Day: NaHCO₃ can be found in the wild, in a mineral called nahcolite. Yes, its name is a pun on the periodic table symbols involved. I find this amusing.

Baking soda is naturally alkaline, raising the pH when added to liquids or foods. Often, to reduce the acid in coffee or a very sour soup, I’ll stir in a tiny pinch of baking soda to neutralize and counteract the acidity.

Someone once told me to add a bit of baking soda to ground beef before frying it, to make it brown better. I figured it couldn't hurt, so I tried it (science!). I was displeased with the results, but it did produce a faster browning action. Not sure of the chemical reason for that, but since I don't plan to do it again, it's not near the top of my curiosity list.

If I’m cooking dried beans, I’ll first soak them overnight in a brine made with baking soda and salt, or cook pre-soaked beans with a smaller quantity of both. If you’ve ever cooked dried beans, only to have them turn out unpleasantly hard, this is the trick for you.

No way. I get my beans from a can, as God intended.

But the reason I'm quoting this line is to point out that "salt" as a culinary ingredient is almost exclusively sodium chloride, but scientists call any substance with a certain ionized crystalline form a "salt." Here, he's using the culinary definition, as baking soda is itself, chemically speaking, a salt.

Baking soda can also act as a catalyst in two important food reactions. A tiny pinch of baking soda to vegetables or meats while roasting or sautéing accelerates the rate of sugar caramelization, and supercharges the Maillard reaction, the rate at which the amino acids in proteins react with sugar.

Well, that's that low-level curiosity satisfied. You didn't think I'd actually leave you hanging, did you?

Every time I look at the jar of baking soda sitting inside my pantry, I smile.

Well, I hope you change it out every so often. Unlike table salt, it has a relatively short shelf life. No, that's not just Church & Dwight (the makers of Arm & Hammer, which is probably the best-known brand of baking soda, at least in the US) trying to get you to buy more of their product. Using it as a fridge freshener, now, that's them trying to get you to buy more of their product. There is little actual evidence that this works, but damn, they're good at marketing.

By the way, it occurred to me the other day that I never really explained why I bang on about marketing in here sometimes. It's because this is, ultimately, a writer's blog; writers tend to want to publish, and publishing requires effective marketing. At the same time, I despise marketing excesses. There's a balance. Kind of like with adding sodium bicarbonate to food: too much and it leaves a bad taste.

The article ends with a "Tips and Tricks" section, about which I only have a couple of comments:

A pinch of baking soda mixed into a glass of water acts as an antacid to reduce heartburn.

The jury's still out on whether sodium actually affects heart health. But to be on the safe side, I try not to overuse sodium salts, including NaCl and bicarb. No, "heartburn" doesn't really involve the heart; it's just that, having had a heart attack, I'm wary of certain overindulgences (booze doesn't usually contain sodium, except for margaritas). I'd check with an actual doctor before using this "trick" if you've got heart issues.

Outside cooking, baking soda has many uses. It can also be used as a mild soap along with vinegar to clean kitchen counters, and stubborn grease marks.

This, I can vouch for. It's also fun to watch the baking soda/vinegar reaction (the chemical result is sodium acetate, water, and carbon dioxide, but you know it as the middle school science fair "volcano" I referred to above).

I'm also told that it can remove stains from nonstick pan surfaces. I haven't had much luck with that; I have a pan that badly needs a deep cleaning, but nothing has worked yet.

Anyway, the real point here is: chemicals are your friends.
August 24, 2022 at 12:02am
August 24, 2022 at 12:02am
#1036871
Today, some hard-hitting journalism sure to be eligible for a Pullet Surprise.

Is Garlic Getting Easier to Peel?  
To solve the mystery, I had to talk to horticulturalists, farmers, chefs—and also my local grocery store.


Mmmm... garlic chicken...

Where was I? Oh yeah. Most of the time, the answer to any question posed in a headline is "No."

For most of my adult life, I was a garlic-phobe. Not because I didn’t like the flavor! No, like basically every human on earth, I adore garlic...

I've known people who didn't like garlic. I avoid them, figuring they're vampires.

I loved eating garlic. What I hated, for years, was peeling garlic.

If only you could have relaxed your culinary snobbery long enough to purchase pre-peeled garlic. Or use (gasp) garlic powder. It has its place, you know.

What a drag it was! Let’s say your recipe calls for three cloves of garlic. Your onions are fizzing, your pasta is bubbling, and you’re cursing, trying to separate the garlic’s skin from the cloves within.

You know, I don't consider myself a gourmet chef, though I get by. And I've ranted before, in the Comedy newsletter and probably here too, about the frustration of garlic-peeling. But there's a concept called mise-en-place, which I knew about even before I started learning French. Literally "put in place" (the "place" is pronounced differently, of course), it means prepping most of your ingredients before starting cooking, so you don't find yourself struggling with peeling garlic, chopping onion, or whatever at critical stages in your cooking.

It's not always necessary. If you're broiling chicken or whatever, you can do some chopping while it's cooking. But having the garlic peeled and minced before you even turn the stove on is, in my estimation, pretty basic.

Like juicing a lemon or opening a stuck jar, peeling garlic is one of those kitchen tasks so pesky that gadget companies are forever trying to solve it for you. I’ve tried the shakers. I’ve tried the silicon rollers. I’ve tried cutting off the stem end and rolling the clove between my hands.

Cut off the stem end, twist the cloves a bit to loosen the skin (this also releases some flavor), and rub them together between your hands while standing over a trash can to catch the paper. You do run the risk of dropping the garlic, so don't do that. Notice I said "cloves." No matter what a recipe says, a single clove is never enough.

Annette recommends squeezing them under the flat of your vegetable knife, and that can work too, but it requires more tools than just using your fingers.

Also, juicing a lemon? Come on. The only trick to that is to do it through a sieve so the seeds don't squirt out onto your shrimp or whatever.

But over the past three years or so, something strange has happened to the garlic I buy at the grocery store. It’s become so much easier to peel!

The more you do something, the easier it gets.

How did my garlic transform from sticky nightmare to user-friendly flavor dispenser? Have America’s garlic breeders suddenly focused on peelability as a saleable trait? Has competition from pre-peeled garlic somehow forced a change in the garlic farming world? I had to know the answer, so I called everyone I could think of who might know anything about garlic.

So, not vampires.

“I don’t know that anybody’s measured that, peelability,” said Barbara Hellier, a horticultural crops curator with the United States Department of Agriculture in Pullman, Washington. Like a home chef with a particularly tough clove, she wrestled to unwrap the subject:...

Stretch metaphor.

I asked if anyone was breeding garlic specifically to improve peelability, and she told me something I hadn’t previously known: “There’s hardly anyone breeding garlic at all.” Garlic, it turns out, isn’t like other crops, where you plant seeds, grow a plant, harvest it, and then, next year, plant a new seed. Garlic seeds, from fertilized flowers on garlic plants, look a little like onion seeds, but hardly anyone generates and plants them—because why would you? All you need to grow a new garlic plant is just one garlic clove off a garlic bulb. (When you’ve left your garlic sitting around so long a clove sprouts a green shoot, you’ve begun that process.)

No garlic lasts that long around me.

The issue, Kamenetsky explained, is that unlike other crops, garlic mostly can’t flower and be fertilized. “In garlic, this was damaged in ancient times,” she said. “For 5,000 years, people selected for bigger cloves, and they continually selected against flowering.”

Okay, now, see, that's interesting. It's kind of like with bananas. Doesn't have anything to do with a peel, though. Oh yeah, I made that pun.

Within a year of opening, Serafini found a California farm that sold pre-peeled garlic, which is where the Stinking Rose now sources all its A. sativum. “What they do is they put it in screens that heat the skin to a certain temperature, dry it, and then they put it through a wind treatment, like a wind tunnel almost.” The wind blows the skins off. Serafini now swears by pre-peeled garlic: “It’s really the best way! It’s more consistent.”

See? Stop being a snob.

But I hadn’t solved this mystery! My grocery store wasn’t getting some special easy-peel garlic. No one was really breeding easy-peel garlic. So what had happened? It was time to go to the source: Harris Teeter, my grocery chain.

Ah, the problem begins to resolve itself.

The article goes on to describe how garlic shipped from further away has more time to dry and thus might be easier to peel.

I guess I’m grateful that the impossibly convoluted complexity of the intercontinental produce supply chain—which makes modern life more convenient in the short term but is destroying the planet for the long term—is the likely cause of my garlic’s new peelability.

Yeah, at this point, I'll take convenience. We're doomed anyway, so we might as well enjoy the ride. One of the few actual benefits of living in late-stage capitalism is global trade. And the planet's not getting destroyed; only the biosphere.

The result is a system in which buying local, freshly harvested produce can result, bizarrely, in a worse product.

There's nothing at all bizarre about it. If buying local actually gave us better products, we'd never have switched to a global trade model. The only reasons to source local is to support local farms and give yourself something to brag about on social media. The former of which I normally support, except when I know the farmers are voting for the wrong politicians.

But whatever. So, in general, like I said way back at the beginning, the answer is "no, garlic isn't getting easier to peel." Except in some individual circumstances. I'm going to keep wrestling with those little buggers anyway, because the taste is worth it.
August 23, 2022 at 12:01am
August 23, 2022 at 12:01am
#1036837
To be clear, I started saying "common sense is neither" before I found this article. But then I found this article, so of course I had to put it in my queue. It's from 2011, just to put some of its content into context.

Common Sense Is Neither Common nor Sense  
How often is common sense correct?


I started getting the idea that "common sense" was just another phrase for anti-intellectualism and know-nothingness. Some politicians run on a "common sense" platform, and the ones that do are all a bunch of down-home anti-book-learnin' ideologues, so I don't trust them. "Let's ignore scientific evidence and instead just go on my personal experience."

Common sense, defined as "sound judgment derived from experience rather than study," is one of the most revered qualities in America.

That's a silly damn definition. It's not sound, it's not judgment, and if it's derived from experience, then it's not common, is it? It's personal.

It evokes images of early and simpler times in which industrious men and women built our country into what it is today.

Let's be real here. Most "common-sense" folks are only thinking about the men from history. They only care about women in terms of how many babies they can make.

People with common sense are seen as reasonable, down to earth, reliable, and practical.

Nice passive voice there. Not by me.

But here's the catch. Common sense is neither common nor sense.

Which is what I've been saying.

If common sense was common, then most people wouldn't make the kinds of decisions they do every day. People wouldn't buy stuff they can't afford. They wouldn't smoke cigarettes or eat junk food. They wouldn't gamble. And if you want to get really specific and timely, politicians wouldn't be tweeting pictures of their private parts to strangers. People wouldn't do the multitude of things that are clearly not good for them.

Okay, whoa, hold the fuck up there.

People do all those things for plenty of reasons, not all of them being that they lack any kind of sense. People buy stuff they can't afford because they're convinced by ad agencies that it will improve their lives more than saving money will. People smoke cigarettes because they're addicted, or because it's genuinely pleasurable. People eat junk food because it's cheap and easy, cheaper than healthy food; I can't reconcile this, at least for poor people, with buying only stuff you can afford. Gambling can be a problem, but for some of us it's just another entertainment expense, equivalent to going to a sportsball game or music concert. We all do things that aren't good for us because, on some level, they are good for us. (The "private parts" thing is a reference to a scandal that was current when the article was written.)

This doesn't change my agreement with the general thrust of the article, but that bit is condescending as hell.

This is the important bit:

And common sense isn't real sense if we define sense as being sound judgment because relying on experience alone doesn't usually offer enough information to draw reliable conclusions. Heck, I think common sense is a contradiction in terms. Real sense can rarely be derived from experience alone because most people's experiences are limited.

Our senses, our lived experiences, tell us the Earth is flat. It looks flat, doesn't it? Especially if you live in Kansas. And that we're at the center of the universe. A person who survives a car accident while not wearing their seat belt might believe that it's safer to not wear a seat belt. The news covers plane crashes religiously, but rarely car wrecks, so you'd think flying would be more dangerous than driving (it is not). You took horse dewormer and got better, so obviously the horse dewormer made you better, right? That's just common sense.

It takes science, research, knowledge, book learnin' to extend our senses, to help us see reality beyond our limited individual experiences. No, science doesn't have all the answers. But it has way more than you do alone. Science tells us the planet's roughly spherical, and how gravity affects orbits, and that the universe doesn't have a "center." Research shows that you're safer with a seat belt than without one, but that doesn't mean you're invulnerable. Studies show that dewormer only cures worms; if you got better from something else while taking it, that was a coincidence.

The word common, by definition, suggests that common sense is held by a large number of people. But the idea that if most people think something makes sense then it must be sound judgment has been disproven time and time again. Further, it is often people who might be accused of not having common sense who prove that what is common sense is not only not sense, but also completely wrong. Plus, common sense is often used by people who don't have the real knowledge, expertise, or direct experience to actually make sound judgments.

"Global warming isn't real! Look at this snowball."

I think we need to jettison this notion of the sanctity of common sense and instead embrace "reasoned sense," that is, sound judgment based on rigorous study of an issue (which also includes direct experience).

No, this doesn't mean going through youtube or social media in search of things that ring your confirmation bias. Yes, I'm aware that this article rings my confirmation bias. No, this is not a contradiction.

A course in scientific thinking and methodology for everyday life should be a requirement for all students. Such proactive education about precise thinking and real sense might reduce the number of truly dunderhead things that subsequent generations will do (the current generations are probably beyond remediation).

Wow, this guy's a tool. Doesn't mean he's wrong, though.

Without being receptive to answers that we may not want to hear, we might as well just ask ourselves what we want to be true and go with that, which is what many people with so-called common sense (most efficient, but often wrong).

Which is what a lot of people seem to do anyway. (This article could have used an editor.)

Let's be realistic. No one likes to see their "theories" disproven.

This is not true. For instance, I have a "theory" (which isn't one in the scientific sense) that technology-using beings, such as us, are extraordinarily rare in the universe, to the point where there's not another one in this galaxy. I'd love for that to be disproven (unless of course it involves their technology blowing up the Earth). And it would be very easy to disprove it.

Anyway. Quibbles aside, the main thrust of the article is something I absolutely agree with.

Which of course makes it suspect. But I can live with that.
August 22, 2022 at 12:01am
August 22, 2022 at 12:01am
#1036797
I don't have a lot to say about today's article; I just think it's a good, fairly simple, example of how science gets scienced.

Which Weighs More, a Pound of Stone or a Pound of Styrofoam?  
It’s not a trick question: your brain answers differently, depending on whether the materials are part of the same object or not


Ah, but it is kind of a trick question, isn't it? There's objective weight, which can be measured by a scale, and subjective weight, which is what your muscles anticipate and feel. Consider two packages of the same weight, one large, one small (maybe the large one contains nothing but packing pillows). The small one will be easier to pick up and carry if only because of the bulk involved.

That's not what this article is about, though.

For more than a century, scientists thought they knew the answer to a curious question: why does 10 pounds of a low-density substance such as Styrofoam feel heavier than 10 pounds of stone? It isn’t heavier, of course, but repeated experiments have shown that it feels that way.

Science tip #1: Just because you think you know the answer, doesn't mean you do. Always test.

Now psychologists say their initial explanation may have been incomplete, and the new explanation could have far-reaching consequences, including for the way Netflix designs the algorithms that recommend movies to its customers.

Science tip #2: Your experiment doesn't have to have practical, everyday applications. But if you can come up with one, it'll be easier to communicate it to the teeming hordes.

The article goes on to describe the experiment and their results, which, as the headline hints, are that people are ass at guessing weights. Or something.

Knowing how the brain estimates weight isn’t just an interesting experiment—it can actually help scientists develop smarter technologies that we use every day. Now that we know more about how context changes the brain’s decisions, programmers might be able to update technologies such as Netflix to imitate the brain more accurately and provide more fine-tuned recommendations for users.

Article is from 2019. Netflix is still trying to recommend shows and movies to me that there's no way in hell I'd watch.

Anyway, like I said, not much else to say, and you'll have to go to the article to see what the actual experiment was, because it wouldn't be easy to take any of it out of context. And I'm all about easy. Give me the smaller package.
August 21, 2022 at 12:04am
August 21, 2022 at 12:04am
#1036756
I'm not above making up words if I don't know a good one for the context. Or even just for fun. None of them have caught on, but English has some words that caught on for a while, and then... caught off?



If your dream is to talk like Moira Rose from Schitt’s Creek...

Who? No.

...look no further than Mrs. Byrne’s Dictionary of Unusual, Obscure, and Preposterous Words, one of the dictionaries Catherine O’Hara used to tweak her iconic character's lines.

Still not interested in the show.

The following terms for everyday things are ones you'll want to add to your lexicon ASAP.

People who grew up with the internet seem to think they had the monopoly on turning words into acronyms. They did not; ASAP predated widespread use of computers. Some sources claim it's about a hundred years old They also didn't invent weed or sex. Just saying.

On the other hand, a bunch of words that people think were acronyms weren't. Those are called backronyms because people love portmanteaux (I, on the other hand, do not). An example of a non-acronym is the ever-useful F word.

Anyway. The article lists too many words for me to copy all of them, and besides, lawyers exist. So I'll just highlight a few.

3. Baragouin

Another word for gibberish that dates back to the early 1600s.


But why bother? Gibberish is easier to spell, pronounce, and remember, and also has the advantage of sounding like what it is.

4. Bumfodder

Why yes, this is a 17th-century word for toilet paper.


Again, easier to say "bumwad," or even "loo roll" if you're of a British bent.

9. Clinchpoop

If you get into a confrontation with a jerk, consider calling them a clinchpoop, which the OED defines as “A term of contempt for one considered wanting in gentlemanly breeding.”


I mean, sure, hit 'em with that word if you want them to hit you with their fists.

13. Eructation

A fancy word for belching...


Everything sounds more proper when using Latin root words. That's one reason we have so many. Defecation. Urination. Flatulence. This one, though, is just showing off.

18. Forjeskit

“Forjesket sair, with weary legs,” Scottish poet Robert Burns wrote in 1785’s “Second Epistle to J. Lapraik.” It was the first use of the word, which means “exhausted from work,” according to Mrs. Byrne’s Dictionary.


Still no definitive word on whether Scots is a dialect of or sister language to English, but I'm pretty sure this word belongs in that language. That's not the only Scots word at the link.

24. Join-hand

Another word for cursive handwriting.


Some words went obsolete for a reason.

31. Maquillage

Another word for makeup that dates back to the late 1800s.


And that one's French. Yes, English stole a bunch of words from French, but most of them came from way before the late 1800s.

32. Matutolypea

According to Mrs. Byrne’s Dictionary, this term means “getting up on the wrong side of the bed.” Macmillan Dictionary notes that the word “is derived from the Latin name Matuta from Matuta Mater, the Roman Goddess of the dawn, and the Greek word lype meaning 'grief or sorrow.’”


Unnecessary. Cumbersome.

40. Ombibulous

According to Mrs. Byrne’s Dictionary, ombibulous describes “someone who drinks everything.” It was coined by H.L. Mencken, who once wrote, “I am ombibulous. I drink every known alcoholic drink and enjoy them all.”


Finally! One I have reason to steal.

47. Scacchic

“of or pertaining to chess,” according to the OED.


As far as I've been able to figure out, the French word for "chess" is the same as the French word for "failure." Unsurprisingly, it seems to be related to this one, which comes from Italian.

51. Tapster

Another word for a bartender.


Hey look, another one I can actually use.

Anyway, like I said, many more at the link. Personally, I think most of these, however unusual, have better words to describe their concepts. But some writers seem to take great delight in vexing their readers with obscure synonyms, so the article might be useful to them.

2,744 Entries · *Magnify*
Page of 138 · 20 per page   < >
Previous ... 30 31 32 33 -34- 35 36 37 38 39 ... Next

© Copyright 2024 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted Writing.Com, its affiliates and its syndicates non-exclusive rights to display this work.

Printed from https://writing.com/main/profile.php/blog/cathartes02/sort_by/entry_order DESC, entry_creation_time DESC/page/34