*Magnify*
SPONSORED LINKS
Printed from https://writing.com/main/profile.php/blog/cathartes02/sort_by/entry_order DESC, entry_creation_time DESC/page/30
Rated: 18+ · Book · Personal · #1196512
Not for the faint of art.
Complex Numbers

A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.

The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.

Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.

Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.




Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning Best Blog in the 2021 edition of  [Link To Item #quills] !
Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2019 Quill Award for Best Blog for  [Link To Item #1196512] . This award is proudly sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . *^*Delight*^* For more information, see  [Link To Item #quills] . Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2020 Quill Award for Best Blog for  [Link To Item #1196512] .  *^*Smile*^*  This award is sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] .  For more information, see  [Link To Item #quills] .
Merit Badge in Quill Award 2
[Click For More Info]

    2022 Quill Award - Best Blog -  [Link To Item #1196512] . Congratulations!!!    Merit Badge in Quill Award 2
[Click For More Info]

Congratulations! 2022 Quill Award Winner - Best in Genre: Opinion *^*Trophyg*^*  [Link To Item #1196512] Merit Badge in Quill Award 2
[Click For More Info]

   Congratulations!! 2023 Quill Award Winner - Best in Genre - Opinion  *^*Trophyg*^*  [Link To Item #1196512]
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the Jan. 2019  [Link To Item #30dbc] !! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on taking First Place in the May 2019 edition of the  [Link To Item #30DBC] ! Thanks for entertaining us all month long! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2019 round of the  [Link To Item #30dbc] !!
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2020 round of the  [Link To Item #30dbc] !! Fine job! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning 1st Place in the January 2021  [Link To Item #30dbc] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the May 2021  [Link To Item #30DBC] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning the November 2021  [Link To Item #30dbc] !! Great job!
Merit Badge in Blogging
[Click For More Info]

Congratulations on winning an honorable mention for Best Blog at the 2018 Quill Awards for  [Link To Item #1196512] . *^*Smile*^* This award was sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . For more details, see  [Link To Item #quills] . Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the January 2020 Round of the  [Link To Item #30dbc] ! Blog On! *^*Quill*^* Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the May 2020 Official Round of the  [Link To Item #30dbc] ! Blog on! Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the July 2020  [Link To Item #30dbc] ! Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the Official November 2020 round of the  [Link To Item #30dbc] !
Merit Badge in Highly Recommended
[Click For More Info]

I highly recommend your blog. Merit Badge in Opinion
[Click For More Info]

For diving into the prompts for Journalistic Intentions- thanks for joining the fun! Merit Badge in High Five
[Click For More Info]

For your inventive entries in  [Link To Item #2213121] ! Thanks for the great read! Merit Badge in Enlightening
[Click For More Info]

For winning 3rd Place in  [Link To Item #2213121] . Congratulations!
Merit Badge in Quarks Bar
[Click For More Info]

    For your awesome Klingon Bloodwine recipe from [Link to Book Entry #1016079] that deserves to be on the topmost shelf at Quark's.
Signature for Honorable Mentions in 2018 Quill AwardsA signature for exclusive use of winners at the 2019 Quill AwardsSignature for those who have won a Quill Award at the 2020 Quill Awards
For quill 2021 winnersQuill Winner Signature 20222023 Quill Winner

Previous ... 26 27 28 29 -30- 31 32 33 34 35 ... Next
November 27, 2022 at 12:01am
November 27, 2022 at 12:01am
#1041062
Today's blast from the past is a short entry I wrote near the end of August, 2008: "I can't get no...

I've been trying to put words to the malaise that seems to have overtaken my life. It's not that I'm not happy, or I'm severely lacking in anything (except maybe motivation).

I don't think I wallow in self-pity to that degree anymore. If I do, I don't write about it, because that leads to a) people shunning you even more; b) advice that doesn't work for me (as per some of the comments on that entry); or c) people being happy about it because at least they're not you.

Anyway, I did finally figure out what was bugging me, I think: work. The fact that I had to do it. It was interfering with my video game time.

What I want to focus on, though, is the first comment, from someone I sadly haven't seen in a while but used to comment here a lot:

It's "midlife crisis," buddy. I've experienced it and know men who experience it until either they self destruct or they do something to sate it.

Some people treat a midlife crisis, especially in dudes, as a joke. And to be fair, sometimes it really is funny, like when someone goes out and blows all their money and credit on a Porsche and a 21-year-old hooker (neither of which I've ever done). But that shit's real, and as much as society pushes men to squash their emotions, doing so is generally a Bad Idea.

That shit can ruin lives. I had an uncle who had an affair with a grad student, and ended up destroying his family. Fortunately, the kids reconciled with him before he croaked, and he expressed regret at his actions.

Anyway, I don't think that was one; I've always been prone to depressive episodes, regardless of age. And even if it was, I didn't have a family to ruin. Sure, I ended up getting divorced the following year, something which was in no way my fault [Narrator: It was a little bit his fault]. But having glanced at a few of the intervening blog entries, they weren't all gloom. Some of them were about my epidural for back pain, and anyone who's experienced chronic back pain can tell you it definitely affects one's mental health. And then there was a vacation, which apparently helped, too.

I did end up, over the following year, buying a new car, retiring, and traveling (in that general order). But the car was a Subaru, not a Porsche, and traveling is something I'd always wanted to do but was difficult while working full-time.

Whilst out and about yesterday in my new-to-me Subaru—after over a year without a car, I wanted to see what changes happened in my town, and besides, it was sunny and 70 damn degrees outside—I saw that we now have a Porsche dealership in my town.

I wasn't even the slightest bit tempted.
November 26, 2022 at 12:01am
November 26, 2022 at 12:01am
#1041037
I've done entries about avocados before. Here are a couple: "The Devil's Avocat and "Another Avocado Article. This one's not about the fruit so much as it is about the product.

How Marketing Changed the Way We See Avocados  
Once upon a time, Americans didn't know what to do with "alligator pears." Now we can't get enough


I have a confession to make: When I was a kid, I hated avocados. "Zaboca" as they are called in Trinidad (and maybe elsewhere in the Caribbean) were mushy and gross, in my young, uninformed opinion. They just didn’t taste like anything. Plus, I believed my parents ate some weird things in general.

I'm pretty sure all kids, from all cultures, have different approaches to food than adults. Hell, most of 'em probably don't like beer, even. But we usually grow up to actually like some of those yucky things mom and/or dad tried to shove down our gullets when we were little.

In the beginning of the 20th-century, they were called “alligator pears.” Their bumpy, olive skin connected them to those denizens of the swamp, and it’s shape resembled, well, a pear.

They had a marketing problem.


In addition to already discussing avocados, I've mentioned a few marketing problems in here before. Can't be arsed to find those entries, but I vaguely remember something about orange juice. And maybe chicken wings.

Along these lines, the California Avocado Grower’s Exchange launched a petition to change the name of the fruit, formally. They were pushing to get back to the cultural roots of avocados: The word “avocado” is derived from the Aztec “ahuacacuahatl.” This renaming was meant to further exoticize the product, lending credence to the idea that it was a special treat.

Another source puts it as āhuacatl. I don't know enough about the Aztec language to know if someone made a mistake or if maybe they used both words, or the one was a shortening of the other. Doesn't matter much, I suppose, but I do like to get these things right.

That was all well and good until nutrition experts began to promote a low-fat diet.

Wait, I thought that wasn't a thing until like the 1990s. Is this article skipping whole decades?

The public didn’t differentiate between saturated fats, which were target of this movement, and monounsaturated fats, which are “good” fats. Avocados came under fire.

Avocados under fire are disgusting. No, seriously, nothing's worse than a heated avocado.

So the avocado growers rallied. They funded research and put out studies meant to extoll the virtues of the fruit.

This. This is why people don't trust nutrition science.

The turning point for avocados was their integration with the [name of copyrighted sportsball game that takes place in Feburary redacted].

With enough money, anyone can advertise anything during that game and people will flock to it. If you put a halftime ad there selling boxes of unprocessed human shit, your shit supply would run out the next day.

The public’s investment and interest in the Su[bleep]wl cemented guacamole as a snack item, giving the avocado a foothold it needed. Access to avocados also increased as the previous ban on the import of this item was lifted in 1997, and fruits from Michoacan began to flow across the border.

Then why do I remember avocados from way back in the 70s? It's possible my memory is faulty, or maybe I shifted timelines.

At the end of the day, avocados have a place on today’s table thanks in part to a tireless campaign to redefine and redraft their identity. Some of it was misguided, some of it was weird and some of it was good. That is the nature of advertising.

I put up marketing articles here from time to time because many writers need to know how to do marketing. I mean, I would, but no matter how much I learn about it, I'm utterly incompetent at marketing (maybe because I won't pay for an ad spot during the game-that-shall-not-be-named). Doesn't stop me from reading about it, though.
November 25, 2022 at 12:01am
November 25, 2022 at 12:01am
#1041003
Wrong time of year for this, but that's never stopped me before.

The Ancient Math That Sets the Date of Easter and Passover  
Why don’t the two holidays always coincide? It is, to some degree, the moon’s fault.


And yes the headline uses the M word.

Passover is a springtime Jewish festival celebrating the early Israelites’ exodus from Egypt and freedom from slavery. Jews observe it by hosting a ritual dinner, called a seder, and then by abstaining from eating all leavened bread for about a week.

Hey, it's the yeast we could do.

Easter is a springtime Christian holiday celebrating the resurrection of Jesus Christ and freedom from sin and death. It is preceded by a series of holidays commemorating Jesus’s path to the cross. One of these holidays is Maundy Thursday, which, aside from being a great name for a holiday, is a remembrance of the Last Supper, which was a seder. In the United States, many Christians observe Easter by attending a ritual meal between breakfast and lunch, called a brunch.

That part cracked me up.

These holidays have a lot in common: They share themes of liberation and triumph; they both involve buying a lot of eggs; they were both a pretty big deal for Jesus.

To acknowledge that I'm writing this in late November, just when the winter holiday marketing season is gearing into overdrive and rolling coal at our collective Priuses, I'll note that there is something of a parallel here with Christmas and Hanukkah. There is, however, one incredibly important difference: while Easter was built off of Passover (and both holidays stole from Pagans), Hanukkah and Christmas (also stolen from Pagans) have fuck-all to do with each other, apart from generally happening when it's way too bloody damn cold in the northern hemisphere.

Without going into detail, Hanukkah isn't "Jewish Christmas" (my friend likes to call it "Blue and Silver Christmas"). But, like the holidays in the article, sometimes they happen to overlap (like this year), and sometimes they don't.

In the Gospels, the existential drama of Easter happens against the backdrop of Passover. Yet about 15 percent of the time, the two holidays actually occur a month apart.

Those are good years for me. See, my cousin usually wants me to travel for Passover. Which is fine. Except when Passover falls on Easter, in which case traveling up the Northeast Corridor is the First Circle of Hell.

Anyway, the rest of the article goes into the differences between the Hebrew lunisolar calendar and the Christian solar calendar.

During the month of Adar (which directly precedes the Passover month of Nisan), the ancient rabbinical court would decide if it was springy enough outside for Passover. If spring seemed to be on track, Nisan could occur. But if it wasn’t warm enough outside yet, the rabbis would tack on another month of Adar. They called this leap month Adar II.

Early Rabbinical Judaism was very creative with names.

Today Roman Catholics and most Protestant traditions now celebrate Easter after March 21 on the Gregorian calendar. But the Eastern Orthodox Church uses the older version of that calendar, known as the Julian, to determine the date of Easter and other festivals.

So it's not just different religions' calendars that cause issues, but that of different sects of the same religion. This in no way surprises me.

Thanks to tiny wobbles in Earth’s orbit, some years are a second or two longer or shorter than others. So every year, the International Earth Rotation and Reference Systems Service announces whether to add a leap second in order to align Earth time with solar time.

Yeah, it looks like they're abandoning, or at least pausing,   this practice.

And as it happens, the first night of Passover can never fall on Maundy Thursday, even though that holiday commemorates a seder. That’s because Passover can never begin on Thursday, ever. “The calendar is rigged so that [seder] can fall only on certain days of the week,” Dreyfus told me. “If Passover started Thursday night, it would push Rosh Hashanah the following year to start on Saturday night.” And neither Rosh Hashanah nor Yom Kippur, the two High Holidays of the Jewish year, can fall the day after Shabbat.

Just in case you thought it wasn't complicated enough.

But no, to directly address the article's subhead, it's not the moon's fault at all. Nor is it the sun's. "The fault, dear Brutus, is not in our stars, but in ourselves, that we are underlings. Might you become master of your fate through choice—no matter what the stars say?" When it comes to holidays and observances, we're all servants of the calendar, and the calendar—any calendar—is purely arbitrary. I could argue that the Hebrew calendar, tied as it is to lunar cycles, is somewhat less arbitrary than the Gregorian, which is tied to nothing (well, sort of, mostly). But it's still arbitrary: Hebrew calendar months start on new moons. Why not full? Or half?

Sometimes I think we should just abandon the whole thing and adopt the Tranquility Calendar.   Other times I go in the complete opposite direction and want a calendar more obviously tied to actual astronomical observations: solstices, equinoxes, moon phases.

Hardly matters, though. Inertia favors the Gregorian. We can't even agree on stopping this Daylight Saving Time nonsense; calendars ain't gonna change.

Well. Eventually they will, because even the orbits of the Earth and Moon are, over a long enough timescale, chaotic. Or we could disappear, along with our calendars. But we're stuck with these complicated things for the foreseeable future. Fortunately, you don't have to make any of the observations and computations yourselves; someone else will tell you when it's time to celebrate whatever.

*Movie**Film**Film**Film**Movie*


Because yesterday was Thanksgiving, I had nothing better to do than go to the movies.

One-Sentence Movie Review: Bones and All:

A meaty movie with a side of mashed metaphors and symbolism sauce, this story of two fine young cannibals in the 1980s was filling Thanksgiving Day fare for me.

Rating: 3.5/5
November 24, 2022 at 12:03am
November 24, 2022 at 12:03am
#1040976
You've certainly heard the cliché, "the greatest thing since sliced bread," which implies that sliced bread was the greatest invention. This is obviously false, as beer was invented before sliced bread.

Sometimes it takes a lack of something to truly appreciated it. This was true during Prohibition, and apparently, also true during World War II.

Remembering When America Banned Sliced Bread  
During World War II, the U.S. government turned to drastic rationing measures.


The year was 1943, and Americans were in crisis. Across the Atlantic, war with Germany was raging. On the home front, homemakers were facing a very different sort of challenge: a nationwide ban on sliced bread.

While it's true that there were far worse things in WWII than a lack of sliced bread, I can see how that would be frustrating.

The ban on sliced bread was just one of many resource-conserving campaigns during World War II. In May 1942, Americans received their first ration booklets and, within the year, commodities ranging from rubber tires to sugar were in short supply.

These days, of course, such measures would be "government overreach," "tyranny," "an attack on muh freedumbz," and "cause for riots in the streets."

So by January 18, 1943, when Claude R. Wickard, the secretary of agriculture and head of the War Foods Administration, declared the selling of sliced bread illegal, patience was already running thin. Since sliced bread required thicker wrapping to stay fresh, Wickard reasoned that the move would save wax paper, not to mention tons of alloyed steel used to make bread-slicing machines.

Okay, so I can see the wax paper thing (plastic wasn't much used then), but didn't the slicers already exist? Sliced bread was invented in 1928  , and pretty much everywhere five years later—ten years before the ban.

On July 7, 1928, the Chillicothe Baking Company in Missouri first put his invention to use, saying it was “the greatest forward step in the baking industry since bread was wrapped.”

So, really, the expression should be "the greatest thing since wrapped bread."

As an aside, I'm picky enough to get my bread from a local bakery rather than the supermarket. They'll slice it right there upon request, and slide it into a plastic bag with a twist tie. Since it's fresh bakery bread without preservatives, if it's not wrapped, it becomes a rock within 24 hours.

Sliced bread really took off in 1930, when the Continental Baking Company’s pre-sliced Wonder Bread made its way into American homes.

Ugh. Foul. Disgusting.

After a few years of aggressive marketing, the pillowy, preservative-laced loaves were synonymous with modernity and convenience.

They're synonymous with American lack of taste, along with American cheese and light beer.

On January 24, less than a week after the ban, the whole thing began to unravel. New York Mayor Fiorello LaGuardia made a public announcement that bakeries that already had bread-slicing machines could carry on using them.

See?

No wonder he got an airport named after him.

One baker by the name of Fink, who also happened to be a member of the New York City Bakers Advisory Committee, publicly advocated for the ban, then was fined $1,000 (more than $14,000 today) for sneakily violating it.

Tempting as it may be to claim that his name was the source of the verb "to fink," no, it wasn't.

By March 8, the government decided to abandon the wildly unpopular measure. “Housewives who have risked thumbs and tempers slicing bread at home for nearly two months will find sliced loaves back on the grocery store shelves tomorrow in most places,” noted the Associated Press.

Government overreach, tyranny, muh freedumbz.

In the end, no thumbs were severed and Americans were reunited with the sliced bread they had learned to hold so dear.

Once you get used to an invention, especially one that saves time and work, it's very, very hard to do without it. If necessity is the mother of invention, laziness is the milkman.

Also, fuck Wonder Bread.
November 23, 2022 at 12:02am
November 23, 2022 at 12:02am
#1040944
This one's been hanging out in my queue since October, but whatever. Poe is timeless.



Article is from Cracked, so take it with a grain of gothic black salt.

As the original goth boi, it’s only fitting that Edgar Allan Poe’s death was as mysterious and haunting as one of his stories. Just before he died at age 40, he seemed to drop off the face of the Earth for a week, and his death has been attributed to everything from low blood sugar to murder.

There was a movie called The Raven about 10 years ago, starring John Cusack as Poe. Critically panned and engendering lukewarm audience response at best, I felt like it was severely underrated. Not that it was a great movie, but it didn't suck, either. I think most people missed the point. The movie makes the most sense, I think, if you remember the Poe quote, "All that we see or seem is but a dream within a dream." Or maybe I'm enough of a fan of both Poe and Cusack to have enjoyed it anyway.

15. Missing Poe-son

Oh, how clever. A Poe pun. Well, I shouldn't complain too much; I have every intention of adopting a tomcat just so I can name him Edgar Allan Purr. Bonus points if I can find a black one.

On September 27, 1849, Poe left Richmond, Virginia, where he’d been busy talking his childhood sweetheart into marrying him, for Philadelphia for a job editing a poetry collection (it needed more symbolism or something), after which he intended to head back to New York, where he lived.

Lots of places claim Poe, and for good reason. He belongs to everywhere, but really, he was a Virginian.

14. October 3, 1849

Four days before his death, Poe resurfaced in Baltimore, if you can call the gutter a surface.


Isn't that just basically Baltimore?

13. Poe’s Death

According to the most likely account, he never got it together long enough to explain how a business trip turned into a disoriented game of dress-up before he died on October 7.


Of course it was October.

For we knew not the month was October,
And we marked not the night of the year


12. Poe’s Cause of Death

Cracked's attempts to fit this story into its usual bite-sized countdown chunks seems forced here, so I'm skipping a few; basically, the next several points involve the various ideas about what might have led to his death. There are a lot of them, and as far as I can tell, none of them really fit.

My personal theory? He fell ill from gothic ennui.

The only person who saw Poe alive after he was brought to the hospital was Dr. John Moran, who kept changing his story.

Keep writing unreliable narrators   and you, too, can have your last days confounded by an unreliable narrator.

You know what would be really helpful here? An autopsy. A death certificate. Any records of any kind. None have survived, if they ever existed, and no autopsy was ever performed on a famous writer who died a bizarrely mysterious death. We don’t know, guys. Our money’s on the hospital administration.

Well, this is America we're talking about. Most likely he died of shock after seeing the hospital bill.

To die laughing must be the most glorious of all glorious deaths!
         —E. A. Poe
November 22, 2022 at 12:01am
November 22, 2022 at 12:01am
#1040914
Nope.

I Spent the Winter Solstice in One of the Darkest Places on Earth  
During the phenomenon of polar night, parts of the Arctic don’t see the sun for weeks or months at a time. The darkness drives some people insane, but for others, it opens a gateway into wonder and peace.


Well. A qualified "nope," anyway.

About eight years ago, I stepped through the unlocked door of a 1915 cabin-turned-chapel in Wiseman, Alaska, an Arctic settlement of about a dozen people roughly seven hours north of Fairbanks.

I looked up Wiseman when I found this article. The "dozen people" thing appears to be true, though some sources say less. Oddly, there is a bed and breakfast there, called Arctic Getaway.

It is quite literally in the middle of nowhere.

The pastor, who had lived in Wiseman for decades, described the inexorable march of darkness as a force both terrifying and beautiful. She spoke of chopping wood, preserving berries, and squeezing the joy out of every moment of daylight before a winter in which, for more than a month, the sun never rises above the horizon.

That's the actual definition of "existing north of the Arctic Circle." You also get a month or so of permanent daylight in the summer. Given my complicated relationship with the accursed daystar, I'm not sure which is worse.

The notion of such sustained darkness in a remote corner of the planet unnerved me. Residents of the Arctic tell stories of people losing their minds in the black of polar night. But I also felt strangely curious—and drawn to return one day.

I, too, admit to some curiosity. But not enough for me to actually go haring off to the Arctic. It's cold and there's probably no internet. On the plus side, there's the aurora borealis. I wouldn't mind seeing that once.

It’s not exactly easy to get to at any time of year and services like hotels and transport are few.

Well, there is that B&B. And being Alaska, don't they all get around by small aircraft? Also, Maps shows an actual state road going through it (apparently built to support the Alaska Pipeline), but I can't be arsed to see if it's passable in the winter.

I did, however, note that there's a place just south of Wiseman called Coldfoot. I immediately assumed that this referred to frostbite, but Wikipedia has other ideas:

Coldfoot is a census-designated place in Yukon-Koyukuk Census Area in the U.S. state of Alaska. The population was 34 at the 2020 census. It is said that the name was derived from travelers getting "cold feet" about making the 240-some-mile journey north to Deadhorse.


So apparently there is also a town (or whatever you want to call it) named Deadhorse. Okay, Alaska.

But last summer, a friend forwarded me an email about a tiny off-grid six-person retreat center that had just opened outside of Wiseman. The owners were hosting a week-long trip that included yoga and exploring the Arctic wild with skis, snowshoes, and dogsleds, and the dates fell right on the winter solstice.

Nope, nope, nope, and nope. Also nope. But at least it's (probably) not hot yoga, though I'm not sure if freezing-your-ass-off yoga would be any better.

I’m not exactly a cold-resistant creature: I’ve suffered from hypothermia multiple times and frostbite that turned my feet white and wooden. I’m generally dressed in a sweater and jeans when my friends are wearing shorts and flip flops. Even at much more temperate latitudes, seasonal affective disorder runs in my family.

I consider anything below 70F to be "cold." Anything below about 60F is "too damn cold." Like, tonight, it was around 40F and in order to take my recycling to the curb, I had to put on my battery-powered vest, scarf, heavy coat, and ushanka hat. I experience seasonal depression, too, but it's not the darkness that would stop me; it's the temperature, and, again, I can't emphasize this enough, no fucking internet.

I also contemplated the wisdom of traveling during a pandemic, and the carbon emissions of flying long distances.

Look how virtuous I wanted to be, but I did it anyway, tee hee.

Soon after arriving, I tugged my snowpants over my jeans, donned both my down jacket and an insulated parka, and pulled on my warmest hat for a short walk.

Whyyyyyy?

The cold blew through it all in seconds. My eyelashes froze and my nose hairs crinkled. The liquid on my eyeballs felt like it was turning to slush. Even the slightest breeze lacerated my cheeks, and my mind felt tight with a barely concealed panic.

Look, I'm not going to berate anyone for stepping outside their comfort zone. I need to do it myself, from time to time. But there's leaving your comfort zone, and then there's going to goddamn Alaska in cocksucking December.

Between November 30 and January 9, the residents of Wiseman, Alaska, do not see the sun. They lose about 12 to 15 minutes of light each day until the solstice and then gain it back just as quickly. The future always looks scarier from the confines of imagination, and polar night was not so unnerving once I was in it. It was actually brighter than I anticipated—locals like to say that on the winter solstice, there are still five hours when it’s light enough that you can’t see the stars.

That's the thing a lot of people don't get about the Arctic Circle (or its southern counterpart). Sure, there are stretches of time when the sun is below the horizon, but depending on how far toward the pole you are, this can be basically a really long twilight.

Not that I've ever ventured that close, myself. The furthest north I've ever been was southern Scotland, unless you count the path my plane took to get there (which happened much closer to the summer solstice, so I could see the midnight sun shining through the plane's windows).

Anyway. The rest of the article attempts to wax poetic about the author's experience and, while I can appreciate the language, every other sentence just made me yelp "nope" again.

One night, I wandered out of my cabin, wrapped in a sleeping bag I had brought just in case, and watched slack-jawed as the northern lights whirled across the dome overhead like a luminous river. After many days, the formidable peaks of the Brooks Range finally disrobed from their mantle of clouds and shone resplendent in the moonlight.

That bit, though... that almost makes me want to visit.

Almost.

At least there's pictures and descriptions that I can see and read because I have a freakin' internet connection in a heated house.
November 21, 2022 at 12:02am
November 21, 2022 at 12:02am
#1040879
One of the more interesting branches of folklore is kidlore.

Why Did We All Have the Same Childhood?  
Children have a folklore all their own, and the games, rhymes, trends, and legends that catch on spread to many kids across time and space.


Though I'm pretty sure some of the stuff circulating in school when I was a kid would get someone in huge trouble nowadays.

For example, we had a song we sang to the tune of "Battle Hymn of the Republic:"

Mine eyes have seen the glory of the burning of the school
We have wiped out all the teachers, we have broken every rule
We got sent up to the office and we shot the principal
Our school is burning down
Glory, glory, hallelujah!
Teacher hit us with the ruler
Glory, glory hallelujah
Our school is burning down


...yep, these days that would result in forced therapy at best and juvie at worst. Just for singing it.

You might not think of typing “BOOBS” on a calculator as cultural heritage, but it is.

Definitely. We also would type 7734 which, when turned upside down, said "hell." This was the height of hilarity in like 4th grade. In our defense, this was very early in the history of pocket calculators. They ran on 9V batteries and the screen was red LED, and most of them could do no more than add, subtract, multiply, and divide. And spell hell and boobs.

This sacred communal knowledge, along with other ephemera of youth—the blueprints for a cootie catcher, the words to a jump-rope rhyme, the rhythm of a clapping game—is central to the experience of being a kid.

We didn't call 'em cootie catchers. In fact, cooties were kind of a foreign concept in my school, something talked about like it was from another culture. No, what they're calling a cootie catcher, we called a fortune teller, and there were intricate ways of labeling and using the thing.

When children are together, they develop their own rituals, traditions, games, and legends—essentially, their own folklore, or, as researchers call it, “childlore.”

I like my "kidlore" better. Rolls off the tongue more easily.

Even seemingly more modern inventions, such as the “cool S”—a blocky, graffiti-ish S that has been etched into countless spiral-bound notebooks—are a shared touchstone for many people who grew up in different times and places in the U.S.

The main graphic at the link displays the S in question. I always somehow associated it with the band Styx, even though their stylized logo S was somewhat different.

Indeed, thinking back to the lore of my own youth, I have no idea how my friends and I thought to give each other “cootie shots” with the lead of a mechanical pencil, or why everyone in my elementary-school art class would smear their hands with Elmer’s glue, wait for it to dry, and then methodically peel it off (other than the fact that it was super fun and I would do it again right now if I had some glue nearby). These things were almost like analog memes, micro-bits of culture that seemed to come from nowhere and everywhere.

I'mma stop you right there: "almost like analog memes," my ass. You know what the actual definition of a meme is? Unlike kidlore, we know exactly where and when the word and concept of "meme" came from: Richard Dawkins, who meant it as "a unit of cultural transmission" and the definition got into a dictionary as "an element of a culture or system of behavior that may be considered to be passed from one individual to another by nongenetic means, especially imitation." By the definition of meme, kidlore is memes, full stop, end of discussion.

Labeling funny images with text "memes" came later. I'm not saying it's wrong; definitions change and in the immortal words of Robert Plant, you know sometimes words have two meanings. But saying that a particular bit of kidlore is "like" a meme is ignorant as shit; it is absolutely a meme.

Anyway...

The main way childlore spreads is, perhaps obviously, by children teaching it to one another. Older kids mentor younger ones both at school and at home, where siblings play a vital role in passing jokes and games down through generations.

And most of this lore is stuff that would horrify parents, who will conveniently forget that they participated in the same rituals as kids.

Parents and teachers share nursery rhymes, folk songs, and games with kids, and adults create the movies, books, and TV shows that kids consume.

Yeah, but those are deliberately sanitized, adult propaganda to convey what adults think children should know. Kidlore is different, and often includes elements that would never make it into a Disney movie. Like the song parody above, or one of the many racist chants I was subjected to as a kid.

The author does point this out later, albeit parenthetically.

Although some elements of childlore last and last, others come and go with the culture of the moment. But even then, Willett told me, kids often build on what came before, whether they realize it or not. For instance, COVID-19 has shown up in many kids’ games, including coronavirus tag— which is, of course, built on one of the most classic kids’ games there is. (Roud suspects that in Victorian times, European children played cholera tag or something similar.)

Interestingly enough, the rhyme "Ring around the rosie" was not,   as is commonly assumed, about the Plague.

Another example Willett gave, from one of her studies, was a game based on the Weeping Angels from Doctor Who—monsters that can move only when you’re not looking at them.

I just gotta say, that's awesome.

And so, as we come of age, we may lose an understanding of something we once knew in our very bones: that typing 8-0-0-8-5 on a calculator is not just naughty and fun, but important. The rebellious thrill, the intense comradery, the urge to pass the knowledge along (and pretend you came up with it yourself)—all of these things fade with time.

I'm sure there's a lot of it I've forgotten. But I knew then, and I know now, that these things were important. And they were just as much a part of learning as any formal schooling. Rhyming taught me wordplay. Figuring out how to make a simple pocket calculator say a naughty word led to my interest in computers (now I can make them say ALL the naughty words). Folding and scribbling on a fortune-teller led to an interest in random numbers and topology.

The racist chants, though, those I could have done without... but I guess they taught me that some people are best avoided. By which I mean the ones perpetuating those memes.
November 20, 2022 at 12:01am
November 20, 2022 at 12:01am
#1040853
One of the reasons I've been randomly digging into the archives on occasion is that things change, especially me.

"Rarity is from mid-2009, and it made me wonder just how much some things actually change. The entry was done in what would eventually become my standard format, commenting on an article I found on the internet. Being from 2009, though, the actual article is gone, but hopefully there's enough in my entry to get the general idea: some employers were allegedly having trouble filling positions.

Like I said, made me wonder just how much some things actually change.

In 2009, the US was in the middle of the Great Recession, marked in part by high unemployment rates. It peaked then at about 10%, and that number would steadily decline over the next several years until it shot up again in early 2020 (gosh I wonder what happened then)—but we couldn't have known that at the time. The current value is around 3.5%, close to what it was in the Before Time.

With an unemployment rate that low, I can understand employers not being able to find workers. At 10%, though? Well, I guess the original article was about professionals and trade workers, not unskilled labor.

What stood out for me in my original entry above was this:

Businesses try to create a glut of workers so that they can have more control over the workers. Supply and demand, folks. If there are 30 engineers competing for one position, they'll go with the competent one who can live on the lowest salary. Just business. If, however, there are 30 possible positions for each engineer, the engineer's in the catbird's seat.

I'm not sure why I phrased it that way now, implying that businesses are creating workers somehow. I mean, I guess they do to some extent when they go to the press with "we have a shortage of [nurses|engineers|welders|whatever] so kids should go learn these trades!" But I can't see that making a huge difference.

What I probably should have said was that businesses like it when there's an abundance of workers. And I still think that way. Oversupply of workers leads to management deciding they can make more demands of them, while a scarcity of potential employees means someone looking for a job can be the one doing the demanding.

Embarrassing moment for me in the entry:

Let's play SAT test (don't worry; no math in this one).

Oh, I'll need to pull cash from the ATM machine and use my GPS system to get to the testing facility. Oy. Sorry about the redundancy. I make mistakes from time to time. I know, I know; it was hard to believe when I wrote yesterday that I don't always get everything right. But this is proof.

In any case, i was struck by the similarities of the 2009 article (at least what excerpts survived in my entry) and a lot of the rhetoric businesses are spouting today. I tried to find a similar article from this year or at least last year, to see what "the hardest jobs to fill" are right now, but my search came up with nothing relevant. I did find this  , but there's no date on the page (apart from a copyright that probably updates every January 1), and the salaries listed appear to be from 20 years ago. So I have no idea what the hardest jobs to fill right now are.

I suspect "blogger" isn't on the list, though.
November 19, 2022 at 12:02am
November 19, 2022 at 12:02am
#1040820
Today's article is about avid readers, which I'm sure all of my readers are. Else you wouldn't be reading this.

Words we think we know, but can't pronounce: the curse of the avid reader  
Do you know how to say apropos? What about awry? We want to know which words you’ve mispronounced – and how you found out your mistake


This is from a couple of years ago, but that's probably irrelevant. I just found it last month. What is relevant is that it's an article about English pronunciation in the Guardian (British) written by an Australian woman, and it's well-known that the US, the UK, and Oz pronounce certain words different ways. "Privacy," for example. I think only the US pronounces that with a long i. So just keep that in mind.

When I mispronounced tinnitus (ti–nuh–tuhs is correct, ti-nai-tis is not) recently and was kindly corrected, my embarrassment was a fraction of when I said apropos (a–prow–pow instead of a-pruh-pow) to a large table of people in London when I was in my 20s. That day I was not kindly corrected, but only realised my mistake after howls of laughter and a whispered, “Maybe that’s how they say it in Australia?”

Now, see, I always thought it was ti-nai-tis. Some sources say both are correct. Officially, the first syllable should be emphasized. If enough people pronounce it the "wrong" way long enough, though, it becomes an alternative pronunciation.

As for "apropos," well, at least she didn't pronounce the s at the end, right?

Since then, I have learned that mispronunciation is often the downfall of people who read widely as children and form the incorrect pronunciation in their mind before actually hearing the word said aloud.

You know what? That shouldn't be embarrassing. It means you read. What should be embarrassing, but too often isn't, is the polar opposite: when you try to write something as it's pronounced, and you spell it wrong. I worked with a guy who kept writing stuff like "part in parcel" (should be "part and parcel"); "save and accept" (save and except), and "beckon call" for beck and call. Those errors aren't proof of illiteracy per se (he would write "per say"), but they do indicate a lack of interest in reading. And don't get me started on affect/effect.

What's even worse, of course, is mixing up things like it's and its; there, they're, and their; and your and you're.

Now, I'm not saying I always get everything right. Far from it. Only that I have more respect for people who mispronounce things because they read a lot than (not "then") I have for people who misspell things because they hardly read.

My ex-wife, for example, pronounced "picturesque" like "picture-skew." I thought it was adorable and never corrected her. Though in hindsight I should have used it against her in the divorce.

In short, I'd rather deal with someone who mispronounces "apropos" than with someone who writes it "apropoe."

Annals (not ay-nals), Hermione, misled (does not rhyme with thistled) and glower...

Look, it should be shining obvious that annals isn't pronounced the same way as anals. No one in the US knew how to pronounce Hermione until the first Harry Potter movie came out. I never thought misled rhymed with thistled. As for glower, well, honestly, I never was very sure about that one so I avoided saying it (turns out it's pronounced like flower).

A colleague pronounced facade with a k sound, another thought burial rhymed with Muriel and yet another was mortified to discover that segue was not pronounced seeg.

At least two of those are a result of not knowing the French influence on English.

English pronunciation can be tricky like that, anyway. We've borrowed so many words from other languages, words where you have to know a bit about the language to pronounce them correctly. Like, if you see the word "sake," you need to know if it's preceded by "oh for fuck's" or if you're talking about delicious Japanese rice wine.

French words very often leave English-speakers flummoxed. I’ve heard canapés pronounced in quite creative ways, and amuse-bouche, prix fixe and hors d’oeuvre have seen the odd food lover come a cropper.

Before I started learning French, I had a lot of fun deliberately mispronouncing French words. Canapés became can o' peas, for example, and hors d'œuvres became, to my vast personal amusement, horse doovers. And the surest way to annoy a French person is to say "Par-lezz-vouse fran-kais"

What word have you always mispronounced?

The article recommends commenting there with an answer to that. I wouldn't advise it as, again, this is over two years out of date. It also recommends tweeting same, which I definitely don't recommend right now.

But if you want to give me your examples below, feel free. Me? I don't know which words I'm mispronouncing. If I did, I wouldn't mispronounce them anymore. I know I used to think that rappelling (the practice of rock climbing with ropes) was like rapple-ing, but once I was laughed at and corrected I said ra-PELL-ing like you're supposed to. But that was back in high school.

I guess what we need is a verbal version of spell check. Something that makes a red squiggly line appear in your vision when you're about to mangle a word that you've only ever seen in print. Alas, we'll have to wait until cyborg technology is more advanced for that.
November 18, 2022 at 12:02am
November 18, 2022 at 12:02am
#1040776
Hell of a hazard.

A Woman Was Caught Whacking a Golf Ball into the Grand Canyon, and the Feds Aren’t Happy  
The latest story of a tourist behaving badly in a national park is a real head scratcher


Source is (shudder) Outside magazine. I still don't know why I keep reading their stuff.

Somewhere in the dark recesses of my memories lives my long-forgotten teenager sensibilities. This is the version of myself that delighted in immature pranks, like toilet papering a classmate’s cottonwood trees and playing ding-dong ditch.

Both of which are annoying but relatively harmless. If no one was injured, a school wasn't evacuated, and nothing caught on fire, you were a goody-two-shoes.

I'm not admitting to anything, by the way. Just saying.

I’ll admit it: my teenaged self would absolutely understand the allure of whacking a golf ball off of the side of the Grand Canyon and watching it disappear into the chasm below.

Okay, so true story: they taught us the basics of golf in sophomore gym class in high school. As I recall, we split up into pairs and each pair got a golf club (don't ask me what its number was or whether it was iron or wood) and a wiffle ball the size of a regulation golf ball. The idea was to learn our swings and recover the ball easily.

I was paired up with the class stoner, who, with a level of perception and intelligence only displayed by a high school stoner, found the one real ball in the box of wiffles. One of each pair of students teed up, and on the coach's command, wound up and swung.

Everyone else's ball caught a bit of air and then dropped down to bounce sadly on the grass. Ours, however, made a perfect golf-ball arc through the air and ended up 300 yards downrange.

Coach got up in our faces. "WAS THAT A REAL BALL?"

Stoner nodded. (Like anyone could have done that with a wiffle.)

"YOU GO GET THAT RIGHT NOW."

So we trudged through the bushes. As soon as we were out of sight, my teammate produced a joint from his pocket and sparked it.

Like I said, perceptive and intelligent.

As I recall, we did find the ball, and both got lousy grades in that gym class. Which was the only class my parents didn't care what grade I got in, so it all worked out for everyone involved. Coach got to get in someone's face; stoner got to get high in class; and I got a story.

Anyway. The relevant thing is that, while I certainly got up to some shady shit in high school, I don't think I ever considered whacking a golf ball off the rim of the Grand Canyon. For one thing, it's 2100 miles away. For another, I don't play golf (it generally requires being *shudder* outside).

On Thursday, National Park Service officials posted an update on the Grand Canyon’s official Facebook page about a woman who filmed herself hitting a golf ball into the canyon, which she then uploaded to TikTok. In the video, the woman also loses the grip on a golf club and flings it off the cliffside.

If it wasn't recorded and posted on social media, it didn't happen.

Now, look. The problem with lofting a golf ball into the Grand Canyon isn't that it might hit someone. The chance of that is infinitesimally small, though admittedly if it did happen the consequences would be terrible. No, it's that it might inspire other people to do it. TokTik trends are a thing, and I can definitely see the "Grand Canyon Hole In One Challenge" going viral. With that many balls flying through the air, the chance of hitting someone increases significantly... as does the amount of litter, which is the real problem here. (The article does mention all this later).

Officials acted swiftly, and with the help of the general public, were able to track down the woman.

Snitches.

At Outside, we come across a litany of stories of people behaving badly in the outdoors, and this year has been a busy one.

More reasons not to go into the outdoors.

There were the high schoolers who booted a football off of Colorado’s Uncompaghre Peak...

Touchdown!

...the dudes who were photographed scrawling graffiti in chalk on a rock at the Grand Canyon...

At least it was chalk and not spray paint?

...and the never-ending march of tourists getting too close to animals at Yellowstone National Park.

Those, I have mixed feelings about. I mean, it might injure the poor animal if it attacks and kills the stupid human, but other than that, it only harms the stupid human. So while it's true that one shouldn't get cuddly with a Yellowstone grizzly, bison, unicorn, or whatever they have out there in the wilderness, the penalty is built right in.

Article doesn't mention my favorite Yellowstone idiocy, which is people who see the pretty pools of azure water and decide to go off the very clearly marked trail festooned with danger signs and warnings (you can't tell me what to do! mah freedumz! stoopid government!) to take a dip. Some of those pools are near boiling and have a pH of like 1.5. Here's someone   of whom nothing was left but a foot.

I mean, that sort of thing is just par for the course.
November 17, 2022 at 12:01am
November 17, 2022 at 12:01am
#1040741
Speaking of food, I continue to see people raving about sourdough, which apparently a lot of people played with when they were stuck at home over the past couple of years. Me, I just continued buying bread from the bakery as nature intended. Every time I've tried to bake bread, bad things happened.

Anyway, this article isn't about pandemic sourdough bakers; it goes back a bit further than that.

San Francisco’s Famous Sourdough Was Once Really Gross  
Gold miners made themselves sick on smelly, hard loaves.


Long before it became a viral food trend or social-media sensation, American sourdough was surprisingly gross.

So, more like the other meaning of "viral."

San Franciscans proudly trace their city’s iconic bread back to the Gold Rush of 1849.

That's fair, but it should be noted that before people knew what yeast actually was, almost all bread was "sourdough." I think some was made with repurposed beer yeast, but the origins of sourdough extend way before San Francisco was a thing.

The men who flocked to Northern California in search of gold made bread in their wilderness camps not with store-bought yeast, but with their own supply of sour, fermented dough.

Yeah, I could be wrong, but I don't think there was a lot of store-bought yeast there at that time.

Letters, diaries, and newspaper articles written by and about the 49ers, lumberjacks, and pioneers of the American West are full of complaints about horrible and inedible sourdough. Could bad bread really have inspired San Francisco’s most beloved loaf?

Or it could be gold miners protecting their hoard. "Don't come out here. The weather is shit, people are camping everywhere, and the bread sucks." You know, a bit like San Francisco today.

In 1849, when gold miners began arriving in San Francisco, most Americans didn’t bake or eat sourdough bread. American bakers typically leavened their bread with “barm” (a yeast derived from beer brewing) or one of several relatively new commercial yeast products.

Yeah, see? Look, I comment on these things as I go, and it's nice to turn out to be mostly right.

Hm, I wonder... why, yes, that is the origin of the mostly British word barmy.  

These commercial yeasts were easy to work with, didn’t require constant maintenance, and produced reliable results. They also produced bread that appealed to American taste buds.

American taste buds suck. They think Budweiser is beer, pasteurized process cheese food is cheese, and Wonder Bread is bread.

Most 19th-century Americans preferred bread that was sweet rather than sour. According to one 1882 advice book for housekeepers, the “ideal loaf” was “light, spongy, with a crispness and sweet pleasant taste.” Sour bread was a sign of failure. As a result, bread recipes from the period used commercial yeasts along with considerable amounts of sugar or other sweeteners to speed up fermentation and avoid an overly sour flavor.

The ideal bread is a French baguette with a crispy crust. Period.

Sourdough required only flour, water, and fresh air. A sourdough “start” needed care, attention, and regular feedings but offered an inexhaustible, self-perpetuating supply of leavening agent, even in the wilderness.

Some brewers make beer with wild yeast, too. The results are all over the place. Some of them are also described as "sour."

Bread was baked under difficult circumstances—outdoors, over a campfire or hot coals, and sometimes in the same flat pan used for panning for gold—leading to inconsistent and unsanitary results.

But they could have sold it as artisanal sourdough with flaked gold.

Sourdough baked by pioneers wasn’t just gross and unappetizing; it could also make you sick.

Well, duh. Not all the microorganisms floating around are beneficial.

Across the American West, sourdough was considered a food for unmarried men who didn’t know how to cook.

As opposed to married men who didn't know how to cook.

So it’s worth asking: If sourdough bread baked by miners was so terrible, how did it become one of San Francisco’s most beloved foods?

It all came down to the success of the city’s French and Italian bakeries.


Yep. That'll fix it, alright.

By the second half of the 20th century, tourism boards in San Francisco were placing the 49ers at the center of the city’s history, idealizing life on the frontier and playing up links between the Gold Rush and the City by the Bay. San Francisco bakeries joined in, crafting stories about partnerships between bakers and miners and attempting to market the bread nationwide.

And once again, we see that no matter how disgusting something may be, if you market it right, it'll become popular.
November 16, 2022 at 12:01am
November 16, 2022 at 12:01am
#1040706
Yeah... I can relate.



Of course, ordering a pizza takes less than 30 minutes. Sure, you usually gotta wait longer than that for it to show up—Domino's nixed the whole "30 minutes or less" thing years ago, and they suck anyway—but at least you can be playing video games while you wait.

We’ve all fallen for the trap before. Wooed by the promise of pan-seared chicken thighs in 30 minutes, only to be defeated and left overanalyzing what went wrong more than an hour later. Or worse, we’ve thrown some onions in a pan to caramelize while we’re searing a batch of burgers, only to find ourselves still stirring the onions dejectedly, 45 minutes later.

It's not just recipes, either. "Minute" rice can take way longer than a minute to cook.

It’s right there, staring at me. Cook time: 30 minutes. But a closer look at the ingredients says otherwise. Five garlic cloves, minced. One stalk of celery, thinly sliced on the bias. Two carrots, peeled and chopped. One yellow onion, finely diced. There go 15 minutes already (on a good day, with a sharp knife, and no distractions), which doesn’t even account for the five minutes needed to compose myself after tearfully hacking at an onion. And that’s only half the battle, if we’re counting the unglamorous process of washing and thoroughly drying all of those vegetables.

Not to mention the half hour or so you spend cleaning up what your roommate left in the sink and on the stovetop.

Look, I'm a big fan of mise-en-place, and was even before I started seriously learning French. Get all that measuring and chopping crap out of the way before you start cooking and you're not stuck watching your pot burn while chopping the onions in the middle of it all. There will be at least one onion that's started to go bad, too, so you always use more onions than you think.

I have managed to keep onion juice from messing with my eyes, though, so there's that.

But the problem with mise-en-place is you're not multitasking, so it usually takes more time to cook something if you're careful about getting everything all set before you fire up the stove.

Recently I fell into a similar trap after being convinced by a trusted blog that 35 minutes was all I would need to make mapo tofu in my Brooklyn kitchen.

Gotta get that humblebrag in. At least in Brooklyn, if you suddenly find you're out of ingredients, it's a much quicker trip to get more than in most parts of the world.

After pulverizing Sichuan peppercorns with a mortar and pestle, peeling and mincing a three-inch knob of ginger, finely chopping half a head of garlic, and rummaging through my dish rack to get enough small bowls and spoons to premeasure the rest of the ingredients, I’d already blown past the 20-minute mark, and I hadn’t even turned on the stove.

The peppercorn thing is way too much work. Ginger is way too much work, too, but it's worth it. And don't get me started again on peeling garlic.

Beneath a fish taco recipe advertised as a “fast dinner for hungry, busy people” in the New York Times, a comment reads, “It’s unbelievably condescending to claim this meal takes 30 minutes. It took me 15 minutes just to make the salsa, 7 for the mayo, 10 to warm all the tortillas, and a full 30 to fry all the fish in batches…. Great recipe, horrifically underestimated execution time, especially for those with kids running around.”

If it's taking you 10 minutes to warm up tortillas, either you're feeding the Mexican army, or you're doing something very, very wrong.

The conditions we’re under have their own matrices of variables. “Part of it is that recipes don’t account for skill levels—such as how fast you chop or mince and the equipment you have at your disposal,” says Kelly Chan, a Queens-based nonprofit analyst who’s often folding dumplings or prepping Cantonese-style stir-fries. Recipes are written with the presumption that all home cooks have speedy, chef-like knife skills to whiz through a mountain of shallots and tomatoes, or that they know how to butterfly a chicken without pausing, washing their hands, and looking up a YouTube tutorial. Even the Instant Pot—widely adored among home cooks for its shortcuts to complex 20-minute pho broths or five-minute steel-cut oats—still needs time to preheat and depressurize, effectively tripling the cooking time in some cases. (But of course no one tells you that, because it’s called an Instant Pot for a reason.)

I've never used an Instant Pot, but my gut told me the name was an exaggeration. Not just that, though, but also the work involved in cleaning it keeps me from buying one (my housemate has one, but rarely uses it, and I'm concerned I'll muck it up). Every time I consider a new kitchen gadget, I mentally figure out how much work cleaning it will be, and usually don't bother. One exception is a blender; those are usually worth the work to clean.

Real cooking proficiency isn’t about whipping things up without a recipe—it’s about reading between the lines of that recipe and knowing when an hour means two hours.

I usually mentally double a recipe's stated cooking time, and it still often runs longer than that. One time, I was trying to make latkes. I knew going in that it would be labor-intensive; that's just the way it goes. What I didn't account for, though, was that the damn things took three times as long to cook as I expected. To be fair, this doesn't always happen; one should use russet potatoes for latkes, and I had to get some other kind because the store was out of russets (this was around Hanukkah a couple years back; I guess everyone else was making latkes, too. Everyone's Jewish for latkes.)

My biggest gripe about cooking for myself, which is the usual case, is that I generally think that a dish shouldn't take longer to cook than it does to eat. Sometimes I do it anyway, for practice. But after laboring over a hot stove for two hours and finishing the resulting meal in less than five minutes, I'm left with the distinct impression that I've wasted my time.

Maybe I should buy more frozen dinners.
November 15, 2022 at 12:01am
November 15, 2022 at 12:01am
#1040668
Look, sometimes the Random Number Generator (peace be unto it) gives me the same source twice in a row. But I promise you this one's worth it, because it resolves a question I've had since I first learned what a question was.



I know I've commented on this before. For a while, I was methodically going through every single episode of every Star Trek show, plus the movies, in chronological order. I think the first time I noticed any reference to a toilet (or that would be head, since it's a ship) was sometime in the early 90s, some 25 years or so after the show's beginning.

But, apparently, I'd missed some. We'll get to that.

Without its fantastical future technology, Star Trek would just be a series about people who love sitting in comfortable chairs.

People watch shows like that all the time.

What has wowed audiences for decades are inventions such as the faster-than-light warp drive, the matter transportation system, and the Starfleet human resources nightmare that is the Holodeck.

Someone, I think it was Dave Barry, noted that the holodeck would be humankind's last invention. Personally, I don't think we'll ever invent it. Not because we won't be able to, but because, well, look at the Metaverse. Any real-life implementation of a holodeck will be absolutely loaded with safeguards, to the point where no one will be able to do anything fun with it, it will be a joke, and everyone will laugh at the inventor (who will consequently become a supervillain: "Laugh at ME, will you? Let's see who laughs now muhuahahaha!!!")

But anyone who’s ever watched any Star Trek TV shows, movies, or adult movies probably has some serious questions about how this fictional universe really works – perhaps the biggest being: where the hell does everyone go to the bathroom?

This is, indeed, one of the biggest questions in the universe.

I should note that, at about the same time DS9 was airing, and also around the same time I saw someone in the Trek universe finally acknowledge the existence of a toilet, Babylon 5 (another SF show about another space station that stayed in one place) not only acknowledged it, but set scenes in it.

Anyway, the next bit points out that there was a door labeled HEAD on the bridge of the Enterprise-D. That was TNG, before DS9.

And while the production “did not design or build the inside of that bathroom,” it was still there, just in case Number One had to take a … well, you know. Also, in the crew members' “various living quarters,” there is an unopened door that “we assume led to a bathroom.”

I should also note that this was around the time when, in attempting to answer my own version of the question (and the Trek novels were no help in that regard, either), I finally decided the key had to be transporter technology. Think about it. A transporter, under normal use, records the position and connections of every molecule, atom, proton, electron, and so on in your body (they have "Heisenberg compensators" to handle the Uncertainty Principle) and your clothes and equipment, right? And when you arrive at your destination, your clothes aren't somehow bonded to your body, etc.

Well, part of your body is the shit in your intestines and piss in your bladder. Sorry, but it's true: transporting someone necessarily requires transporting whatever waste products are awaiting exit. And with the transporter able to easily distinguish between different molecules, certainly it can tell shit from shinola. So. All you have to do is program the transporter to image you, then pull out the waste. No need to even strain on the throne; just push a button and boom, it's gone.

Where it goes is... well, let's keep reading. My theory turns out to be wrong, anyway. But it could have been right.

So we know that the Enterprise is equipped with toilets, but do we know what happens to the human waste? Do they make Chief O’Brien beam it out? More sensibly, one would think that the Enterprise crew could simply fire their poop into space from time to time, like a smellier version of Spock’s funeral.

Send it all to the Klingons like Scotty did with the tribbles?

But it turns out that this might be a terrible, terrible idea.

It is, for many reasons, not the least of which is the next starship zipping through the area will come into spacedock with skid marks.

Dr. Siegel speculated that, while none of the Trek shows go into much detail about toilet-based issues, “every bit of eliminated human waste is useful matter that you can reconstitute into something else.” Meaning that any waste created could potentially be used to “power the Replicators” – you know, the device that people of the future use to create small objects, cups of piping hot tea, and food.

Now, look. Some people are going to be grossed out by this idea. But come on. How do you think it works right here on Earth, and mostly without technological interference?

Everything you eat contains atoms that were once part of poo. Everything you drink contains atoms that were once pee. Every breath you take inhales atoms that were in a fart. Hell, worse, they've all been part of dead things. I don't mean fresh dead things like the kind you eat (even if you're vegan), but yummy, delicious carrion... oh, sorry, channeling my spirit animal again.

A replicator would just speed up that process.

Part of civil engineering, though not a part I ever participated in directly, is sewage treatment. You take all that waste and process it, and (in theory anyway) the water becomes clean enough to dump into a river. It's then either evaporated, falling back as rain that you might eventually drink; or it's processed by fish that you might eventually eat. Solid waste is sterilized and becomes fertilizer. Trek would just use the technological evolution of the same sort of processes.

According to Dr. Siegel, this is a pretty solid plan, although admittedly, you have to get over the “gross factor.” Like if you were to replicate a clarinet, à la Harry Kim in Voyager, you’d have to look past how “the atoms making up the clarinet that I'm putting in my mouth and playing right now were defecated out by me and a thousand other crew members onboard the ship.”

Again, the only difference between that and the way things occur naturally here on Earth is scale.

Still, seems to me that in Trek it would be simpler to cut out the whole "plumbing" bit and use the transporter like I said. But I guess that might set things up for some really gross practical jokes.
November 14, 2022 at 12:02am
November 14, 2022 at 12:02am
#1040628
Today's article, another one from Cracked, is a fun (and sometimes funny) exercise in speculation.



And I do mean speculation. Which is okay; you gotta start somewhere, and speculating is better than being incurious.

Humanity's most important question, excluding pop media relationship statuses, Incognito rash inquiries, and whether or how various animals would wear pants, is this: are we alone?

Well, some of us definitely are.

Oh, you mean "we" as a species. Or maybe "we" as a biosphere.

Are there any intelligent aliens we could have a brew and watch the game with, or are they all crabs?

There was an idea floating around a while back that came out as "everything eventually evolves into a crab." This is, of course, crabshit, as a moment's thought of how evolution works to create and fill environmental niches will conclude. The original idea is that a lot of, specifically, crustaceans, eventually take on the morphology of crabs, and there may be something to that. But it does represent one important leap in popular speculation about extraterrestrial intelligence: the idea that the human form isn't necessarily what evolution works toward (it actually doesn't "work toward" anything).

Speaking of which, this article uses "intelligence." I've beaten that dead crab a few times; basically, let's not conflate intelligence with technological capability. And please, please, stow the tired old "but we're not intelligent either" jokes. This article has enough of them.

Chillingly, we may be the gleaming example of advanced life in the entire universe.

Maybe. The Universe is a big place, though. So I doubt it.

In its usual style, the list is in reverse numerical order. Look, it's just their brand.

4. It's Not Just What Other Life Looks Like, But How We See It

Plants are green because they reflect green light. But the chlorophyll that powers their photosynthetic planty prowesses is extra reflective in near-infrared. Sadly, we're limited to seeing the visible spectrum of light, which is a tiny portion of the entire spectrum.


This is, essentially, true. But there's a decent reason for why we see the sliver of spectrum that we do, and not way out in other wavelengths: it's the relative transparency of water (where our distant ancestors evolved) and air (where our more recent ancestors evolved) to those particular frequencies. Now, some species see higher or lower wavelengths, but our red-to-violet vision is more than acceptable for what evolution produced vision for in the first place: seeing predators coming, and seeing prey.

The rest of this section goes deep into the speculation bit, and it has helpful images designed to be seen by our puny-sliver-of-spectrum-seeing eyes.

3. Aliens Could Look And Maybe Even Communicate Like Us, Dawg

Regarding convergent evolution, maybe nature isn't as creative as we thought and survival problems "only have a few good solutions."


Again, not borne out by evidence right here on our own planet. Every single living thing right now has been subject to evolution just as long as humans (and crabs) have, and this includes such varying survival techniques as nonskeletal molluscs (octopuses), opposable thumbs (primates), claws (crabs), bills (ducks), mushiness (jellyfish), ants, trees, and many other wildly varying features.

Photosynthesis necessitated loads of tweaks to many cell types, so plants produce oxygen and not, perhaps, farts. So, such intelligence as ours may occur on only 1 in 100 trillion habitable worlds. But while there may not be any civilizations in our galaxy, it's quite possible that the Milky Way still harbors tens of billions of planets covered in prokaryotic purple slime.

This aligns with what I've been saying all along. But remember, we have a sample size of exactly one when it comes to "examples of worlds with life on them." It could be that technology (again, not using "intelligence") happens on 1 in 10. It could be 1 in a googolplex. Personally, I suspect it's closer to the latter than the former. We don't know.

2. They might be robots, or robotic brains the size of your city

The problem with organic "wet" brains is that they're limited by size and processing power. Similar challenges are faced by the organic “wet” under-parts that get so many of us in trouble today. But inorganic brains theoretically have no limits of perception or conception, and robotification may be the ultimate destiny of all lifeforms that don't nuke themselves into glowing dust.


This is certainly not a new idea. Our own history of space exploration is "send the robot first." It's entirely likely that if another tech civilization exists, we'll meet their robot probes first. Or they'll meet ours; whatever. The logical extension of that would be consciousness transfer to robotic forms, which isn't remotely possible with our current technology (not to mention we don't really know what consciousness is), but hey, we're speculating here.

1. We May Be All Alone

Or maybe advanced aliens don't look like anything because they don't exist. We may be the only intelligent (ish) life in the universe. Based on statistical models, Oxford researchers say "average evolutionary transition times likely exceed the lifetime of Earth." And the universe is only a few Earth-ages old.


Oh, it's worse than that, though. Life as we know it depends on certain heavier elements. Not just the molybdenum mentioned in a recent blog entry, but something as seemingly basic as oxygen, or the iron that makes our blood work. And such elements just aren't found in the early universe. No, they have to be forged in stars, supernovae, and things like neutron star collisions. This takes time. It's not like life as we know it could have begun early on. "Sure, but what about life not as we know it?" Sure, we can just make stuff up.

Still, let's not put too much stock in statistical models, even if they do come from a reputable place like Oxford. Again, we have one data point.

My favorite theory? The one with the greatest potential for mindscrewiness: that aliens may, or may have initially, looked like us.

That's not a theory. That's more speculation. I also want to take this opportunity to reiterate what I've said in the past: there is no universal law of evolution that requires the emergence of a technology-using species. Plenty of species get along just fine without building computers or rockets.

They may even address us in English, which isn't crazy as it sounds. If they can traverse space, why couldn't they learn our language by jamming to Spotify in Moon orbit?

More likely they'll be speaking Mandarin Chinese or Spanish; more people speak those. As for what they'd listen to, I'll just note that the radio waves with the greatest chance of punching through Earth's atmosphere are in the FM band.

So I really, really hope they're listening to NPR and not some morning show shock jock.

When they do show up, just stay out of range of their pincers and you should be fine.
November 13, 2022 at 12:01am
November 13, 2022 at 12:01am
#1040597
Once again, taking a look at a past entry. This one's from way, way back in October of 2008: "Never ends

It's hard for me to remember last week, let alone that far back. I used Wikipedia to remind me of stuff that happened around the time of the entry. At that point, George W. Bush was still President (look, I'm not getting political here, just stating facts), and we were still a month away from electing Obama. The Great Recession, as it came to be called, had just begun; Bush had, a few days earlier, bailed out some of the banks involved. I was still working, still married. None of that mattered to me on October 7-8 because, apparently, on those days, I was in severe pain.

This was, obviously, from a very different time in my life, so I'll try to take it step by step.

Up until a week and a half ago, I had that nasty, unrelenting back pain and sciatica, the only relief for which was lying down and taking lots of meds.

Oh, yeah. I do remember how bad my back pain could get back in the noughties. At some point, I got steroid shots for it, and it got better. You know, those great big needles that they insert into your actual spinal nerves? Yeah. They suck. But not as hard as back pain.

I was okay for a week, then. I mean, I still had twinges in my back and leg, but nothing major.

Memory of pain is weird. It all blends together for me. Back and neck pain was just part of my existence back then. But this particular episode stands out as being utterly incapacitating.

Then, as I reported here, I got sick on Sunday, cutting short our anniversary celebration. This continued through Monday.

Hm. Somehow I had it in my head that our anniversary was closer to the end of October. In my defense, once she dumped me, I could release that date from long-term memory storage, so I did. I couldn't find anything in the archives about an anniversary celebration, only about everyone in the house, including the cats, being sick.

Monday night, I slept for a few hours, then was wide awake for a few hours, until maybe 15 minutes before my alarm went off. When I woke up, my neck and shoulders were stiff. No big deal, except my stomach was still upset, so I went to work. I left work early afternoon, figuring what I needed was to lie down with some heat on my neck.

No, I couldn't call in sick. Hard to do that when you own the company and don't have employees (not at that time anyway). The effects of the Great Recession on the business hadn't taken hold yet. Can't recall what projects we were working on that month, only that we were still able to make money.

So I heated up a neck thing and went to lie down, and pain exploded between my shoulder blades such as made the worst pain I experienced with my lower back (not to mention appendicitis) seem like a pleasant day in the Caribbean.

Like I said, memory of pain is weird. If I were to rank my pain as I remember it now, that day would only be about #4, behind the appendicitis and my heart attack (which happened later) and that time I got stung by a whole nest of yellow jackets (which happened in the eighties).

I couldn't move. Oh, I could move my legs and, to some extent, my arms, but I couldn't sit up or roll over. I couldn't even play dead because I kept looking for a position that minimized the pain.

Ha! "play dead." I crack me up.

My mobile phone was not nearby, so I couldn't reach it to call anyone. Every time I tried, I felt like someone was pushing a knife into my upper spine.

I do remember this particular episode of pain. Until I found this blog post at random, though, I couldn't even have guessed at the year, only that it had to be sometime in the noughties because my wife was involved.

I think I dozed off for a while. My phone rang. I had no way to get to it. I could only hope that my wife would come home before she went out to dance practice.

These days, I'd be utterly boned. Someone would find my emaciated, cat-chewed corpse.

Fortunately, she did. Unfortunately, she had no way of moving me. Fortunately, one of our close friends is a chiropractor. Unfortunately, the chiropractor was still at work. Fortunately, we were able to leave a message. Unfortunately, the ditzbrain who took the message didn't give it to her. Fortunately, I called her mobile phone an hour later to see if she got the message. Unfortunately, she hadn't. Fortunately, she was still able to come over and fix it so I could at least stand up - albeit with intense pain.

Remember a week ago I said some of these entries made me cringe? This is one of them. It's a bit embarrassing to me now. The idea of going to a quack to crack my back wouldn't fly with me these days. Sometimes you have to learn these things the hard way, I guess. It's entirely possible that chiropractic was the actual cause of much of my back pain in that era, though obviously there was some short-term relief from it.

Once I stood up, holding my head straight and not twisting or raising an arm, I was okay. We got back to the chiropractic clinic and she worked on me some more on the table. Then she said I couldn't get on the computer, so I sat with ice on my upper back.

Like I said, short-term relief. I haven't been to a chiropractor in well over ten years, and I rarely have these bouts of pain anymore. The one time I remember since then was neck pain coinciding with my month-long trip to Maui in... 2017, maybe? Some February in the teens. Really cramped my style; it's hard to snorkel when you can't move your head around to see where you're going. Bouncing around on the roads wasn't pleasant, either. At least there was copious alcohol.

I can only imagine how antsy I was without being able to compute. I don't think I've gone a day since 1979 (with the possible exception of a couple of vacation trips) without using some computer, somewhere, for work or school, or the internet or gaming. Not even that day; I would have used one at work the day of the incident, and obviously I was using one to make the blog post about it the following day. The thought of going without a computer for so much as a day fills me with the dread of possible boredom.

And look, I'm not trying to come down hard against alternative medicine. And I'm certainly not dissing my friend (I still call her my friend even though we've barely seen each other in the last decade or more. People drift apart; it happens.) It's just that these days, I need more scientific evidence before trying a course of treatment. Chiropractic has been shown to work for several things, but it's also been reported that there's a risk involved with adjustments, especially neck adjustments. To me, right now, the risk isn't worth the benefit.

I might change my mind if I find my neck in that much pain again, though. People in general will go to great lengths to make pain go away, especially hedonists like me. I'd be like "Fine, even having a stroke would be better than enduring this much agony."

But it was around that time that I came up with this:

I used to say "My back is killing me!" Now I say, "My quack is billing me!"
November 12, 2022 at 12:01am
November 12, 2022 at 12:01am
#1040558
Hope you're not hungry.



"Delicious" is, of course, a matter of opinion. The headline would have probably been too long if Cracked had qualified that, though.

As usual with such a long list, I'm only going to share a few of these.

15. Lobster

Lobster was so plentiful in the areas colonized by early Americans that they stopped eating it as soon as they could. Only prisoners, the poor, and livestock -- which were pretty much considered the same things -- would deign to eat it, and it was even used as fish bait.


You know, I've been hearing this for so long and with such certainty of delivery that I started to question it. So I checked a source that's marginally more verifiable than Cracked, and discovered   that while this probably is the case, in Europe it was often associated with wealth, before a bunch of Europeans came over here.

So this is more of a case of changing popularity over time, which happens with many foods.

Also, keep in mind that sometimes the wealthy like to eat expensive things just because they can, and because the poor can't. This has little to do with the actual taste of the food. See also: caviar. That shit's disgusting.

14. Chicken Wings

Though now a staple of [sportsball game whose name is copyrighted] parties and other manly gatherings, chicken wings were considered the grossest part of the chicken, to be either unceremoniously thrown out or used for soup stock at the most.


Ah, one of my favorite things to rag on. "Let's take the chicken part that used to be made into dog food and turn it into sports food." Look, they're still the grossest part of the chicken (except maybe the beak and intestines) and are only popular because they're marketed to be. And because of the hot sauce, of course.

12. Foie Gras

There is so much wrong with this entry that I'm not even going to paste it here. Also, everything about foie gras is foul. Pun intended.

9. Peanuts

Peanuts came to America from Africa, and like most delicious African foods, they were immediately dismissed by colonizers as unfit for humans until three things happened. First, the Civil War reduced people to choking down whatever protein they could get their hands on, and peanuts were definitely preferable to rats.


I'm no fan of peanuts, but yes, if I had a choice between peanuts and rats, I'd eat the peanuts.

Then P.T. Barnum began selling peanuts as circus food.

Having been marketed by the Platonic ideal of "huckster" ("sucker born every minute" etc.) is not something that I'd use to recommend a product.

Finally, peanut butter happened. Even the most frothing bigot can’t resist a spoonful of peanut butter.

Admittedly, I'm okay with peanut butter. I still don't know why I like peanut butter (but only the real kind, not the candy kind like Skippy) and not peanuts, but I never claimed to be entirely consistent.

7. Mushrooms

The Western world shunned mushrooms on account of their tendency to make you see God and/or kill you until the French insisted they were the height of cuisine in the 18th century


I do like (commercially available) mushrooms. I know several people who can't stand them, mostly due to the texture (they say). I can understand that. When you really think about it, eating mushrooms is weird. It's a fungus, neither animal nor plant (but, oddly, closer to the former than to the latter), and thrives in shit. There aren't many fungi that we eat. Yeast, sure, but we were ingesting that (as part of bread or fine fermented beverages) long before we knew what yeast actually was.

On the other hand, when you really think about a lot of things that we eat, you start to question them. Eggs, for example. Or:

3. Oysters

The story of oysters is a very straightforward one of supply and demand. They were once so plentiful that Charles Dickens characters looked down on the patrons of oyster houses that lined the London streets one “to every half-dozen houses.” Then we filled the oceans with so much pollution that it was hard to get a good oyster, prices went up, and the rich just equated “expensive” with “good.”


Like I said, sometimes rich people do rich people things just because the poor can't.

Still, you have to wonder how people figured out oysters in the first place. "Let's crack open this rock and see if there's a tasty treat inside."

1. Burgers

There are people who, not unreasonably, would take a juicy burger over the finest steak any day, but back in the Upton Sinclair days, ground beef was seen as unclean at best and possibly containing dude at worst.


It's actually more complicated than that, and ground beef certainly predates fast food. I'm pretty sure, though, that the popularity of hamburgers didn't take off until there was enough supply to make them cheaply, and that required access to electricity to power the grinders.

When I was a kid it confused the hell out of me that hamburgers didn't contain ham. Just another linguistic weirdness of English, in this case with the word deriving from the city of Hamburg which may or may not have had nothing to do with the invention of the hamburger. Nowadays you can talk about beef burgers, veggie burgers, turkey burgers, even cheeseburgers (which aren't made out of cheese) but never ham burgers. And no one calls it a hamburger anymore, either.

Now you're hungry, aren't you?
November 11, 2022 at 12:01am
November 11, 2022 at 12:01am
#1040530
Just like UFOs are only UFOs until they're identified (then they're just FOs), it's only a cryptid until you know what it is.

Beware Montana’s Shunka Warak’in, the ‘Rocky Mountain Hyena’  
Is one of these crafty cryptids on display in a small museum?


Not to be confused with the Rocky Mountain Oyster.

Something has been preying on domesticated animals across the plains of Montana for centuries.

Yeah, wolves.

It has been given many names over the years, below most of which burn angry red squiggly lines when typed into Microsoft Word: Shunka warak’in. Ringdocus. Guyasticutus.

All of which would be awesome band names. Still. Wolf doesn't freak out my spell checker.

But it’s also been called the Beast and the Rocky Mountain hyena—in fact, any name but wolf, although the creature could easily be called a wolf.

Which is what I've been saying.

Perhaps that’s because wolves were extinct in the state for about half of the 20th century.

Yeah, sure they were.

Here in Virginia, and down into North Carolina, in the Blue Ridge Mountains, people occasionally claim to see a mountain lion (which also has multiple names: puma, cougar, whatever). Officially, mountain lions are extinct in the eastern part of the US. Unofficially, everyone knows there are mountain lions up there. Lamest cryptid ever: unlike the Jersey Devil or the Mothman, we're pretty sure what a mountain lion is.

I'm not saying cougars aren't cool. Just that we lack imagination when it comes to cryptids here in the Blue Ridge.

If only we had a carcass, we could figure out what this creature is once and for all.

Oh, wait. Turns out, we do. It’s on display in a museum in Montana.


There are museums in Montana?

(I know at least two of my occasional readers are from Montana. Relax. I'm joking.)

The article (which is actually a book excerpt, but whatever) goes on to describe how someone actually killed one, and it ended up stuffed and mounted because that's what we do, apparently. Then:

The ringdocus outlasted Sherwood and was on display at least into the 1980s. And then it disappeared.

Probably stolen by Bigfoot.

Or maybe a wolf.

Meanwhile, Lance Foster, a historic site preservationist, paranormal enthusiast, and member of the Ioway tribe, speculated that the beast could be a shunka warak’in, a canid non-wolf beast from Native American lore that would sneak into camps at night and make off with dogs (the name translates to “carries off dogs”).

Okay, fine. Not a wolf. Maybe Bigfoot's dog.

Apparently, they tracked down the taxidermied whatever-it-was (turns out it wasn't stolen by Bigfoot, but just transferred to a museum in the one state less likely to have a museum than Montana) and put the thing back on display.

Today, the creature is the museum’s most popular exhibit. They just call it the Beast.

And we're back to no imagination.

Or is it just a bad taxidermy mount? Only a DNA test could tell, and all interested parties have decided not to do that. The mystery of the shunka warak’in has gone on so long that nobody wants to risk solving it.

It may be the case that some mysteries are best left unsolved, but in this case, come ON. It's like when they tested hair that supposedly got rubbed off of a Bigfoot, and it turned out to be bear or cow or whatever. People still believe in Bigfoot after all that, because it's hard to prove a negative. (See my entry from last year, "Tales from the Cryptid.) Even though we have hard evidence that all the blurry pictures of that particular cryptid were definitely hoaxes.

It would be like refusing to test the DNA from the multiple taxidermied jackalopes   in neighboring Wyoming: they just don't want people to think that jackalopes are completely made up.
November 10, 2022 at 12:02am
November 10, 2022 at 12:02am
#1040465
Well, it missed the full moon by a couple of days, but this one finally came up from my queue for me:

Why the Moon’s two faces are so different  
The far side of the Moon is incredibly different from the Earth-facing side. 63 years later, we know why the Moon's faces are not alike.


Article has a helpful picture of both hemispheres of the moon up at the top, and lots of other pretty pictures scattered throughout.

The Moon, by far, is the brightest object and largest object that’s visible to human eyes in Earth’s night sky.

I was about to object to this until I realized that it means that it appears the largest, not that it's the largest object, in absolute terms, that we can see. That would probably be a star somewhere, depending on your definition of "object."

So to satisfy my urge to be pedantic, I'll point out that one can often see the moon in the daytime sky, as well, during certain phases when it doesn't appear too close to the sun.

With even an off-the-shelf pair of binoculars or the cheapest telescope you can find, there are two main features about the Moon that you can’t miss:

That it's made of cheese and it's round?

In addition, because the Moon’s orbit is elliptical, moving faster when it’s closest to Earth and slower when it’s farthest away, the face of the Moon that’s visible changes ever-so-slightly, a phenomenon known as lunar libration. Even though this means, over the course of many months, we could see up to a total of 59% of the Moon, it wasn’t until 63 years ago, when the Soviet spacecraft Luna 3 swung around to the far side of the Moon, that we got our first pictures of the far side of the Moon.

Because of this, most of the features of the far side are named in Russian.

Although it wasn’t very impressive in terms of image quality, it was remarkable for an unexpected reason: the near side of the Moon appears vastly different, in terms of both cratered features and maria features, from the far side that always faces away from us. This discovery came as quite a shock, and for decades, even as our imaging and understanding of this elusive side of our nearest planetary neighbor improved in quality, we lacked an explanation as to why this difference existed at all.

The rest of the article explains just that, and it's pretty cool, not only for the explanation, but for the observations, deductions, and science that got us there. Which is fascinating, but there's not a lot of point in rehashing it here.

Then:

No matter how wild or unusual your idea may be, if it has sufficiently strong explanatory power to account for what we observe, it just might be the necessary idea to solve whatever puzzle it is that you’re considering.

Until some better observations come along, of course, and change everything we think we know. But that, too, is part of the awesomeness of science.
November 9, 2022 at 12:01am
November 9, 2022 at 12:01am
#1040431
Whenever Cracked claims to be "scientific," I always take it with a huge chunk of pink Himalayan sea salt.

Still funny, though.



Of all the things you want control over in your life, who you have sex with probably ranks pretty high.

Right, because not controlling it means someone's committed a felony.

Fortunately, considering that we have not, in fact, descended into a Handmaid’s Tale dystopian nightmare (yet), it probably feels like you do.

We're getting there.

Well, we’re sorry to tell you that free will is an illusion and you’re as beholden to the tyranny of biology as the grubbiest little worm when it comes to who you rub your genitals on.

Well, duh.

Some of these are not so "secretly." I'm not going to go through all 15 here, just some highlights (lowlights?)

14. Whether You’re Ovulating

Ovulation (that is, the phase of a woman’s menstrual cycle when she’s fertile) lowers women’s standards...


Anything that lowers women's standards can only work in my favor.

12. How Much They Smile

This one depends largely on who you are and who you’re trying to get with: Men prefer women who smile more, while women like men who smile less (or even look vaguely ashamed).


This one didn't sit right with me—it is physically impossible for me to smile in the standard bare-your-teeth fashion, and yet I'm somehow not swimming in sex—so I went to the link   they helpfully provided, and oh boy.

*FlagR* A note to single dudes: If you're looking to pick up a woman at a bar, whatever you do -- don't smile at her.

I have a hard and fast firm (dammit, there's not an adjective here that can't be misconstrued, is there) rule about not picking up women at bars. "But Waltz, didn't you just say that you want women with lower standards? What lowers a person's standards more than alcohol? Don't say 'ovulation.'" Yeah, that was what's known in the rarefied circles of advanced comedy as a "joke."

The obvious difficulty here is that if I'm out in public, I'm probably at a bar.

Full disclosure: I did it a couple of times when I was younger; why else do you think I developed that rule?

*FlagR* Researchers asked more than 1,000 volunteers to rate the sexual attractiveness of hundreds of images of the opposite sex.

Images don't cut it. Smiles, and other expressions of emotion, are generally fleeting, unless you work for a retailer and thus have to have one plastered on your face at all times. No, maybe an image can give someone a good or bad first impression, but I suspect I'm not alone in wanting to see more body language—even if I'm terrible at reading it.

*FlagR* (All were heterosexual, ages 17 to 49 years, with a median age of 21. Fifty-two percent of participants were Asian, and 48 percent were Caucasian.)

I think a few demographic groups are missing here. While it's irrelevant to me what gays, for example, prefer to see in such a study, I'm sure a lot of people do want to know that. What is relevant to me is if that still holds true at age fifty-something. I suspect that, like most studies of this nature, the majority of the guinea pigs were university students (or possibly teachers/researchers in the case of the older ones) who got enough bread for a couple of pints out of the deal.

*FlagR* They found that women ranked the smiling guys as less attractive -- but they were into the prideful and ashamed men. But the male participants were most attracted to the smiling women, and least attracted to the ones who seemed proud.

Missing some instances of "most" here. I seriously doubt everyone had the exact same reaction. It's like asking people what their favorite candy is. Most people say "Reese's cups." I despise Reese's cups.

Anyway, enough of that. Suffice it to say this is one instance of me needing that huge chunk of pink Himalayan sea salt.

(How sea salt got up into the Himalayas, I leave as an exercise for the reader.)

11. Whether They Touch Your Arm

Lightly touching a woman on the arm makes her more likely to agree to dance or give out her phone number because touchers are considered more attractive and (sigh) more dominant.


Touching a woman on the arm (or anywhere else) if you don't know her that well is a good way to get mace in the face. Or so I'm told. Maybe that's just me.

7. Genetics

You have a secret superpower, and it’s sensing immune profiles. (We never said it was cool.) Women prefer the smells of men whose genetic immune profiles are the most different from their own, which is helpful for your future offspring but also to everyone hoping to avoid a distant cousin.


Or, sometimes, people just stink.

You can't avoid a distant cousin, by the way. Close relatives, maybe, but everyone who isn't a close relative is your distant cousin. And I've seen other "studies" that imply that people too distantly related won't be attractive to a given individual.

Case in point:

4. What Your Parents Look Like

The Freudians weren’t right about much, but we do gravitate toward people who look like our preferred-gender parent. This would seem to counteract the whole “genetic diversity” thing...


I have no idea what my genetic parents looked like (other than that they probably resemble me, if they're still around). Who I find attractive has historically been all over the place, though, and I can't think of anyone I looked at and said, "oh, wow, she's short with black hair just like my mom!"

I mean, ew.

3. Your Parents’ Ages

Similarly, people raised by older parents tend to have older partners...


No idea if this is genetic or environmental. My adoptive parents were a lot older than me, too old at the time I was adopted to have given birth to anyone. And while my first wife was two years my senior, my second was nine years younger (fun fact: they were both 27 when I married them).

1. Who You’ve Had Sex With Before

Think you don’t have a type? Wrong. You might not have started out with one, but one of the biggest factors of our perception of beauty is familiarity. We prefer faces similar to those of our friends, loved ones, and yes, exes, because we associate those people with good times.


Yeah, no, not in my case. Women I've dated tend to be all over the spectrum in terms of height, hair color, body type, etc. The one thing they all have in common, the one characteristic that could be considered "my type" is that they were all batshit crazy.

Which I'm aware says more about me than about them (me being the other thing they all have in common). Which in turn is one reason I'm single.

That and my refusal to pick someone up at a bar.
November 8, 2022 at 12:01am
November 8, 2022 at 12:01am
#1040395
Full moon tonight. Also a lunar eclipse, but one that's too late to stay up for and definitely too early to wake up for.

So instead, we get to read about the Incredible Shrinking Brain.



Given the state of the world right now, I would think "thirty years ago" rather than "three thousand years ago."

Did the 12th century B.C.E.—a time when humans were forging great empires and developing new forms of written text—coincide with an evolutionary reduction in brain size?

Well, it's not in the headline; it's the lede. The answer is still "no."

Think again, says a UNLV-led team of researchers who refute a hypothesis that's growing increasingly popular among the science community.

One thing I should note: brain size isn't strongly correlated with intelligence. If it were, elephants would be ruling the planet. While I'm convinced they're intelligent, it's not like they've invented the internet, sliced bread, or beer.

Last year, a group of scientists made headlines when they concluded that the human brain shrank during the transition to modern urban societies about 3,000 years ago because, they said, our ancestors' ability to store information externally in social groups decreased our need to maintain large brains.

That's... not really how evolution works, anyway.

Their hypothesis, which explored decades-old ideas on the evolutionary reduction of modern human brain size, was based on a comparison to evolutionary patterns seen in ant colonies.

"What is this, a study for ants?"

That by itself would raise a bunch of red flags for me.

"We re-examined the dataset from DeSilva et al. and found that human brain size has not changed in 30,000 years, and probably not in 300,000 years," Villmoare said. "In fact, based on this dataset, we can identify no reduction in brain size in modern humans over any time-period since the origins of our species."

The date for the appearance of what's called "anatomically modern" humans, from what I've read, is about 100,000 years ago.

The UNLV team says the rise of agriculture and complex societies occurred at different times around the globe—meaning there should be variation in timing of skull changes seen in different populations. However, DeSilva's dataset sampled only 23 crania from the timeframe critical to the brain shrinkage hypothesis and lumped together specimens from locations including England, China, Mali, and Algeria.

The only information we have on human brains from that long ago is cranial capacity, which puts an upper limit on brain size. But, again, the relationship between brain size and intelligence isn't absolute.

I'm linking this article because, while I didn't see reports on the original study (the one that claimed brain size reduction 3,000 years ago), apparently, lots of people did. And when a study like that comes out, it's natural for people to want to share it, especially if it fits in with their worldview. In this case, it would be the (demonstrably false) worldview that we're stupid and never should have invented agriculture or civilization. Or possibly that we shouldn't be relying on what I like to call "auxiliary memory" (my smartphone) or we'll become even stupider. (Incidentally, there's a picture making the rounds purporting to show a misshapen human with a cell-phone-holding claw and a crooked neck and calling it "the future evolution of humankind;" that's also utter bullshit.)

It's like when a nutrition scientist says "dark chocolate is good for you," and people crow about it while stuffing their faces with Godiva, and then never hear that the study was suspect because it was funded by Willy Wonka.

So I'm here setting the record straight to the best of my ability. Now you can be That Person at the party who, upon hearing someone confidently proclaim that our ancestors' brains shrank 3,000 years ago, can go, "Well, ackshually..."

2,743 Entries · *Magnify*
Page of 138 · 20 per page   < >
Previous ... 26 27 28 29 -30- 31 32 33 34 35 ... Next

© Copyright 2024 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted Writing.Com, its affiliates and its syndicates non-exclusive rights to display this work.

Printed from https://writing.com/main/profile.php/blog/cathartes02/sort_by/entry_order DESC, entry_creation_time DESC/page/30