\"Writing.Com
*Magnify*
SPONSORED LINKS
Printed from https://www.writing.com/main/profile/blog/cathartes02/month/12-1-2024
Image Protector
Rated: 18+ · Book · Personal · #1196512
Not for the faint of art.
Complex Numbers

A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.

The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.

Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.

Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.




Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning Best Blog in the 2021 edition of  [Link To Item #quills] !
Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2019 Quill Award for Best Blog for  [Link To Item #1196512] . This award is proudly sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . *^*Delight*^* For more information, see  [Link To Item #quills] . Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2020 Quill Award for Best Blog for  [Link To Item #1196512] .  *^*Smile*^*  This award is sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] .  For more information, see  [Link To Item #quills] .
Merit Badge in Quill Award 2
[Click For More Info]

    2022 Quill Award - Best Blog -  [Link To Item #1196512] . Congratulations!!!    Merit Badge in Quill Award 2
[Click For More Info]

Congratulations! 2022 Quill Award Winner - Best in Genre: Opinion *^*Trophyg*^*  [Link To Item #1196512] Merit Badge in Quill Award 2
[Click For More Info]

   Congratulations!! 2023 Quill Award Winner - Best in Genre - Opinion  *^*Trophyg*^*  [Link To Item #1196512]
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the Jan. 2019  [Link To Item #30dbc] !! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on taking First Place in the May 2019 edition of the  [Link To Item #30DBC] ! Thanks for entertaining us all month long! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2019 round of the  [Link To Item #30dbc] !!
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2020 round of the  [Link To Item #30dbc] !! Fine job! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning 1st Place in the January 2021  [Link To Item #30dbc] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the May 2021  [Link To Item #30DBC] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning the November 2021  [Link To Item #30dbc] !! Great job!
Merit Badge in Blogging
[Click For More Info]

Congratulations on winning an honorable mention for Best Blog at the 2018 Quill Awards for  [Link To Item #1196512] . *^*Smile*^* This award was sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . For more details, see  [Link To Item #quills] . Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the January 2020 Round of the  [Link To Item #30dbc] ! Blog On! *^*Quill*^* Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the May 2020 Official Round of the  [Link To Item #30dbc] ! Blog on! Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the July 2020  [Link To Item #30dbc] ! Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the Official November 2020 round of the  [Link To Item #30dbc] !
Merit Badge in Highly Recommended
[Click For More Info]

I highly recommend your blog. Merit Badge in Opinion
[Click For More Info]

For diving into the prompts for Journalistic Intentions- thanks for joining the fun! Merit Badge in High Five
[Click For More Info]

For your inventive entries in  [Link To Item #2213121] ! Thanks for the great read! Merit Badge in Enlightening
[Click For More Info]

For winning 3rd Place in  [Link To Item #2213121] . Congratulations!
Merit Badge in Quarks Bar
[Click For More Info]

    For your awesome Klingon Bloodwine recipe from [Link to Book Entry #1016079] that deserves to be on the topmost shelf at Quark's.
Signature for Honorable Mentions in 2018 Quill AwardsA signature for exclusive use of winners at the 2019 Quill AwardsSignature for those who have won a Quill Award at the 2020 Quill Awards
For quill 2021 winnersQuill Winner Signature 20222023 Quill Winner

Previous ... -1- 2 ... Next
December 31, 2024 at 9:32am
December 31, 2024 at 9:32am
#1081753
Well, here we are at the end of another trip around the sun (or close enough), at a purely arbitrary date on a purely arbitrary calendar.

I had what I consider to be the beginning of the new cycle ten days ago, on the solstice. If anything should mark a transition on what's really a continuum, let it be something real and measurable. But, most of the world uses the Gregorian calendar for recordkeeping and consistency (it is, I'll grant, remarkably good at calculating solar returns), and it's one of the few things most of us share. So we want to impose meaning on December 31 / January 1, fine. Impose it. At least it's usually celebrated with two of my favorite activities: Drinking, and staying up late.

Now, I don't usually make resolutions. I think they're artificial, and set the resolutor (or whatever) up for failure. Besides, it's a bandwagon thing, and I hate bandwagon things.

Another bandwagon thing is the concept of Dry January: the idea of abstaining from ethanol during that calendar month, perhaps in penance for December's overindulgence. This concept legitimately offends me, and I don't get offended easily. In past years, I've simply ignored it and gone on doing what I usually do.

Thing is, contrary to popular belief (that I promote), I don't actually drink every day. Usually once or twice a week. More, perhaps, when I'm on a trip, but only if I'm not subsequently driving. But this New Year's, I've decided to throw personal tradition into the trash and actually make a (gasp) resolution:

Be it hereby resolved that, in protest of the abominable concept of Dry January, Waltz is determined to drink an alcoholic beverage every day during the calendar month of January. This could be a cocktail, a shot of tequila, a dram of scotch, a tot of rum, a bottle of beer, a glass of wine, or the equivalent. More than that is acceptable. Less is not. I do reserve the right to deliberately fail at the resolution in the case of illness or severe injury that requires painkillers, but apart from that, no excuses.

Don't get me wrong: anyone who genuinely wants to stop drinking, temporarily or permanently, as a resolution or otherwise, I wish them well. This is for me. I just have major issues with following a crowd or participating in what's probably little more than abolitionist propaganda.

We'll see how it goes. Knowing me, I'll fail at it like most of us fail at resolutions.
December 30, 2024 at 9:40am
December 30, 2024 at 9:40am
#1081711
For my last link of 2025 (I expect to do a personal update tomorrow, New Year's Eve), we have a 2.5 year old article from a source I don't think I've linked before, Planetocracy. I don't know anything about the site besides this one article.

    Objections to Mars Colonisation  Open in new Window.
A summary of bad arguments that keep being repeated


Judging by the spelling in the headline, this author isn't from the US. Which is fine; it's good to have non-US perspectives on space.

When people who are either uninterested in space colonisation, or actively opposed to it, comment on the prospects of travelling to Mars they often make the same or very similar arguments, unaware that these arguments were either ill-formed to begin with or have already been convincingly refuted.

On the flip side, Kelly and Zach Weinersmith wrote some compelling arguments against it in a book that, if I recall correctly, came out after this article. I don't usually mention books directly here, but I'll make an exception: A City on Mars  Open in new Window. (Amazon link)

“Humans can’t live on Mars because it lacks a magnetic field”

Remember, that's an argument the author claims is refuted. But on the basis of this article alone, I'm not convinced.

The lack of such a field on Mars is by no means a showstopper for colonisation, however. The main protection we have from cosmic rays on the surface of Earth is not our magnetic field, it is our atmosphere.

Okay, and Mars has a much thinner atmosphere. On the plus side, it's further away from the source of most radiation, the Sun. But it seems to me that building habs underground (using robot diggers to start with) would mitigate a lot of the radiation hazard.

Replenishing Mars’ atmosphere, if necessary, would thus be fairly trivial.

Even with the math leading up to it, I find that statement questionable.

“Humans can’t live on Mars because terraforming is impossible/impractical/takes too long”

I'd hesitate to agree with "impossible," because engineers can be pretty smart, and technology continues to advance. I don't think we could do it right now, though. Hell, we can't even terraform Terra.

“Humans can’t live on Mars because they have to live underground and would go crazy”

I'd think a lot of that depends on available space and other factors.

“Humans can’t live on Mars because perchlorate in the soil will poison them”

That, however, has an engineering solution, so I don't think it'd be a dealbreaker.

There's much more at the link. Mind you, I'm not advocating for the author's position, or for its opposite; I don't have the information needed to support or refute these arguments by myself. If we really wanted to colonize—as opposed to visiting or doing short-term stints on— Mars, most of these are problems with engineering solutions. The question, then, is going to be: is it worth it?

And I don't just mean monetary return, but also scientific and engineering advances. Unless we find life (by which I mean microbial life or its equivalent) there, only two reasons to do it that stand out to me: 1) as a stepping-stone to asteroid mining; and 2) as a hedge for the human race against Earth catastrophe. Right now, I don't think we're anywhere close to making either of those things practical.

Even if we did find life there, I don't think it would require a permanent colony to study.

There is, however, one compelling reason to do what we need to do to make it happen, which is: because it's there. The one argument against it that I see most often is some variant of "why spend money on space when we need it to fix things down here," which I find disingenuous. It's not like we're "throwing money into space." It's like the proponents of that argument think we just bundle a bunch of bills into the payload of a rocket and shoot it into the Sun (which, by the way, would take a lot more money than sending it to Mars would). All that money goes back into the economy, creating jobs and helping to develop new technologies.

Besides, we could fix things down here, even with an active space program. But we don't.
December 29, 2024 at 8:07am
December 29, 2024 at 8:07am
#1081680
Today's dip into the murky well of the past takes us back over five years, before the beginning of my current daily blogging streak (but not much before), to an entry for 30DBC: "None More BlackOpen in new Window.

The prompt was: What is your favorite color? Do you have a favorite color pairing? What’s something in your life that you picture when you think of your favorite color? Do you choose to wear clothing that is your favorite color? Has your favorite color changed over your life?

Now, we all know I have a lousy memory, but I'm pretty sure I remember that when this prompt appeared, lo these many years ago, I read the first question and immediately thought of the "Questions Three" from Monty Python and the Holy Grail.

Quoting certain movies and/or books has that effect on me. One cannot mention the number 42 without evoking my memories of Douglas Adams' Hitch Hiker's Guide, for example. I see a Twinkie, and the Twinkie scene from Ghostbusters plays in my head. Sometimes, I can suppress the urge to blurt it out, especially when I'm writing and can deliberately avoid the subject.

With this one, however, I managed to suppress the Python reference only to get caught up in a Spinal Tap reference. Luckily, the gif I found for "none more black" is still active after all this time. I'll post it again for anyone unwilling to go see the earlier entry:



There followed a short treatise on the beauty of black, and really, not much has changed for me in that regard since then.

And the film This Is Spinal Tap (released 40 years ago) continues to be a cultural touchstone.
December 28, 2024 at 8:53am
December 28, 2024 at 8:53am
#1081623
SciAm takes on a spelling challenge. Or, it did, nine years ago. Well, actually, it was an opinion piece even then.

    The Difference between ExtrAversion and ExtrOversion  Open in new Window.
What's the correct spelling: ExtrAversion or ExtrOversion?


Let's find out what they say, then, if we can crawl out of our introvert holes long enough to give them a look.

Jung may be rolling in his grave.

Lots of people are. I propose wrapping them in copper wire, installing some magnets, and turning them into power generators.

Folklore has it that when Carl Jung was once asked which was the correct spelling—ExtrAvert or ExtrOvert—Jung's secretary wrote back something like, "Dr. Jung says it's ExtrAverted, because ExtrOverted is just bad latin."

That's rich, claiming folkore in a story about Jung.

The thing about Latin is, as a dead language, all of its rules are set in stone. When people actually spoke it, though, it was widespread enough that it changed over time, and we, at some point, decided that the Latin in administrative use around Julius Caesar's time (if I recall all this correctly) was the Latin, and the usages and spellings were calcified. In reality, people went on speaking and writing it, and it eventually morphed into Italian, French, Spanish, etc.

If it helps, the French translation of the adjective extraverted is extraverti(e).

But. This is English. A very widespread living language, subject to change and regional variations. What's the correct way to spell humour? Gray? Tire? Kerb? Once something gets set loose in the public, at some point, it stops being a mistake and starts being a variant. Yes, sometimes I rail against those variants, but I have to remind myself that I'm witnessing a linguistic shift as it occurs.

It's always a mistake to use it's as a possessive pronoun, though.

One of the first times Carl Jung introduced the term is in 1917, in his book "Die Psychologie der Unbewussten Prozesse", he spelled it "ExtrAvert". Exhibit A (ha ha):

You'll have to go to the link to see the example, because it's a graphic, but please note that it's in German. English is about as much German as it is French, and neither language controls English spelling.

So why do so many people spell it ExtrOversion today?

At this point in my first reading of the article, I took a wild guess: to conform better with the spelling of its antonym, introvert.

The article then dates the English "o" spelling to one Phyllis Blanchard in 1918, which, as you might note, is but one year after Jung's book above. It also specifies American English, which, as we all know from the above examples I provided, need not conform to other Anglophone countries' spelling.

Not only did she change the spelling of the word, but she also changed the definition!

Definitions, too, change over time and culture. You know what the French translation of the English verb "to request" is? "Demander." This is, of course, cognate with another word in English that has a much stronger connotation than "request."

What I think probably happened is that she was translating Jung and used the "extro" form to imitate the "intro" form for symmetry.

That is, as the author admits, a guess. But it was my guess, too.

We now know that there are five fundamental dimensions of personality (extraversion, neuroticism, conscientiousness, agreeableness, and intellect/imagination), each one on a continuum.

Yeah, as with any other psychological "knowledge," this, too, is subject to alteration over time. But usually, it changes in a somewhat more technical manner than spelling or everyday word usage.

Under this framework, extraversion is defined as being outgoing, sociable, expressive, and assertive. Introversion is defined as the opposite of extraversion (reserved, quiet).

Anyone who's met me can tell you I'm not very reserved or quiet, and yet I don't identify as an extrovert.

Why does this matter? Trust me, I'm not usually this pedantic.

But I am. Being an introvert and all.

Maybe a solution is to just have both spellings in existence, but define the terms differently.

Oh, gods, no, no, no, please don't. I went down a rabbit hole recently on the difference between kluge and kludge (in the process discovering that neither one is actually of Yiddish origin, much to my disappointment), and, well, let's just say I left the rabbit hole even more confused than when I fell into it.

How about instead we bury ExtrOversion once and for all, and all embrace the same spelling to honor Jung.

The author, in passages I didn't quote, appears to be a massive fan of Jung. I can kind of understand this (he was better than Freud, at the very least), but, again, he wrote in German. Which is a fine language, but, as I said, has no direct connection to English (which I've come to understand as a mature creole of earlier forms of French and German).

Fortunately, the author ends with a sentiment I can certainly support:

I do believe it's helpful for scientists to listen to the experiences of individuals, but I also think it could be helpful for individuals to listen to the latest science.

Science, however, does not and should not dictate word spellings. Hell, they can't even dictate word usage; "theory," for example, means something completely different to a scientist than it does to an ordinary person. And yet, I'm going to continue to spell it "extrovert," because it's totally acceptable in English, if not Latin (or German). If you're more familiar than I am with the word in other Anglophone countries, or even in other languages, feel free to chime in.
December 27, 2024 at 8:29am
December 27, 2024 at 8:29am
#1081585
From SciAm, an opinion piece that seems to align with my own opinions:

    A Science Breakthrough Too Good to Be True? It Probably Isn’t  Open in new Window.
The more exciting, transformative and revolutionary a science result appears, especially if it comes out of nowhere, the more likely it is to be dead wrong. So approach science headlines with a healthy amount of skepticism and patience


Regular readers might have noticed that this is what I try to do.

I'd also add: pay attention to retractions. There are still people stubborn enough to believe vaccines cause autism.

In 2014 astronomers announced a whopper of a discovery: primordial waves from the earliest moments of the big bang.

You know, I have no memory of that one, or its retraction.

Or remember Tabby’s Star? In 2015 astronomers speculated that its strange light pattern might be the product of alien megastructures. Cue media circus, high-profile talks, the works. Further analysis revealed that it was ... dust, again.

That one, I recall fairly well. I wasn't blogging at the time, but I remember reading about it and going, yeah, let's do some more research before jumping to "aliens."

Further analysis revealed that it was ... dust, again.

Or... that's what They want us to think.

More recently, a group of astronomers claimed to find phosphine in abundance in the Venusian atmosphere, proposing that there might be some form of exotic life floating in the cloud tops.

That one's a bit more complicated. In brief, further studies are being done, and some are contradictory. That's okay. That's how we figure things out. Until there's something definitive, though, I think it's completely safe to assume "no life on Venus."

It’s not just astronomy. Neutrinos can travel faster than light. Mozart makes your kids smarter. Dyeing your hair gives you cancer. Smartphones make us stupid.

Very, very stunning. But very, very wrong.


Some of that is wishful thinking or deliberate misinformation. "Mozart makes your kids smarter," for example, sounds like something that music producers might push.

First, much, if not most, scientific research is wrong. That’s why it’s research; if we knew the answers ahead of time, we wouldn’t need to do science.

That's... well, it could be phrased better, I think. Not that scientific research is "wrong," which can easily imply a moral judgement, but that a) some hypotheses turn out to be falsified and b) sometimes scientists reach the wrong conclusion, which is later caught through peer review and replication attempts.

Second, scientists endure perverse incentives to publish as much as possible—to “publish or perish”—and to get their results in top-tier journals as much as possible.

That's a problem, but I wouldn't have the first idea how to fix it.

Lastly, there’s the modern-day hype machine. While many journalists respect scientists and want to faithfully represent the results of scientific research, publishers face their own incentives to capture eyeballs and clicks and downloads. The more sensational the story, the better.

This bit is what I mainly focus on in here, because I'm not involved in science or science publication, but I do try to recognize when a headline or link is deliberately sensationalized.

The more times that the public sees science contradict itself, the less likely people are to believe the next result that makes headlines. And the more times that scientists are loudly, publicly wrong, the more ammunition antiscience groups have in their fight against trusted experts.

This is absolutely a problem. I've used this example before, but it's like how eggs have gone from good to bad to good to bad to maybe okay to maybe not too many to good to bad (and then to way too expensive anyway), all just within my lifetime. Just because nutrition science has major flaws, however, doesn't mean astronomy and physics do.

And let me be clear here. I’m sure you find most, if not all, scientific research absolutely fascinating—as do I. But the more interesting a result is to the wider community, with more headlines, chatter, attention and raised eyebrows, the more likely it is to be worthy of a bit of healthy skepticism.

I think that's a pretty sound generality. Like any generality, it's not always true. One day, perhaps, someone will find unequivocal evidence of life on another world: microbes on Mars or eukaryotes on Europa, or something. That will be a Big Fucking Deal, one of the most important discoveries in human history. However, if the evidence is not unequivocal—as with the Venus phosphine, or, before that, the Antarctic meteorite from Mars—it's all sensation with no confirmation.

The best approach to take with science results, news, and headlines is the same approach scientists use themselves: healthy skepticism.

To be clear, "skepticism" doesn't mean "reflexive disbelief." It's more like looking at it critically while keeping an open mind. Seeing a science headline and immediately disbelieving it on principle is just as bad as immediately believing it because you want it to be true.

Beware big headlines; don’t believe everything you see. But when study after study comes out, building up an interconnected latticework of theory and experiment, allow your beliefs to shift, because that’s when the process of science has likely led to an interesting and useful conclusion.

That, to me, is of great importance. Being stubborn, clinging to what has been disproved or debunked, is not a positive character trait. But then, neither is naïve acceptance of every claim you see.
December 26, 2024 at 9:30am
December 26, 2024 at 9:30am
#1081547
Just about a month ago, I linked an article about going gray, in the entry "A Touch of GrayOpen in new Window.. The PopSci article proclaimed with great certainty that gray hair can never return to its original color. Well, here's SciAm going "nuh-uh" in 2021:

    Gray Hair Can Return to Its Original Color—and Stress Is Involved, of Course  Open in new Window.
The universal marker of aging is not always a one-way process


And here I thought the universal marker of aging was yelling at kids to get off your lawn.

As we grow older, black, brown, blonde or red strands lose their youthful hue.

Which is why investments in hair dye companies are likely to be lucrative.

Although this may seem like a permanent change, new research reveals that the graying process can be undone—at least temporarily.

Obviously, everything is temporary, especially if you're old enough to have gray hair in the first place.

Hints that gray hairs could spontaneously regain color have existed as isolated case studies within the scientific literature for decades.

Sometimes, all it takes to disprove a hypothesis is a single counter-example. Like if you said "all planets have moons," and then someone showed that Venus is a planet that doesn't have moons, your hypothesis would be broken. In biology, though, things aren't always that clean-cut, and a single counterexample could just be a fluke or a hoax.

In a study published today in eLife, a group of researchers provide the most robust evidence of this phenomenon to date in hair from around a dozen people of various ages, ethnicities and sexes.

While a sample size of around 12 also doesn't do much to make the findings definitive, this phenomenon is apparently rare enough that to expect a bigger sample would be wishful thinking.

It also aligns patterns of graying and reversal to periods of stress, which implies that this aging-related process is closely associated with our psychological well-being.

On this point, the article is in alignment with the one I posted last month.

Around four years ago Martin Picard, a mitochondrial psychobiologist at Columbia University...

Now, that's someone who either hates Star Trek with a burning passion, or takes every opportunity to tell his underlings to "make it so."

...was pondering the way our cells grow old in a multistep manner in which some of them begin to show signs of aging at much earlier time points than others.

I don't really have anything to say about that process or the science behind it; I just wanted to make a Picard joke in an entry about hair.

These patterns revealed something surprising: In 10 of these participants, who were between age nine and 39, some graying hairs regained color.

So, this is highly unlikely to apply to older folks who go gray. Still, research like this might lead to a way to artificially restore hair melanin without dyes. Not that I care. I think I'd look awesome with a gray mane.

Most people start noticing their first gray hairs in their 30s—although some may find them in their late 20s.

That early? This surprises me, though I've known quite young gray-haired people. But one never knows: those who turn gray then could be dyeing their hair, or shaving it off entirely.

The team also investigated the association between hair graying and psychological stress because prior research hinted that such factors may accelerate the hair’s aging process. Anecdotes of such a connection are also visible throughout history: according to legend, the hair of Marie Antoinette, the 18th-century queen of France, turned white overnight just before her execution at the guillotine.

If true, for a queen, that had to add insult to injury.

Eventually, Picard says, one could envision hair as a powerful tool to assess the effects of earlier life events on aging—because, much like the rings of a tree, hair provides a kind of physical record of elapsed events.

Or, you know, a sorcerer could use it to control you. You never know.
December 25, 2024 at 9:01am
December 25, 2024 at 9:01am
#1081497
I've banged on in here many times over what is and is not an illusion. Here's a bit from Popular Mechanics (way back in 2020) about things we know are illusions.

    Why We've Fallen for Optical Illusions for Thousands of Years  Open in new Window.
Mammoth or bison? Rabbit or duck? Ambiguous images have tricked our eyes forever.


Forever, huh? Our eyes haven't been around forever.

Okay, okay, no, I'm well aware of the emphatic connotation of "forever," and it's beneath me to rag on that usage. I did it anyway.

So, because pictures in here are hard to do, you'd need to visit the link to see the oldest known optical illusion: a carved mammoth / bison dated to 14,000 years ago, thus showing that humans have been fascinated by optical illusions, if not forever, for at least 14,000 years.

Like many optical illusions, these images play on the human brain’s urge for context and turn our first impressions upside down.

I say it also plays on our predilection for pareidolia. Much of art does, really; it can be deliberately-induced pareidolia, which is why we recognize the smiley-face emoji as a smiley-face. These illusions deliberately induce pareidolia in more than one way.

The ambiguity itself is the point, and researchers have studied how the regular human experiences and preconceived notions we all carry around influence the way we decide between ambiguous options or fill in missing information.

And there's a metaphor in there, somewhere.

There's not a lot more at the article, which doesn't even acknowledge my hypothesis about pareidolia, preferring instead to talk about brain plasticity and resolution of ambiguity. But it did get me wondering why these illusions don't elicit the same kind of groaning hatred that their linguistic equivalent, the pun, does?

Maybe because evolution has worked to keep us punsters from contaminating the gene pool.
December 24, 2024 at 2:28am
December 24, 2024 at 2:28am
#1081447
Posting early today because, like many people, I have stuff to do later. In my case, though, the stuff is completely unrelated to tomorrow's holiday.

I've written about Betelgeuse before, most recently here: "Betelgeuse 2Open in new Window.. This is, however, a different article, more recent, from Big Think.

    This is what we’ll see when Betelgeuse goes supernova  Open in new Window.
The closest known star that will soon undergo a core-collapse supernova is Betelgeuse, just 640 light-years away. Here’s what we’ll observe.


And already I have Quibbles.

1: "what we'll see." It's extremely unlikely that anyone alive as I write this will see it happen. I'm a gambling man, and I wouldn't bet on it, not unless some bookie was offering billion-to-one odds and I could bet, like, a dollar. The headline uses the same value of "we" as people do when they talk about when "we" will colonize distant star systems (hopefully not Betelgeuse).

2. "will soon undergo." As with "we," they're using a variant value of "soon." Best estimate I've seen is within 100,000 years. That's soon in cosmic terms. It's not soon in human terms. Hell, 100,000 years ago, we'd (entirely different definition of "we" this time) barely started using fire.

3. "640 light-years away." Yeah... maybe. For whatever technical reason (it's been explained to me, but it's over my head), B's distance has been tricky to pin down. Wiki claims 400-600 light years, and that's a huge margin which doesn't even include 640.

I should reiterate here that even if it's at the low end of that scale, astronomers expect no ill effects for any life remaining on Earth when it happens. Of course, astronomers have been known to be wrong, from time to time.

But, okay. Issues with the headline don't always transfer to the actual text. It's just that it's the first thing we see, so getting it right is kinda a big deal. I'm not saying that it's clickbait, but it is a bit sensationalized.

The stars in the night sky, as we typically perceive them, are normally static and unchanging to our eyes. Sure, there are variable stars that brighten and fainten, but most of those do so periodically and regularly, with only a few exceptions. One of the most prominent exceptions is Betelgeuse, the red supergiant that makes up one of the “shoulders” of the constellation Orion.

Hence the title of today's entry.

Over the past five years, not only has it been fluctuating in brightness, but its dimming in late 2019 and early 2020, followed by a strange brightening in 2023, indicates variation in a fashion never before witnessed by living humans.

It is necessary for a human to be living in order to witness anything (metaphysics and religion aside), but I think they mean it's weirder than it's been for the past 100 years or so.

There’s no scientific reason to believe that Betelgeuse is in any more danger of going supernova today than at any random day over the next ~100,000 years or so, but many of us — including a great many professional and amateur astronomers — are hoping to witness the first naked-eye supernova in our galaxy since 1604.

As unlikely as it might be, I've said before that it would be very, very cool if I got to see it. I'm just not betting on it.

Located approximately 640 light-years away, it’s more than 2,000 °C cooler than our Sun, but also much larger, at approximately 900 times our Sun’s radius and occupying some 700,000,000 times our Sun’s volume. If you were to replace our Sun with Betelgeuse, it would engulf Mercury, Venus, Earth, Mars, the asteroid belt, and even Jupiter!

See, those numbers don't hit very well with people, including me. Even comparing the size to our solar system doesn't give us a visceral idea of just how fucking huge that star is (not to mention I'd question the Jupiter orbit thing, because red giants like that just don't have a well-defined surface in the way that we think of the Sun as having one).

This image  Open in new Window. might help with the size comparison.

Even when it transitions to the more advanced stages of life within its core, from carbon-burning to then neon and oxygen and eventually silicon fusion, we won’t have any directly observable signatures of those events.

Dude, people are easily confused. I get that stars, like people or cats, have a birth, time of existence, and death. As far as we know, though, stars themselves don't harbor life. Yes, the universe is weird, and it's fun to speculate that maybe they do, but "life" in this case is a metaphor for how stars change over time. Calling it life just begs people to misunderstand, deliberately or not, what's meant.

The article goes on to describe what whoever's on Earth when it happens can expect to experience when the event finally occurs. Not going to quote more, but it's pretty interesting, in my opinion. Pay no mind to the "it really happened 640 [or whatever] years ago" thing, though; it's irrelevant except as a way to acknowledge that information has a maximum speed.

Naturally, being science, everything there is based on our best knowledge at this point in time. I'd also expect surprises. But those surprises will only serve to advance the science.

Unless, of course, they're wrong about the "it won't irradiate and sterilize the Earth" thing.

Sleep tight!
December 23, 2024 at 7:31am
December 23, 2024 at 7:31am
#1081423
In my partial listing of intelligent species yesterday, I left out an important one. But my random number generator reminded me today, by pointing to this article from Atlas Obscura:

    Can You Outsmart a Raccoon?  Open in new Window.
Recent studies show just how tricky these trash pandas can be, from opening locks to nabbing DoorDash orders.


Well, that last bit won't happen to me because I don't use DoorTrash. But I have caught those masked marauders in the actual trash. I also once caught one that had opened the door to the house, snuck in, and scarfed down the cat food.

While many other species around the world are in decline, raccoons are actually thriving, and do particularly well in urban areas, says lead author Lauren Stanton, a cognitive ecologist at the University of California, Berkeley.

Okay, that tracks, but... "cognitive ecologist?"

Raccoons are strong—they can push a cinder block off a trash can—and tenacious. The more we do to keep them out, the more skills they learn for breaking in, leading to a cognitive arms race between people and raccoons.

You know how people keep saying that if cats ever got opposable thumbs, we'd be in big trouble? Well, raccoons don't have opposable thumbs, either, but their little paws grip stuff just fine without them.

In 2016, for example, the city of Toronto spent 31 million CAD (that’s about $23 million) on raccoon-resistant waste bins. While they deterred most would-be robbers, certain tricksters had no problem solving the new puzzle. The city has continued to release new versions of the bin, trying to outsmart Toronto’s most persistent trash invaders.

All due respect to our Canadian friends, that right there cracked me up.

The word raccoon can be traced back to the Proto-Algonquian word ärähkun, deriving from the phrase “he scratches with his hands.”

The name was more directly from a specific Algonquian language, spoken by the Powhatans here in what would become Virginia.

While it’s hard to compare intelligence across species, says Stanton, she says that some recent studies show the neural density of raccoons is “more similar to primates than other carnivore species.”

Also, from what I've been hearing, neither brain size nor neural density is strongly correlated with those traits we call intelligence. Still, there's no mistaking that at least some raccoons exhibit advanced problem-solving skills.

However, we have learned that raccoons, once thought to be solitary, are in cahoots with each other far more than we knew.

Great. Now we have to deal with raccoon gangs.

Lots more at the article, which, most importantly, features multiple pictures of impossibly cute raccoons.
December 22, 2024 at 6:24am
December 22, 2024 at 6:24am
#1081398
As usual for Sunday entries these days, I selected an older entry at random to take a second look at. This time, I didn't land all that far in the past, just over a year (one year being my minimum for these exercises), and found this: "The Trouble with QuibblesOpen in new Window.

Being relatively recent, the linked article,  Open in new Window. from Vox, is still up, and I didn't see any indication that it's been revised since then.

I will therefore address, primarily, my own thoughts at the time.

Quibble 1: "Intelligent." I've railed on this before, but, to summarize: What they really mean is "technology-using."

I have, in fact, bitched about this sort of thing on numerous occasions, and for reasons I go over in that entry. But, even apart from the tiresome jokes about humans not being intelligent, we know of other intelligent life on Earth: octopodes, dolphins, cats, crows, etc. It took entirely too long, from the perspective of our time as a species, to recognize these intelligences. Our communication with them is limited in scope; those species are related to us, so you'd think there would be enough common ground to establish a dialogue, but no. How much worse might it be to communicate with someone from an entirely different genetic lineage?

Of course, there's always the most basic form of communication: violence. I know it's fashionable to think that any culture advanced enough to get to the stars will have put all that behind them, but I'm skeptical. We certainly haven't. Humans fear the Other, and there's probably sound evolutionary reasons for that, but nothing would be more Other than space aliens. To them, we're the space aliens.

We're looking for signs that some extraterrestrial species has developed communication or other technology whose effects we can detect. This technology would indicate what we call intelligence, but not all intelligence develops technology. One might even successfully argue that it's kinda dumb to invent technology.

Quibble 2: UAP may be more or less silly than UFO, but I believe it to be a better fit.

UAP may have less baggage than UFO, but we have a history of taking new, neutral terms and giving them the same encrustations of connotation that we give the old terms. Like how "retarded" started out as a value-neutral term to describe humans of lower intelligence (see above), to replace words like idiot, cretin, and moron, which had turned into insults. Then "retarded" turned into an insult, and some say we shouldn't be using the term at all. Well, that's special.

Point is, I give UAP (unidentified anomalous (originally aerial) phenomena) about five more years before they have to come up with something new because UAP studies have taken a turn for the edge of the world.

And I don't doubt that there's something to study. Sure, there are hoaxes; there are always hoaxes, like with Bigfoot, but they're probably not all hoaxes. I just don't jump straight to the conclusion that if there's a sighting of something that can't be immediately identified, it must therefore be aliens. That's just retarded.

Quibble 5: What's the first thing we did when we started exploring space? Sent robots, not people. No reason to assume hypothetical aliens wouldn't do the same.

This can probably be nitpicked because some of our early ventures into space were crewed: first person in space, first human on the moon, etc. Still, Sputnik (not really a robot but not living, either) preceded Gagarin, and lunar probes preceded Tranquillity Base, and since then pretty much everything outside of Low Earth Orbit has been a robot.

Well, that's all I'm going to expand on today. My thoughts haven't changed much in the 14 months since that entry, and we have found no extraterrestrial life, intelligent or otherwise, during that time, so the search continues.
December 21, 2024 at 11:36am
December 21, 2024 at 11:36am
#1081375
A neutron walks into a bar. "What'll it be?" asks the bartender. "How much for a beer?" "For you, no charge!"



While this Quartz article is from the faraway year of 2017, I found enough to snark on to make it worth sharing.

You’ve likely been asked how you see the proverbial glass: half full or half empty? Your answer allegedly reflects your attitude about life—if you see it half full, you’re optimistic, and if you see it half empty, you’re pessimistic.

I'm an engineer. All I see is an overdesigned glass. Or, depending on my mood, an inefficient use of available storage space.

I'm also a pessimist, but at least I'm a pragmatic one.

Implied in this axiom is the superiority of optimism.

Also, I don't know if even the most dedicated pessimist, not knowing and deliberately following this particular cliché, would seriously consider "half-empty" to be a thing. Almost everything is related to full. Like, if your fuel gauge is in the middle, you say "we have half a tank of gas," not "the tank's half-empty."

Thus, the good answer is to see the glass half full. Otherwise, you risk revealing a bad attitude.

Shut the fuck up about my attitude.

Actually, the glass isn’t half full or half empty. It’s both, or neither.

Come on now. No. It's not in a state of quantum indeterminacy. Or, at least, no more than any other object in view.

Things aren’t mutually exclusive, awesome or awful. Mostly they’re both, and if we poke around our thoughts and feelings, we can see multiple angles.

On that bit, though, I'm fully on board. I really hate it when people put things into two boxes: "awesome" and "suck." The moment Netflix went to shit was the moment it switched from star ratings to thumbs up or down. Of course, I'm fully aware that by hating it, I'm putting the idea of the awesome/suck binary into the "suck" box. Everyone is a hypocrite, including me.

Neutrality sets us free. It helps us see something more like the truth, what’s happening, instead of experiencing circumstances in relation to expectations and desires.

Ehhhh... nah. Pessimism, and only pessimism, sets us free. An optimist is doubly disappointed when their imaginings fail to materialize: from the positive outcome having not worked out, as well as the ego hit from being wrong. A neutral person risks never experiencing the joy of anticipation. It is only the pessimist who, if their prediction falls flat, still takes a win: either something good happens, which is good; or they turn out to be right, which is a pleasant feeling.

The article goes on to relate the quality of neutrality to Buddhism, I suppose in an effort to give neutrality some ancient gravitas, but instead, it only makes Buddhism seem even less appealing to me.

But hey, it's not about me. On the subject of whether this applies to anyone else or not, well, I'm neutral.
December 20, 2024 at 10:25am
December 20, 2024 at 10:25am
#1081341
Solstice tomorrow (around 4:20 am here), so this is my last entry of astronomical fall. Today's article, from BBC, has nothing to do with seasons, though, and it's a subject I really shouldn't be weighing in on—but of course, I'm going to do it anyway.



What inspired me to jump in above my head here was the lede:

Far from triumphantly breezing out of Africa, modern humans went extinct many times before going on to populate the world, new studies have revealed.

Now, there's a poorly phrased sentence if I've ever seen one. It should be blindingly obvious to everyone who can read this that modern humans didn't go extinct. This is a fact on par with "Earth is roughly spherical" and "Space is mostly vacuum." Actually, wait, no, I'm even more sure that modern humans didn't go extinct than I am about those other things, because, last I checked, there were about 8 billion modern humans running around. Or sitting around. Whatever. You do you. Point is, we're not extinct yet.

Likely, the author meant "sub-populations of modern humans went extinct many times," which, okay, I guess they have science to back them up on that, and I'm not going to argue about it. But I feel like the way it's phrased would be like if they said "humans went extinct in Pompeii in 79 C.E."

The new DNA research has also shed new light on the role our Neanderthal cousins played in our success.

This is, I think, the interesting bit here. But I'd like to emphasize the "cousins" metaphor there. Sapiens and neandertals (the spelling of the latter appears to have legitimate variants) shared a common ancestor. A certain ape population separated at some point, genetic drift and selection occurred differently in each population, until you got groups with clear physiological differences in the fossil record. But, apparently, the physiological differences weren't enough to prevent interbreeding.

The definition of a species is, to my understanding, a bit of a slippery concept in biology. That is, it's not always obvious what constitutes a separate species. If it were as easy as "a population that can breed to produce fertile offspring," we wouldn't consider sapiens and neandertals separate species because, according to DNA evidence, they produced fertile offspring together.

While these early European humans were long seen as a species which we successfully dominated after leaving Africa, new studies show that only humans who interbred with Neanderthals went on to thrive, while other bloodlines died out.

Again, I feel like this is poorly phrased, and puts too much emphasis on Europe. Apparently, there are populations in sub-Saharan Africa today with no neandertal genes and, again, obviously they didn't die out. And they're the same species as the rest of us mostly-hairless bipeds.

Apart from these nitpicks, I think the new findings are fascinating, delving into how, basically, hybridization led to greater hardiness. As with all science, it may be overturned or refined through later studies, but this article itself describes an overturning of previous hypotheses about early human ancestry. And it has helpful infographics and pictures.

But unless we invent time travel (unlikely), all we can do is make hypotheses and test them. It's really a very human thing to do.
December 19, 2024 at 8:22am
December 19, 2024 at 8:22am
#1081306
Show of hands, please: how many of you are planning on eating over the next week or so?

Huh, that's a lot.

An eating-related article from PopSci:

    Is the five-second rule true? Don’t push your luck.  Open in new Window.
The scientific research on floor food has a clear answer.


I never heard about this "five-second rule" until I was fully an adult. Now, remember, I spent my childhood out in the country, on a farm, and we had our own vegetable garden. The door to the house opened into the kitchen. Clean floors were a "sometimes" thing. But I honestly can't remember what my mom (it was almost always my mom) did if something dropped onto said floor. Probably wouldn't have mattered because I used to pick veggies straight from the garden and eat them. Yes, even carrots. Especially carrots. I wouldn't eat vegetables that she'd cook, but I ate the hell out of raw, dirty-root carrots.

Nurses are always surprised and distrustful when they see "no known allergies" on my chart, but I credit my finely-tuned immune system (quite unscientifically) to eating dirt as a kid.

Anyway, I never believed the five-second rule, and now there's some science to back me up on this.

According to this popular belief, if you drop a piece of food on the floor and pick it up in less than five seconds, then it’s safe to eat. The presumption is that bacteria on the floor don’t have enough time to hitch a ride on the food.

Right, because bacteria are just little animals that take more than five seconds to realize there's a tasty treat above and jump onto it. Snort. No, any bacteria (or other unwanted contamination) hitches a ride on the floor dirt that the dropped food picks up immediately. And I don't care how clean you think your floor is; if it's just been cleaned, there's cleaning agent, which is also not very good for you; and if it hasn't, there's dirt.

In 2003, Jillian Clarke, a senior at the Chicago High School for Agricultural Sciences in Illinois, put the five-second rule to the test.

I will reiterate here that, as a high-schooler, she was younger than I was when I first heard about the five-second rule. Also, we never got to do cool science projects like that in my high school.

Clarke and her coworkers saw that bacteria transferred to food very quickly, even in just five seconds, thus challenging the popular belief.

While this supports my own non-scientific conclusion, one study, performed by a high-school team, is hardly definitive.

A few years later, food scientist Paul Dawson and his students at Clemson University in South Carolina also tested the five-second rule and published their results in the Journal of Applied Microbiology.

Additional studies and replication, now... that starts to move the needle to "definitive."

When they dropped bologna sausage onto a piece of tile contaminated with Salmonella typhimurium, over 99% of the bacteria transferred from the tile to the bologna after just five seconds. The five-second rule was just baloney, Dawson concluded.

One might think that the main reason I saved this article to share was because of the bologna/baloney pun.

One would be correct.

But in 2014, microbiology professor Anthony Hilton and his students at Aston University in the United Kingdom reignited the debate... According to their results (which were shared in a press release but not published in a peer-reviewed journal), the longer a piece of food was in contact with the floor, the more likely it was to contain bacteria. This could be interpreted as evidence in favor of the five-second rule, Hilton noted, but was not conclusive.

Well, maybe UK bacteria are less aggressive.

This prompted food science professor Donald Schaffner and his master’s thesis student, Robyn C. Miranda, at Rutgers University in New Jersey to conduct a rigorous study on the validity of the five-second rule, which they published in the journal Applied and Environmental Microbiology... By analyzing bacterial transfer at <1, 5, 30, and 300 seconds, they found that longer contact times resulted in more transfer but some transfer took place “instantaneously,” after less than 1 second, thus debunking the five-second rule once and for all.

Now that "definitive" needle has moved substantially. But shame on the source for applying "once and for all" to science.

“Based on our studies, the kitchen floor is one of the germiest spots in the house,” Charles P. Gerba, a microbiologist and professor of virology at the University of Arizona, tells Popular Science. Believe it or not, “the kitchen is actually germier than the restroom in the home,” he added.

I get really tired of the "more germs than a toilet seat" scaremongering.

The next time you’re tempted to eat that cookie you just dropped, remember: bacteria move fast.

Or they're hitching a ride on the larger particles that stick to the toast that you inevitably dropped butter-side-down.

Anyway, I'm not sharing this to shame anyone for eating stuff off the floor. You do you, as they say. Just don't make me eat it. My dirt-eating days are long behind me.
December 18, 2024 at 9:58am
December 18, 2024 at 9:58am
#1081275
Getting back to science today, here's one from Quanta for all the opponents of nihilism out there.

    What Is Entropy? A Measure of Just How Little We Really Know.  Open in new Window.
Exactly 200 years ago, a French engineer introduced an idea that would quantify the universe’s inexorable slide into decay. But entropy, as it’s currently understood, is less a fact about the world than a reflection of our growing ignorance. Embracing that truth is leading to a rethink of everything from rational decision-making to the limits of machines.


It makes all kinds of sense that it took a French person to figure this out.

Life is an anthology of destruction. Everything you build eventually breaks. Everyone you love will die. Any sense of order or stability inevitably crumbles. The entire universe follows a dismal trek toward a dull state of ultimate turmoil.

That sounds more like a French (or possibly Russian) philosophy book than science, but I assure you, it's science (just without the math). As I've said before, philosophy guides science, while science informs philosophy.

To keep track of this cosmic decay, physicists employ a concept called entropy.

Keeping track of decay may sound like a paradox, and, in a way, it is.

Entropy is a measure of disorderliness, and the declaration that entropy is always on the rise — known as the second law of thermodynamics — is among nature’s most inescapable commandments.

That's slightly simplified. The Second Law of Thermodynamics states that in a closed system, entropy can never decrease. It can remain constant, just never decrease. And it specifies "closed system," which the Earth most definitely is not; we have a massive energy source close by (in cosmic terms), at least for now.

Order is fragile. It takes months of careful planning and artistry to craft a vase but an instant to demolish it with a soccer ball.

I've also noted before that creation and destruction are actually the same thing. What we call it depends on our perspective at the time. Did you create a sheet of paper, or did you destroy a tree? Well, both, really, but maybe you needed the paper more than you needed the tree, so you lean toward the "creation" angle.

We spend our lives struggling to make sense of a chaotic and unpredictable world, where any attempt to establish control seems only to backfire.

Who's this "we?"

We are, despite our best intentions, agents of entropy.

At the risk of repeating myself once more, it could well be that the purpose of life, if such a thing exists at all, is to accelerate entropy.

But despite its fundamental importance, entropy is perhaps the most divisive concept in physics. “Entropy has always been a problem,” Lloyd told me. The confusion stems in part from the way the term gets tossed and twisted between disciplines — it has similar but distinct meanings in everything from physics to information theory to ecology. But it’s also because truly wrapping one’s head around entropy requires taking some deeply uncomfortable philosophical leaps.

Uncomfortable for some, maybe.

As physicists have worked to unite seemingly disparate fields over the past century, they have cast entropy in a new light — turning the microscope back on the seer and shifting the notion of disorder to one of ignorance.

What he's basically saying here, if I understand correctly (always in question), is that they're trying to fit entropy into information theory. Remember a few days ago when I said information theory is a big deal in physics? It was here: "Life IsOpen in new Window.

The notion of entropy grew out of an attempt at perfecting machinery during the industrial revolution. A 28-year-old French military engineer named Sadi Carnot set out to calculate the ultimate efficiency of the steam-powered engine.

It's important, I think, to remember that the steam engine was the cutting-edge of technology at the time.

Reading through Carnot’s book a few decades later, in 1865, the German physicist Rudolf Clausius coined a term for the proportion of energy that’s locked up in futility. He called it “entropy,” after the Greek word for transformation.

I find that satisfying, as well, given my philosophical inclination concerning creation and destruction. If they're the same thing, then "transformation" is a better word.

Physicists of the era erroneously believed that heat was a fluid (called “caloric”).

Yes, science is sometimes wrong, and later corrects itself. This should, however, not be justification to assume that the Second Law will somehow also be overturned (though, you know, if you want to do that in a science fiction story, just make it a good story).

This shift in perspective allowed the Austrian physicist Ludwig Boltzmann to reframe and sharpen the idea of entropy using probabilities.

So far, they've talked about a French person, a German, and an Austrian. This doesn't mean thermodynamics is inherently Eurocentric.

The second law becomes an intuitive probabilistic statement: There are more ways for something to look messy than clean, so, as the parts of a system randomly shuffle through different possible configurations, they tend to take on arrangements that appear messier and messier.

The article uses a checkerboard as an example, but as a gambler, I prefer thinking of a deck of cards. The cards come in from the factory all nice and clean and ordered by rank and suit. The chance of that same order being recreated after shuffling is infinitesimal.

Entropy experienced a rebirth during World War II.

Now, there's a great double entendre. I wonder if it was intentional.

Claude Shannon, an American mathematician, was working to encrypt communication channels... Shannon sought to measure the amount of information contained in a message. He did so in a roundabout way, by treating knowledge as a reduction in uncertainty.

Sometimes, it really does take a shift in perspective to move things along.

In two landmark (opens a new tab) papers (opens a new tab) in 1957, the American physicist E.T. Jaynes cemented this connection by viewing thermodynamics through the lens of information theory.

Okay, so the connection between entropy and information isn't exactly new.

However, this unified understanding of entropy raises a troubling concern: Whose ignorance are we talking about?

And that's where I stop today. There is, of course, a lot more at the link. Just remember that by increasing your own knowledge, you're accelerating the entropy of the universe by an infinitesimal amount. You're going to do that whether you read the article or not, so you might as well read the article. As it notes, "Knowledge begets power, but acquiring and remembering that knowledge consumes power."
December 17, 2024 at 7:01am
December 17, 2024 at 7:01am
#1081249
Today, from the Land of Party Poopers (actually, from Thrillist):

    No, That Isn't Duct Tape on Your Plane's Wings  Open in new Window.
An aircraft mechanic explains what the tape you sometimes see on plane wings really is.


Why Party Poopers? Well, because they're taking one of my few precious joys out of life.

See, I don't fly all that often. Once a year, maybe. (Okay, twice, if you count the round trip as two trips.) So I don't get to do this often, but when I see that tape on a plane, I usually wait until the plane starts to taxi away from the gate to loudly exclaim, "Hey, look, the wings are being held together by duct tape!"

I also find the fake outlets  Open in new Window. at waiting areas incredibly hilarious, though I've never done that prank, myself.

Those are little moments of happiness for me, but this article sucks the joy out of the first one. Well, at least, it would, if people actually read Thrillist. Maybe my faux-freakout over the tape will still have its desired effect.

Anyway, after all that, I'm sure you're dying to know what it really is on the wings.

As a passenger, noticing that your plane's wings are seemingly held together by the same silver duct tape that your dad uses to fix anything around the house is, by all means, a frightening sight.

Or, you know, it would, if duct tape weren't so damn useful.

"That's not actually duct tape," says an aircraft mechanic in a TikTok video addressing the issue. "That's speed tape, [...] and speed tape is an aluminum-base tape that's designed specifically for aviation due to the large speeds and the large temperature differentials that aircraft are subjected to."

I actually knew that. But knowing that it's called "speed tape" doesn't help for shit. Like, from the sound of it, it should make the airplane go faster, but if that were the case, the whole plane would be covered in it, right? If it has something to do with the "large speeds" (eyetwitch) as well as temperature differentials, why call it speed tape and not cold tape?

Instead, sometimes, it's used as a temporary sealant to prevent moisture from entering specific components.

Uh huh. Okay. Doesn't tell me why it's called speed tape.

"Speed tape, also known as aluminum tape, is a material used for temporary, minor repairs to nonstructural aircraft components," an FAA spokesperson told Thrillist.

And it's called that because...?

Yes, I know I could ask some random AI the question and get some kind of answer, but that's not the point. The point is, why can't the article purporting to explain all about speed tape not even bother to explain why it's called speed tape?

You can relax now and enjoy your flight stress-free.

HA! Like there aren't 498 other things about flying that cause stress.

Oh, right: 499 if I'm around.
December 16, 2024 at 7:51am
December 16, 2024 at 7:51am
#1081213
Way the hell back in 2018, Quartz published the article / stealth book ad I'm linking today.



Does it? Does it really remain at the center of dining controversy? Because I thought that in 2018, and even now, the "center of dining controversy" is how to handle mobile phones at the table.

On June 25, 1633, when governor John Winthrop, a founding father of the Massachusetts Bay Colony, took out a fork, then known as a “split spoon,” at the dinner table, the utensil was dubbed “evil” by the clergy.

While this article is US-centric, and makes no attempt to be otherwise, other sources  Open in new Window. show that the fork has been considered a tool of the devil since it was introduced to Europe. This is, naturally, just another in a long list of humans considering anything new and different to be necessarily evil, because we're kinda stupid like that.

Forks were pretty much unheard of during Winthrop’s era. People would use their hands or wooden spoons to eat. The Museum of Fine Arts (MFA) in Boston says that only “a handful of well-to-do colonists,” adopted the use of the fork.

I mean, technically, you're using your hands either way.

When Americans finally started their love affair with the fork, their dining etiquette compared to their international peers became a source of controversy for centuries, whether it’s the way the fork is held, only eating with the fork, or using the “cut-and-switch.“

Oh, no, different countries do things differently. The horror.

During the time it took for Americans to widely start using the fork, dining cutlery was evolving in England. Knives changed to have rounded blade ends, since forks had “assumed the function of the pointed blade,” says Deetz.

I'm betting there were other reasons for the switch, like, maybe, deweaponization?

So if you've ever wondered why some cultures point fork tines up while others point them down, well, the article explains that. Sort of. Unsatisfactorily. Still not mentioned: why formal place settings are the way they are.

Also not mentioned in the article (perhaps one of the books it promotes says something about it, but it's unlikely I'll ever find out) is the abomination known as the spork.
December 15, 2024 at 8:49am
December 15, 2024 at 8:49am
#1081177
It's time-travel time again. Today's random numbers brought me all the way back to July of 2008, with a short and ranty entry: "Those Naughty BritsOpen in new Window.

Apparently, there was a link to a (London) Times article, in the chick section, about "kinky sex." It should be surprising to no one that the link is dead and now just redirects to the Times main page, which I didn't bother looking at.

"Why do many of us like kinky sex?" apparently opened the original article, based on what I said in that entry. These days, I have preconceived ideas about headline questions: First, if it's a yes/no question, the answer is probably "no." Second, if it's a "why" question, the answer is probably "money."

I think I'm wrong about the second idea, but only this time.

2008 Me: Why is this in the "women" section? Men don't want to read about kinky sex? Please.

I'm guessing men are less likely to consider it kinky, outrageous, or naughty. But what the fuck do I know (pun intended)?

2008 Me: In conclusion, the article seems to be designed to be provocative, but semantically null.

I guess that was me, waking up to the practices of major information outlets.

2008 Me: What happened to investigative journalism? Hell, what happened to comprehensive news stories?

Gods, 2008 Me was so young and naïve.

2008 Me: ...an excuse to link the blog of a friend of mine...

Said blog no longer exists, and I have no recollection of who the friend was now.

2008 Me: Journalism may not be dead yet, but it's starting to wander and stink.

Dead now. Mostly.

2008 Me: I blame bloggers.

Clearly, that was an attempt at irony. The reality was, and is, way more complicated than one single reason, as these things usually are. I'm not getting into it here, and I'm probably wrong, anyway. But this look into the far-distant past has been enlightening, and maddening. Still, one constant that hasn't changed, and was an old constant even in 2008: sex sells. And, apparently, kinky sex sells more.
December 14, 2024 at 9:32am
December 14, 2024 at 9:32am
#1081148
Appropriately enough, the first entry after the completion of my five-year daily blogging streak is from Cracked:

    How the Tomato Became Torn Between the Lands of Fruits and Vegetables  Open in new Window.
A confusing, red, plant-based chimera


Right, because the most important characteristic of a tomato is which category we pigeonhole it into. But, okay, I'll play along.

I don’t know what it is about the fruit-versus-vegetable designation of a tomato that I find so particularly annoying, but it twists in my brain like a knife.

That sounds serious. Maybe, instead, take a break and think about Pluto for a while.

As it sits today, the tomato is indeed, botanically a fruit. At the same time, it is legally a vegetable...

Yes, and my mom was, to me, my mother, but to my dad, she was his wife. So?

First, let’s stick to the science, which decidedly declares a tomato a fruit according to botanical guidelines.

Well, botanically, it's a berry. And, according to botany, strawberries, raspberries, and blackberries are not berries. Why this would matter to anyone trying to fix dinner or dessert, though, is beyond my comprehension.

Where the other side of the argument comes from is the culinary world, the place where most people are interacting with tomatoes on a daily basis. It’s also the dominant layman’s classification, probably due to the fact that it’s based in common fucking sense.

Here's where I usually rant about how common sense is usually wrong and needs to be superseded by science. But the classification of a tomato isn't like studying what its nutritional characteristics are. Categories and classifications are imposed from the outside and are supposed to help us make sense of the universe, like what the definition of "planet" or "mammal" should be. Then something like a platypus comes along to remind us that the universe fundamentally doesn't make sense and we shouldn't expect it to. Point is, we could just as well say "any topping on a Big Mac is officially a vegetable," which might settle the tomato question once and for all, but move the discussion to whether cheese should be called a vegetable or not.

And yet, no less of an authority than the Supreme Court has ruled differently. Unsurprisingly, it’s money-related, specifically to do with tariffs. In the late 1800s in America, the taxation on fruits and vegetables was starkly different. Fruits could be imported with impunity, while bringing in foreign veggies would demand a steep 10-percent tariff. An importer named John Nix saw opportunity in the science, and refused to pay tariffs on a shipment of tomatoes, since they were technically fruits. The case climbed all the way to the Supreme Court, where it was heard in 1893.

It also should come as no shock that, in some cases, a thing can be categorized as one thing in one context, and another thing in other contexts. Like, astronomers consider any element that's not hydrogen or helium to be a "metal." That works for astronomy. It doesn't work for structural engineering.

As I read it, the Supreme Court agrees with the people, issuing the legal equivalent of “sure, technically, but come on, dude.”

Leaving aside for the moment that botanists and biologists are also (usually) people, all that means is that, in the US, tomatoes are vegetables by legal definition. I vaguely remember some nonsense a while back about whether ketchup, which doesn't have to be made from tomatoes but usually is, should also be considered a vegetable for the purpose of school lunch nutrition or something.

Left unsettled, then, is still the question of whether a hot dog (with or without ketchup) is a sandwich, and I maintain that no, it's a taco.
December 13, 2024 at 6:26am
December 13, 2024 at 6:26am
#1081102
1827.

No, I'm not referring to the year. 1827 is what you get when you multiply 365 by 5, and then add 2.

Yes, today is not only Friday the 13th, but it's also the day I claim a five-year daily blogging streak, having shat an entry out every single day between December 14, 2019, and today: one thousand eight hundred twenty-seven entries. (There were two leap days in there, hence the "add 2.")

Granted, they weren't all great entries. Some of them were probably not even very good. But I put thought and effort into each of them, and I really did do one every day (as defined by WDC time, midnight to midnight); we're not set up here to release entries at some scripted time, or to make up for lost days.

But that's the limit. There will be no six-year blogging streak, at least not in this item. With fewer than 100 entries left in its capacity, the end looms like a kaiju over Tokyo. I thought about ending it today, but nah. Or maybe on the solstice, because that would be appropriate. Or on December 31, because the very first entry was on a January 1. No, I think I'll make the attempt to continue until entry #3000, and then... hell, I don't know. Take a break? Start a new one? Retire from writing? I haven't decided yet, and, knowing me, won't decide until the very last possible minute.

Well, I promised something different today, and there it is: a great big brag. Tomorrow, I'll be back to my usual humble self. Hey... stop laughing.
December 12, 2024 at 9:23am
December 12, 2024 at 9:23am
#1081071
Another day, another book ad. But an interesting book ad, this one from Big Think. I promise something different tomorrow.

     A bold challenge to the orthodox definition of life  Open in new Window.
In “Life As No One Knows It,” Sara Imari Walker explains why the key distinction between life and other kinds of “things” is how life uses information.


I'm not going to weigh in on whether she's right or wrong, or somewhere in between. That's above my pay grade (not that that's ever stopped me before). I do think it's an interesting approach that adds to the conversation of science, even if it's ultimately a categorization issue, like the planetary status of Pluto or the sea status of the Great Lakes.

Sara Imari Walker is not messing around. From the first lines of the physicist’s new book, Life As No One Knows It, she calls out some big-name public intellectuals for missing the boat on the ancient, fundamental question, “What Is Life?”

I'm not sure how ancient, or fundamental, that question really is. With regards to humans and other animals, our distant ancestors could pretty much figure out the difference between life and not-life. With plants, it may have been a bit trickier, as they tend to not move even when they're alive. But I think Jo Cavewoman would scoff at the question. Dog: life. Rock: not life. (Yes, I'm aware that belief in animism might counter what I just said, but I'm talking in generalities here.)

It probably took until we started looking through microscopes that we began to question the boundaries. Is a spermatozoon "life?" How about a virus?

Since then, it's my understanding that people have proposed several different definitions for life, all necessarily based on conditions on Earth, and scientists and philosophers have been arguing ever since, as scientists and philosophers love to do.

Subtitled The Physics of Life’s Emergence, one of the book’s major themes is a critique of the orthodox view in the physical sciences that life is an “epiphenomenon.”

"Epiphenomenon" is another word with a kind of slippery definition. I don't like to quote dictionaries as sources, because they're descriptive and not prescriptive, but the definition I found was "a secondary effect or byproduct that arises from but does not causally influence a process." Which, well, thanks? That doesn't help.

The Wikipedia article  Open in new Window. on it is similarly confusing, at least to me, with the added bonus of also coming from a source people don't like to cite.

What's worse, in my view, is when people conflate "epiphenomenon" with "illusion:"

This is the argument, often heard in mainstream popular science, that life is a kind of illusion. It’s nothing special and fully explainable by way of atoms and their motions.

To address the latter assertion first: "nothing special" is a value judgement, and "fully explainable" is laughable hubris.

As for the "illusion" thing, well, I've banged on in here on several occasions against the "time is an illusion" declaration. But that can be generalized to anyone airily calling anything an "illusion." To me, an illusion is something that, upon further study, goes away: a stage magician's trick, or those seemingly moving lines in a popular optical illusion picture. But no matter how much we study, for instance, time, it doesn't go away. I think a better word description would be "emergent phenomenon," meaning that it's not fundamental, but rather a bulk property. Like temperature. One atom doesn't have a temperature; it only has a vibration or speed or... whatever. Get a bunch of atoms together, though, and the group has an average speed, which we read as temperature.

Or, to use everyone's favorite example, the chair you're sitting in. "It's an illusion," some philosophers claim (generally after taking a few bong hits). "It's not real." Well, look, any philosophy that doesn't start with "the chair is real" is a failure, in my view. Your ass isn't sinking through it; therefore, it's real. Sure, it's made of smaller pieces. On the macro scale, it's got a seat, probably a back (sometimes in one continuous piece), maybe arms, legs and/or casters, maybe a cushion for said ass. This doesn't make the chair any less real; it just means there's a deeper level to consider.

Similarly, the cushion, for example, is usually a fabric stretched over some stuffing. The fabric itself can be further broken down into individual fibers. The fibers, in turn, are made of molecules, some of which have a particular affinity for one another, giving the fiber some integrity. The molecules are made of atoms. The atoms contain electrons, protons, and neutrons. Those latter two, at least, can be further broken down until you're left with, basically, energy. And maybe there's something even more fundamental than that.

None of that makes the chair any less real. It just shows that our understanding can go deeper than surface reality. But surface reality is still reality.

And so it is with life. I know I'm alive, for now, and that's reality. I'm pretty sure my cats are, too, and the white deer I saw munching on leaves in my backyard yesterday. Not so sure about the leaves, it being December and all, but I am as certain as I can be of anything that they are a product of life.

Whew. Okay. Point is, I'd like to see these macro-level phenomena labeled something other than "illusion." It's misleading.

In the standard physics perspective on life, living systems are fully reducible to the atoms from which they are constructed.

Yeah, well, physics gonna physic. Just as with your chair, things can be studied at different scales. Biology is usually the science concerned with life. But biology is basically chemistry, and chemistry is basically physics. This doesn't make biology an illusion, either.

Still, they will argue, nothing fundamentally new is needed to explain life. If you had God’s computer you could, in principle, predict everything about life via those atoms and their laws.

I'm gonna deliberately misquote James T. Kirk here: "What does God need with a computer?"

Walker is not having any of this. For her, the key distinction between life and other kinds of “things” is the role of information.

Well, that's amusing. Not because it's not true—like I said, I'm not weighing in on that—but because from everything I've read, physics is moving toward the view that everything is, at base, information. Yes, that might be what energy can be broken down into. Or maybe not. I don't know. But "information theory" is a big deal in physics.

Whether there's something even more fundamental than information, I haven't heard.

Life needs information. It senses it, stores it, copies it, transmits it, and processes it. This insight is, for Walker, the way to understand those strange aspects of life like its ability to set its own goals and be a “self-creating and self-maintaining” agent.

Okay. Great. Let's see some science about it.

As usual, there's more at the link, if you're interested. Might want to sit down for it, though. You know. On that chair which is definitely real and hopefully not alive.

31 Entries · *Magnify*
Page of 2 · 20 per page   < >
Previous ... -1- 2 ... Next

© Copyright 2025 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted Writing.Com, its affiliates and its syndicates non-exclusive rights to display this work.

Printed from https://www.writing.com/main/profile/blog/cathartes02/month/12-1-2024