\"Writing.Com
*Magnify*
    January     ►
SMTWTFS
   
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
Archive RSS
SPONSORED LINKS
Printed from https://www.writing.com/main/profile/blog/cathartes02
Image Protector
Rated: 18+ · Book · Personal · #1196512
Not for the faint of art.
Complex Numbers

A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.

The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.

Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.

Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.




Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning Best Blog in the 2021 edition of  [Link To Item #quills] !
Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2019 Quill Award for Best Blog for  [Link To Item #1196512] . This award is proudly sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . *^*Delight*^* For more information, see  [Link To Item #quills] . Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2020 Quill Award for Best Blog for  [Link To Item #1196512] .  *^*Smile*^*  This award is sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] .  For more information, see  [Link To Item #quills] .
Merit Badge in Quill Award 2
[Click For More Info]

    2022 Quill Award - Best Blog -  [Link To Item #1196512] . Congratulations!!!    Merit Badge in Quill Award 2
[Click For More Info]

Congratulations! 2022 Quill Award Winner - Best in Genre: Opinion *^*Trophyg*^*  [Link To Item #1196512] Merit Badge in Quill Award 2
[Click For More Info]

   Congratulations!! 2023 Quill Award Winner - Best in Genre - Opinion  *^*Trophyg*^*  [Link To Item #1196512]
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the Jan. 2019  [Link To Item #30dbc] !! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on taking First Place in the May 2019 edition of the  [Link To Item #30DBC] ! Thanks for entertaining us all month long! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2019 round of the  [Link To Item #30dbc] !!
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2020 round of the  [Link To Item #30dbc] !! Fine job! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning 1st Place in the January 2021  [Link To Item #30dbc] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the May 2021  [Link To Item #30DBC] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning the November 2021  [Link To Item #30dbc] !! Great job!
Merit Badge in Blogging
[Click For More Info]

Congratulations on winning an honorable mention for Best Blog at the 2018 Quill Awards for  [Link To Item #1196512] . *^*Smile*^* This award was sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . For more details, see  [Link To Item #quills] . Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the January 2020 Round of the  [Link To Item #30dbc] ! Blog On! *^*Quill*^* Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the May 2020 Official Round of the  [Link To Item #30dbc] ! Blog on! Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the July 2020  [Link To Item #30dbc] ! Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the Official November 2020 round of the  [Link To Item #30dbc] !
Merit Badge in Highly Recommended
[Click For More Info]

I highly recommend your blog. Merit Badge in Opinion
[Click For More Info]

For diving into the prompts for Journalistic Intentions- thanks for joining the fun! Merit Badge in High Five
[Click For More Info]

For your inventive entries in  [Link To Item #2213121] ! Thanks for the great read! Merit Badge in Enlightening
[Click For More Info]

For winning 3rd Place in  [Link To Item #2213121] . Congratulations!
Merit Badge in Quarks Bar
[Click For More Info]

    For your awesome Klingon Bloodwine recipe from [Link to Book Entry #1016079] that deserves to be on the topmost shelf at Quark's.
Signature for Honorable Mentions in 2018 Quill AwardsA signature for exclusive use of winners at the 2019 Quill AwardsSignature for those who have won a Quill Award at the 2020 Quill Awards
For quill 2021 winnersQuill Winner Signature 20222023 Quill Winner

Previous ... -1- 2 3 4 5 6 7 8 9 10 ... Next
January 4, 2025 at 8:13am
January 4, 2025 at 8:13am
#1081935
The article I'm featuring today is old in internet terms: nearly six years. It only popped up on my radar recently, and I have to admit, when I saw the headline, my knee-jerk reaction was close to rage.

    Men Have No Friends and Women Bear the Burden  Open in new Window.
Toxic masculinity—and the persistent idea that feelings are a "female thing"—has left a generation of straight men stranded on emotionally-stunted island, unable to forge intimate relationships with other men. It's women who are paying the price.


I guess "rage" is one of the emotions we're allowed to feel. In this case, it was mainly for two reasons: the overgeneralized headline and subhead; and how a "man" problem is turned around to be a "woman" problem.

Then I remembered that the target audience for Harper's Bazaar is chicks, and my rage subsided somewhat. The goal, I figured, wasn't to fix men, but to commiserate with women.

Which brought my rage back to a simmer.

But okay. Okay. Being a man and all, I choked back all emotion, as we must always do or face social ridicule, and tried to give the article a fair chance.

Kylie-Anne Kelly can’t remember the exact moment she became her boyfriend’s one and only, his what would I do without you, but she does remember neglecting her own needs to the point of hospitalization.

Starting an article with an anecdote is a time-tested way of grabbing a reader's attention. That's fine. What I have an issue with is that, coming as it does immediately after that headline, it makes it sound like Kylie-Anne's problem (and it is, obviously, her problem, not that of her nameless and ultimately irrelevant boyfriend) is just an ordinary relationship scenario, ho-hum, this is what we're all facing, isn't it, girls? Can't you relate?

Kelly’s boyfriend refused to talk to other men or a therapist about his feelings, so he’d often get into “funks,” picking pointless fights when something was bothering him.

I don't mean to sidestep the issue or pretend that this sort of thing isn't a problem. It absolutely is, and it's one reason I don't conform to that version of masculinity. But, and I'll just point this out and leave it here, this guy had a girlfriend and I don't. So, clearly, it works, at least at first.

After three years together, when exhaustion and anxiety landed her in the hospital and her boyfriend claimed he was “too busy” to visit, they broke up.

I've been considering getting a t-shirt with a giant waving red flag on it to wear in public. That way I can skip the small talk and have people avoid me before starting a conversation that reveals whatever internal red flags I project. Saves time and energy that way.

Kelly’s story, though extreme, is a common example of modern American relationships.

Oh, now they admit it's extreme.

Women continue to bear the burden of men’s emotional lives, and why wouldn’t they? For generations, men have been taught to reject traits like gentleness and sensitivity, leaving them without the tools to deal with internalized anger and frustration.

Hey, look at that sneaky use of the passive voice! I suppose the implication is that men are taught exclusively and only by other men: fathers, uncles, brothers, male peers.

Meanwhile, the female savior trope continues to be romanticized on the silver screen (thanks Disney!), making it seem totally normal—even ideal—to find the man within the beast.

Yeah, you know, if it didn't resonate with viewers, they wouldn't keep depicting it. The Mouse has many faults, but this looks to me like another dodging of responsibility. They also, classically, depicted male heroes saving helpless damsels in distress, but that trope, at least, seems to be fading, or at least morphing, because people demand that women save themselves, instead.

Incidentally, though, I do sometimes wonder about a gender-swapped Beauty and the Beast. I'm nowhere near good enough to write something like that, and besides, if I can think of it, it's already been done.

The article continues to describe the problem, and, I'll grant, it's not unjustified. After a while, it finally names an actual man:

So Shepherd turned to the internet, downloaded a men’s group manual, and invited a few guy friends who he knew would be receptive. He capped the membership at eight and set up a structure with very clear boundaries; the most important being what’s talked about in men’s group stays in men’s group.

I'm not saying it's a bad idea by itself. But are we sure that downloading instructions from some site on the internet is really the way to go? At that point, you might as well use the movie Fight Club as an instructional guide. Not to rag on that film; it's still one of my favorites. But it's fiction.

Lots more at the link. My gender-role-approved rage has settled down to a dull feeling of numbness, which is also gender-role-approved. Now to order that giant red flag T-shirt.
January 3, 2025 at 10:35am
January 3, 2025 at 10:35am
#1081896
Today, instead of picking an article at random, I decided to go with this one from Lifehacker, which plans for the year that's (mostly) still ahead of us. Considering the source, though, you might want to double-check any of these that you might care about. The few that I did check out didn't really track, so I don't trust the rest of it without verification.

    Mark These 2025 Celestial Events on Your Calendar  Open in new Window.
Here's when to look up for full moons, meteor showers, and planetary parades.


No, I'm not going to repeat all of them. That's what the link is for.

Jan. 3–4: Quadrantid meteor shower. The Quadrantids are active from Dec. 28 to Jan. 12 but are expected to peak around 4 a.m. EST on Jan. 4.

Yeah, this is the main reason I jumped this one out of the queue. I've never had much luck with meteor showers—it's always cloudy when the good ones happen, and every time I travel to a darker place to see one, it ends up being a dud—but your experience may vary.

Jan. 13: Wolf Moon. The first full moon of 2025 has extra appeal, as it will pass close to (almost in front of) Mars.

If I had one wish, like from a genie or whatever, I could, of course, wish for world peace. Or an end to homelessness. Or, to be selfish, a billion dollars. But no, if I had one wish, it would be to end this bogus association of moon names with Gregorian calendar months. That's the crusade I'd choose (mainly because I can see how all those other wishes could be twisted to horrific effect, like ending homelessness by disappearing all the homeless).

The Wolf Moon is not defined as the full moon in January, no matter how many websites and "authoritative" sources say it is. It is the first full moon after the northern hemisphere winter solstice. Yes, this year, those happen to be the same thing. But they are not always.

This, as usual, maddened me so much that I almost missed the cool part about Mars. And I also almost missed this:

The red planet will appear to disappear behind the moon at 9:16 p.m. EST and reappear at 10:31 p.m. EST.

I'll give "appear to disappear" a pass, but the last sentence was "close to (almost in front of) Mars" and this one implies it would occult Mars entirely.

This. This is why people don't trust Lifehacker.

So, to break this down a bit:

A Moon/Mars conjunction occurs just about every lunar month. Because most solar system bodies orbit at a slight tilt to each other, eclipses don't happen at every conjunction. This is true for the Sun/Moon conjunction, which is why we don't get a solar eclipse every month, but also for when the Moon appears closest to any given planet. Plus, sometimes, it happens when Mars is in the daytime sky and we can't see it.

Additionally, as with solar eclipses, the timing of the event varies with location, because of parallax. So if you're going to say "9:16 pm EST to 10:31 pm EST," you also have to note the location on Earth where that's true, and it's not "the entirety of places that observe Eastern Time."

Now, a quick glance tells me that other sources give different timings for the eclipse (I guess it should be considered a Martian eclipse), but the January 13 date appears to be correct for both Full Moon and Martian Eclipse. Remember when I said eclipses don't happen at every conjunction? Well, one happening at a Full Moon is remarkable, both for its rarity and spectacle. Of course, it's not happening precisely at the Full Moon (5:27pm EST), but close enough for spectacle.

One wonders if Mars will even be visible against the glare of a Full Moon. I guess we'll find out. Well, other people will find out; something this rare and mars-velous (I couldn't resist) practically guarantees that, wherever I am, the sky will be covered in a thick blanket of clouds.  Open in new Window.

Nevertheless, I've noted it on my calendar. I've been known to beat the odds before, including during two solar eclipses, and I would very much like to see this.

Well, that took up way more space (pun also intended) than I expected. I'll just throw in some highlights from the rest of the calendar year 2025 skywatch forecast:

February's main event is a planetary parade, when the planets appear to be in one line in Earth's sky. The parade actually begins on Jan. 10 when the Moon joins up with Jupiter and continues through February.

See, again, misleading. The planets will appear to line up, sure, but halfway through that period, the Moon will be on the other side of the sky.

There were a few consecutive nights maybe 30 or so years ago when all the visible planets lined up in one quadrant of the sky. Or at least most of them; it's been 30 years and I barely remember the details. What I do remember is going up to the Blue Ridge Parkway on one of those nights, with some friends and a telescope. It wasn't cloudy. But it was colder than my ex-wife's lawyer's heart. As this is happening in January and February, well, it'll be cold again here.

Saturn will drop off mid-month, but tiny Mercury will be barely visible in the parade on Feb. 28.

And maybe, just maybe, I'll finally be able to see Mercury and know that it's Mercury. I've bitched about that before. I might have seen it at some point in the past, but it's only ever visible just after sunset or just before sunrise, and at those times, the twilight washes out a lot of contextual stars. So I don't know if I've ever actually seen Mercury in the sky.

As a side note, last night, I got treated to a post-sunset very bright crescent Moon not far from a very bight Venus. It was cool. But I expect tonight, the Moon will appear even closer to Venus and it'll look even more awesome.

Provided the clouds don't roll in.

March 14: Total lunar eclipse... Though the total lunar eclipse will be visible around the world, the full 65-minute totality will only happen in the Americas and Antarctica.

Nice for our continent. Unless it's cloudy. I fully expect it to be clear everywhere but Virginia that night, unless I go to, say, Wyoming to see it, in which case it'll be clear everywhere but Wyoming.

Saturn will be at opposition on the night of Sept. 21. Just like Mars in January, this event will show Saturn at its brightest, visible to the naked eye.

To clarify, if Saturn and Earth are in the right alignment such that Saturn appears at night, that planet is always visible to the naked eye. Doubt we'll be able to see the rings without aid, but even binoculars might be enough to make out that distinctive feature of the planet... provided, of course, that the rings aren't edge-on, which I don't know. And if it's not cloudy.

The rest is mostly meteor showers, which, as I said, are cool, but I've never had much luck viewing them. Maybe this year will be different, but I say that every year. And there's a few "supermoons," too, toward the end of the year. Again, though, that just describes a Full Moon near perigee, which happens every year and I think it doesn't deserve all the hype attached to it.

Still, if it gets people to look up, I'll allow it. Just stop with the association of Full Moon names with Gregorian calendar months.
January 2, 2025 at 9:02am
January 2, 2025 at 9:02am
#1081847
From aeon, an article aged over seven years:



Oh, a new word, eh? Cool, cool. This is an essay first published in 2017, so surely that word's been spread to the farthest reaches of the travel community by now, right?

One thing I’ve noticed over the years of bringing my students to Ireland – my homeland – is that they pay rapt attention to the little things. This heightened and delighted attention to the ordinary, which manifests in someone new to a place, does not seem to have a name. So I have given it one: allokataplixis (from the Greek allo meaning ‘other’, and katapliktiko meaning ‘wonder’).

I'm not really mocking. I've invented dozens of words that never caught on (and one that did in ways I could never have anticipated). Okay, but I'm also mocking, a little, because there's no fucking way a word like allokataplixis would ever go viral, except maybe if it were the name of a new penis-enlarging drug. Even then, we'd shorten it to allo, which could get confusing.

For the past five years, I have travelled around Ireland each summer with a bunch of allokataplixic American kids.

And there it is: the adjective form.

Marvellous to them also is the slight smell of salt in the air when you arrive in Dublin, the raucousness of seagulls crying overhead... [loads of poetic imagery] ...the sun setting on the Atlantic viewed from the beaches of the west, the melancholy slopes in County Kerry that were abandoned during the famine.

Well, yeah. What's ordinary to locals is often fresh and exciting to visitors. It's not just Dublin (never been, but want to go), but almost any place you're not used to. Like how millions of New Yorkers pass by the Empire State Building without admiring its art-deco grandeur.

This is why some of us travel: to find the beauty in the mundane, to see it with new eyes and, maybe, pass along some of that newness to those jaded by familiarity with it.

Yet over the years that I’ve been bringing students to Ireland I’ve observed that their thirst for fresh experience is contagious. It oftentimes brings out the best in people. A tourist generally has an eye for the things that, through repetitive familiarity, have become almost invisible to the resident.

It can also bring out the worst in people.

One does not need, however, to be an outsider or a tourist to be allokataplixic. Is it not the task of most writers to awaken us from the dull, the flat and the average sentiments that can dominate our lives? Many of the Irish writers that my students read before travelling have a knack for noticing the marvellous in the everyday, and of making the quotidian seem wholly other and amazing.

Just in case you were wondering if this had anything to do with writing.

I don't take issue with the general ideas in the article (there's even a foray into the fractal, which is always like candy to me). It's just... that word. You'd think we could come up with something better, something with fewer syllables, something less pretentious than an obscure phrase from Ancient Greek.

You'd think so, but I'm stumped.
January 1, 2025 at 9:55am
January 1, 2025 at 9:55am
#1081789
Well, now that that's over, let's get back to it. What better way is there to start a new calendar year than by pointing out a mistake made in the previous calendar year? A bit from Ars Technica:

     Journal that published faulty black plastic study removed from science index  Open in new Window.
Chemosphere cut from Web of Science, which calculates impact factors.


Some people might not have noticed the black plastic crisis. I didn't see anything about it until the retraction, myself, so I was less prone to primacy bias.

This article goes beyond one single retraction, but I'll point this out anyway: usually, people hear about the study, usually through some breathlessly urgent reporting by someone trying to be first out of the gate, and then the retraction happens... and radio silence ensues, leaving people believing the first report. Worse, some people (exhibiting the aforementioned primacy bias) do hear about the retraction, but the falsified original stays in their brain.

Rarer is the case where an entire journal faces consequences for publishing shoddy studies.

The publisher of a high-profile, now-corrected study on black plastics has been removed from a critical index of academic journals after failing to meet quality criteria, according to a report by Retraction Watch.

If you've been lucky enough to avoid the whole made-up controversy, this article does a fair job explaining the events timeline. It's there if you want to read it.

However, it gets worse.

It appears that the people responsible for the original, retracted study on black plastic kitchenware did make a math error. This is bad enough, as it contributes to primacy bias, though anyone can make math errors or other mistakes (which is one reason you have peer review in science). But the worst part is, it looks like the authors of the original study had an Agenda:

The statement says that, regardless of the math error, the study still found unnecessary flame retardants in some products and that the compounds can "significantly contaminate" those products.

That is not science. That is opinion contaminating science. It's like if the Committee for Bug-Free Food found that 1% of the contents of canned tomatoes was bugs (there is, as I understand it, a maximum allowable bug level in food, as attempting to remove all insect parts reaches a point of diminishing returns, but I can't be arsed to research what it is), but then said they misplaced a decimal and it's actually 0.01%—and then still insisted that there's still bugs in the food and so canned food should be avoided at all costs.

Usually, the next thing you find out is that the Committee for Bug-Free Food is in the employ of someone with a vested interest in selling their own line of (more expensive) bug-free food.

Now, I'm not weighing in on whether you "should" use these plastic utensils or not. It's not an issue of grand global importance, the way the Wakefield disaster was, and still is. I just think any such decision should at least take the science into account. The actual science, not the one with math errors and strongly-held opinions.
December 31, 2024 at 9:32am
December 31, 2024 at 9:32am
#1081753
Well, here we are at the end of another trip around the sun (or close enough), at a purely arbitrary date on a purely arbitrary calendar.

I had what I consider to be the beginning of the new cycle ten days ago, on the solstice. If anything should mark a transition on what's really a continuum, let it be something real and measurable. But, most of the world uses the Gregorian calendar for recordkeeping and consistency (it is, I'll grant, remarkably good at calculating solar returns), and it's one of the few things most of us share. So we want to impose meaning on December 31 / January 1, fine. Impose it. At least it's usually celebrated with two of my favorite activities: Drinking, and staying up late.

Now, I don't usually make resolutions. I think they're artificial, and set the resolutor (or whatever) up for failure. Besides, it's a bandwagon thing, and I hate bandwagon things.

Another bandwagon thing is the concept of Dry January: the idea of abstaining from ethanol during that calendar month, perhaps in penance for December's overindulgence. This concept legitimately offends me, and I don't get offended easily. In past years, I've simply ignored it and gone on doing what I usually do.

Thing is, contrary to popular belief (that I promote), I don't actually drink every day. Usually once or twice a week. More, perhaps, when I'm on a trip, but only if I'm not subsequently driving. But this New Year's, I've decided to throw personal tradition into the trash and actually make a (gasp) resolution:

Be it hereby resolved that, in protest of the abominable concept of Dry January, Waltz is determined to drink an alcoholic beverage every day during the calendar month of January. This could be a cocktail, a shot of tequila, a dram of scotch, a tot of rum, a bottle of beer, a glass of wine, or the equivalent. More than that is acceptable. Less is not. I do reserve the right to deliberately fail at the resolution in the case of illness or severe injury that requires painkillers, but apart from that, no excuses.

Don't get me wrong: anyone who genuinely wants to stop drinking, temporarily or permanently, as a resolution or otherwise, I wish them well. This is for me. I just have major issues with following a crowd or participating in what's probably little more than abolitionist propaganda.

We'll see how it goes. Knowing me, I'll fail at it like most of us fail at resolutions.
December 30, 2024 at 9:40am
December 30, 2024 at 9:40am
#1081711
For my last link of 2025 (I expect to do a personal update tomorrow, New Year's Eve), we have a 2.5 year old article from a source I don't think I've linked before, Planetocracy. I don't know anything about the site besides this one article.

    Objections to Mars Colonisation  Open in new Window.
A summary of bad arguments that keep being repeated


Judging by the spelling in the headline, this author isn't from the US. Which is fine; it's good to have non-US perspectives on space.

When people who are either uninterested in space colonisation, or actively opposed to it, comment on the prospects of travelling to Mars they often make the same or very similar arguments, unaware that these arguments were either ill-formed to begin with or have already been convincingly refuted.

On the flip side, Kelly and Zach Weinersmith wrote some compelling arguments against it in a book that, if I recall correctly, came out after this article. I don't usually mention books directly here, but I'll make an exception: A City on Mars  Open in new Window. (Amazon link)

“Humans can’t live on Mars because it lacks a magnetic field”

Remember, that's an argument the author claims is refuted. But on the basis of this article alone, I'm not convinced.

The lack of such a field on Mars is by no means a showstopper for colonisation, however. The main protection we have from cosmic rays on the surface of Earth is not our magnetic field, it is our atmosphere.

Okay, and Mars has a much thinner atmosphere. On the plus side, it's further away from the source of most radiation, the Sun. But it seems to me that building habs underground (using robot diggers to start with) would mitigate a lot of the radiation hazard.

Replenishing Mars’ atmosphere, if necessary, would thus be fairly trivial.

Even with the math leading up to it, I find that statement questionable.

“Humans can’t live on Mars because terraforming is impossible/impractical/takes too long”

I'd hesitate to agree with "impossible," because engineers can be pretty smart, and technology continues to advance. I don't think we could do it right now, though. Hell, we can't even terraform Terra.

“Humans can’t live on Mars because they have to live underground and would go crazy”

I'd think a lot of that depends on available space and other factors.

“Humans can’t live on Mars because perchlorate in the soil will poison them”

That, however, has an engineering solution, so I don't think it'd be a dealbreaker.

There's much more at the link. Mind you, I'm not advocating for the author's position, or for its opposite; I don't have the information needed to support or refute these arguments by myself. If we really wanted to colonize—as opposed to visiting or doing short-term stints on— Mars, most of these are problems with engineering solutions. The question, then, is going to be: is it worth it?

And I don't just mean monetary return, but also scientific and engineering advances. Unless we find life (by which I mean microbial life or its equivalent) there, only two reasons to do it that stand out to me: 1) as a stepping-stone to asteroid mining; and 2) as a hedge for the human race against Earth catastrophe. Right now, I don't think we're anywhere close to making either of those things practical.

Even if we did find life there, I don't think it would require a permanent colony to study.

There is, however, one compelling reason to do what we need to do to make it happen, which is: because it's there. The one argument against it that I see most often is some variant of "why spend money on space when we need it to fix things down here," which I find disingenuous. It's not like we're "throwing money into space." It's like the proponents of that argument think we just bundle a bunch of bills into the payload of a rocket and shoot it into the Sun (which, by the way, would take a lot more money than sending it to Mars would). All that money goes back into the economy, creating jobs and helping to develop new technologies.

Besides, we could fix things down here, even with an active space program. But we don't.
December 29, 2024 at 8:07am
December 29, 2024 at 8:07am
#1081680
Today's dip into the murky well of the past takes us back over five years, before the beginning of my current daily blogging streak (but not much before), to an entry for 30DBC: "None More BlackOpen in new Window.

The prompt was: What is your favorite color? Do you have a favorite color pairing? What’s something in your life that you picture when you think of your favorite color? Do you choose to wear clothing that is your favorite color? Has your favorite color changed over your life?

Now, we all know I have a lousy memory, but I'm pretty sure I remember that when this prompt appeared, lo these many years ago, I read the first question and immediately thought of the "Questions Three" from Monty Python and the Holy Grail.

Quoting certain movies and/or books has that effect on me. One cannot mention the number 42 without evoking my memories of Douglas Adams' Hitch Hiker's Guide, for example. I see a Twinkie, and the Twinkie scene from Ghostbusters plays in my head. Sometimes, I can suppress the urge to blurt it out, especially when I'm writing and can deliberately avoid the subject.

With this one, however, I managed to suppress the Python reference only to get caught up in a Spinal Tap reference. Luckily, the gif I found for "none more black" is still active after all this time. I'll post it again for anyone unwilling to go see the earlier entry:



There followed a short treatise on the beauty of black, and really, not much has changed for me in that regard since then.

And the film This Is Spinal Tap (released 40 years ago) continues to be a cultural touchstone.
December 28, 2024 at 8:53am
December 28, 2024 at 8:53am
#1081623
SciAm takes on a spelling challenge. Or, it did, nine years ago. Well, actually, it was an opinion piece even then.

    The Difference between ExtrAversion and ExtrOversion  Open in new Window.
What's the correct spelling: ExtrAversion or ExtrOversion?


Let's find out what they say, then, if we can crawl out of our introvert holes long enough to give them a look.

Jung may be rolling in his grave.

Lots of people are. I propose wrapping them in copper wire, installing some magnets, and turning them into power generators.

Folklore has it that when Carl Jung was once asked which was the correct spelling—ExtrAvert or ExtrOvert—Jung's secretary wrote back something like, "Dr. Jung says it's ExtrAverted, because ExtrOverted is just bad latin."

That's rich, claiming folkore in a story about Jung.

The thing about Latin is, as a dead language, all of its rules are set in stone. When people actually spoke it, though, it was widespread enough that it changed over time, and we, at some point, decided that the Latin in administrative use around Julius Caesar's time (if I recall all this correctly) was the Latin, and the usages and spellings were calcified. In reality, people went on speaking and writing it, and it eventually morphed into Italian, French, Spanish, etc.

If it helps, the French translation of the adjective extraverted is extraverti(e).

But. This is English. A very widespread living language, subject to change and regional variations. What's the correct way to spell humour? Gray? Tire? Kerb? Once something gets set loose in the public, at some point, it stops being a mistake and starts being a variant. Yes, sometimes I rail against those variants, but I have to remind myself that I'm witnessing a linguistic shift as it occurs.

It's always a mistake to use it's as a possessive pronoun, though.

One of the first times Carl Jung introduced the term is in 1917, in his book "Die Psychologie der Unbewussten Prozesse", he spelled it "ExtrAvert". Exhibit A (ha ha):

You'll have to go to the link to see the example, because it's a graphic, but please note that it's in German. English is about as much German as it is French, and neither language controls English spelling.

So why do so many people spell it ExtrOversion today?

At this point in my first reading of the article, I took a wild guess: to conform better with the spelling of its antonym, introvert.

The article then dates the English "o" spelling to one Phyllis Blanchard in 1918, which, as you might note, is but one year after Jung's book above. It also specifies American English, which, as we all know from the above examples I provided, need not conform to other Anglophone countries' spelling.

Not only did she change the spelling of the word, but she also changed the definition!

Definitions, too, change over time and culture. You know what the French translation of the English verb "to request" is? "Demander." This is, of course, cognate with another word in English that has a much stronger connotation than "request."

What I think probably happened is that she was translating Jung and used the "extro" form to imitate the "intro" form for symmetry.

That is, as the author admits, a guess. But it was my guess, too.

We now know that there are five fundamental dimensions of personality (extraversion, neuroticism, conscientiousness, agreeableness, and intellect/imagination), each one on a continuum.

Yeah, as with any other psychological "knowledge," this, too, is subject to alteration over time. But usually, it changes in a somewhat more technical manner than spelling or everyday word usage.

Under this framework, extraversion is defined as being outgoing, sociable, expressive, and assertive. Introversion is defined as the opposite of extraversion (reserved, quiet).

Anyone who's met me can tell you I'm not very reserved or quiet, and yet I don't identify as an extrovert.

Why does this matter? Trust me, I'm not usually this pedantic.

But I am. Being an introvert and all.

Maybe a solution is to just have both spellings in existence, but define the terms differently.

Oh, gods, no, no, no, please don't. I went down a rabbit hole recently on the difference between kluge and kludge (in the process discovering that neither one is actually of Yiddish origin, much to my disappointment), and, well, let's just say I left the rabbit hole even more confused than when I fell into it.

How about instead we bury ExtrOversion once and for all, and all embrace the same spelling to honor Jung.

The author, in passages I didn't quote, appears to be a massive fan of Jung. I can kind of understand this (he was better than Freud, at the very least), but, again, he wrote in German. Which is a fine language, but, as I said, has no direct connection to English (which I've come to understand as a mature creole of earlier forms of French and German).

Fortunately, the author ends with a sentiment I can certainly support:

I do believe it's helpful for scientists to listen to the experiences of individuals, but I also think it could be helpful for individuals to listen to the latest science.

Science, however, does not and should not dictate word spellings. Hell, they can't even dictate word usage; "theory," for example, means something completely different to a scientist than it does to an ordinary person. And yet, I'm going to continue to spell it "extrovert," because it's totally acceptable in English, if not Latin (or German). If you're more familiar than I am with the word in other Anglophone countries, or even in other languages, feel free to chime in.
December 27, 2024 at 8:29am
December 27, 2024 at 8:29am
#1081585
From SciAm, an opinion piece that seems to align with my own opinions:

    A Science Breakthrough Too Good to Be True? It Probably Isn’t  Open in new Window.
The more exciting, transformative and revolutionary a science result appears, especially if it comes out of nowhere, the more likely it is to be dead wrong. So approach science headlines with a healthy amount of skepticism and patience


Regular readers might have noticed that this is what I try to do.

I'd also add: pay attention to retractions. There are still people stubborn enough to believe vaccines cause autism.

In 2014 astronomers announced a whopper of a discovery: primordial waves from the earliest moments of the big bang.

You know, I have no memory of that one, or its retraction.

Or remember Tabby’s Star? In 2015 astronomers speculated that its strange light pattern might be the product of alien megastructures. Cue media circus, high-profile talks, the works. Further analysis revealed that it was ... dust, again.

That one, I recall fairly well. I wasn't blogging at the time, but I remember reading about it and going, yeah, let's do some more research before jumping to "aliens."

Further analysis revealed that it was ... dust, again.

Or... that's what They want us to think.

More recently, a group of astronomers claimed to find phosphine in abundance in the Venusian atmosphere, proposing that there might be some form of exotic life floating in the cloud tops.

That one's a bit more complicated. In brief, further studies are being done, and some are contradictory. That's okay. That's how we figure things out. Until there's something definitive, though, I think it's completely safe to assume "no life on Venus."

It’s not just astronomy. Neutrinos can travel faster than light. Mozart makes your kids smarter. Dyeing your hair gives you cancer. Smartphones make us stupid.

Very, very stunning. But very, very wrong.


Some of that is wishful thinking or deliberate misinformation. "Mozart makes your kids smarter," for example, sounds like something that music producers might push.

First, much, if not most, scientific research is wrong. That’s why it’s research; if we knew the answers ahead of time, we wouldn’t need to do science.

That's... well, it could be phrased better, I think. Not that scientific research is "wrong," which can easily imply a moral judgement, but that a) some hypotheses turn out to be falsified and b) sometimes scientists reach the wrong conclusion, which is later caught through peer review and replication attempts.

Second, scientists endure perverse incentives to publish as much as possible—to “publish or perish”—and to get their results in top-tier journals as much as possible.

That's a problem, but I wouldn't have the first idea how to fix it.

Lastly, there’s the modern-day hype machine. While many journalists respect scientists and want to faithfully represent the results of scientific research, publishers face their own incentives to capture eyeballs and clicks and downloads. The more sensational the story, the better.

This bit is what I mainly focus on in here, because I'm not involved in science or science publication, but I do try to recognize when a headline or link is deliberately sensationalized.

The more times that the public sees science contradict itself, the less likely people are to believe the next result that makes headlines. And the more times that scientists are loudly, publicly wrong, the more ammunition antiscience groups have in their fight against trusted experts.

This is absolutely a problem. I've used this example before, but it's like how eggs have gone from good to bad to good to bad to maybe okay to maybe not too many to good to bad (and then to way too expensive anyway), all just within my lifetime. Just because nutrition science has major flaws, however, doesn't mean astronomy and physics do.

And let me be clear here. I’m sure you find most, if not all, scientific research absolutely fascinating—as do I. But the more interesting a result is to the wider community, with more headlines, chatter, attention and raised eyebrows, the more likely it is to be worthy of a bit of healthy skepticism.

I think that's a pretty sound generality. Like any generality, it's not always true. One day, perhaps, someone will find unequivocal evidence of life on another world: microbes on Mars or eukaryotes on Europa, or something. That will be a Big Fucking Deal, one of the most important discoveries in human history. However, if the evidence is not unequivocal—as with the Venus phosphine, or, before that, the Antarctic meteorite from Mars—it's all sensation with no confirmation.

The best approach to take with science results, news, and headlines is the same approach scientists use themselves: healthy skepticism.

To be clear, "skepticism" doesn't mean "reflexive disbelief." It's more like looking at it critically while keeping an open mind. Seeing a science headline and immediately disbelieving it on principle is just as bad as immediately believing it because you want it to be true.

Beware big headlines; don’t believe everything you see. But when study after study comes out, building up an interconnected latticework of theory and experiment, allow your beliefs to shift, because that’s when the process of science has likely led to an interesting and useful conclusion.

That, to me, is of great importance. Being stubborn, clinging to what has been disproved or debunked, is not a positive character trait. But then, neither is naïve acceptance of every claim you see.
December 26, 2024 at 9:30am
December 26, 2024 at 9:30am
#1081547
Just about a month ago, I linked an article about going gray, in the entry "A Touch of GrayOpen in new Window.. The PopSci article proclaimed with great certainty that gray hair can never return to its original color. Well, here's SciAm going "nuh-uh" in 2021:

    Gray Hair Can Return to Its Original Color—and Stress Is Involved, of Course  Open in new Window.
The universal marker of aging is not always a one-way process


And here I thought the universal marker of aging was yelling at kids to get off your lawn.

As we grow older, black, brown, blonde or red strands lose their youthful hue.

Which is why investments in hair dye companies are likely to be lucrative.

Although this may seem like a permanent change, new research reveals that the graying process can be undone—at least temporarily.

Obviously, everything is temporary, especially if you're old enough to have gray hair in the first place.

Hints that gray hairs could spontaneously regain color have existed as isolated case studies within the scientific literature for decades.

Sometimes, all it takes to disprove a hypothesis is a single counter-example. Like if you said "all planets have moons," and then someone showed that Venus is a planet that doesn't have moons, your hypothesis would be broken. In biology, though, things aren't always that clean-cut, and a single counterexample could just be a fluke or a hoax.

In a study published today in eLife, a group of researchers provide the most robust evidence of this phenomenon to date in hair from around a dozen people of various ages, ethnicities and sexes.

While a sample size of around 12 also doesn't do much to make the findings definitive, this phenomenon is apparently rare enough that to expect a bigger sample would be wishful thinking.

It also aligns patterns of graying and reversal to periods of stress, which implies that this aging-related process is closely associated with our psychological well-being.

On this point, the article is in alignment with the one I posted last month.

Around four years ago Martin Picard, a mitochondrial psychobiologist at Columbia University...

Now, that's someone who either hates Star Trek with a burning passion, or takes every opportunity to tell his underlings to "make it so."

...was pondering the way our cells grow old in a multistep manner in which some of them begin to show signs of aging at much earlier time points than others.

I don't really have anything to say about that process or the science behind it; I just wanted to make a Picard joke in an entry about hair.

These patterns revealed something surprising: In 10 of these participants, who were between age nine and 39, some graying hairs regained color.

So, this is highly unlikely to apply to older folks who go gray. Still, research like this might lead to a way to artificially restore hair melanin without dyes. Not that I care. I think I'd look awesome with a gray mane.

Most people start noticing their first gray hairs in their 30s—although some may find them in their late 20s.

That early? This surprises me, though I've known quite young gray-haired people. But one never knows: those who turn gray then could be dyeing their hair, or shaving it off entirely.

The team also investigated the association between hair graying and psychological stress because prior research hinted that such factors may accelerate the hair’s aging process. Anecdotes of such a connection are also visible throughout history: according to legend, the hair of Marie Antoinette, the 18th-century queen of France, turned white overnight just before her execution at the guillotine.

If true, for a queen, that had to add insult to injury.

Eventually, Picard says, one could envision hair as a powerful tool to assess the effects of earlier life events on aging—because, much like the rings of a tree, hair provides a kind of physical record of elapsed events.

Or, you know, a sorcerer could use it to control you. You never know.
December 25, 2024 at 9:01am
December 25, 2024 at 9:01am
#1081497
I've banged on in here many times over what is and is not an illusion. Here's a bit from Popular Mechanics (way back in 2020) about things we know are illusions.

    Why We've Fallen for Optical Illusions for Thousands of Years  Open in new Window.
Mammoth or bison? Rabbit or duck? Ambiguous images have tricked our eyes forever.


Forever, huh? Our eyes haven't been around forever.

Okay, okay, no, I'm well aware of the emphatic connotation of "forever," and it's beneath me to rag on that usage. I did it anyway.

So, because pictures in here are hard to do, you'd need to visit the link to see the oldest known optical illusion: a carved mammoth / bison dated to 14,000 years ago, thus showing that humans have been fascinated by optical illusions, if not forever, for at least 14,000 years.

Like many optical illusions, these images play on the human brain’s urge for context and turn our first impressions upside down.

I say it also plays on our predilection for pareidolia. Much of art does, really; it can be deliberately-induced pareidolia, which is why we recognize the smiley-face emoji as a smiley-face. These illusions deliberately induce pareidolia in more than one way.

The ambiguity itself is the point, and researchers have studied how the regular human experiences and preconceived notions we all carry around influence the way we decide between ambiguous options or fill in missing information.

And there's a metaphor in there, somewhere.

There's not a lot more at the article, which doesn't even acknowledge my hypothesis about pareidolia, preferring instead to talk about brain plasticity and resolution of ambiguity. But it did get me wondering why these illusions don't elicit the same kind of groaning hatred that their linguistic equivalent, the pun, does?

Maybe because evolution has worked to keep us punsters from contaminating the gene pool.
December 24, 2024 at 2:28am
December 24, 2024 at 2:28am
#1081447
Posting early today because, like many people, I have stuff to do later. In my case, though, the stuff is completely unrelated to tomorrow's holiday.

I've written about Betelgeuse before, most recently here: "Betelgeuse 2Open in new Window.. This is, however, a different article, more recent, from Big Think.

    This is what we’ll see when Betelgeuse goes supernova  Open in new Window.
The closest known star that will soon undergo a core-collapse supernova is Betelgeuse, just 640 light-years away. Here’s what we’ll observe.


And already I have Quibbles.

1: "what we'll see." It's extremely unlikely that anyone alive as I write this will see it happen. I'm a gambling man, and I wouldn't bet on it, not unless some bookie was offering billion-to-one odds and I could bet, like, a dollar. The headline uses the same value of "we" as people do when they talk about when "we" will colonize distant star systems (hopefully not Betelgeuse).

2. "will soon undergo." As with "we," they're using a variant value of "soon." Best estimate I've seen is within 100,000 years. That's soon in cosmic terms. It's not soon in human terms. Hell, 100,000 years ago, we'd (entirely different definition of "we" this time) barely started using fire.

3. "640 light-years away." Yeah... maybe. For whatever technical reason (it's been explained to me, but it's over my head), B's distance has been tricky to pin down. Wiki claims 400-600 light years, and that's a huge margin which doesn't even include 640.

I should reiterate here that even if it's at the low end of that scale, astronomers expect no ill effects for any life remaining on Earth when it happens. Of course, astronomers have been known to be wrong, from time to time.

But, okay. Issues with the headline don't always transfer to the actual text. It's just that it's the first thing we see, so getting it right is kinda a big deal. I'm not saying that it's clickbait, but it is a bit sensationalized.

The stars in the night sky, as we typically perceive them, are normally static and unchanging to our eyes. Sure, there are variable stars that brighten and fainten, but most of those do so periodically and regularly, with only a few exceptions. One of the most prominent exceptions is Betelgeuse, the red supergiant that makes up one of the “shoulders” of the constellation Orion.

Hence the title of today's entry.

Over the past five years, not only has it been fluctuating in brightness, but its dimming in late 2019 and early 2020, followed by a strange brightening in 2023, indicates variation in a fashion never before witnessed by living humans.

It is necessary for a human to be living in order to witness anything (metaphysics and religion aside), but I think they mean it's weirder than it's been for the past 100 years or so.

There’s no scientific reason to believe that Betelgeuse is in any more danger of going supernova today than at any random day over the next ~100,000 years or so, but many of us — including a great many professional and amateur astronomers — are hoping to witness the first naked-eye supernova in our galaxy since 1604.

As unlikely as it might be, I've said before that it would be very, very cool if I got to see it. I'm just not betting on it.

Located approximately 640 light-years away, it’s more than 2,000 °C cooler than our Sun, but also much larger, at approximately 900 times our Sun’s radius and occupying some 700,000,000 times our Sun’s volume. If you were to replace our Sun with Betelgeuse, it would engulf Mercury, Venus, Earth, Mars, the asteroid belt, and even Jupiter!

See, those numbers don't hit very well with people, including me. Even comparing the size to our solar system doesn't give us a visceral idea of just how fucking huge that star is (not to mention I'd question the Jupiter orbit thing, because red giants like that just don't have a well-defined surface in the way that we think of the Sun as having one).

This image  Open in new Window. might help with the size comparison.

Even when it transitions to the more advanced stages of life within its core, from carbon-burning to then neon and oxygen and eventually silicon fusion, we won’t have any directly observable signatures of those events.

Dude, people are easily confused. I get that stars, like people or cats, have a birth, time of existence, and death. As far as we know, though, stars themselves don't harbor life. Yes, the universe is weird, and it's fun to speculate that maybe they do, but "life" in this case is a metaphor for how stars change over time. Calling it life just begs people to misunderstand, deliberately or not, what's meant.

The article goes on to describe what whoever's on Earth when it happens can expect to experience when the event finally occurs. Not going to quote more, but it's pretty interesting, in my opinion. Pay no mind to the "it really happened 640 [or whatever] years ago" thing, though; it's irrelevant except as a way to acknowledge that information has a maximum speed.

Naturally, being science, everything there is based on our best knowledge at this point in time. I'd also expect surprises. But those surprises will only serve to advance the science.

Unless, of course, they're wrong about the "it won't irradiate and sterilize the Earth" thing.

Sleep tight!
December 23, 2024 at 7:31am
December 23, 2024 at 7:31am
#1081423
In my partial listing of intelligent species yesterday, I left out an important one. But my random number generator reminded me today, by pointing to this article from Atlas Obscura:

    Can You Outsmart a Raccoon?  Open in new Window.
Recent studies show just how tricky these trash pandas can be, from opening locks to nabbing DoorDash orders.


Well, that last bit won't happen to me because I don't use DoorTrash. But I have caught those masked marauders in the actual trash. I also once caught one that had opened the door to the house, snuck in, and scarfed down the cat food.

While many other species around the world are in decline, raccoons are actually thriving, and do particularly well in urban areas, says lead author Lauren Stanton, a cognitive ecologist at the University of California, Berkeley.

Okay, that tracks, but... "cognitive ecologist?"

Raccoons are strong—they can push a cinder block off a trash can—and tenacious. The more we do to keep them out, the more skills they learn for breaking in, leading to a cognitive arms race between people and raccoons.

You know how people keep saying that if cats ever got opposable thumbs, we'd be in big trouble? Well, raccoons don't have opposable thumbs, either, but their little paws grip stuff just fine without them.

In 2016, for example, the city of Toronto spent 31 million CAD (that’s about $23 million) on raccoon-resistant waste bins. While they deterred most would-be robbers, certain tricksters had no problem solving the new puzzle. The city has continued to release new versions of the bin, trying to outsmart Toronto’s most persistent trash invaders.

All due respect to our Canadian friends, that right there cracked me up.

The word raccoon can be traced back to the Proto-Algonquian word ärähkun, deriving from the phrase “he scratches with his hands.”

The name was more directly from a specific Algonquian language, spoken by the Powhatans here in what would become Virginia.

While it’s hard to compare intelligence across species, says Stanton, she says that some recent studies show the neural density of raccoons is “more similar to primates than other carnivore species.”

Also, from what I've been hearing, neither brain size nor neural density is strongly correlated with those traits we call intelligence. Still, there's no mistaking that at least some raccoons exhibit advanced problem-solving skills.

However, we have learned that raccoons, once thought to be solitary, are in cahoots with each other far more than we knew.

Great. Now we have to deal with raccoon gangs.

Lots more at the article, which, most importantly, features multiple pictures of impossibly cute raccoons.
December 22, 2024 at 6:24am
December 22, 2024 at 6:24am
#1081398
As usual for Sunday entries these days, I selected an older entry at random to take a second look at. This time, I didn't land all that far in the past, just over a year (one year being my minimum for these exercises), and found this: "The Trouble with QuibblesOpen in new Window.

Being relatively recent, the linked article,  Open in new Window. from Vox, is still up, and I didn't see any indication that it's been revised since then.

I will therefore address, primarily, my own thoughts at the time.

Quibble 1: "Intelligent." I've railed on this before, but, to summarize: What they really mean is "technology-using."

I have, in fact, bitched about this sort of thing on numerous occasions, and for reasons I go over in that entry. But, even apart from the tiresome jokes about humans not being intelligent, we know of other intelligent life on Earth: octopodes, dolphins, cats, crows, etc. It took entirely too long, from the perspective of our time as a species, to recognize these intelligences. Our communication with them is limited in scope; those species are related to us, so you'd think there would be enough common ground to establish a dialogue, but no. How much worse might it be to communicate with someone from an entirely different genetic lineage?

Of course, there's always the most basic form of communication: violence. I know it's fashionable to think that any culture advanced enough to get to the stars will have put all that behind them, but I'm skeptical. We certainly haven't. Humans fear the Other, and there's probably sound evolutionary reasons for that, but nothing would be more Other than space aliens. To them, we're the space aliens.

We're looking for signs that some extraterrestrial species has developed communication or other technology whose effects we can detect. This technology would indicate what we call intelligence, but not all intelligence develops technology. One might even successfully argue that it's kinda dumb to invent technology.

Quibble 2: UAP may be more or less silly than UFO, but I believe it to be a better fit.

UAP may have less baggage than UFO, but we have a history of taking new, neutral terms and giving them the same encrustations of connotation that we give the old terms. Like how "retarded" started out as a value-neutral term to describe humans of lower intelligence (see above), to replace words like idiot, cretin, and moron, which had turned into insults. Then "retarded" turned into an insult, and some say we shouldn't be using the term at all. Well, that's special.

Point is, I give UAP (unidentified anomalous (originally aerial) phenomena) about five more years before they have to come up with something new because UAP studies have taken a turn for the edge of the world.

And I don't doubt that there's something to study. Sure, there are hoaxes; there are always hoaxes, like with Bigfoot, but they're probably not all hoaxes. I just don't jump straight to the conclusion that if there's a sighting of something that can't be immediately identified, it must therefore be aliens. That's just retarded.

Quibble 5: What's the first thing we did when we started exploring space? Sent robots, not people. No reason to assume hypothetical aliens wouldn't do the same.

This can probably be nitpicked because some of our early ventures into space were crewed: first person in space, first human on the moon, etc. Still, Sputnik (not really a robot but not living, either) preceded Gagarin, and lunar probes preceded Tranquillity Base, and since then pretty much everything outside of Low Earth Orbit has been a robot.

Well, that's all I'm going to expand on today. My thoughts haven't changed much in the 14 months since that entry, and we have found no extraterrestrial life, intelligent or otherwise, during that time, so the search continues.
December 21, 2024 at 11:36am
December 21, 2024 at 11:36am
#1081375
A neutron walks into a bar. "What'll it be?" asks the bartender. "How much for a beer?" "For you, no charge!"



While this Quartz article is from the faraway year of 2017, I found enough to snark on to make it worth sharing.

You’ve likely been asked how you see the proverbial glass: half full or half empty? Your answer allegedly reflects your attitude about life—if you see it half full, you’re optimistic, and if you see it half empty, you’re pessimistic.

I'm an engineer. All I see is an overdesigned glass. Or, depending on my mood, an inefficient use of available storage space.

I'm also a pessimist, but at least I'm a pragmatic one.

Implied in this axiom is the superiority of optimism.

Also, I don't know if even the most dedicated pessimist, not knowing and deliberately following this particular cliché, would seriously consider "half-empty" to be a thing. Almost everything is related to full. Like, if your fuel gauge is in the middle, you say "we have half a tank of gas," not "the tank's half-empty."

Thus, the good answer is to see the glass half full. Otherwise, you risk revealing a bad attitude.

Shut the fuck up about my attitude.

Actually, the glass isn’t half full or half empty. It’s both, or neither.

Come on now. No. It's not in a state of quantum indeterminacy. Or, at least, no more than any other object in view.

Things aren’t mutually exclusive, awesome or awful. Mostly they’re both, and if we poke around our thoughts and feelings, we can see multiple angles.

On that bit, though, I'm fully on board. I really hate it when people put things into two boxes: "awesome" and "suck." The moment Netflix went to shit was the moment it switched from star ratings to thumbs up or down. Of course, I'm fully aware that by hating it, I'm putting the idea of the awesome/suck binary into the "suck" box. Everyone is a hypocrite, including me.

Neutrality sets us free. It helps us see something more like the truth, what’s happening, instead of experiencing circumstances in relation to expectations and desires.

Ehhhh... nah. Pessimism, and only pessimism, sets us free. An optimist is doubly disappointed when their imaginings fail to materialize: from the positive outcome having not worked out, as well as the ego hit from being wrong. A neutral person risks never experiencing the joy of anticipation. It is only the pessimist who, if their prediction falls flat, still takes a win: either something good happens, which is good; or they turn out to be right, which is a pleasant feeling.

The article goes on to relate the quality of neutrality to Buddhism, I suppose in an effort to give neutrality some ancient gravitas, but instead, it only makes Buddhism seem even less appealing to me.

But hey, it's not about me. On the subject of whether this applies to anyone else or not, well, I'm neutral.
December 20, 2024 at 10:25am
December 20, 2024 at 10:25am
#1081341
Solstice tomorrow (around 4:20 am here), so this is my last entry of astronomical fall. Today's article, from BBC, has nothing to do with seasons, though, and it's a subject I really shouldn't be weighing in on—but of course, I'm going to do it anyway.



What inspired me to jump in above my head here was the lede:

Far from triumphantly breezing out of Africa, modern humans went extinct many times before going on to populate the world, new studies have revealed.

Now, there's a poorly phrased sentence if I've ever seen one. It should be blindingly obvious to everyone who can read this that modern humans didn't go extinct. This is a fact on par with "Earth is roughly spherical" and "Space is mostly vacuum." Actually, wait, no, I'm even more sure that modern humans didn't go extinct than I am about those other things, because, last I checked, there were about 8 billion modern humans running around. Or sitting around. Whatever. You do you. Point is, we're not extinct yet.

Likely, the author meant "sub-populations of modern humans went extinct many times," which, okay, I guess they have science to back them up on that, and I'm not going to argue about it. But I feel like the way it's phrased would be like if they said "humans went extinct in Pompeii in 79 C.E."

The new DNA research has also shed new light on the role our Neanderthal cousins played in our success.

This is, I think, the interesting bit here. But I'd like to emphasize the "cousins" metaphor there. Sapiens and neandertals (the spelling of the latter appears to have legitimate variants) shared a common ancestor. A certain ape population separated at some point, genetic drift and selection occurred differently in each population, until you got groups with clear physiological differences in the fossil record. But, apparently, the physiological differences weren't enough to prevent interbreeding.

The definition of a species is, to my understanding, a bit of a slippery concept in biology. That is, it's not always obvious what constitutes a separate species. If it were as easy as "a population that can breed to produce fertile offspring," we wouldn't consider sapiens and neandertals separate species because, according to DNA evidence, they produced fertile offspring together.

While these early European humans were long seen as a species which we successfully dominated after leaving Africa, new studies show that only humans who interbred with Neanderthals went on to thrive, while other bloodlines died out.

Again, I feel like this is poorly phrased, and puts too much emphasis on Europe. Apparently, there are populations in sub-Saharan Africa today with no neandertal genes and, again, obviously they didn't die out. And they're the same species as the rest of us mostly-hairless bipeds.

Apart from these nitpicks, I think the new findings are fascinating, delving into how, basically, hybridization led to greater hardiness. As with all science, it may be overturned or refined through later studies, but this article itself describes an overturning of previous hypotheses about early human ancestry. And it has helpful infographics and pictures.

But unless we invent time travel (unlikely), all we can do is make hypotheses and test them. It's really a very human thing to do.
December 19, 2024 at 8:22am
December 19, 2024 at 8:22am
#1081306
Show of hands, please: how many of you are planning on eating over the next week or so?

Huh, that's a lot.

An eating-related article from PopSci:

    Is the five-second rule true? Don’t push your luck.  Open in new Window.
The scientific research on floor food has a clear answer.


I never heard about this "five-second rule" until I was fully an adult. Now, remember, I spent my childhood out in the country, on a farm, and we had our own vegetable garden. The door to the house opened into the kitchen. Clean floors were a "sometimes" thing. But I honestly can't remember what my mom (it was almost always my mom) did if something dropped onto said floor. Probably wouldn't have mattered because I used to pick veggies straight from the garden and eat them. Yes, even carrots. Especially carrots. I wouldn't eat vegetables that she'd cook, but I ate the hell out of raw, dirty-root carrots.

Nurses are always surprised and distrustful when they see "no known allergies" on my chart, but I credit my finely-tuned immune system (quite unscientifically) to eating dirt as a kid.

Anyway, I never believed the five-second rule, and now there's some science to back me up on this.

According to this popular belief, if you drop a piece of food on the floor and pick it up in less than five seconds, then it’s safe to eat. The presumption is that bacteria on the floor don’t have enough time to hitch a ride on the food.

Right, because bacteria are just little animals that take more than five seconds to realize there's a tasty treat above and jump onto it. Snort. No, any bacteria (or other unwanted contamination) hitches a ride on the floor dirt that the dropped food picks up immediately. And I don't care how clean you think your floor is; if it's just been cleaned, there's cleaning agent, which is also not very good for you; and if it hasn't, there's dirt.

In 2003, Jillian Clarke, a senior at the Chicago High School for Agricultural Sciences in Illinois, put the five-second rule to the test.

I will reiterate here that, as a high-schooler, she was younger than I was when I first heard about the five-second rule. Also, we never got to do cool science projects like that in my high school.

Clarke and her coworkers saw that bacteria transferred to food very quickly, even in just five seconds, thus challenging the popular belief.

While this supports my own non-scientific conclusion, one study, performed by a high-school team, is hardly definitive.

A few years later, food scientist Paul Dawson and his students at Clemson University in South Carolina also tested the five-second rule and published their results in the Journal of Applied Microbiology.

Additional studies and replication, now... that starts to move the needle to "definitive."

When they dropped bologna sausage onto a piece of tile contaminated with Salmonella typhimurium, over 99% of the bacteria transferred from the tile to the bologna after just five seconds. The five-second rule was just baloney, Dawson concluded.

One might think that the main reason I saved this article to share was because of the bologna/baloney pun.

One would be correct.

But in 2014, microbiology professor Anthony Hilton and his students at Aston University in the United Kingdom reignited the debate... According to their results (which were shared in a press release but not published in a peer-reviewed journal), the longer a piece of food was in contact with the floor, the more likely it was to contain bacteria. This could be interpreted as evidence in favor of the five-second rule, Hilton noted, but was not conclusive.

Well, maybe UK bacteria are less aggressive.

This prompted food science professor Donald Schaffner and his master’s thesis student, Robyn C. Miranda, at Rutgers University in New Jersey to conduct a rigorous study on the validity of the five-second rule, which they published in the journal Applied and Environmental Microbiology... By analyzing bacterial transfer at <1, 5, 30, and 300 seconds, they found that longer contact times resulted in more transfer but some transfer took place “instantaneously,” after less than 1 second, thus debunking the five-second rule once and for all.

Now that "definitive" needle has moved substantially. But shame on the source for applying "once and for all" to science.

“Based on our studies, the kitchen floor is one of the germiest spots in the house,” Charles P. Gerba, a microbiologist and professor of virology at the University of Arizona, tells Popular Science. Believe it or not, “the kitchen is actually germier than the restroom in the home,” he added.

I get really tired of the "more germs than a toilet seat" scaremongering.

The next time you’re tempted to eat that cookie you just dropped, remember: bacteria move fast.

Or they're hitching a ride on the larger particles that stick to the toast that you inevitably dropped butter-side-down.

Anyway, I'm not sharing this to shame anyone for eating stuff off the floor. You do you, as they say. Just don't make me eat it. My dirt-eating days are long behind me.
December 18, 2024 at 9:58am
December 18, 2024 at 9:58am
#1081275
Getting back to science today, here's one from Quanta for all the opponents of nihilism out there.

    What Is Entropy? A Measure of Just How Little We Really Know.  Open in new Window.
Exactly 200 years ago, a French engineer introduced an idea that would quantify the universe’s inexorable slide into decay. But entropy, as it’s currently understood, is less a fact about the world than a reflection of our growing ignorance. Embracing that truth is leading to a rethink of everything from rational decision-making to the limits of machines.


It makes all kinds of sense that it took a French person to figure this out.

Life is an anthology of destruction. Everything you build eventually breaks. Everyone you love will die. Any sense of order or stability inevitably crumbles. The entire universe follows a dismal trek toward a dull state of ultimate turmoil.

That sounds more like a French (or possibly Russian) philosophy book than science, but I assure you, it's science (just without the math). As I've said before, philosophy guides science, while science informs philosophy.

To keep track of this cosmic decay, physicists employ a concept called entropy.

Keeping track of decay may sound like a paradox, and, in a way, it is.

Entropy is a measure of disorderliness, and the declaration that entropy is always on the rise — known as the second law of thermodynamics — is among nature’s most inescapable commandments.

That's slightly simplified. The Second Law of Thermodynamics states that in a closed system, entropy can never decrease. It can remain constant, just never decrease. And it specifies "closed system," which the Earth most definitely is not; we have a massive energy source close by (in cosmic terms), at least for now.

Order is fragile. It takes months of careful planning and artistry to craft a vase but an instant to demolish it with a soccer ball.

I've also noted before that creation and destruction are actually the same thing. What we call it depends on our perspective at the time. Did you create a sheet of paper, or did you destroy a tree? Well, both, really, but maybe you needed the paper more than you needed the tree, so you lean toward the "creation" angle.

We spend our lives struggling to make sense of a chaotic and unpredictable world, where any attempt to establish control seems only to backfire.

Who's this "we?"

We are, despite our best intentions, agents of entropy.

At the risk of repeating myself once more, it could well be that the purpose of life, if such a thing exists at all, is to accelerate entropy.

But despite its fundamental importance, entropy is perhaps the most divisive concept in physics. “Entropy has always been a problem,” Lloyd told me. The confusion stems in part from the way the term gets tossed and twisted between disciplines — it has similar but distinct meanings in everything from physics to information theory to ecology. But it’s also because truly wrapping one’s head around entropy requires taking some deeply uncomfortable philosophical leaps.

Uncomfortable for some, maybe.

As physicists have worked to unite seemingly disparate fields over the past century, they have cast entropy in a new light — turning the microscope back on the seer and shifting the notion of disorder to one of ignorance.

What he's basically saying here, if I understand correctly (always in question), is that they're trying to fit entropy into information theory. Remember a few days ago when I said information theory is a big deal in physics? It was here: "Life IsOpen in new Window.

The notion of entropy grew out of an attempt at perfecting machinery during the industrial revolution. A 28-year-old French military engineer named Sadi Carnot set out to calculate the ultimate efficiency of the steam-powered engine.

It's important, I think, to remember that the steam engine was the cutting-edge of technology at the time.

Reading through Carnot’s book a few decades later, in 1865, the German physicist Rudolf Clausius coined a term for the proportion of energy that’s locked up in futility. He called it “entropy,” after the Greek word for transformation.

I find that satisfying, as well, given my philosophical inclination concerning creation and destruction. If they're the same thing, then "transformation" is a better word.

Physicists of the era erroneously believed that heat was a fluid (called “caloric”).

Yes, science is sometimes wrong, and later corrects itself. This should, however, not be justification to assume that the Second Law will somehow also be overturned (though, you know, if you want to do that in a science fiction story, just make it a good story).

This shift in perspective allowed the Austrian physicist Ludwig Boltzmann to reframe and sharpen the idea of entropy using probabilities.

So far, they've talked about a French person, a German, and an Austrian. This doesn't mean thermodynamics is inherently Eurocentric.

The second law becomes an intuitive probabilistic statement: There are more ways for something to look messy than clean, so, as the parts of a system randomly shuffle through different possible configurations, they tend to take on arrangements that appear messier and messier.

The article uses a checkerboard as an example, but as a gambler, I prefer thinking of a deck of cards. The cards come in from the factory all nice and clean and ordered by rank and suit. The chance of that same order being recreated after shuffling is infinitesimal.

Entropy experienced a rebirth during World War II.

Now, there's a great double entendre. I wonder if it was intentional.

Claude Shannon, an American mathematician, was working to encrypt communication channels... Shannon sought to measure the amount of information contained in a message. He did so in a roundabout way, by treating knowledge as a reduction in uncertainty.

Sometimes, it really does take a shift in perspective to move things along.

In two landmark (opens a new tab) papers (opens a new tab) in 1957, the American physicist E.T. Jaynes cemented this connection by viewing thermodynamics through the lens of information theory.

Okay, so the connection between entropy and information isn't exactly new.

However, this unified understanding of entropy raises a troubling concern: Whose ignorance are we talking about?

And that's where I stop today. There is, of course, a lot more at the link. Just remember that by increasing your own knowledge, you're accelerating the entropy of the universe by an infinitesimal amount. You're going to do that whether you read the article or not, so you might as well read the article. As it notes, "Knowledge begets power, but acquiring and remembering that knowledge consumes power."
December 17, 2024 at 7:01am
December 17, 2024 at 7:01am
#1081249
Today, from the Land of Party Poopers (actually, from Thrillist):

    No, That Isn't Duct Tape on Your Plane's Wings  Open in new Window.
An aircraft mechanic explains what the tape you sometimes see on plane wings really is.


Why Party Poopers? Well, because they're taking one of my few precious joys out of life.

See, I don't fly all that often. Once a year, maybe. (Okay, twice, if you count the round trip as two trips.) So I don't get to do this often, but when I see that tape on a plane, I usually wait until the plane starts to taxi away from the gate to loudly exclaim, "Hey, look, the wings are being held together by duct tape!"

I also find the fake outlets  Open in new Window. at waiting areas incredibly hilarious, though I've never done that prank, myself.

Those are little moments of happiness for me, but this article sucks the joy out of the first one. Well, at least, it would, if people actually read Thrillist. Maybe my faux-freakout over the tape will still have its desired effect.

Anyway, after all that, I'm sure you're dying to know what it really is on the wings.

As a passenger, noticing that your plane's wings are seemingly held together by the same silver duct tape that your dad uses to fix anything around the house is, by all means, a frightening sight.

Or, you know, it would, if duct tape weren't so damn useful.

"That's not actually duct tape," says an aircraft mechanic in a TikTok video addressing the issue. "That's speed tape, [...] and speed tape is an aluminum-base tape that's designed specifically for aviation due to the large speeds and the large temperature differentials that aircraft are subjected to."

I actually knew that. But knowing that it's called "speed tape" doesn't help for shit. Like, from the sound of it, it should make the airplane go faster, but if that were the case, the whole plane would be covered in it, right? If it has something to do with the "large speeds" (eyetwitch) as well as temperature differentials, why call it speed tape and not cold tape?

Instead, sometimes, it's used as a temporary sealant to prevent moisture from entering specific components.

Uh huh. Okay. Doesn't tell me why it's called speed tape.

"Speed tape, also known as aluminum tape, is a material used for temporary, minor repairs to nonstructural aircraft components," an FAA spokesperson told Thrillist.

And it's called that because...?

Yes, I know I could ask some random AI the question and get some kind of answer, but that's not the point. The point is, why can't the article purporting to explain all about speed tape not even bother to explain why it's called speed tape?

You can relax now and enjoy your flight stress-free.

HA! Like there aren't 498 other things about flying that cause stress.

Oh, right: 499 if I'm around.
December 16, 2024 at 7:51am
December 16, 2024 at 7:51am
#1081213
Way the hell back in 2018, Quartz published the article / stealth book ad I'm linking today.



Does it? Does it really remain at the center of dining controversy? Because I thought that in 2018, and even now, the "center of dining controversy" is how to handle mobile phones at the table.

On June 25, 1633, when governor John Winthrop, a founding father of the Massachusetts Bay Colony, took out a fork, then known as a “split spoon,” at the dinner table, the utensil was dubbed “evil” by the clergy.

While this article is US-centric, and makes no attempt to be otherwise, other sources  Open in new Window. show that the fork has been considered a tool of the devil since it was introduced to Europe. This is, naturally, just another in a long list of humans considering anything new and different to be necessarily evil, because we're kinda stupid like that.

Forks were pretty much unheard of during Winthrop’s era. People would use their hands or wooden spoons to eat. The Museum of Fine Arts (MFA) in Boston says that only “a handful of well-to-do colonists,” adopted the use of the fork.

I mean, technically, you're using your hands either way.

When Americans finally started their love affair with the fork, their dining etiquette compared to their international peers became a source of controversy for centuries, whether it’s the way the fork is held, only eating with the fork, or using the “cut-and-switch.“

Oh, no, different countries do things differently. The horror.

During the time it took for Americans to widely start using the fork, dining cutlery was evolving in England. Knives changed to have rounded blade ends, since forks had “assumed the function of the pointed blade,” says Deetz.

I'm betting there were other reasons for the switch, like, maybe, deweaponization?

So if you've ever wondered why some cultures point fork tines up while others point them down, well, the article explains that. Sort of. Unsatisfactorily. Still not mentioned: why formal place settings are the way they are.

Also not mentioned in the article (perhaps one of the books it promotes says something about it, but it's unlikely I'll ever find out) is the abomination known as the spork.

2,932 Entries · *Magnify*
Page of 147 · 20 per page   < >
Previous ... -1- 2 3 4 5 6 7 8 9 10 ... Next

© Copyright 2025 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted Writing.Com, its affiliates and its syndicates non-exclusive rights to display this work.

Printed from https://www.writing.com/main/profile/blog/cathartes02