Items to fit into your overhead compartment |
|
So, here's one from a source I don't usually follow, but it came to my attention thanks to Elisa, Stik of Clubs How to Avoid Falling for Fake Videos in the Age of AI Slop Why fake videos spread so easily, and what actually helps people spot them We’re entering an era of what’s often called AI slop: an endless stream of synthetic images, videos, and stories produced quickly, cheaply, and at scale. I have to admit, I'm getting more than a little tired of hearing/seeing the words "AI slop." From what I've seen, AI output has become more polished and professional than about 75% of human-generated content. I think some people might be jealous. I ain't saying it's right, mind you. Only that it's prettier. Sometimes these fake videos are harmless or silly, like 1001 cats waking up a human from sleep. Harmless? You dare to call something that triggering harmless? I don't even allow the significantly fewer than 1001 cats in my household to wake me up. Other times, they are deliberately designed to provoke outrage, manipulate identity, or push propaganda. Because human-generated content would never do that. Just like no one ever got lost using paper maps in the time before GPS. To navigate this new information environment, we need to combine psychological literacy, media literacy, and policy-level change. And here's where it gets difficult for most of us. Why should we change? It's the world that needs to change, dammit! The article provides a road map (or, if you prefer, a GPS route) to us changing: 1) Understand Our Own Psychological Biases (Psychological Literacy) The psychology behind falling for AI-generated misinformation isn’t fundamentally new. The process is largely the same as with other forms of misinformation, but AI takes it to a whole new level– it dramatically lowers the cost and effort required to produce and spread it at scale. My own simple solution: Right now, most of us have a bias that says "I saw it, so it must be real." I suggest turning that around. Assume everything you see on the internet, or on TV, is fake. Like you're watching a fictional movie or show. The burden of proof thus shifts. The downside to this (every simple solution has a downside) is that you get so you don't believe anything, And for some of these content generators, that's the goal: make you question reality itself so they can swoop in and substitute it with their own version. Hell, religion has been doing this for as long as there's been religion. As Matthew has written before about fake AI accounts, people are motivated to believe what fits their values, grievances, and group identities, not necessarily what’s true. When a video confirms what you already believe about politics, culture, or power, authenticity becomes secondary. I have noted this before: it is important to be just as, or preferably more, skeptical about the things that tickle our confirmation bias. The goal isn’t to suppress emotion. It’s to recognize when emotion is being used as a shortcut around verification, and being used to manipulate you. It sure would be nice to be able to suppress emotion, though. I've felt that way since watching Star Trek as a kid. Spock was my role model. 2) Lateral Reading Is Still the Best Tool We Have (Media Literacy) When people try to fact-check AI videos, their instinct is often to stare harder at the content itself such as examining faces, counting fingers, looking for visual glitches. Guilty. I've been seriously considering wearing a prosthetic extra pinkie finger so that anyone who looks at a surveillance photo of me will immediately assume it's an AI fake. The most effective fact-checking strategy we have isn’t vertical reading (scrutinizing the video itself). It’s lateral reading—leaving the content entirely to verify it elsewhere. I do that here, especially with notoriously unreliable sources, which, since I try to use free and easily accessible content, is almost everyone these days. 3) Policy Changes and Platform Accountability Individual skills matter. Community norms matter. But at this point, policy intervention is likely required. Well, I was trying to be funny with the "It's the world that needs to change" bit above, but I guess they're serious. Social media platforms are not optimized for truth, they’re optimized for engagement. I should fact-check this, but it aligns with what I already believe, so I won't. Conclusion The most dangerous thing about fake AI videos isn’t that people believe them once. It’s that repeated exposure erodes trust altogether: in media, in institutions, and eventually in one another. As I alluded to above, it makes us question the very meaning of "truth." I'd also add this: Be humble enough to know that you can be wrong. Be brave enough to admit when you're wrong. And allow space for the idea that sometimes, your ideological opponents are right. Not often, mind you. But sometimes. |
| You know those "which Hogwarts house are you?" quizzes designed to fill out your ad profile online? I don't know; maybe they've finally fallen out of favor. Here's a different kind to consider, from Big Think, and I'm not even building an ad profile of you: Which of the 5 philosophical archetypes best describes you? I'm definitely a Kitsune, but would a Kitsune actually say that? For clarity, that subhead there is the author describing himself as a Kitsune. I'm absolutely not a Kitsune, though I appreciate them. Sometimes. We are all philosophers. I don’t mean this in the “What do you make of Quine’s ‘Two Dogmas’?” sense. No, we are all philosophers in that we all do philosophy. Yeah, even that insipid song by Edie Brickell with the line "philosophy is the talk on a cereal box" is a kind of philosophy. Philosophy is a practice of wonder and logic; curiosity and introspection; dialectic and meditation; criticism and advocacy. I question the author's assertion here, but I guess that means I'm doing philosophy. So, without any empirical rigor whatsoever — another favorite characteristic of philosophy — I present here five different ways to be a philosopher. I feel like "The Fool" is conveniently left out, though maybe that's an aspect of the Kitsune. Yes, yes, I'm getting to what that is, if you don't already know. But that's because I assert that philosophers, by definition, have a stunted sense of humor, or none at all. We have a different word for philosophers with a sense of humor: comedians. The Sphinx The archetype: The Sphinx had the head of a woman, the body of a lion, and the wings of a bird. While that kind of chimera is probably highly symbolic, I don't know what the symbols might mean. Physical descriptions are probably the least important things in these archetypes. Each time, the Sphinx would ask a single riddle, the classic being, “What walks on four legs in the morning, two at noon, and three in the evening?” but I assume there were more. One of my favorite scenes in fiction is from a Zelazny novel. The MC meets a sphinx, who asks him a riddle. He asks, in return: "What's red and green and goes round and round and round?" This stumps the sphinx, because of course the sphinx isn't attuned to the modern definition of "riddle." He is thus able to pass while the sphinx ponders, much like when Spock set an android into an infinite loop with deliberate illogic. This is probably when I determined the essential difference between philosophers and comedians. Oh, the answer is "a frog in a blender." The Leviathan The archetype: The Leviathan is a demonic sea serpent that breathes fire. Its back is a row of shields and churns the oceans to a frothing boil. Not ever answered: what use fire-breathing has in a sea monster. This person has a transferable framework that they apply to everything. They’ve read a book, studied a philosophy, or watched a YouTube video and decided, “Yes, this idea is the one that will govern my life.” Every action in every minute of the day can be explained by this single system of ideas. Oh. That type. The Kitsune The archetype: In Japanese folklore, the kitsune is a fox spirit known for their ability to shapeshift. A kitsune might appear as a beautiful woman, an old man, a child, or a tree. Some are tricksters, and others are teachers. The "trickster" archetype can be funny. But not usually to the ones being tricked. The kitsune-person may say something outrageous and, when challenged, give a wide smile with a twinkle in their eye. They’re often impossible to argue with because they keep changing things. Oh, yeah, the goalpost-mover. The Minotaur The archetype: The Minotaur is a half-human, half-beast (typically a bull) locked in a labyrinth. The Minotaur is feral and brutal, no doubt — he will kill anyone he catches in his maze — but he is also lost and tormented. In my view, the "bull" part is essential to the minotaur's description. It's right there in the name. ("But, Waltz, what about centaurs? They're part horse, not bull." "Turns out one possible etymology for 'centaur' is 'bull-slayer.'") The minotaur-philosopher is someone lost in the mire of human suffering, mortality, freedom, and absurdity. They never escape the labyrinth but make a dark, resigned home within it. Here, you’ll find Pascal, Dostoevsky, Heidegger, Sartre, Camus, and Simone de Beauvoir pacing about in anguish. No comment. The Garuda The archetype: The Garuda is a great eagle of Indian mythology and is associated with clear sight and the dispelling of poisons — especially those of serpents and nagas. The Garuda soars above the landscape and sees the structure of things. One might think that because it's a big-ass bird associated with purification, I'd identify most closely with this. One would be wrong. The Garuda-person asks, “What do you mean by that?” a lot. They hate vagueness and metaphor used as arguments and will often call out both — “What does that actually mean?” they say. They generally don’t have time for “lived experience” or emotional reasoning. Or, I don't know. Maybe that's pretty close. Fuller descriptions exist at the link, of course. While, as the author notes, the list is by no means exhaustive, I find it amusing. I'm also quite pleased that it's not limited to one set of mythology, though there are certainly others that could be included, from other cultures. Though the "trickster" archetype seems to be pretty universal. And most of us are composites — a little Sphinx when we’re unsure, a little Minotaur late at night, a little Garuda when we’re fed up with nonsense. I'd venture that most of us just are, without thinking about archetypes. Hm. Maybe Edie Brickell was onto something, after all. |
| Here's one for your inner 12-year-old, from Live Science: How many holes does the human body have? You might think that the human body has many holes, but that number shrinks when you stop to consider what counts as a hole. Because I know your inner 12-year-old immediately said "which sex?" The human body is extraordinarily complex, with several openings and a few exits. Cue Beavis and Butt-Head. But exactly how many holes does each person have? I imagine it not only depends on your definition of "holes," but how recently someone's been shot. Maybe that only applies in war or the US. But it's not quite that easy once you start considering questions like: "What exactly is a hole?" "Does any opening count?" And "why don't mathematicians know the difference between a straw and a doughnut?" I've noted before that a "hole" isn't a thing. However you conceive of the concept, a hole can only be defined by what's around it. You can't just point to a random location in space and say "that's a hole." Or, well, you can, but people would look at you funny. "Black holes" may be the only exception to this, but their name is more metaphorical. Oh, and the branch of mathematics that doesn't know the difference between a straw and a donut (and a coffee mug, for that matter) is called topology, where all of those shapes are considered toroids: one hole going all the way through. Topologically, we're all toroids (assuming we haven't been shot through recently). Most animal life on Earth is. Katie Steckles, a lecturer in mathematics at Manchester Metropolitan University in the U.K. and a freelance mathematics communicator, told Live Science that mathematicians "use the term 'hole' to mean one like the hole in a donut: one that goes all the way through a shape and out the other side." Look, I don't care if you call it doughnut or donut. The former is more British; the latter is more US. Just do try to keep it consistent, and if you're quoting a Brit, use the former. Or do what I do, and say "bagel" instead. But if you dig a "hole" at the beach, your aim is probably not to dig right through to the other side of the world. Totally tried to do that when I was a kid. It's good to have goals. Similarly, mathematical communicator James Arthur, who is based in the U.K., told Live Science that "in topology, a 'hole' is a through hole, that is you can put your finger through the object." Um. Phrasing? And if you ask people how many holes a straw has you will get a range of different answers: one, two and even zero. This is a result of our colloquial understanding of what constitutes a hole. Are... are you telling me language can be ambiguous? Say it ain't so! In topology, objects can be grouped together by the number of holes they possess. For example, a topologist sees no difference between a golf ball, a baseball or even a Frisbee. And I knew that, obviously, but it's also another excuse for people to grumble about "common sense," as if that were a thing that existed. Armed with the topologists' definition of a hole, we can tackle the original question: How many holes does the human body have? Let's first try to list all the openings we have. The obvious ones are probably our mouths, our urethras (the ones we pee out of) and our anuses, as well as the openings in our nostrils and our ears. For some of us, there are also milk ducts in nipples and vaginas. At least they addressed the 12-year-old directly and shut down its gigglesnorts with all kinds of formal medical words. Unfortunately, that sentence needs another comma near the end. In total there are potentially millions of these openings in our bodies, but do they all count as holes? This is a bit like asking if a tomato is a fruit or a vegetable, in that a scientist will give you a different answer than a sous-chef. "They're not actually holes in the topological sense, as they don't go all the way through," Steckles said. "They're just blind pits." Again. Phrasing. A pair of underwear, for example, has three openings (one for the waist and one for each of the two legs), but it's not immediately clear how many holes a topologist would say it has. And again, it probably depends on the sex and/or gender of the person wearing it. And here comes the 12-year-old, giggling again. So the mathematician's answer is that humans have either seven or eight holes. And my answer? None. Think about it: how many holes does your house have? You can ignore the drafty cracks for my purposes; I'm talking about, like, windows and doors. Open one window: no holes in the topological sense. Open a window and a door: suddenly you have a topological hole. Open three windows, and you get the situation the article refers to with underpants. Might be complicated if you also consider water and sewer systems. And your digestive tract is, also, usually closed at at least one end, like a door or a window in your house. So while you could, technically and topologically, thread a string from mouth to asshole (preferably not the other way around), in practice, we're usually pretty closed off, apart from respiratory functions. So, again: it's all about how you look at it. And if you're 12, this shit is funny as hell. |
| Yes, sometimes I link to Outside. It's better than actually going outside. Can Jumping 50 Times Each Morning Actually Improve Your Health? Here’s what the science says about the Internet's latest trend. I dunno about science, but I have some idea about what your downstairs neighbors would say. You’ve tried everything to feel more awake in the mornings—caffeine, sunlight, water, stretching—but no matter what, you still feel groggy and unready to face the day. Have you tried attuning your schedule to your chronotype, instead of trying to fit your chronotype into someone else's schedule? Yeah, yeah, I know, few have the privilege of being able to do that. I certainly did not for most of my life. There’s one thing you probably haven’t tried that’s taking social media by storm: jumping. If it's "taking social media by storm," a) I'd be the last to hear about it and b) I'd immediately distrust it, like I did the "walking backwards" fad from, what, a year ago? Two? Now, even though I don't practice this these days, I can accept that some exercise is better than no exercise. I can also accept that, sometimes, you gotta try something new to break up your routine a bit. Near as I can tell, if you don't live above someone else or can do it (shudder) outside, there's nothing inherently wrong with this and it doesn't make you look as dumb as walking backwards does. And yet, I'd still shun it, simply because it's a trend. The article goes on to list the "benefits" of this particular exercise. I won't rehash them here. Just assume I'm skeptical. Not in the denial sense, but in the "I'm not going to trust this one source" sense. Who Should Skip the Jumping This section header is the actual reason I saved this article. Skip? Jumping? I'm dying over here. You might want to think twice about participating in this trend if you have a weak pelvic floor, significant knee, hip, ankle, or foot pain, Achilles tendinopathy, plantar fasciitis, recent sprains, a history of stress fractures, or balance issues, Wickham says. I admit, though, that putting this here assuages some of my skepticism. To get the most out of your jumps, jump 50 times in place at a rapid, consistent speed, making sure to drive through the balls of your feet and land softly on the balls of your feet. If I tried that right now, I'd end up in the hospital. Meanwhile, I'll continue my usual jumping exercises: the ones that lead me to conclusions. |
| Speaking of time, there's this article from aeon: The shape of time In the 19th century, the linear idea of time became dominant, forever changing how those in the West experience the world It would be horribly remiss of me if I didn't include this famous quote: "People assume that time is a strict progression of cause to effect, but actually from a non-linear, non-subjective viewpoint - it's more like a big ball of wibbly-wobbly... timey-wimey... stuff." — The Doctor ‘It’s natural,’ says the Stanford Encyclopedia of Philosophy, ‘to think that time can be represented by a line.’ We imagine the past stretching in a line behind us, the future stretching in an unseen line ahead. I have heard that there is a culture, perhaps in Australia, or maybe Papua New Guinea or South America (I don't recall), where they think of the future as behind them and the past as in front of them. This is, if I remember right, because you can "see" the past but you cannot "see" the future. So, no, it's no more "natural" to "think that time can be represented by a line" than it is to think of time moving from left to right on a page, the way Westerners read language. Perhaps those who write from right to left see time progressing from right to left. Even writing is arguably not "natural." However, this picture of time is not natural. Its roots stretch only to the 18th century, yet this notion has now entrenched itself so deeply in Western thought that it’s difficult to imagine time as anything else. Except, perhaps, as a big ball of wibbly-wobbly, timey-wimey stuff. Let’s journey back to ancient Greece. Amid rolls of papyrus and purplish figs, philosophers like Plato looked up into the night. His creation myth, Timaeus, connected time with the movements of celestial bodies. The god ‘brought into being’ the sun, moon and other stars, for the ‘begetting of time’. They trace circles in the sky, creating days, months, years. While it seems to be true that Western culture borrows a lot from ancient Greece, there really were other cultures in the world. I'd think the whole "this started with ancient Greece" thing would have fallen out of favor by now. Guess not. Such views of time are cyclical: time comprises a repeating cycle, as events occur, pass, and occur again. I can kind of understand why people would think time is a cycle. As the article notes, things do seem to have cycles: day/night, moon, year, planet alignments, etc. But the idea that "events occur, pass, and occur again" just seems wrong to me. Even though there's, e.g., Groundhog Day every year, not every Groundhog Day is the same. It’s even hinted at in the Bible. For example, Ecclesiastes proclaims: ‘What has been will be again … there is nothing new under the sun.’ Of all the laughably wrong things in the Bible, "there is nothing new under the sun" might well be the most laughably wrong. Well, right up there with "we live on a flat Earth between two waters," anyway. Maybe also with "there was a global flood in human history." And yet, like Greek ideas, it's part of culture. To quote the Battlestar:Galactica remake: "All of this has happened before. All of this will happen again." Importantly, medievals and early moderns didn’t literally see cyclical time as a circle, or linear time as a line. Yet in the 19th-century world of frock coats, petticoats and suet puddings, change was afoot. Gradually, the linear model of time gained ground, and thinkers literally began drawing time as a line. I believe it's important to note that, whether we conceive of time as cyclical, linear, wibbly-wobbly, or anything else we can come up with, this is a matter of perception, not reality. No one knows what time really is. People have guesses, and they'll tell you their guesses with great confidence, such as "time is like a river," but as we saw yesterday, some rivers are more rivery than others. But no. Time is time. The only thing I can say with great confidence is what it's not: an illusion. It may well be an emergent property of something deeper, but then, so is the chair you're sitting in right now. A crucial innovation lay in the invention of ‘timelines’. As Daniel Rosenberg and Anthony Grafton detail in their coffee-table gorgeous Cartographies of Time (2010), the ‘modern form’ of the timeline, ‘with a single axis and a regular, measured distribution of dates’, came into existence around the mid-18th century. In 1765, the scientist-philosopher Joseph Priestley, best known for co-discovering oxygen, invented what was arguably the world’s first modern timeline. What this brought to mind for me was the Periodic Table. Elements, like oxygen, exist with or without the Periodic Table, but Mendeleev's invention helped us visualize their relationships with each other, much like a timeline helps us visualize past events in relation to one another. Rosenberg and Grafton describe A Chart of Biography as ‘path-breaking’, a ‘watershed’. ‘Within very few years, variations on Priestley’s charts began to appear just about everywhere … and, over the course of the 19th century, envisioning history in the form of a timeline became second nature.’ Priestley’s influence was widespread. For example, William Playfair, the inventor of line graphs and bar charts, singled out Priestley’s timeline as a predecessor of his own work. I say this gives short shrift to Descartes, who basically invented graphs in the early 17th century. See, already I'm putting events on a timeline. The second key development concerns evolution. During the early 19th century, scientists created linear and cyclical models of evolutionary processes. For example, the geologist Charles Lyell hypothesised that the evolution of species might track repeatable patterns upon Earth. This led to his memorable claim that, following a ‘cycle of climate’, the ‘pterodactyle might flit again through the air.’ However, with the work of Charles Darwin, cyclical models faded. His On the Origin of Species (1859) conceives of evolution in linear terms. It literally includes diagrams depicting species’ evolution over time using splaying, branching lines. I think that once we realized that entropy only goes in one time direction, the old idea of cycles of time had to go right out the window. Entropy and time are intertwined, and physics's best guess as to the nature of time right now is that it is entropy. Now, I know some people mistakenly believe that evolution goes against entropy, but that discussion is outside the scope of this entry. The last development stemmed from mathematics: theories of the fourth dimension. Humans perceive three spatial dimensions: length, width, and depth. But mathematicians have long theorised there were more. In the 1880s, the mathematician Charles Hinton popularised these ideas, and went further. He didn’t just argue that space has a fourth dimension, he identified time with that dimension. Now that was something I wasn't aware of. I knew the idea of "spacetime" preceded Einstein, but I don't think I'd ever heard of Hinton. Nowadays, of course, mathematicians like to play with way more than four dimensions, and apparently, something like 16 are required for string theory (which, if anything in science can be said to be "only a theory," it's string theory). Within history, conceiving of time as a line helped to fuel the notion that humanity is making progress. Joseph Priestley, our timeline inventor, is partly responsible for this. The man once listed inventions that have made people happier, including flour mills, linen, clocks, and window glass. Sadly, Priestly lived before sliced bread and the "Skip Intro" button. Within philosophy, conceiving of time as a line led to thinkers debating the reality of the past and future. Whereas I assert that only the past is real; the present is an illusion created by the very recent past, and the future doesn't even rise to the level of illusion, as it does not exist at all and won't until the past catches up. But I acknowledge that this, too, is a matter of perception and point of view. I've gone on long enough for today (see, I made a time reference there). The article also goes on for a while, but it's an interesting read. And an appropriate one for an outlet named aeon. |
| Some questions may not have meaningful answers, such as this one from LiveScience: What's the oldest river in the world? The oldest river predates the dinosaurs. But how do we know this? First, you have to define what you mean by "river," and that can be harder than it sounds. The dictionary definition (at least the first one I found) is: "a large natural stream of water flowing in a channel to the sea, a lake, or another such stream." (Oxford) So you've got "large," which is a judgement call; "natural," which is fuzzy; "stream," which implies flowing, but lots of water flows and some rivers sometimes don't; and "water," which rules out, for example, the L.A. River (most of the time); "flowing," which I say is redundant after "stream;" and "channel," which seems straightforward enough until you consider that some rivers are braided and/or deltaed with multiple channels. And then you have bodies like the Potomac River, which for much of its lower reaches, all the way up past DC, isn't so much a river as a tidal estuary that happens to be fed by a higher river. Oh, but that's not all. Dictionary definitions don't cut it here in this blog. I can use them as examples, but they don't resolve arguments. You know the old saying, "You can't step into the same river twice?" I think it's supposed to be about how things change over time. Water goes in, water flows out, evaporation happens, shores get eroded, sandbars form, megatons of soil get transported, etc. Thing is, rivers (and other streams) don't just change over time; they, like living bodies, are in a constant state of flux ("flux," incidentally, shares a root with "flow" and "fluid"). Consequently, I say you can't step into the same river once. Because between the time your foot touches the surface and the bottom, the river has already changed. Hell, the mere act of stepping into it changes it, however minimally. So when you're asking a question like the one in the headline, you have to be careful. Rivers may seem as old as the hills, but they have life cycles just like other natural features do. Yeah. Like hills. Some rivers last longer than others, however. So which river is the oldest in the world today? Remember that a river isn't just its water. Sometimes, it's not even its water, but just its channel, such as the aforementioned L.A. River (which also stretches "natural" to its natural breaking point). Channels change over geological time, though, carved and altered by water flow and other processes such as continental drift. The winner is older than the dinosaurs: The Finke River in Australia, or Larapinta in the Indigenous Arrernte language, is between 300 million and 400 million years old. I'm certainly not going to argue about that, though. Australia is a remarkably stable continent (or island or whatever name you slap on the land mass). If I remember right, some of the oldest rocks in the world are also found there, presumably guarded by dangerous wildlife, but don't get me started on how they define how old a rock is. The arid conditions in the center of the continent mean the river flows only intermittently; most of the year, it exists as a string of isolated water holes. See? There's a whole lot of semi-technical geological explanation for how they figured it out at the article. While I have some experience with geology, it was rather secondary to hydrology in my education, so I'm not going to quibble about it. It is interesting, at least to me. But no quotes here. "Rivers can disappear if a massive influx of sediment overwhelms them (e.g., volcanic eruptions) or if topography changes so dramatically that the flowing water takes a new course across the landscape (e.g., glacial advance and retreat)," Ellen Wohl, a geologist at Colorado State University, told Live Science in an email. Pretty sure there's more that can change or destroy a river. In the case of the Finke, Australia has been an unusually stable landscape for a very long time. Resting in the middle of the Australian Plate, the continent has experienced virtually no significant tectonic activity for the past several 100 million years, Baker explained. Like I said. Only with more detail. If the Finke ever dries up, the runner up may be the New River, which today is about 300 million years old, Baker said, and runs through Virginia, West Virginia and North Carolina. And so we get to the final bit in the article, and the main reason I'm sharing this. Australia is on almost the opposite side of the world from me, and I've never been there, but the New River is practically in my backyard, globally speaking. I've known about its ancient age since college, when I took the aforementioned geology and hydrology courses. Unlike most Virginia rivers, it doesn't flow into the Chesapeake Bay and thence into the Atlantic; instead, it's part of the Mississippi River basin. Which technically flows into the Atlantic, too, but via the Gulf of Mexico. And unlike the Finke / Larapinta, the New River is always wet. And flowing. They take people whitewater rafting on it. Not me, obviously. But people. In the interest of full disclosure, I should note this quote, which has cited sources, from the New River article The irony, of course, is that it's called the New River, and that's what I find endlessly amusing. |
| I'm linking to Lifehacker today. Yeah, yeah, I know. Bear with me. Let me guess: 1) You're 2) getting 3) money 4) for 5) this. Quality power tools are an investment, and if you take proper care of them, they’ll last a long time. It's been a while since I bought power tools, so I'm not even sure which brands can be trusted, these days. But power tools have seen a lot of advancement in recent years. While your old warhorses might still perform their core function well enough, if your drills, saws, and other power tools are five years old or older, it’s time to consider upgrading to a more modern version, for a range of reasons. Seriously, this strikes me less as helpful advice and more as a tool companies paying for an ad that looks like an article. And, indeed, they mention some brands by name in the article. But let's see what they come up with: Advances in battery technology I suppose this is fair enough. But if you've purchased a battery-powered tool of any kind, hopefully you're aware that the battery isn't going to last forever, regardless. Such tools are going to need to be replaced sooner than corded ones, in general. Improved ergonomics This feels like a stretch. Get it? Ergonomics? Stretch? No? No. I'll be here all week. And yeah, it's looking more and more like a paid ad. More powerful motors Uh huh. If it was, and remains, adequate for what you need it for, are you just upgrading because you're a Manly Man Who Must Have More Power? Better safety features Seems to me that the best safety feature is familiarity (provided one doesn't get complacent). Smart technology Until it can do the job on its own, I'm not interested. I really didn't have much else to say, today. Just that stealth advertising sucks. |
| Regular readers know I'm a fan of etymology and other language studies. Here's one from NPR: For many in the business world, a return to work after the winter break will mean once again donning the dreaded suit and tie. Pretty sure that's falling out of fashion, except for, like, lawyers. The corporate neckwear is the everyday counterpart to the traditionally more luxurious cravat – a voluminous neckscarf that conjures up images of opulent dinners aboard a yacht sailing through the Mediterranean. It does no such thing for me. But I do know that what we call "a tie" is called "une cravate" in French, and France has a Mediterranean coast, so... whatever; I don't really have a point here, unlike my ties. Yes, I do own some. President Abraham Lincoln wore cravats, as did Hollywood actor Cary Grant and the extravagant entertainer Liberace. At least one, possibly all three, of those men were gay. Nothing wrong with that, of course, at least not from today's perspective; I'm just pointing out that it might be a factor. In more recent times, the garment has been popularized in the American mainstream by the likes of Madonna and the late Diane Keaton. Fashion has been moving toward more unisex styles, from what little I know of it. Nothing wrong with that, either. In this installment of NPR's "Word of the Week" series we trace the origins of the "cravat" (borrowed from the French "cravate") back to the battlefields of 17th century Europe and explore its links to the modern day necktie, patented in New York more than 100 years ago. That is, honestly, more recent than I thought modern neckties were. "Scarves worn around the neck existed long before, but the story of the cravat truly begins in the Thirty Years' War when it first gained wider European recognition," explains Filip Hren... As someone who has studied fighting skills, albeit briefly and without much enthusiasm, I've often wondered about that. Something worn tied around one's neck is a liability in a fight. Unless it's a fake, designed to throw the opponent off-guard when they grab it to strangle you, and it instead comes off in your hand, giving you at least a temporary advantage. Hren is referring to the 1618-48 conflict fought between Catholics and Protestants and known as Europe's last religious war. Heh. That's funny. The word "cravate" first appeared in the French language to describe military attire worn by Croatian mercenaries who were renowned among their enemies for their brutal fighting prowess. Looking like a fighter is at least half the battle. Not sure if Sun Tzu wrote that, but I believe it to be true. Made of silk or cotton, the cloth is said to have been used to protect their faces against cold weather and smoke in battle, and to treat injuries. For a while, neckwear existed with a practical purpose (for non-warriors): shirts didn't have top buttons, or had really bad top buttons, so they used ties (of various styles) to hold the collar closed for a cleaner, more formal look. For fighters, I can only imagine that they could turn its inherent disadvantage into an advantage: "I can kick your ass even with this liability looped around my most vital body connection." "The scarves took their names from Croats. It was tied in a Croatian manner, or in French – a la Croate," explains Filip Hren. And that, I didn't know until I read this article. As an aside, Croats should not be confused with the Croatoan, a native American tribe largely in what is now North Carolina. King Louis XIV introduced the cravat into French fashion and from Paris it soon spread across Europe. And who, in the history of the world, has had more impact on clothing fashion than the French? No one, I say. They also kick military ass. Coincidence? I think not. Over the years, the necktie has come to symbolize success, sophistication and status, but has also been criticized by some as a symbol of power, control and oppression. I don't really understand fashion, but I am rather attuned to symbolism (pretty much have to be, as a writer). Remaining unexplained, however, is the continued popularity of its cousin, the bowtie. |
| Here's a source I've never linked before, apparently some self-promoter called Michael Ashford: Not that self-promotion is inherently bad. But check how many times he (yes, I'm assuming gender) pushes his podcast, newsletter, book, etc. This does not mean the content is bad, either. Have you ever heard of the term “conflict entrepreneur?” Until my conversation with Martin Carcasson, I hadn’t heard it. That's because someone made it up. All words and phrases are made up, of course, just some more recently than others. This particular one isn't catchy or short enough to ever catch on, the way other phrases like "concern troll" have. I propose "strifemonger." ...the idea is simple: A conflict entrepreneur is someone who makes money and/or generates a large following by intentionally pitting people against each other. And they have been around since long before the internet. Unfortunately, conflict entrepreneurship is big business, and it’s scary. One of those things is opinion. It’s scary because it’s easy to rile up peoples’ sensitivities and emotions. You take that back RIGHT NOW! Perhaps most unsettling, it takes zero experience, financial backing, wisdom, or talent to become a successful conflict entrepreneur. Eh, I don't know about that. You gotta want to do it, and have some efficacy at it, and what's that besides talent? And you can earn experience along the way. We see example after example in popular media of people who make their living off of reducing complicated issues into black-and-white binaries, removing nuance from conversation in favor of parroted talking points, and stereotyping the many based off the actions of the few. This is, I think, the important part. Think about, for example, kiddy-diddlers. I know you don't want to think about kiddy-diddlers, but I'm making a point here. There's a meme (original sense of the word) going around that drag queens are bad and they shouldn't be around children because they'll diddle them. Whereas, here in reality, the vast, vast majority of kiddy-diddlers who aren't family (happens a lot) are fine, upstanding church or school leaders. And yet if ONE trans person got caught diddling a kid, they'd say it's because they're trans; while the fine, upstanding church or school leaders who diddle kids are "mentally ill" and "don't reflect the values of the group." In other words, if someone in the in-group does something bad, it's their fault (or we ignore it, as has been the case lately). If someone in the out-group does something bad, it's the entire out-group that's at fault. To a conflict entrepreneur, your anger and your discontent are their supply. Your desire to withdraw into a tribe and demonize anyone outside of it is the capital a conflict entrepreneur needs to continue to build their empire. Like I said. Our anger sustains them. Our frustration feeds them. We're raging all over the internet, and they're sitting there chuckling. Curious questions stop them in their tracks. Okay, first of all, no; second, first you'd have to find and identify them. This process of asking yourself questions, asking questions about others, and asking questions of others is at the heart of the... ...thing he's self-promoting. As usual, I'm not avoiding talking about something in here just because someone's trying to sell a book. We're mostly writers and readers here, with many interested in selling their books and many more (hopefully) interested in reading them. And I think the basic points here are sound: that strifemongers exist, that they're manipulating people for fun and profit, and there are ways to aikido the hell out of them. Now if I could just remember this the next time someone posts something deliberately inflammatory. |
| From The Conversation: No. There. Article over. Question answered. Done. Let's move on. Is the whole universe just a simulation? – Moumita B., age 13, Dhaka, Bangladesh Sigh. Okay. Fine. It's a kid's question. Probably best to not be all Calvin's Dad How do you know anything is real? Some things you can see directly, like your fingers. Other things, like your chin, you need a mirror or a camera to see. Other things can’t be seen, but you believe in them because a parent or a teacher told you, or you read it in a book. And then there are things that are not real, but you think they are, because someone lied to you. Maybe the world we live in our whole lives inside isn’t the real one, maybe it’s more like a big video game, or the movie “The Matrix.” Okay, here's my biggest problem with the simulation hypothesis, apart from it being inherently untestable and non-falsifiable: I question the motives of anyone who insists that this is a simulation. I question them even more when someone uses the word "just" as a modifier. Now, I'm not going to apply that distrust to a 13-year-old who lives on damn near the exact opposite side of the planet from me, if indeed that person is real, but for grown adults, I wonder. Because when I'm in a simulation, and I know it's a simulation, my ethics go right out the window. I have no issue with depopulating entire towns in single-player games, for example. There are no consequences outside of the game. I also question them because this only became a popular question after The Matrix. Like, you couldn't come up with it yourself but had to have it fed to you on a screen? It was a science fiction movie, for fuck's sake. (So much for targeting this to kids.) It's like asking if Klingons are real, or if replicants are real. And to add another layer of whatever to it, I've studied religion, and the simulation hypothesis is just a modern incarnation of gnosticism. The simulation hypothesis is a modern attempt to use logic and observations about technology to finally answer these questions and prove that we’re probably living in something like a giant video game. This shouldn't be too advanced for a 13-year-old: the burden of proof is on the hypothesizers. It is not on the rest of us to prove that we're not living in a simulation. Twenty years ago, a philosopher named Nick Bostrom made such an argument based on the fact that video games, virtual reality and artificial intelligence were improving rapidly. The argument has been around longer than that. Matrix came out in what, 1999? 27 years ago. That's when people in my circles started asking the question. Here’s Bostrom’s shocking logical argument: If the 21st century planet Earth only ever existed one time, but it will eventually get simulated trillions of times, and if the simulations are so good that the people in the simulation feel just like real people, then you’re probably living on one of the trillions of simulations of the Earth, not on the one original Earth. And here's where that "logical" argument falls flat on its face: We do not currently have the capability to create a simulation where the people in the simulation feel just like real people. Maybe we're close, maybe not, but we're not there. This eliminates every one of the trillions (some say infinite, which is a hell of a lot more than trillions) of intermediate simulations, leaving us with exactly two possibilities: we're in the real world, or we're in an unadvanced simulation. The argument from probability thus evaporates like the words on a computer you've just turned off. If we are living in a simulation, does that explain anything? Maybe the simulation has glitches, and that’s why your phone wasn’t where you were sure you left it, or how you knew something was going to happen before it did, or why that dress on the internet looked so weird. Or, maybe, our brains are just plain weird. See, there is one simulation hypothesis that I am pretty well convinced of, which is that what we experience is filtered to our brains through our senses. No outside influence, no god, no monster, no advanced technology is required for that hypothesis, just natural evolution. But Bostrom’s argument doesn’t require any scientific proof. It’s logically true as long as you really believe that many powerful simulations will exist in the future. No, that doesn't work, and it takes real mental gymnastics to make it work. But, you know... our brains are weird and can make such gymnastics. That’s why famous scientists like Neil deGrasse Tyson and tech titans like Elon Musk have been convinced of it, though Tyson now puts the odds at 50-50. Calling Musk a "scientist" is like calling me a football player. Or a scientist, for that matter. He's not. Not by any stretch of the imagination. And apparently I get to mention Tyson twice in two consecutive entries; he seems to have reached the same conclusion that I did. Even though it is far from being resolved, the simulation hypothesis is an impressive logical and philosophical argument that has challenged our fundamental notions of reality and captured the imaginations of millions. Here's my essential caveat, though: I don't think we should dismiss these ideas out of hand, any more than we should dismiss the idea of space aliens out of hand. It's just that, in the words of a real scientist, "Extraordinary claims require extraordinary evidence." But I am moved to ask: even if this is a simulation, what difference would that make? If it's so you don't have to take responsibility for your actions, like when I "kill" everyone in a fantasy town, then we're going to have a problem. If it's so you can believe there's some higher power guiding it all, then it's basically techno-gnosticism. Religion. Which is not science. If it's so you can believe you're special and everyone else is an NPC, then it's techno-solipsism. And borders on conspiracy theory thinking. If it's merely an academic question, then fine. I'm all for searching for deeper realities. That's what science does, in part. And then it's not "just" a simulation; it's just reality. If it's true. Which it's not. |
| This Big Think article is from December. You know, that special time of year when they gotta retrospect all the things. 10 scientific truths that somehow became unpopular in 2025 Scientific truths remain true regardless of belief. These 10, despite contrary claims, remain vitally important as 2025 draws to a close. Yes, I'm going to quibble about the headline before even getting into the article: "Scientific truths remain true regardless of belief" right up until new science tweaks the old truths. Some take issue with this, but personally, I embrace it. Neil deGrasse Tyson once said, “The good thing about science is that it's true whether or not you believe in it.” As regular readers know, I'm a big fan of science. Science is cool. Science (combined with mathematics) is the absolute best method we have for understanding the universe (and perhaps beyond). No other philosophical system even comes close, not even actual philosophy. But that quote? a) science gets overwritten by more science all the time; and b) religious people can, and often do, make the same claim about religion. And that's not even getting into the accusations leveled against Tyson; it's possible to be right (or wrong) about some things and also be a sex pest. However. Science is overturned by more science, not by people who've seen a few YouTube videos or listened to the disinformation specialists on social media. Certainly not by people who claim divine inspiration. And when it comes to scientific "truths," some are more certain than others. For example, there's a really extraordinarily high level of certainty when it comes to things like how atoms combine to make molecules, but significantly less for things like nutrition science. So, with that lengthy disclaimer out of the way, here's (some of) the article. No matter what it is that humans do — what we think, feel, accomplish, believe, or vote for — our shared scientific reality is the one thing that unites us all. Well. Except for that subset of "us all" who insist that there's no such thing as objective reality. Moreover, some of the quantum rules that govern reality are fundamentally indeterminate, limiting our ability to predict a system’s future behavior from even an arbitrarily well-known starting point. I'm pretty sure that chaos theory (which is what he's describing there) doesn't rest on quantum mechanics alone. And I'd make a distinction between "indeterminate" and "unpredictable." But again, those are probably quibbles. Still, scientific truths remain true, even if there are very few who accept them. Gravity worked for billions of years before humans figured out the rules that govern massive objects. Life formed, thrived, and evolved for billions of years before humans discovered evolution, genetics, and DNA. There is a probably-untestable hypothesis that the universe sprang into being, fully formed, just a few seconds ago, along with all of our memories and literature and science. This seems far-fetched, of course, but by the rules of quantum mechanics, it's not impossible; and given infinite spacetime, anything that's not impossible happens. There is another, older, probably-untestable hypothesis that you are the only consciousness, and everything else is a product of your imagination. These things are fun to think about and maybe write science fiction about. I don't actually believe them. But I suspect some people do. If, at any rate, those people actually exist and aren't products of my admittedly twisted imagination. However, many scientific truths have fallen out of public favor in recent times. Now, in 2025, some of the misinformation that’s replaced those truths has been elevated to prominence, and many cannot tell fact from fiction any longer. Whether you believe them or not, here are 10 scientific truths that remain true, even though you might not realize it here in the final month of 2025. None of my commentary here is meant to override the matters covered in this article. It is only to say that I understand, on some level, how one could deny these things. As always, I'm only covering a portion of this. There's quite a bit more at the link, including pretty pictures and graphs of questionable utility. 2.) Interstellar interlopers are real, and while we found a new one (only the third ever) in 2025, they are still not aliens. When I was a kid first getting hooked on astronomy, as I recall, at one point I was learning about comets and their orbits. I have a memory of being taught that some comets could have a hyperbolic trajectory, not an elliptical one, because they came from outside the solar system and would return to outside the solar system. Apparently, that was hypothetical back then. If I can even trust my memory at all. It might be something like extrasolar planets: We knew they had to be there, but there was never any direct or even indirect evidence. As for "they are still not aliens," 1) technically, they are aliens, by some definition, as they are alien to our solar system; 2) dismissing the idea out of hand that they're the product of tech-using space aliens is contrary to science and inquiry; 3) continuing to believe that they're the product of tech-using space aliens when there's overwhelming evidence that they're not is also contrary to science and inquiry. In short, it's awesome that we can track objects visiting us from extrasolar space, but screaming about space aliens doesn't help anyone's credibility. Regardless of what you believe (or what anyone believes), this object is a natural comet-like interloper originating from beyond our Solar System, and has absolutely nothing to teach us about alien life beyond Earth. First part: high probability of truth. Second part: I wouldn't jump to that conclusion. Such objects might very well provide insights into the early stages of life's development. Not sentient life, mind you. 4.) Earth’s orbit has a finite “carrying capacity,” and if we exceed that, such as with megaconstellations of satellites, it will inevitably lead to Kessler syndrome. Remember, this is a "truth" that is dismissed or ignored. You'd have to go to the article, or elsewhere on the internet, for a fuller explanation (spoiler: Kessler syndrome has nothing to do with a starship making the Kessel run in 12 parsecs). But, to me, this is an absolutely prime example of the tragedy of the commons: there's no overriding authority to regulate the number of satellites in orbit, so people keep lofting them up there. Readers of science fiction have known about this problem since, I don't know, at least as long as I've been alive. It hasn't even been 100 years since we first figured out how to put satellites in orbit, and already we're fucking it up. 5.) The germ theory of disease is real, and vaccination is the safest, most effective strategy to combat these deadly pathogens. Denial of this royally pisses me off, and sometimes I wish there were a Hell so frauds like Andrew Wakefield, who falsely claimed a link between vaccines and autism, could burn in it forever. Besides, believing that falsehood is basically saying "I'd rather have a dead child than an autistic one," which I can only imagine pisses off actual autistic people. 7.) The Universe’s expansion is still accelerating, the Hubble tension remains an important puzzle, and the much-publicized evidence we have is insufficient to conclude that dark energy is evolving. Look, unlike the vaccine thing, this one's pretty damn esoteric. We have to live here on Earth with the consequences of vaccine denial (and of climate change, which the article covers but I didn't quote). But this? I say let the cosmologists sort it out. I want to know the answers, too, but it tells me absolutely nothing about whether I should get a measles booster or try to recycle more stuff. To be clear, this doesn't mean I'm dismissing anything. Just that it doesn't impact anything apart from my own innate curiosity—at least, as far as I know. 9.) We’ve found evidence for organics on Mars (again), but still have no good evidence for life on any planet other than Earth. It’s important to remember, especially when specious claims about the existence of aliens are at an all-time high, that we still have no robust evidence for the existence of life on any planet or world other than Earth. Sure, other worlds could be inhabited. As with the exoplanet thing, or the extrasolar comet thing, it would be absolutely shocking if life (by which I mean simple life) doesn't exist outside our tiny planet. But until they find actual evidence, I for one am not interested in leaping to conclusions. I mean, as a fiction writer, it's fun to play with the idea, and I like Star Trek as much as the next person (and probably more), but my answer to everything unknown isn't to shrug and say "must be space aliens." Unless I'm making a joke. Which, if there's anything I enjoy more than science, it would be that. These 10 truths, although they should be completely non-controversial in a world that values factual reality, are often disputed here in 2025. Despite their unpopularity, they’re just as true as they’ve ever been, and will likely remain true for a long time to come. Don’t let anyone convince you otherwise until they’ve obtained the extraordinary evidence needed to convince even a skeptic; if the evidence cannot yet decide the matter, then the matter hasn’t been decided. All that said, I would absolutely change my mind about space aliens if a flying saucer landed in my street. Actually, my personal level of proof is way lower than that; I don't need to experience something directly to believe in it. But there needs to be a higher level of support than any "alien hypothesis" has now, even when it comes to UFO/UAP sightings. So, in brief, while I think the article's on the right track, I do feel like it's a bit simplistic and/or misleading in a couple of places. That's okay, though. It gives me something to write about. Its true sin, though, in my view anyway, is not staying on track with the "this is a truth that some people choose to deny" thing, and the subject headers are all over the place with that. I think I figured it out through context clues, though. |
| Contrary to popular belief, it is not true that I do nothing. The truth is, I do nothing useful. Here's a Guardian article on how to do nothing: The perfect way to do nothing: how to embrace the art of idling We are often so busy and yet when the opportunity arises to do nothing, we can find it uncomfortable. Here’s how to lean into boredom – and unlock the imagination You would think that, of all the things we do, you wouldn't need a how-to guide for doing nothing. It'd be like if Lifehacker put out a "You're drinking water wrong" article. Please, please don't tell me they already have. There are limits to my curiosity, and one of those limits is not wanting to know just where the bottom of the barrel is. On a rainy afternoon last weekend, plans got cancelled and I found myself at a loose end. Given that I’m someone who likes to have backup plans for my backup plans, my initial response was panic. Now what? I wandered aimlessly from room to room, grumpily tidying away random items. In fairness, cleaning is the thing I do when I've absolutely, completely, and totally run out of anything else to do. For good measure, I organised a triage box containing plant food, a mister and a watering can. Why are we still calling them "misters"? That's sexist as hell. Despite the palpable benefits, my initial reluctance to slow down is not unusual. Research has shown that people often underestimate the extent to which they will enjoy inactivity. There’s a tendency for human beings to prefer to do something, even something unpleasant, than the alternative. It is true that I do not enjoy inactivity. What I enjoy is doing things that benefit no one at all, such as playing video games. Well, I suppose if I pay for the video games, I'm benefiting someone. I'll have to try harder to benefit no one. This was proved to an extraordinary degree by Harvard University psychologists whose study revealed that given the choice between sitting alone with their thoughts for as little as six to 15 minutes or giving themselves an electric shock, participants preferred to be zapped. In fairness, lots of people enjoy being zapped. In skepticism, if you know that there will be no lasting negative (pun intended) effects from getting shocked, why not choose that over doing nothing? At least you're learning what it feels like to be shocked. A true study would determine if people would rather sit alone with their thoughts for 15 minutes, or have a finger cut off without anesthesia. But I suspect that would violate some pesky ethics rule. There’s another factor: guilt – particularly about appearing to be lazy. Increasingly, being busy carries a sense of status and moral superiority. “Many of us grew up with the phrase ‘the devil will find work for idle hands’,” says Treanor. Aw, man, I thought that was an American Puritan thing. You know, the group England kicked out. Many of us simply fear boredom. Sandi Mann is a psychologist at the University of Lancashire and author of The Science of Boredom. Her research revealed that boredom, far from being a bad thing, can make us more creative. I'm sorry. I'm truly, truly, sorry. But the idea of a boredom book being written by someone named Sandi Mann just triggers every absurdist neuron in my brain. ...because Sandman? Get it? Huh? Huh? I'll be here all week. When we’re alert and fully rational, our critical, judging mind is ruling the show. Or as Mann puts it: “If you’re daydreaming, you haven’t got that inhibition, that voice in your head saying, ‘Don’t be silly, that’s a ridiculous idea!’ Instead, our minds are free to roam outside the box looking for things we wouldn’t necessarily come up with when we are more conscious.” Assertion without evidence. (That "we" can be fully rational and that "we" have critical minds.) If you want to get better at being productively unproductive, there are strategies. “See it as an experiment and bring some lightness and play into it,” suggests Treanor. Nah. I just want to find ways to be even more completely useless. If you’re feeling really brave, she suggests going cold turkey and sitting doing nothing for two minutes. “Be proud of yourself for having a go. Acknowledge that it’s really hard and uncomfortable. You don’t have to judge yourself for not enjoying it. Next time you could try for longer.” But that's two minutes I could have spent looking at cat videos. There's a lot more at the link. You can go visit it. Or you can do something else. Or you can do nothing. Whatever. |
| For no other reason than I found this amusing, an article from Smithsonian: A Cat Left Paw Prints on the Pages of This Medieval Manuscript When the Ink Was Drying 500 Years Ago An exhibition called “Paws on Parchment” tracks how cats were depicted in the Middle Ages through texts and artworks from around the world—including one example of a 15th-century “keyboard cat” Now, this might be a paid ad for the museum running the exhibition. But even if it is, the article is informative by itself. More than 500 years ago, after dedicating hours to the meticulous transcription of a crucial manuscript, a Flemish scribe set the parchment out to dry—only to later return and discover the page smeared, filled with inky paw prints. I hope the scribe didn't punish the poor kitty. “Objects like [the manuscript] have a way of bridging across time, as it’s just so relatable for anyone who has ever had a cat,” Lynley Anne Herbert, the museum’s curator of rare books and manuscripts, tells Artnet’s Margaret Carrigan. “Many medieval people loved their cats just as much as we do.” The common perception is that Europeans, back then, hated and/or feared cats, believing them to be agents of the devil (which, to be honest, I can kind of understand). And I've heard they were blamed for the Plague, or at least one of the Plagues, therefore killed en masse, thus eliminating a check on the rodent population, in turn enabling the spread of the flea with the plague germs. I can hear someone from that time right now if I tried to explain that to them: "But still, it's cats." Anyway, point is, I'm sure that then, as now, there were people who liked and appreciated cats. Though maybe liked them a little less when they left paw prints on your manuscript. This affection is evidenced by the myriad illustrations of cats across cultures. After finding the Flemish manuscript, Herbert searched the museum archives and found no shortage of other feline mentions or depictions in Islamic, Asian and other European texts and images. Also, apparently, they're not limiting it to Europe. And a 15th-century painting called Madonna and Child With a Cat features a small kitten beside the newborn baby Jesus. The depiction is likely a reference to the lesser-told Christian legend that a cat gave birth to a litter of kittens inside the manger at the same time that Mary gave birth to Jesus, according to the museum. And yet, to the best of my knowledge, no one worships those kittens or their mother. It's just not fair. “Paws on Parchment” is the first of three exhibitions over the next two years dedicated to animals in art. Its displays have already made an impression on viewers, human and feline alike. Shortly after its grand opening, in partnership with the Baltimore Animal Rescue and Care Shelter, of four 6-week-old foster kittens were given a private tour. Herbert adopted two of them. Hopefully they won't do that with the elephant exhibit. Anyway, not much to the article which, as I say, may very well be an ad. But it has pictures. Including pictures of the 6-week-old foster kittens from that last quote. I'll just end with this: a while back, I had to get part of my basement slab redone. They poured new concrete and, as the concrete was curing, my cat at the time decided to walk in it. He did not like having his paws washed afterward, but I never did anything about the prints in the concrete. So the next owner of this house is going to get a nice surprise. |
| Here's one from Self that caught my eye. Intense Fear of Rejection Is Common in People With This Condition Paris Hilton just highlighted her experience with it in a new interview. What do you call the condition where you pay any attention at all to someone who's only famous for being famous? No one is excited to deal with social rejection, but people with a certain mental health condition may struggle with this more than others. "Yes! I got rejected by my peers again! Whoohoo!" It’s called rejection sensitivity dysphoria, and Paris Hilton just highlighted her experience with it in a new interview. So... wait. You're telling me that people with rejection sensitivity dysphoria are sensitive to rejection? Hilton says that people with rejection sensitivity dysphoria experience negative feelings “on such a deep level.” And why are we listening to her opinion on a psychological subject? Hilton said she wasn’t even aware that rejection sensitivity dysphoria was a thing before her diagnosis, but she’s learned that many people with ADHD feel the same way as she does when it comes to social rejection. I'm not sure if I can explain any better how utterly stupid this idea is. Not the aversion to rejection. I get that. I have it, which is why I almost never initiate conversations myself. It's like... let's take one of my biggest fears, which is anything touching my eyeballs. I can just say "I have a fear of something (other than my eyelids) touching my eyeballs." Or, we can make up a psychological condition called "eyeball touch aversion," and proclaim that the reason I have a fear of anyone touching my eyeballs is because I have eyeball touch aversion. Suddenly, it doesn't seem like an irrational phobia so much as a medical condition. I could join internet support groups like "Don't touch my eyeballs!" and "Alternatives to contact lenses." It's circular. It's tautological. Hell, it's even recursive. What is it and how does it differ from a standard fear of rejection? Psychologists explain. I'm slightly more willing to accept explanations from psychologists than from useless heiresses. Rejection sensitive dysphoria is not in the DSM-5, the handbook used by health care professionals to classify and diagnose mental health conditions... I'm shocked. Shocked! I must have bullshitshockophilia. “The term appears to have originated in popular discourse about ADHD but lacks a clear clinical definition, validated diagnostic criteria, or empirical research base in peer-reviewed medical literature,” Dr. Saltz says. A rational article would have stopped there, because here's a rough (but accurate) summary of what has transpired within it thus far. Celebrity: "I have a medical condition." Medical professional: "No, you don't." Reporter: "Well, let's hear both sides." Social rejection can be upsetting to anyone, but people with rejection sensitive dysphoria experience it differently. News flash: people experience things differently. We're not all alike. Who knew? Again, shocking. The rejection sensitivity part refers to the tendency to “anxiously expect, readily perceive, and intensely respond to cues of rejection or criticism from others,” Dr. Saltz says. She notes that this can cause “significant distress through unpleasant bodily sensations, anxiety, and misery.” And? Look, I'm not trying to minimize the feelings here. As I said, they definitely apply to me. But I'm not trying to fit into a little box by proclaiming that my intense aversion to rejection is, or should be, a named psychological condition. Ultimately, rejection sensitive dysphoria taps into a person’s core beliefs about themselves, making someone feel that they’re unloveable and unworthy, Dr. Gallagher says. I'm also not really ragging on Hilton. It's not her I have a problem with, so much. It's the willingness of media to fawn all over her. Less so now than in the noughties, of course, but all that does is reinforce the idea that women are only valuable when they're still young, which of course is bullshit. And yes, I'm completely aware that by posting this entry, I'm adding, if only a little bit, to the hype. In any case, the point is, some of us are unloveable and unworthy. This might come as a shock to a physically attractive and rich celebrity, but I made my peace with it long ago. So the article goes on to list the "symptoms," which, as with most lists of symptoms, mostly just invite people to go "OMG I have that! I'm not weird; I'm diagnosed!" Feeling easily embarrassed or self-conscious Having trouble believing in themselves Struggling to contain emotions when they feel rejected Suddenly turning their feelings inward, which can mimic severe depression Being a “people pleaser” Avoiding starting projects, tasks, or goals where there’s a chance of failure Compensating for fear of failure or rejection by striving for perfection OMG I have that! I'm not weird; I'm diagnosed! There is, of course, more at the article. And maybe you disagree with my point of view on this. That's okay. I promise not to take it as personal rejection. Or, I don't know; maybe I will. I can't help it, because obviously I have RSD. |
| Another story about elements, this time from, believe it or not, Irish Times. Boy (7) strikes it lucky by finding one of the world’s rarest minerals near his home in Cork Within seconds of handing it over to an expert, it was clear quartz discovery was very special Around lunchtime on March 1st, 2024, Patrick Roycroft, geology curator at the National Museum of Ireland, was given a piece of mineral, about the size of a Creme Egg, by a seven-year-old boy called Ben O’Driscoll. I wanted to talk about this without making Irish jokes, but that's not going to happen. For example, those are about the most Irish names that ever Irished. Just a few weeks earlier, in mid-February, Ben had returned home after soccer practice one Saturday morning and had decided to explore a field near his home in Rockforest East, near Mallow in Co Cork. I'm going to have to assume that "had decided to explore" meant "found the end of a rainbow." When he showed his mother, Melanie, what he’d found, she sensed he’d struck it lucky. Post your leprechaun jokes in the comment section. I'm already feeling the hot, burning stares of my Irish friends. Roycroft knew exactly what he was looking for. Within seconds, he realised what he had in his palm was genuine: a true cotterite, one of the rarest forms of quartz in the world. Okay, here's where I stop making fun. Quartz is, according to what I found on the internet, But, much as carbon can be graphite or coal or diamond, there are situations that can make quartz rare and valuable. Add iron to the matrix and subject it to gamma rays, for example, and you get amethyst (which still isn't all that rare, but it sure is pretty). And I'd never heard of cotterite. What Ben had found was the first discovery of cotterite in 150 years. That's genuinely cool. There are about three dozen known authentic cotterite specimens, which are held by museums in Cork, Dublin, London and even the Smithsonian in Washington. They were all found within a few months of each other and derive from a single horizontal vein of calcite, quartz and ferruginous mud cut through carboniferous limestone in Rockforest. I understood most of those words. I didn't know "ferruginous," but I guessed it had something to do with iron, and, as usual, I was right. What amused me was that the place name is Rockforest. It was formed in a single geological event under conditions so specific that, as far as scientists know, they have never been repeated anywhere else in the world since. A bit misleading, maybe. I might have put it "has never been found anywhere else in the world since." This tale has one character: a woman called Grace Elizabeth Cotter, who grew up in Knuttery, a townland near Rockforest in Cork. Ah. The mystery of how cotterite got its name, solved. Anyway, the article goes further into what makes this particular form of quartz unique, and I think it's pretty cool. But that's because I'm a huge nerd. Also, when I saw the article, I knew I'd have to post it here just so I could make the pun in the entry title. |
| This article from The Independent seems to have been reissued from The Conversation, a source I've linked before. Why didn't I go look for it at Conversation? Because I'm lazy. Scientists mimicking the Big Bang accidentally turn lead into gold The physicists made an unexpected breakthrough Okay, well, first of all- No. Honestly, I don't even know where to begin with that headline. I'll try to just follow the article. Medieval alchemists dreamed of transmuting lead into gold. It is certain that some did. However, it is possible that the original idea was metaphorical: to turn something common and ordinary into something rare and precious. Today, we know that lead and gold are different elements, and no amount of chemistry can turn one into the other. I won't quibble about this except to say that what we call "elements," a category based on the number of protons in an atomic nucleus, isn't what alchemists called "elements." And actually, it seems to me that this quoted sentence is a bit tautological: an element is a substance that can't be turned into another substance through chemistry. Dictionary definition: "each of more than one hundred substances that cannot be chemically interconverted or broken down into simpler substances and are primary constituents of matter." Note the definition doesn't say they can't be transmuted. Only that it can't be done via chemistry. Perhaps I digress. But our modern knowledge tells us the basic difference between an atom of lead and an atom of gold: the lead atom contains exactly three more protons. First thing I've seen here that's unambiguously true. So can we create a gold atom by simply pulling three protons out of a lead atom? As it turns out, we can. But it’s not easy. In other words, the resources needed to do so exceed, by many orders of magnitude, the value of the substance transmuted. It would be like... I don't know, let's try this analogy. Somehow, you get knowledge that there's a gold nugget buried three miles beneath you. Is it worth the expense of excavation, drilling, time, etc., to get that nugget? Not in terms of the value of the gold, it's not. Or astronomers find an asteroid made of platinum: what's the cost of retrieving the asteroid, vs. the price of platinum? Nevertheless, from a purely scientific perspective, it's cool. However, we already knew transmutation was possible. The Sun does it all the time, converting hydrogen to helium. Other elements are easier to transmute. Some even do it spontaneously, like with radioactive decay. Scientifically, it's kind of like exoplanets. Until the 1990s, no one had imaged, or even inferred the existence of, a planet around a star other than our Sun. We were certain they had to be there; it made no sense whatsoever from a scientific perspective that our star, out of all the trillions and trillions of stars in the universe, was the only one with planets. Every space opera, every science fiction book or series, simply assumed that other stars had planets. But it's one thing to believe something, and another thing entirely to have experimental verification. While smashing lead atoms into each other at extremely high speeds in an effort to mimic the state of the universe just after the Big Bang... The "mimic the Big Bang" thing is hype for the public, and it led (pun intended) to all kinds of misunderstandings about what they were actually doing. To be fair, what they were actually doing is way above my pay grade, so of course they had to find a way to explain it to the general public. But this created its own set of problems like people thinking it meant they were trying to create a whole nother universe. The funniest thing to come out of this misunderstanding was this web page: https://hasthelargehadroncolliderdestroyedtheworldyet.com/ Anyway, here's CERN itself So, getting back to the article: Despite my misgivings about its sensationalism, it actually goes on to do a pretty good explanation of what's actually happening, without getting too technical. So it's there if you care; the article itself is pretty short, unlike the LHC. One final thing: it would be a mistake to scoff at those alchemists, based on our current knowledge of science and the universe. Just as astrology preceded astronomy, alchemy was an essential step on the road to chemistry. I know I've said it before, but even Isaac Newton had alchemical beliefs. What marks a scientist, though, isn't the beliefs they start out with; it's the conclusions they end up with based on observation and experiment. And that, folks, is the true alchemy: turning the lead of guesswork and wishful thinking into the gold of knowledge and understanding. |
| Here's a source I don't share often: PCMag. Kohler's Poop-Analyzing Toilet Cam Might Also Flush Your Privacy Down the Drain Kohler Health admits it can decrypt data collected from its $599 Dekoda toilet camera, which it advertises as 'end-to-end encrypted.' The funny part, the whole reason I saved this link, is the description of a "poop-analyzing toilet" system as "end-to-end encrypted." That's where my amusement stops. A toilet camera that can analyze your poop isn’t as private as its marketing suggests. I'm shocked. Shocked, I say! In October, Kohler Health announced the Dekoda, a $599 camera that hangs on the rim of your toilet and analyzes your stool and urine for potential health insights. I am moved to wonder: is this article really a privacy warning or, given the repetition of the vendor, product name, and price tag, is it actually an ad? However, Kohler designed the camera’s sensors to face downward and advertised the system as end-to-end encrypted, a term that often implies the provider can’t read the user’s data. What's it matter where the camera is facing? Some asshole seeing my asshole is far less worrisome to me than someone being able to exploit the data. For instance, selling it to health insurers. "Oh, don't be paranoid; that won't happen." Maybe not, but the risk is too high. It's like those period-tracker apps, which women living in red states quickly found out were notifying the authorities whenever a pregnancy was possible, so they could be investigated for abortion if the period started up again too soon. Oh, wait, that didn't happen. I know. At least I don't think it has, not yet. But it's not outside the realm of possibility. Government access to data normally considered private is absolutely possible, supposedly with a search warrant, but either way, "legal investigation" is one absolute exemption to privacy. But a former technology advisor to the Federal Trade Commission took a closer look at the encryption claims, and found them to be bogus. I'm not going to get into whether this one guy was correct or not. I don't much care because I'm not going to buy a poop anal-yzer either way. I know a lot of people have given up on privacy. Those people are as annoying to me as I'm sure I am to those who haven't yet given up on the idea that we'll do anything about climate change. End-to-end encryption is most often used when talking about messaging apps... The term means that only the sender and recipient’s devices can decrypt any data, preventing the service provider from reading the messages. This is why WhatsApp and Signal can’t hand the contents of you messages over to law enforcement. The encryption keys are stored on the devices, not the company’s servers. I vaguely remember reading recently that at least one of those apps isn't truly secure from that, either. Kohler Health also confirmed that it can harness the collected data to train AI programs, a concern that Fondrie-Teitler flagged. Great. Now the AI is literally up our asses. In response to the privacy concerns, it noted: “Privacy and security are foundational to Kohler Health because we know health data is deeply personal. We welcome user feedback and want to ensure they understand that every element of the product is designed with privacy and security in mind.” My own internal poop-analyzer is tuned only to that which emerges from the male bovine, and it just flashed red. |
| This SciAm article is only half a year old, so maybe it's still relevant. Massive Study Flips Our Story of Addiction and the Brain Brain differences in children and teens who experiment with drugs early show up before they take their first puff or sip For decades, Americans have been told a simple story about addiction... That by itself should be the first clue that the story is bullshit: it's simple. ...taking drugs damages the brain... I'm not going to argue with this, but I will point out that American football also damages the brain, so banning drugs without banning football is hypocritical as all hell. "Kids can damage their brains in this approved manner that involves violence, but not this unapproved manner that involves feeling good." "But, Waltz, football has other benefits." I disagree, but that's a topic for another time. ...and the earlier in life children start using substances, the more likely they are to progress through a “gateway” from milder ones such as marijuana to more dangerous drugs such as opioids. Okay, first of all, I'll say that kids shouldn't be "using substances" either. But kids do a lot of things they shouldn't do. Source: me, former kid. Second, there is no Third, no mention of alcohol or nicotine? Nicotine is highly addictive for almost anyone, though the problem with it is probably more its delivery system than the chemical itself. And alcohol is objectively a way worse drug than cannabis, though from what I understand, neither of those chemicals are inherently addictive like nicotine or opiods are. So if you're keeping track, every quote so far has been the "simple story" they mentioned. But a recent study, part of an ongoing project to scan the brains of 10,000 kids as they move through childhood into adulthood, complicates the picture. It found that the brains of those who started experimenting with cannabis, cigarettes or alcohol before age 15 showed differences from those who did not—before the individuals took their first puff or sip. You know how I keep harping in here about the hazards of confusing correlation with causation? Or about getting the causation arrow backwards? This. This is that. Now, as always, I caution against using just one study to draw firm conclusions, even though—no, especially since—it agrees with my predetermined notion. But let's at least acknowledge the possibility that it's not the drugs that are the problem, but the brains. That said, there are obvious issues with the methodology as reported here. In separate interviews, the participants and their parents also provided information on diet and substance use. Nearly a quarter of the children had used drugs including alcohol, cannabis and nicotine before the study began. Self-reporting is one of the confounding factors in nutritional studies. How much worse can it be with kids who, maybe, tried smoking a joint but refuse to tell the scientists the truth? Having a bulkier and more heavily creased brain is generally linked to higher intelligence, though these factors are far from the only ones that matter. Bigger and groovier isn’t always better... This really doesn't have much to do with the point I'm trying to make, but I wanted to point out the amusing absurdity of "Bigger and groovier." Other research has associated some of the brain differences found in the study with certain personality traits: curiosity, or interest in exploring the environment, and a penchant for risk-taking. So, if I'm reading this right, it's not the dumb kids who mess with drugs. It's the smart ones. They only become dumb later if they get addicted. If these early brain differences aren’t caused by drugs, where do they come from? They could reflect certain genetic variations or childhood exposure to adverse experiences—both of which have previously been associated with addiction risk. In other words, it's either genetics or environment. Thanks, that clears everything up! While it’s still possible that substances could chemically interfere with brain development, contributing to the elevated risk for addiction among those who start drinking or taking other drugs early, the study suggests that there are other, preexisting factors at play. I'd assume that, yes, "substances could chemically interfere with brain development." There is no reason why both can't be true. It would be more complicated, sure, but we've tried the simple answers and they don't work. Conrod emphasizes that “risky” traits have pluses as well as minuses. For example, a tendency to seek new experiences can be critical for success in science, medicine and the arts. A willingness to take risks is useful in occupations ranging from firefighting to entrepreneurship. The trick is to help young people manage such predilections safely. Of course, we could also work toward excising curiosity, risk-taking, and intelligence. We're already making great strides in that direction. So, as usual, let's not get ahead of ourselves on the jumping to conclusions train of thought with regards to this article. It's promising that the research is even being done, and at least some people are moving past the "drugs are bad" thing and into a more nuanced perspective. But nothing's certain yet. Except that I'm about 99% sure that there's no such thing as a "gateway drug." |
| Here's Better Homes & Gardens (me: "That's still a thing?") with one of the most important articles of this or any other century: What's the Difference Between Seltzer, Club Soda, and Sparkling Water? Pour yourself a bubbly beverage and study up on the difference between these popular fizzy drinks. You might just find your new favorite! I always looked at it like this: Seltzer is Yiddish, club soda is WASPish, and sparkling water is French. Hm. Maybe I'm not all that far off. Soon after being led to your table at certain sit-down restaurants, you’ll be approached with a question: “Still or sparkling?” Somehow, I've never gotten that question at dive bars. While this seems like a straightforward ask, it’s a deceptively layered query. No. No, it really isn't. Do you want bubbles in your water or not? I'm not judging either way. Still, is it tap or bottled? Bottled water is tap water, for the most part. The only difference is how far away the tap is. I remember people joking about Evian a while back "hurr hurr it's 'naive' spelled backwards." But no, it's worse than that. No one cares about spelling backwards, otherwise no one would ever go to that pretentious Erehwon place. What's worse is that one word for "spigot" or "faucet" in French is "evier." Which would make the associated adjective Evian. Okay, no, the French would probably be like "eau d'evier," for water from the faucet. My point, though, is that it's far more amusing to me that the water is named similarly to a spigot than it's "naive" backwards. They look exactly the same, but “the main difference between seltzer, club soda, and sparkling water is the actual ingredients,” explains Allison Kafalas... Water and vodka look exactly the same, too. Just saying. ...Pradhan summarizes it beautifully for us: “Sparkling water (or soda water) is naturally carbonated and often contains natural minerals, while club soda has added minerals, and seltzer has none.” See that? That quoted part? That could have been the article. That could have been the whole thing. But no, they have to phrase the headline in the form of a question to get clicks and please advertisers. Joke's on them. My ad-blocker works well. If you’re looking for a blank slate, seltzer is it. That bubbly you make in your SodaStream? It’s seltzer. Since it doesn’t contain any other minerals or sodium beyond the hydrogen and oxygen that make it water, seltzer has a very mild flavor, Kafalas tells BHG. Quibble: it may not contain added minerals (though I really do appreciate the acknowledgement that water is, itself, a mineral). But most water has trace minerals, usually calcium, magnesium, and other elements that get picked up from rocks and soil. Water without these trace minerals is called "soft" for historical reasons, and these naturally occurring minerals are generally good for you and make the water taste slightly better. So, mineral water? Either it's a hard water source, or they add the minerals later. Nothing wrong with either one, though hard water can be tough on plumbing. Point is, if you're making seltzer in your SodaStream or whatever, its mineral content will depend on your local supply. We like to use seltzer to stretch full-octane cocktails into low-ABV drinks... You do you, but to me that defeats the purpose. Club soda is also carbonated water, but unlike seltzer, the “recipe” contains “added minerals like sodium bicarbonate or potassium sulfate, giving it a slightly salty taste,” according to Pradhan. Oooooh, scaaaaaary cheeeeeeemicals. Both seltzer and club soda are sparkling, too, but when it comes to a beverage specifically branded as “sparkling water,” the carbonation is natural—as are the minerals in the water. Pro tip: Before a trip to France, learn how to properly pronounce Perrier. Hint: there are no sounds in that word that resemble what Anglophones think of as "r." The article goes into which to choose for what, and that's fine; I do like to enhance my mixological knowledge. Still (pun intended of course), if you're just drinking it for hydration, I think it's a matter of taste. |
| Today, I'm featuring this article from LiveScience. This is not what I'd call a trustworthy source, but I found the article amusing enough to share. What's the darkest place in the solar system? What about the universe? Space looks very dark from Earth. But does the solar system, and the universe for that matter, have an area that's the darkest of all? What's the darkest place in the universe? My heart, of course. ...yes, I did save this questionable article for the sole (pun intended) purpose of making that joke. Look into the night sky, and it might seem like space is a vast expanse of darkness. Making me feel right at home. But are any regions darker than others? Questions like that are what make me distrust this source. It should be painfully obvious to anyone with a working brain that some regions of space would have to be darker than others. The illuminated side of the moon, e.g., as compared to the... you know. In short, the answer isn't straightforward, and it depends on whom you ask, experts told Live Science. I imagine it would depend on one's definition of "darkness." We only see a small sliver of the EM spectrum. Do we limit the answer to light visible to humans, or expand it to include things like radio waves and gamma rays? True darkness, the blackest black, is surprisingly rare and hard to pinpoint. "It's like, how much more black could this be? And the answer is none. None more black." -Nigel Tufnel ...okay, I also saved this article so I could make a Spinal Tap reference joke. My alternative joke for this line involved Vantablack and Anish Kapoor, but I'm going with the Spinal Tap one in memory of Rob Reiner. This is because there is a lot of dust in the cosmos: Dust scatters light, making space glow far beyond stars... That's my excuse for not cleaning: the room's brighter when it's dusty. As a result, there is a background glow that permeates much of the universe. (The color of the universe is actually "cosmic latte," a beige shade not too far off white.) See, saying stuff like that may be true, but you need to explain it better lest people snort and say stuff about "common sense," and dismiss anything science comes up with as a result. Darkness also "depends on how you define it," Andreas Burkert, a theoretical astrophysicist at the University of Munich, told Live Science. Okay, I'm not the only one who quibbles about the EM spectrum. If you consider only visible light, there are some exceedingly dark places in space. And they all work in law firms. Firstly, cosmic objects can be made of light-absorbing material, making them appear very dark. Scientifically, this is known as albedo, or the amount of light reflected off a surface. We think of the illuminated surface of the moon as bright. But it's really rather dark, as anyone obsessed enough to pick up the background dialogue from Pink Floyd's greatest album can attest. The nucleus of comet Borrelly (also called 19P/Borrelly) is one of the darkest spots in our solar system, according to the Guinness Book of World Records. I trust Guinness World Records even less than this source. But wouldn't the interior of any planet be pretty damn dark in visible light? Black holes, too, are dark because they capture light that crosses the event horizon. But interestingly, "that doesn't mean that there is no light," Burkert said. "It simply is trapped." As a result, "when you enter the black hole, it's actually extremely bright," he explained. And stuff like this is misleading as hell, too. If the light is trapped, there is no light, from an outside perspective. And if you "enter the black hole," you're not coming back out to report on its brightness. And furthermore, we've all seen images of accretion disks around a black hole, which are, for various reasons, really bright. So anyway. There's more at the link. Like I said, it's an interesting question, and not one with an easy answer... unless you're a comedian. |