Items to fit into your overhead compartment |
| Regular readers know I'm a fan of etymology and other language studies. Here's one from NPR: For many in the business world, a return to work after the winter break will mean once again donning the dreaded suit and tie. Pretty sure that's falling out of fashion, except for, like, lawyers. The corporate neckwear is the everyday counterpart to the traditionally more luxurious cravat – a voluminous neckscarf that conjures up images of opulent dinners aboard a yacht sailing through the Mediterranean. It does no such thing for me. But I do know that what we call "a tie" is called "une cravate" in French, and France has a Mediterranean coast, so... whatever; I don't really have a point here, unlike my ties. Yes, I do own some. President Abraham Lincoln wore cravats, as did Hollywood actor Cary Grant and the extravagant entertainer Liberace. At least one, possibly all three, of those men were gay. Nothing wrong with that, of course, at least not from today's perspective; I'm just pointing out that it might be a factor. In more recent times, the garment has been popularized in the American mainstream by the likes of Madonna and the late Diane Keaton. Fashion has been moving toward more unisex styles, from what little I know of it. Nothing wrong with that, either. In this installment of NPR's "Word of the Week" series we trace the origins of the "cravat" (borrowed from the French "cravate") back to the battlefields of 17th century Europe and explore its links to the modern day necktie, patented in New York more than 100 years ago. That is, honestly, more recent than I thought modern neckties were. "Scarves worn around the neck existed long before, but the story of the cravat truly begins in the Thirty Years' War when it first gained wider European recognition," explains Filip Hren... As someone who has studied fighting skills, albeit briefly and without much enthusiasm, I've often wondered about that. Something worn tied around one's neck is a liability in a fight. Unless it's a fake, designed to throw the opponent off-guard when they grab it to strangle you, and it instead comes off in your hand, giving you at least a temporary advantage. Hren is referring to the 1618-48 conflict fought between Catholics and Protestants and known as Europe's last religious war. Heh. That's funny. The word "cravate" first appeared in the French language to describe military attire worn by Croatian mercenaries who were renowned among their enemies for their brutal fighting prowess. Looking like a fighter is at least half the battle. Not sure if Sun Tzu wrote that, but I believe it to be true. Made of silk or cotton, the cloth is said to have been used to protect their faces against cold weather and smoke in battle, and to treat injuries. For a while, neckwear existed with a practical purpose (for non-warriors): shirts didn't have top buttons, or had really bad top buttons, so they used ties (of various styles) to hold the collar closed for a cleaner, more formal look. For fighters, I can only imagine that they could turn its inherent disadvantage into an advantage: "I can kick your ass even with this liability looped around my most vital body connection." "The scarves took their names from Croats. It was tied in a Croatian manner, or in French – a la Croate," explains Filip Hren. And that, I didn't know until I read this article. As an aside, Croats should not be confused with the Croatoan, a native American tribe largely in what is now North Carolina. King Louis XIV introduced the cravat into French fashion and from Paris it soon spread across Europe. And who, in the history of the world, has had more impact on clothing fashion than the French? No one, I say. They also kick military ass. Coincidence? I think not. Over the years, the necktie has come to symbolize success, sophistication and status, but has also been criticized by some as a symbol of power, control and oppression. I don't really understand fashion, but I am rather attuned to symbolism (pretty much have to be, as a writer). Remaining unexplained, however, is the continued popularity of its cousin, the bowtie. |
| Here's a source I've never linked before, apparently some self-promoter called Michael Ashford: Not that self-promotion is inherently bad. But check how many times he (yes, I'm assuming gender) pushes his podcast, newsletter, book, etc. This does not mean the content is bad, either. Have you ever heard of the term “conflict entrepreneur?” Until my conversation with Martin Carcasson, I hadn’t heard it. That's because someone made it up. All words and phrases are made up, of course, just some more recently than others. This particular one isn't catchy or short enough to ever catch on, the way other phrases like "concern troll" have. I propose "strifemonger." ...the idea is simple: A conflict entrepreneur is someone who makes money and/or generates a large following by intentionally pitting people against each other. And they have been around since long before the internet. Unfortunately, conflict entrepreneurship is big business, and it’s scary. One of those things is opinion. It’s scary because it’s easy to rile up peoples’ sensitivities and emotions. You take that back RIGHT NOW! Perhaps most unsettling, it takes zero experience, financial backing, wisdom, or talent to become a successful conflict entrepreneur. Eh, I don't know about that. You gotta want to do it, and have some efficacy at it, and what's that besides talent? And you can earn experience along the way. We see example after example in popular media of people who make their living off of reducing complicated issues into black-and-white binaries, removing nuance from conversation in favor of parroted talking points, and stereotyping the many based off the actions of the few. This is, I think, the important part. Think about, for example, kiddy-diddlers. I know you don't want to think about kiddy-diddlers, but I'm making a point here. There's a meme (original sense of the word) going around that drag queens are bad and they shouldn't be around children because they'll diddle them. Whereas, here in reality, the vast, vast majority of kiddy-diddlers who aren't family (happens a lot) are fine, upstanding church or school leaders. And yet if ONE trans person got caught diddling a kid, they'd say it's because they're trans; while the fine, upstanding church or school leaders who diddle kids are "mentally ill" and "don't reflect the values of the group." In other words, if someone in the in-group does something bad, it's their fault (or we ignore it, as has been the case lately). If someone in the out-group does something bad, it's the entire out-group that's at fault. To a conflict entrepreneur, your anger and your discontent are their supply. Your desire to withdraw into a tribe and demonize anyone outside of it is the capital a conflict entrepreneur needs to continue to build their empire. Like I said. Our anger sustains them. Our frustration feeds them. We're raging all over the internet, and they're sitting there chuckling. Curious questions stop them in their tracks. Okay, first of all, no; second, first you'd have to find and identify them. This process of asking yourself questions, asking questions about others, and asking questions of others is at the heart of the... ...thing he's self-promoting. As usual, I'm not avoiding talking about something in here just because someone's trying to sell a book. We're mostly writers and readers here, with many interested in selling their books and many more (hopefully) interested in reading them. And I think the basic points here are sound: that strifemongers exist, that they're manipulating people for fun and profit, and there are ways to aikido the hell out of them. Now if I could just remember this the next time someone posts something deliberately inflammatory. |
| From The Conversation: No. There. Article over. Question answered. Done. Let's move on. Is the whole universe just a simulation? – Moumita B., age 13, Dhaka, Bangladesh Sigh. Okay. Fine. It's a kid's question. Probably best to not be all Calvin's Dad How do you know anything is real? Some things you can see directly, like your fingers. Other things, like your chin, you need a mirror or a camera to see. Other things can’t be seen, but you believe in them because a parent or a teacher told you, or you read it in a book. And then there are things that are not real, but you think they are, because someone lied to you. Maybe the world we live in our whole lives inside isn’t the real one, maybe it’s more like a big video game, or the movie “The Matrix.” Okay, here's my biggest problem with the simulation hypothesis, apart from it being inherently untestable and non-falsifiable: I question the motives of anyone who insists that this is a simulation. I question them even more when someone uses the word "just" as a modifier. Now, I'm not going to apply that distrust to a 13-year-old who lives on damn near the exact opposite side of the planet from me, if indeed that person is real, but for grown adults, I wonder. Because when I'm in a simulation, and I know it's a simulation, my ethics go right out the window. I have no issue with depopulating entire towns in single-player games, for example. There are no consequences outside of the game. I also question them because this only became a popular question after The Matrix. Like, you couldn't come up with it yourself but had to have it fed to you on a screen? It was a science fiction movie, for fuck's sake. (So much for targeting this to kids.) It's like asking if Klingons are real, or if replicants are real. And to add another layer of whatever to it, I've studied religion, and the simulation hypothesis is just a modern incarnation of gnosticism. The simulation hypothesis is a modern attempt to use logic and observations about technology to finally answer these questions and prove that we’re probably living in something like a giant video game. This shouldn't be too advanced for a 13-year-old: the burden of proof is on the hypothesizers. It is not on the rest of us to prove that we're not living in a simulation. Twenty years ago, a philosopher named Nick Bostrom made such an argument based on the fact that video games, virtual reality and artificial intelligence were improving rapidly. The argument has been around longer than that. Matrix came out in what, 1999? 27 years ago. That's when people in my circles started asking the question. Here’s Bostrom’s shocking logical argument: If the 21st century planet Earth only ever existed one time, but it will eventually get simulated trillions of times, and if the simulations are so good that the people in the simulation feel just like real people, then you’re probably living on one of the trillions of simulations of the Earth, not on the one original Earth. And here's where that "logical" argument falls flat on its face: We do not currently have the capability to create a simulation where the people in the simulation feel just like real people. Maybe we're close, maybe not, but we're not there. This eliminates every one of the trillions (some say infinite, which is a hell of a lot more than trillions) of intermediate simulations, leaving us with exactly two possibilities: we're in the real world, or we're in an unadvanced simulation. The argument from probability thus evaporates like the words on a computer you've just turned off. If we are living in a simulation, does that explain anything? Maybe the simulation has glitches, and that’s why your phone wasn’t where you were sure you left it, or how you knew something was going to happen before it did, or why that dress on the internet looked so weird. Or, maybe, our brains are just plain weird. See, there is one simulation hypothesis that I am pretty well convinced of, which is that what we experience is filtered to our brains through our senses. No outside influence, no god, no monster, no advanced technology is required for that hypothesis, just natural evolution. But Bostrom’s argument doesn’t require any scientific proof. It’s logically true as long as you really believe that many powerful simulations will exist in the future. No, that doesn't work, and it takes real mental gymnastics to make it work. But, you know... our brains are weird and can make such gymnastics. That’s why famous scientists like Neil deGrasse Tyson and tech titans like Elon Musk have been convinced of it, though Tyson now puts the odds at 50-50. Calling Musk a "scientist" is like calling me a football player. Or a scientist, for that matter. He's not. Not by any stretch of the imagination. And apparently I get to mention Tyson twice in two consecutive entries; he seems to have reached the same conclusion that I did. Even though it is far from being resolved, the simulation hypothesis is an impressive logical and philosophical argument that has challenged our fundamental notions of reality and captured the imaginations of millions. Here's my essential caveat, though: I don't think we should dismiss these ideas out of hand, any more than we should dismiss the idea of space aliens out of hand. It's just that, in the words of a real scientist, "Extraordinary claims require extraordinary evidence." But I am moved to ask: even if this is a simulation, what difference would that make? If it's so you don't have to take responsibility for your actions, like when I "kill" everyone in a fantasy town, then we're going to have a problem. If it's so you can believe there's some higher power guiding it all, then it's basically techno-gnosticism. Religion. Which is not science. If it's so you can believe you're special and everyone else is an NPC, then it's techno-solipsism. And borders on conspiracy theory thinking. If it's merely an academic question, then fine. I'm all for searching for deeper realities. That's what science does, in part. And then it's not "just" a simulation; it's just reality. If it's true. Which it's not. |
| This Big Think article is from December. You know, that special time of year when they gotta retrospect all the things. 10 scientific truths that somehow became unpopular in 2025 Scientific truths remain true regardless of belief. These 10, despite contrary claims, remain vitally important as 2025 draws to a close. Yes, I'm going to quibble about the headline before even getting into the article: "Scientific truths remain true regardless of belief" right up until new science tweaks the old truths. Some take issue with this, but personally, I embrace it. Neil deGrasse Tyson once said, “The good thing about science is that it's true whether or not you believe in it.” As regular readers know, I'm a big fan of science. Science is cool. Science (combined with mathematics) is the absolute best method we have for understanding the universe (and perhaps beyond). No other philosophical system even comes close, not even actual philosophy. But that quote? a) science gets overwritten by more science all the time; and b) religious people can, and often do, make the same claim about religion. And that's not even getting into the accusations leveled against Tyson; it's possible to be right (or wrong) about some things and also be a sex pest. However. Science is overturned by more science, not by people who've seen a few YouTube videos or listened to the disinformation specialists on social media. Certainly not by people who claim divine inspiration. And when it comes to scientific "truths," some are more certain than others. For example, there's a really extraordinarily high level of certainty when it comes to things like how atoms combine to make molecules, but significantly less for things like nutrition science. So, with that lengthy disclaimer out of the way, here's (some of) the article. No matter what it is that humans do — what we think, feel, accomplish, believe, or vote for — our shared scientific reality is the one thing that unites us all. Well. Except for that subset of "us all" who insist that there's no such thing as objective reality. Moreover, some of the quantum rules that govern reality are fundamentally indeterminate, limiting our ability to predict a system’s future behavior from even an arbitrarily well-known starting point. I'm pretty sure that chaos theory (which is what he's describing there) doesn't rest on quantum mechanics alone. And I'd make a distinction between "indeterminate" and "unpredictable." But again, those are probably quibbles. Still, scientific truths remain true, even if there are very few who accept them. Gravity worked for billions of years before humans figured out the rules that govern massive objects. Life formed, thrived, and evolved for billions of years before humans discovered evolution, genetics, and DNA. There is a probably-untestable hypothesis that the universe sprang into being, fully formed, just a few seconds ago, along with all of our memories and literature and science. This seems far-fetched, of course, but by the rules of quantum mechanics, it's not impossible; and given infinite spacetime, anything that's not impossible happens. There is another, older, probably-untestable hypothesis that you are the only consciousness, and everything else is a product of your imagination. These things are fun to think about and maybe write science fiction about. I don't actually believe them. But I suspect some people do. If, at any rate, those people actually exist and aren't products of my admittedly twisted imagination. However, many scientific truths have fallen out of public favor in recent times. Now, in 2025, some of the misinformation that’s replaced those truths has been elevated to prominence, and many cannot tell fact from fiction any longer. Whether you believe them or not, here are 10 scientific truths that remain true, even though you might not realize it here in the final month of 2025. None of my commentary here is meant to override the matters covered in this article. It is only to say that I understand, on some level, how one could deny these things. As always, I'm only covering a portion of this. There's quite a bit more at the link, including pretty pictures and graphs of questionable utility. 2.) Interstellar interlopers are real, and while we found a new one (only the third ever) in 2025, they are still not aliens. When I was a kid first getting hooked on astronomy, as I recall, at one point I was learning about comets and their orbits. I have a memory of being taught that some comets could have a hyperbolic trajectory, not an elliptical one, because they came from outside the solar system and would return to outside the solar system. Apparently, that was hypothetical back then. If I can even trust my memory at all. It might be something like extrasolar planets: We knew they had to be there, but there was never any direct or even indirect evidence. As for "they are still not aliens," 1) technically, they are aliens, by some definition, as they are alien to our solar system; 2) dismissing the idea out of hand that they're the product of tech-using space aliens is contrary to science and inquiry; 3) continuing to believe that they're the product of tech-using space aliens when there's overwhelming evidence that they're not is also contrary to science and inquiry. In short, it's awesome that we can track objects visiting us from extrasolar space, but screaming about space aliens doesn't help anyone's credibility. Regardless of what you believe (or what anyone believes), this object is a natural comet-like interloper originating from beyond our Solar System, and has absolutely nothing to teach us about alien life beyond Earth. First part: high probability of truth. Second part: I wouldn't jump to that conclusion. Such objects might very well provide insights into the early stages of life's development. Not sentient life, mind you. 4.) Earth’s orbit has a finite “carrying capacity,” and if we exceed that, such as with megaconstellations of satellites, it will inevitably lead to Kessler syndrome. Remember, this is a "truth" that is dismissed or ignored. You'd have to go to the article, or elsewhere on the internet, for a fuller explanation (spoiler: Kessler syndrome has nothing to do with a starship making the Kessel run in 12 parsecs). But, to me, this is an absolutely prime example of the tragedy of the commons: there's no overriding authority to regulate the number of satellites in orbit, so people keep lofting them up there. Readers of science fiction have known about this problem since, I don't know, at least as long as I've been alive. It hasn't even been 100 years since we first figured out how to put satellites in orbit, and already we're fucking it up. 5.) The germ theory of disease is real, and vaccination is the safest, most effective strategy to combat these deadly pathogens. Denial of this royally pisses me off, and sometimes I wish there were a Hell so frauds like Andrew Wakefield, who falsely claimed a link between vaccines and autism, could burn in it forever. Besides, believing that falsehood is basically saying "I'd rather have a dead child than an autistic one," which I can only imagine pisses off actual autistic people. 7.) The Universe’s expansion is still accelerating, the Hubble tension remains an important puzzle, and the much-publicized evidence we have is insufficient to conclude that dark energy is evolving. Look, unlike the vaccine thing, this one's pretty damn esoteric. We have to live here on Earth with the consequences of vaccine denial (and of climate change, which the article covers but I didn't quote). But this? I say let the cosmologists sort it out. I want to know the answers, too, but it tells me absolutely nothing about whether I should get a measles booster or try to recycle more stuff. To be clear, this doesn't mean I'm dismissing anything. Just that it doesn't impact anything apart from my own innate curiosity—at least, as far as I know. 9.) We’ve found evidence for organics on Mars (again), but still have no good evidence for life on any planet other than Earth. It’s important to remember, especially when specious claims about the existence of aliens are at an all-time high, that we still have no robust evidence for the existence of life on any planet or world other than Earth. Sure, other worlds could be inhabited. As with the exoplanet thing, or the extrasolar comet thing, it would be absolutely shocking if life (by which I mean simple life) doesn't exist outside our tiny planet. But until they find actual evidence, I for one am not interested in leaping to conclusions. I mean, as a fiction writer, it's fun to play with the idea, and I like Star Trek as much as the next person (and probably more), but my answer to everything unknown isn't to shrug and say "must be space aliens." Unless I'm making a joke. Which, if there's anything I enjoy more than science, it would be that. These 10 truths, although they should be completely non-controversial in a world that values factual reality, are often disputed here in 2025. Despite their unpopularity, they’re just as true as they’ve ever been, and will likely remain true for a long time to come. Don’t let anyone convince you otherwise until they’ve obtained the extraordinary evidence needed to convince even a skeptic; if the evidence cannot yet decide the matter, then the matter hasn’t been decided. All that said, I would absolutely change my mind about space aliens if a flying saucer landed in my street. Actually, my personal level of proof is way lower than that; I don't need to experience something directly to believe in it. But there needs to be a higher level of support than any "alien hypothesis" has now, even when it comes to UFO/UAP sightings. So, in brief, while I think the article's on the right track, I do feel like it's a bit simplistic and/or misleading in a couple of places. That's okay, though. It gives me something to write about. Its true sin, though, in my view anyway, is not staying on track with the "this is a truth that some people choose to deny" thing, and the subject headers are all over the place with that. I think I figured it out through context clues, though. |
| Contrary to popular belief, it is not true that I do nothing. The truth is, I do nothing useful. Here's a Guardian article on how to do nothing: The perfect way to do nothing: how to embrace the art of idling We are often so busy and yet when the opportunity arises to do nothing, we can find it uncomfortable. Here’s how to lean into boredom – and unlock the imagination You would think that, of all the things we do, you wouldn't need a how-to guide for doing nothing. It'd be like if Lifehacker put out a "You're drinking water wrong" article. Please, please don't tell me they already have. There are limits to my curiosity, and one of those limits is not wanting to know just where the bottom of the barrel is. On a rainy afternoon last weekend, plans got cancelled and I found myself at a loose end. Given that I’m someone who likes to have backup plans for my backup plans, my initial response was panic. Now what? I wandered aimlessly from room to room, grumpily tidying away random items. In fairness, cleaning is the thing I do when I've absolutely, completely, and totally run out of anything else to do. For good measure, I organised a triage box containing plant food, a mister and a watering can. Why are we still calling them "misters"? That's sexist as hell. Despite the palpable benefits, my initial reluctance to slow down is not unusual. Research has shown that people often underestimate the extent to which they will enjoy inactivity. There’s a tendency for human beings to prefer to do something, even something unpleasant, than the alternative. It is true that I do not enjoy inactivity. What I enjoy is doing things that benefit no one at all, such as playing video games. Well, I suppose if I pay for the video games, I'm benefiting someone. I'll have to try harder to benefit no one. This was proved to an extraordinary degree by Harvard University psychologists whose study revealed that given the choice between sitting alone with their thoughts for as little as six to 15 minutes or giving themselves an electric shock, participants preferred to be zapped. In fairness, lots of people enjoy being zapped. In skepticism, if you know that there will be no lasting negative (pun intended) effects from getting shocked, why not choose that over doing nothing? At least you're learning what it feels like to be shocked. A true study would determine if people would rather sit alone with their thoughts for 15 minutes, or have a finger cut off without anesthesia. But I suspect that would violate some pesky ethics rule. There’s another factor: guilt – particularly about appearing to be lazy. Increasingly, being busy carries a sense of status and moral superiority. “Many of us grew up with the phrase ‘the devil will find work for idle hands’,” says Treanor. Aw, man, I thought that was an American Puritan thing. You know, the group England kicked out. Many of us simply fear boredom. Sandi Mann is a psychologist at the University of Lancashire and author of The Science of Boredom. Her research revealed that boredom, far from being a bad thing, can make us more creative. I'm sorry. I'm truly, truly, sorry. But the idea of a boredom book being written by someone named Sandi Mann just triggers every absurdist neuron in my brain. ...because Sandman? Get it? Huh? Huh? I'll be here all week. When we’re alert and fully rational, our critical, judging mind is ruling the show. Or as Mann puts it: “If you’re daydreaming, you haven’t got that inhibition, that voice in your head saying, ‘Don’t be silly, that’s a ridiculous idea!’ Instead, our minds are free to roam outside the box looking for things we wouldn’t necessarily come up with when we are more conscious.” Assertion without evidence. (That "we" can be fully rational and that "we" have critical minds.) If you want to get better at being productively unproductive, there are strategies. “See it as an experiment and bring some lightness and play into it,” suggests Treanor. Nah. I just want to find ways to be even more completely useless. If you’re feeling really brave, she suggests going cold turkey and sitting doing nothing for two minutes. “Be proud of yourself for having a go. Acknowledge that it’s really hard and uncomfortable. You don’t have to judge yourself for not enjoying it. Next time you could try for longer.” But that's two minutes I could have spent looking at cat videos. There's a lot more at the link. You can go visit it. Or you can do something else. Or you can do nothing. Whatever. |
| For no other reason than I found this amusing, an article from Smithsonian: A Cat Left Paw Prints on the Pages of This Medieval Manuscript When the Ink Was Drying 500 Years Ago An exhibition called “Paws on Parchment” tracks how cats were depicted in the Middle Ages through texts and artworks from around the world—including one example of a 15th-century “keyboard cat” Now, this might be a paid ad for the museum running the exhibition. But even if it is, the article is informative by itself. More than 500 years ago, after dedicating hours to the meticulous transcription of a crucial manuscript, a Flemish scribe set the parchment out to dry—only to later return and discover the page smeared, filled with inky paw prints. I hope the scribe didn't punish the poor kitty. “Objects like [the manuscript] have a way of bridging across time, as it’s just so relatable for anyone who has ever had a cat,” Lynley Anne Herbert, the museum’s curator of rare books and manuscripts, tells Artnet’s Margaret Carrigan. “Many medieval people loved their cats just as much as we do.” The common perception is that Europeans, back then, hated and/or feared cats, believing them to be agents of the devil (which, to be honest, I can kind of understand). And I've heard they were blamed for the Plague, or at least one of the Plagues, therefore killed en masse, thus eliminating a check on the rodent population, in turn enabling the spread of the flea with the plague germs. I can hear someone from that time right now if I tried to explain that to them: "But still, it's cats." Anyway, point is, I'm sure that then, as now, there were people who liked and appreciated cats. Though maybe liked them a little less when they left paw prints on your manuscript. This affection is evidenced by the myriad illustrations of cats across cultures. After finding the Flemish manuscript, Herbert searched the museum archives and found no shortage of other feline mentions or depictions in Islamic, Asian and other European texts and images. Also, apparently, they're not limiting it to Europe. And a 15th-century painting called Madonna and Child With a Cat features a small kitten beside the newborn baby Jesus. The depiction is likely a reference to the lesser-told Christian legend that a cat gave birth to a litter of kittens inside the manger at the same time that Mary gave birth to Jesus, according to the museum. And yet, to the best of my knowledge, no one worships those kittens or their mother. It's just not fair. “Paws on Parchment” is the first of three exhibitions over the next two years dedicated to animals in art. Its displays have already made an impression on viewers, human and feline alike. Shortly after its grand opening, in partnership with the Baltimore Animal Rescue and Care Shelter, of four 6-week-old foster kittens were given a private tour. Herbert adopted two of them. Hopefully they won't do that with the elephant exhibit. Anyway, not much to the article which, as I say, may very well be an ad. But it has pictures. Including pictures of the 6-week-old foster kittens from that last quote. I'll just end with this: a while back, I had to get part of my basement slab redone. They poured new concrete and, as the concrete was curing, my cat at the time decided to walk in it. He did not like having his paws washed afterward, but I never did anything about the prints in the concrete. So the next owner of this house is going to get a nice surprise. |
| Here's one from Self that caught my eye. Intense Fear of Rejection Is Common in People With This Condition Paris Hilton just highlighted her experience with it in a new interview. What do you call the condition where you pay any attention at all to someone who's only famous for being famous? No one is excited to deal with social rejection, but people with a certain mental health condition may struggle with this more than others. "Yes! I got rejected by my peers again! Whoohoo!" It’s called rejection sensitivity dysphoria, and Paris Hilton just highlighted her experience with it in a new interview. So... wait. You're telling me that people with rejection sensitivity dysphoria are sensitive to rejection? Hilton says that people with rejection sensitivity dysphoria experience negative feelings “on such a deep level.” And why are we listening to her opinion on a psychological subject? Hilton said she wasn’t even aware that rejection sensitivity dysphoria was a thing before her diagnosis, but she’s learned that many people with ADHD feel the same way as she does when it comes to social rejection. I'm not sure if I can explain any better how utterly stupid this idea is. Not the aversion to rejection. I get that. I have it, which is why I almost never initiate conversations myself. It's like... let's take one of my biggest fears, which is anything touching my eyeballs. I can just say "I have a fear of something (other than my eyelids) touching my eyeballs." Or, we can make up a psychological condition called "eyeball touch aversion," and proclaim that the reason I have a fear of anyone touching my eyeballs is because I have eyeball touch aversion. Suddenly, it doesn't seem like an irrational phobia so much as a medical condition. I could join internet support groups like "Don't touch my eyeballs!" and "Alternatives to contact lenses." It's circular. It's tautological. Hell, it's even recursive. What is it and how does it differ from a standard fear of rejection? Psychologists explain. I'm slightly more willing to accept explanations from psychologists than from useless heiresses. Rejection sensitive dysphoria is not in the DSM-5, the handbook used by health care professionals to classify and diagnose mental health conditions... I'm shocked. Shocked! I must have bullshitshockophilia. “The term appears to have originated in popular discourse about ADHD but lacks a clear clinical definition, validated diagnostic criteria, or empirical research base in peer-reviewed medical literature,” Dr. Saltz says. A rational article would have stopped there, because here's a rough (but accurate) summary of what has transpired within it thus far. Celebrity: "I have a medical condition." Medical professional: "No, you don't." Reporter: "Well, let's hear both sides." Social rejection can be upsetting to anyone, but people with rejection sensitive dysphoria experience it differently. News flash: people experience things differently. We're not all alike. Who knew? Again, shocking. The rejection sensitivity part refers to the tendency to “anxiously expect, readily perceive, and intensely respond to cues of rejection or criticism from others,” Dr. Saltz says. She notes that this can cause “significant distress through unpleasant bodily sensations, anxiety, and misery.” And? Look, I'm not trying to minimize the feelings here. As I said, they definitely apply to me. But I'm not trying to fit into a little box by proclaiming that my intense aversion to rejection is, or should be, a named psychological condition. Ultimately, rejection sensitive dysphoria taps into a person’s core beliefs about themselves, making someone feel that they’re unloveable and unworthy, Dr. Gallagher says. I'm also not really ragging on Hilton. It's not her I have a problem with, so much. It's the willingness of media to fawn all over her. Less so now than in the noughties, of course, but all that does is reinforce the idea that women are only valuable when they're still young, which of course is bullshit. And yes, I'm completely aware that by posting this entry, I'm adding, if only a little bit, to the hype. In any case, the point is, some of us are unloveable and unworthy. This might come as a shock to a physically attractive and rich celebrity, but I made my peace with it long ago. So the article goes on to list the "symptoms," which, as with most lists of symptoms, mostly just invite people to go "OMG I have that! I'm not weird; I'm diagnosed!" Feeling easily embarrassed or self-conscious Having trouble believing in themselves Struggling to contain emotions when they feel rejected Suddenly turning their feelings inward, which can mimic severe depression Being a “people pleaser” Avoiding starting projects, tasks, or goals where there’s a chance of failure Compensating for fear of failure or rejection by striving for perfection OMG I have that! I'm not weird; I'm diagnosed! There is, of course, more at the article. And maybe you disagree with my point of view on this. That's okay. I promise not to take it as personal rejection. Or, I don't know; maybe I will. I can't help it, because obviously I have RSD. |
| Another story about elements, this time from, believe it or not, Irish Times. Boy (7) strikes it lucky by finding one of the world’s rarest minerals near his home in Cork Within seconds of handing it over to an expert, it was clear quartz discovery was very special Around lunchtime on March 1st, 2024, Patrick Roycroft, geology curator at the National Museum of Ireland, was given a piece of mineral, about the size of a Creme Egg, by a seven-year-old boy called Ben O’Driscoll. I wanted to talk about this without making Irish jokes, but that's not going to happen. For example, those are about the most Irish names that ever Irished. Just a few weeks earlier, in mid-February, Ben had returned home after soccer practice one Saturday morning and had decided to explore a field near his home in Rockforest East, near Mallow in Co Cork. I'm going to have to assume that "had decided to explore" meant "found the end of a rainbow." When he showed his mother, Melanie, what he’d found, she sensed he’d struck it lucky. Post your leprechaun jokes in the comment section. I'm already feeling the hot, burning stares of my Irish friends. Roycroft knew exactly what he was looking for. Within seconds, he realised what he had in his palm was genuine: a true cotterite, one of the rarest forms of quartz in the world. Okay, here's where I stop making fun. Quartz is, according to what I found on the internet, But, much as carbon can be graphite or coal or diamond, there are situations that can make quartz rare and valuable. Add iron to the matrix and subject it to gamma rays, for example, and you get amethyst (which still isn't all that rare, but it sure is pretty). And I'd never heard of cotterite. What Ben had found was the first discovery of cotterite in 150 years. That's genuinely cool. There are about three dozen known authentic cotterite specimens, which are held by museums in Cork, Dublin, London and even the Smithsonian in Washington. They were all found within a few months of each other and derive from a single horizontal vein of calcite, quartz and ferruginous mud cut through carboniferous limestone in Rockforest. I understood most of those words. I didn't know "ferruginous," but I guessed it had something to do with iron, and, as usual, I was right. What amused me was that the place name is Rockforest. It was formed in a single geological event under conditions so specific that, as far as scientists know, they have never been repeated anywhere else in the world since. A bit misleading, maybe. I might have put it "has never been found anywhere else in the world since." This tale has one character: a woman called Grace Elizabeth Cotter, who grew up in Knuttery, a townland near Rockforest in Cork. Ah. The mystery of how cotterite got its name, solved. Anyway, the article goes further into what makes this particular form of quartz unique, and I think it's pretty cool. But that's because I'm a huge nerd. Also, when I saw the article, I knew I'd have to post it here just so I could make the pun in the entry title. |
| This article from The Independent seems to have been reissued from The Conversation, a source I've linked before. Why didn't I go look for it at Conversation? Because I'm lazy. Scientists mimicking the Big Bang accidentally turn lead into gold The physicists made an unexpected breakthrough Okay, well, first of all- No. Honestly, I don't even know where to begin with that headline. I'll try to just follow the article. Medieval alchemists dreamed of transmuting lead into gold. It is certain that some did. However, it is possible that the original idea was metaphorical: to turn something common and ordinary into something rare and precious. Today, we know that lead and gold are different elements, and no amount of chemistry can turn one into the other. I won't quibble about this except to say that what we call "elements," a category based on the number of protons in an atomic nucleus, isn't what alchemists called "elements." And actually, it seems to me that this quoted sentence is a bit tautological: an element is a substance that can't be turned into another substance through chemistry. Dictionary definition: "each of more than one hundred substances that cannot be chemically interconverted or broken down into simpler substances and are primary constituents of matter." Note the definition doesn't say they can't be transmuted. Only that it can't be done via chemistry. Perhaps I digress. But our modern knowledge tells us the basic difference between an atom of lead and an atom of gold: the lead atom contains exactly three more protons. First thing I've seen here that's unambiguously true. So can we create a gold atom by simply pulling three protons out of a lead atom? As it turns out, we can. But it’s not easy. In other words, the resources needed to do so exceed, by many orders of magnitude, the value of the substance transmuted. It would be like... I don't know, let's try this analogy. Somehow, you get knowledge that there's a gold nugget buried three miles beneath you. Is it worth the expense of excavation, drilling, time, etc., to get that nugget? Not in terms of the value of the gold, it's not. Or astronomers find an asteroid made of platinum: what's the cost of retrieving the asteroid, vs. the price of platinum? Nevertheless, from a purely scientific perspective, it's cool. However, we already knew transmutation was possible. The Sun does it all the time, converting hydrogen to helium. Other elements are easier to transmute. Some even do it spontaneously, like with radioactive decay. Scientifically, it's kind of like exoplanets. Until the 1990s, no one had imaged, or even inferred the existence of, a planet around a star other than our Sun. We were certain they had to be there; it made no sense whatsoever from a scientific perspective that our star, out of all the trillions and trillions of stars in the universe, was the only one with planets. Every space opera, every science fiction book or series, simply assumed that other stars had planets. But it's one thing to believe something, and another thing entirely to have experimental verification. While smashing lead atoms into each other at extremely high speeds in an effort to mimic the state of the universe just after the Big Bang... The "mimic the Big Bang" thing is hype for the public, and it led (pun intended) to all kinds of misunderstandings about what they were actually doing. To be fair, what they were actually doing is way above my pay grade, so of course they had to find a way to explain it to the general public. But this created its own set of problems like people thinking it meant they were trying to create a whole nother universe. The funniest thing to come out of this misunderstanding was this web page: https://hasthelargehadroncolliderdestroyedtheworldyet.com/ Anyway, here's CERN itself So, getting back to the article: Despite my misgivings about its sensationalism, it actually goes on to do a pretty good explanation of what's actually happening, without getting too technical. So it's there if you care; the article itself is pretty short, unlike the LHC. One final thing: it would be a mistake to scoff at those alchemists, based on our current knowledge of science and the universe. Just as astrology preceded astronomy, alchemy was an essential step on the road to chemistry. I know I've said it before, but even Isaac Newton had alchemical beliefs. What marks a scientist, though, isn't the beliefs they start out with; it's the conclusions they end up with based on observation and experiment. And that, folks, is the true alchemy: turning the lead of guesswork and wishful thinking into the gold of knowledge and understanding. |
| Here's a source I don't share often: PCMag. Kohler's Poop-Analyzing Toilet Cam Might Also Flush Your Privacy Down the Drain Kohler Health admits it can decrypt data collected from its $599 Dekoda toilet camera, which it advertises as 'end-to-end encrypted.' The funny part, the whole reason I saved this link, is the description of a "poop-analyzing toilet" system as "end-to-end encrypted." That's where my amusement stops. A toilet camera that can analyze your poop isn’t as private as its marketing suggests. I'm shocked. Shocked, I say! In October, Kohler Health announced the Dekoda, a $599 camera that hangs on the rim of your toilet and analyzes your stool and urine for potential health insights. I am moved to wonder: is this article really a privacy warning or, given the repetition of the vendor, product name, and price tag, is it actually an ad? However, Kohler designed the camera’s sensors to face downward and advertised the system as end-to-end encrypted, a term that often implies the provider can’t read the user’s data. What's it matter where the camera is facing? Some asshole seeing my asshole is far less worrisome to me than someone being able to exploit the data. For instance, selling it to health insurers. "Oh, don't be paranoid; that won't happen." Maybe not, but the risk is too high. It's like those period-tracker apps, which women living in red states quickly found out were notifying the authorities whenever a pregnancy was possible, so they could be investigated for abortion if the period started up again too soon. Oh, wait, that didn't happen. I know. At least I don't think it has, not yet. But it's not outside the realm of possibility. Government access to data normally considered private is absolutely possible, supposedly with a search warrant, but either way, "legal investigation" is one absolute exemption to privacy. But a former technology advisor to the Federal Trade Commission took a closer look at the encryption claims, and found them to be bogus. I'm not going to get into whether this one guy was correct or not. I don't much care because I'm not going to buy a poop anal-yzer either way. I know a lot of people have given up on privacy. Those people are as annoying to me as I'm sure I am to those who haven't yet given up on the idea that we'll do anything about climate change. End-to-end encryption is most often used when talking about messaging apps... The term means that only the sender and recipient’s devices can decrypt any data, preventing the service provider from reading the messages. This is why WhatsApp and Signal can’t hand the contents of you messages over to law enforcement. The encryption keys are stored on the devices, not the company’s servers. I vaguely remember reading recently that at least one of those apps isn't truly secure from that, either. Kohler Health also confirmed that it can harness the collected data to train AI programs, a concern that Fondrie-Teitler flagged. Great. Now the AI is literally up our asses. In response to the privacy concerns, it noted: “Privacy and security are foundational to Kohler Health because we know health data is deeply personal. We welcome user feedback and want to ensure they understand that every element of the product is designed with privacy and security in mind.” My own internal poop-analyzer is tuned only to that which emerges from the male bovine, and it just flashed red. |
| This SciAm article is only half a year old, so maybe it's still relevant. Massive Study Flips Our Story of Addiction and the Brain Brain differences in children and teens who experiment with drugs early show up before they take their first puff or sip For decades, Americans have been told a simple story about addiction... That by itself should be the first clue that the story is bullshit: it's simple. ...taking drugs damages the brain... I'm not going to argue with this, but I will point out that American football also damages the brain, so banning drugs without banning football is hypocritical as all hell. "Kids can damage their brains in this approved manner that involves violence, but not this unapproved manner that involves feeling good." "But, Waltz, football has other benefits." I disagree, but that's a topic for another time. ...and the earlier in life children start using substances, the more likely they are to progress through a “gateway” from milder ones such as marijuana to more dangerous drugs such as opioids. Okay, first of all, I'll say that kids shouldn't be "using substances" either. But kids do a lot of things they shouldn't do. Source: me, former kid. Second, there is no Third, no mention of alcohol or nicotine? Nicotine is highly addictive for almost anyone, though the problem with it is probably more its delivery system than the chemical itself. And alcohol is objectively a way worse drug than cannabis, though from what I understand, neither of those chemicals are inherently addictive like nicotine or opiods are. So if you're keeping track, every quote so far has been the "simple story" they mentioned. But a recent study, part of an ongoing project to scan the brains of 10,000 kids as they move through childhood into adulthood, complicates the picture. It found that the brains of those who started experimenting with cannabis, cigarettes or alcohol before age 15 showed differences from those who did not—before the individuals took their first puff or sip. You know how I keep harping in here about the hazards of confusing correlation with causation? Or about getting the causation arrow backwards? This. This is that. Now, as always, I caution against using just one study to draw firm conclusions, even though—no, especially since—it agrees with my predetermined notion. But let's at least acknowledge the possibility that it's not the drugs that are the problem, but the brains. That said, there are obvious issues with the methodology as reported here. In separate interviews, the participants and their parents also provided information on diet and substance use. Nearly a quarter of the children had used drugs including alcohol, cannabis and nicotine before the study began. Self-reporting is one of the confounding factors in nutritional studies. How much worse can it be with kids who, maybe, tried smoking a joint but refuse to tell the scientists the truth? Having a bulkier and more heavily creased brain is generally linked to higher intelligence, though these factors are far from the only ones that matter. Bigger and groovier isn’t always better... This really doesn't have much to do with the point I'm trying to make, but I wanted to point out the amusing absurdity of "Bigger and groovier." Other research has associated some of the brain differences found in the study with certain personality traits: curiosity, or interest in exploring the environment, and a penchant for risk-taking. So, if I'm reading this right, it's not the dumb kids who mess with drugs. It's the smart ones. They only become dumb later if they get addicted. If these early brain differences aren’t caused by drugs, where do they come from? They could reflect certain genetic variations or childhood exposure to adverse experiences—both of which have previously been associated with addiction risk. In other words, it's either genetics or environment. Thanks, that clears everything up! While it’s still possible that substances could chemically interfere with brain development, contributing to the elevated risk for addiction among those who start drinking or taking other drugs early, the study suggests that there are other, preexisting factors at play. I'd assume that, yes, "substances could chemically interfere with brain development." There is no reason why both can't be true. It would be more complicated, sure, but we've tried the simple answers and they don't work. Conrod emphasizes that “risky” traits have pluses as well as minuses. For example, a tendency to seek new experiences can be critical for success in science, medicine and the arts. A willingness to take risks is useful in occupations ranging from firefighting to entrepreneurship. The trick is to help young people manage such predilections safely. Of course, we could also work toward excising curiosity, risk-taking, and intelligence. We're already making great strides in that direction. So, as usual, let's not get ahead of ourselves on the jumping to conclusions train of thought with regards to this article. It's promising that the research is even being done, and at least some people are moving past the "drugs are bad" thing and into a more nuanced perspective. But nothing's certain yet. Except that I'm about 99% sure that there's no such thing as a "gateway drug." |
| Here's Better Homes & Gardens (me: "That's still a thing?") with one of the most important articles of this or any other century: What's the Difference Between Seltzer, Club Soda, and Sparkling Water? Pour yourself a bubbly beverage and study up on the difference between these popular fizzy drinks. You might just find your new favorite! I always looked at it like this: Seltzer is Yiddish, club soda is WASPish, and sparkling water is French. Hm. Maybe I'm not all that far off. Soon after being led to your table at certain sit-down restaurants, you’ll be approached with a question: “Still or sparkling?” Somehow, I've never gotten that question at dive bars. While this seems like a straightforward ask, it’s a deceptively layered query. No. No, it really isn't. Do you want bubbles in your water or not? I'm not judging either way. Still, is it tap or bottled? Bottled water is tap water, for the most part. The only difference is how far away the tap is. I remember people joking about Evian a while back "hurr hurr it's 'naive' spelled backwards." But no, it's worse than that. No one cares about spelling backwards, otherwise no one would ever go to that pretentious Erehwon place. What's worse is that one word for "spigot" or "faucet" in French is "evier." Which would make the associated adjective Evian. Okay, no, the French would probably be like "eau d'evier," for water from the faucet. My point, though, is that it's far more amusing to me that the water is named similarly to a spigot than it's "naive" backwards. They look exactly the same, but “the main difference between seltzer, club soda, and sparkling water is the actual ingredients,” explains Allison Kafalas... Water and vodka look exactly the same, too. Just saying. ...Pradhan summarizes it beautifully for us: “Sparkling water (or soda water) is naturally carbonated and often contains natural minerals, while club soda has added minerals, and seltzer has none.” See that? That quoted part? That could have been the article. That could have been the whole thing. But no, they have to phrase the headline in the form of a question to get clicks and please advertisers. Joke's on them. My ad-blocker works well. If you’re looking for a blank slate, seltzer is it. That bubbly you make in your SodaStream? It’s seltzer. Since it doesn’t contain any other minerals or sodium beyond the hydrogen and oxygen that make it water, seltzer has a very mild flavor, Kafalas tells BHG. Quibble: it may not contain added minerals (though I really do appreciate the acknowledgement that water is, itself, a mineral). But most water has trace minerals, usually calcium, magnesium, and other elements that get picked up from rocks and soil. Water without these trace minerals is called "soft" for historical reasons, and these naturally occurring minerals are generally good for you and make the water taste slightly better. So, mineral water? Either it's a hard water source, or they add the minerals later. Nothing wrong with either one, though hard water can be tough on plumbing. Point is, if you're making seltzer in your SodaStream or whatever, its mineral content will depend on your local supply. We like to use seltzer to stretch full-octane cocktails into low-ABV drinks... You do you, but to me that defeats the purpose. Club soda is also carbonated water, but unlike seltzer, the “recipe” contains “added minerals like sodium bicarbonate or potassium sulfate, giving it a slightly salty taste,” according to Pradhan. Oooooh, scaaaaaary cheeeeeeemicals. Both seltzer and club soda are sparkling, too, but when it comes to a beverage specifically branded as “sparkling water,” the carbonation is natural—as are the minerals in the water. Pro tip: Before a trip to France, learn how to properly pronounce Perrier. Hint: there are no sounds in that word that resemble what Anglophones think of as "r." The article goes into which to choose for what, and that's fine; I do like to enhance my mixological knowledge. Still (pun intended of course), if you're just drinking it for hydration, I think it's a matter of taste. |
| Today, I'm featuring this article from LiveScience. This is not what I'd call a trustworthy source, but I found the article amusing enough to share. What's the darkest place in the solar system? What about the universe? Space looks very dark from Earth. But does the solar system, and the universe for that matter, have an area that's the darkest of all? What's the darkest place in the universe? My heart, of course. ...yes, I did save this questionable article for the sole (pun intended) purpose of making that joke. Look into the night sky, and it might seem like space is a vast expanse of darkness. Making me feel right at home. But are any regions darker than others? Questions like that are what make me distrust this source. It should be painfully obvious to anyone with a working brain that some regions of space would have to be darker than others. The illuminated side of the moon, e.g., as compared to the... you know. In short, the answer isn't straightforward, and it depends on whom you ask, experts told Live Science. I imagine it would depend on one's definition of "darkness." We only see a small sliver of the EM spectrum. Do we limit the answer to light visible to humans, or expand it to include things like radio waves and gamma rays? True darkness, the blackest black, is surprisingly rare and hard to pinpoint. "It's like, how much more black could this be? And the answer is none. None more black." -Nigel Tufnel ...okay, I also saved this article so I could make a Spinal Tap reference joke. My alternative joke for this line involved Vantablack and Anish Kapoor, but I'm going with the Spinal Tap one in memory of Rob Reiner. This is because there is a lot of dust in the cosmos: Dust scatters light, making space glow far beyond stars... That's my excuse for not cleaning: the room's brighter when it's dusty. As a result, there is a background glow that permeates much of the universe. (The color of the universe is actually "cosmic latte," a beige shade not too far off white.) See, saying stuff like that may be true, but you need to explain it better lest people snort and say stuff about "common sense," and dismiss anything science comes up with as a result. Darkness also "depends on how you define it," Andreas Burkert, a theoretical astrophysicist at the University of Munich, told Live Science. Okay, I'm not the only one who quibbles about the EM spectrum. If you consider only visible light, there are some exceedingly dark places in space. And they all work in law firms. Firstly, cosmic objects can be made of light-absorbing material, making them appear very dark. Scientifically, this is known as albedo, or the amount of light reflected off a surface. We think of the illuminated surface of the moon as bright. But it's really rather dark, as anyone obsessed enough to pick up the background dialogue from Pink Floyd's greatest album can attest. The nucleus of comet Borrelly (also called 19P/Borrelly) is one of the darkest spots in our solar system, according to the Guinness Book of World Records. I trust Guinness World Records even less than this source. But wouldn't the interior of any planet be pretty damn dark in visible light? Black holes, too, are dark because they capture light that crosses the event horizon. But interestingly, "that doesn't mean that there is no light," Burkert said. "It simply is trapped." As a result, "when you enter the black hole, it's actually extremely bright," he explained. And stuff like this is misleading as hell, too. If the light is trapped, there is no light, from an outside perspective. And if you "enter the black hole," you're not coming back out to report on its brightness. And furthermore, we've all seen images of accretion disks around a black hole, which are, for various reasons, really bright. So anyway. There's more at the link. Like I said, it's an interesting question, and not one with an easy answer... unless you're a comedian. |
| Short one today, a supposed travel article from TimeOut. These four U.S. cities were voted among the most inauthentic in the world A new study reveals the cities that lean heavily into tourist-trap experiences, according to travelers. "According to travelers." How about according to locals, or, I dunno, neutral parties? Whatever. No one should take lists like this seriously, anyway. If you’ve ever come home from a trip feeling like you spent more time in souvenir shops than local hotspots, you're not alone. Never been to a beach town, huh? The souvenir shops are the local hotspots. A new study from travel insurance provider InsureandGo... How very scientific. ...suggests that several major U.S. cities have earned a reputation for coming off as tourist productions rather than authentic, lived-in places. There is, of course, a little bit more to the world than the US, but this article just focuses on that. The study reviewed more than 1.3 million Google Maps reviews across 144 global cities, tracking how often travelers described experiences as "authentic," "local" or "traditional" versus "tourist trap" or "overpriced." Okay. At least they had criteria in mind and presumably didn't just sit there and brainstorm a list of cities they personally hated. It's still a bit subjective. Chicago topped the global list as the least authentic city, earning a remarkably low score of 2 out of 100. Pfft. The only thing inauthentic about Chicago is their "pizza." Las Vegas landed close behind at number three... Oh, come on. If I were pulling city names out of my ass for "inauthentic," Vegas would be at the very tippity-top of the list. Nashville came in at number four globally. Okay. I have no opinions about Nashville. Boston rounds out the U.S. cities deemed the most inauthentic, at number six in the world. Its rich history and walkability draws millions, but travelers frequently described central areas of the city as crowded and geared toward visitors rather than residents. ...so what? "We want that sweet, sweet tourism money" is an authentic declaration. It's worth noting that these rankings don't mean these cities aren't worth visiting. They're popular for a reason. "No one goes there anymore. It's too crowded." -Yogi Berra The article ends with the full Top 10 list of "inauthentic" cities. Venice, I could see, though I've never been there. But come on. Brussels? Get the fuck outta heah, as they say authentically in Boston. |
| Today's article, from Vox, is about something that really chaps my ass. Airports and airlines have a fake service dog problem Why so many people take advantage of airlines’ service dog loophole. Admittedly, I don't fly all that often. I don't think I was on even one plane last year. And the last flight I took was rather pleasant, because it didn't have that issue. But I remember one flight in particular that, I'm fairly certain, held more canines than humans. The stench was horrific, and made me long for the good old days when cigarette smoke was the worst odor in an airplane cabin. But it's not just airlines that have this problem. As for "why," I'll tell you why, at the risk of mortally offending any readers who might be faking their service dogs: because they're inconsiderate twats. There are people with legitimate need for a service animal. I get that. I'm in no way ragging on them. The fakers make their lives worse, though, because it leads the rest of us to stop taking true service animals, and their accompanying humans, seriously. On the first leg of that trip, from New York to Los Angeles, a dog in a “service dog” vest barked at me at the gate. The dog (not its given name), looked to be a stout French bulldog, paced back and forth, and yapped at a couple of other travelers. Okay, "(not its given name)" is legitimately hilarious. It all made me realize how many dogs traveling these days are designated service dogs, so many that there’s no way each one was a thoroughly-trained working canine. Some of these pooches had to be impostors. Ya think? The trouble is, it's rude (and sometimes illegal) to call them out on this. Which is one reason the humans can take advantage of their Main Character Syndrome. I'm fully aware that conditions requiring service animals aren't always visible. Hell, my ex-wife had epilepsy. She never had a service dog, but other epileptics do, and for very good reason. Point is, you'd never know she was in any way "differently abled" or whatever the current proper nomenclature is, unless she told you or had a seizure (rare) in front of you; she was a belly dancer, for fuck's sake. So yeah, not everyone with a service animal is obviously in need of one. But, again, that just makes the problem worse, because it's easier to fake. Why are there so many? Why and how do so many people have them? Is certification that easy to get? Do this many people need them? Why is this one barking at me? Are these people who just want to take their dog on their trip? Does being suspicious of some of them make me awful? Is a fake service dog really that bad? Because many people are inconsiderate twats, because many other people support and enable inconsiderate twattery, yes, no, because it's not a real service dog, yes, no, and hell to the exponent of yes. Sadly, I could not speak to an actual service dog for an interview regarding this contentious subject. Funny shit like this does manage to dilute my rage somewhat. More and more people want to travel with their pets, and despite airline assurances about safety, owners still harbor some overall worry about traveling with their animals in cargo. That's legitimate. It does not, however, excuse inconsiderate twattery. Also, "want to travel with their pets" is a whole different ball game from "need to travel with their pets." At the same time, traveling in the US with a pet dog in cabin — thanks to a multitude of rules — is actually difficult. We could re-regulate the airline industry, but... hey, why are you laughing? “There are plenty of owner-trained, well-behaved service dogs, and they are training their dogs to do actual physical tasks, and they should be given access. But I think we’re also talking about a lot of people not wanting to leave their dogs at home,” Reiss says. Again—because this comes up anytime someone rants about this on the internet—I am in no way saying that people who actually need a service animal are the problem. I am, in fact, saying the opposite: that the people with fake service animals make the lives of people with real service animals more difficult, and the last thing they need is more difficult lives. That said, it’s even more complicated, because no one wants to be a person who treats someone with a disability with suspicion or doubt. Then tighten the requirements, for shit's sake. “That’s the thing, the rules don’t even matter,” Molly Carta, a woman living with cerebral palsy who has a service dog named Slate, tells Vox. “I feel that way half the time too. I’m like, why did I pay $50 for this vet visit to get this form filled out? This person over here is just going to walk on with their dog.” I just want to say that if that were my name, I'd absolutely change my first name to Magna. For a long time, Carta believed that educating people about how service dogs are a medical need was the answer. But the more and more time that passes, the more she’s realized that more public awareness doesn’t work if people aren’t willing to listen. Words of wisdom, indeed. There's a lot more at the link, of course. And yes, I'm aware that the solution would need to be more nuanced than "tighten the requirements," as I said above. In the meantime, I'm avoiding flights as much as I can. Except, of course, for flights of beer, wine, whiskey, or fancy. |
| Breaking one of my until-now unspoken rules here, I'm going to link to HuffPo today. As a reminder, I browse using ad and script blockers, so hopefully you'll be able to see the content through whatever popups they push at you. What 'Only Children' Bring Up The Most In Therapy From feeling misunderstood to putting unnecessary pressure on themselves, here's what the only child may need help with. I'm shattering my rule because, as a former "only child," I was interested in what they had to say. Of course, it's been over 40 years since I could be considered a "child," but it's not like I suddenly grew siblings as I got older. Okay, that's not entirely true. There are people who I consider brothers and sisters, though I can't be sure if it's the same kind of relationship because all I have to go by are other peoples' stories. And judging by some of those stories, I didn't miss out on anything good. Some, but definitely not all. If you grew up as an only child, you’ve likely heard some of these stereotypical phrases at some point in your life: “That’s sad you grew up all alone.” “Your parents must’ve spoiled you.” “Do you have a hard time making friends?” And yet, they never ask people who had siblings things like "Was it hard, not being the center of attention?" Or, "How did it feel to feud with your siblings over the inheritance?" To address the quoted questions from my point of view: Not sad at all, it prepared me for a life of something close to self-sufficiency; yeah, they kind of did, but so what; and no, what I have a hard time with is meeting people. Yet recent research shows that many of these portrayals of only children are inaccurate. Color me shocked. Even though growing up without siblings is becoming more common, there’s still a long-lasting stigma around only children. And? At least when I was a kid, there was an even bigger stigma around childless people (the concept of "childfree" wasn't a thing yet). My parents could adopt exactly one brat, and that was, to my great good fortune, Me. We talked to therapists about the most common issues they hear only children bring up. What's not immediately clear is that this is mostly about what they bring up when they're older. You're always someone's child, but you're not always a "child." English is weird. In therapy, adult only children sometimes share that they feel lonely because they come from a smaller family and don’t have any sibling relationships. Not meaning to minimize others' experiences, but one thing I don't remember ever feeling was "lonely," either as a child or as a (technical) adult. There was always someone around to interact with. But I am moved to ask, perhaps rhetorically: "What about people with siblings who feel lonely?" In my view, it's better to never have had siblings at all than it is to be in a shitty relationship with the ones you have. “Holidays can be especially lonely for some only children because they often don’t have the big family gatherings that you see in movies and on TV.” Yeah... those movie and TV gatherings are generally idealized (or, possibly, whatever the opposite of "idealized" is when it shows a dysfunctional family). I've been to big family gatherings—both my ex-wives came from more traditional families, though one was also adopted—and while it was never exactly an unpleasant experience, I personally prefer to stay home by myself and relax rather than putting my best face on. As adults, many only children will seek out close friendships that feel like family members to fill that void, Clark said. See, this kind of wording is something I bristle at. It implicitly makes having siblings the "norm" while keeping one-child families as the "weird." Yes, as the article notes, they're a minority. But so are lots of other minorities—gay people, for example—and very few professionals these days would say something like, "many gay people will seek out close friendships with another gender to fill that void." At least not without getting pushback from both gay and straight folks. It's natural to seek out close friendships. Even I do it. It's not an exclusively "only child" thing. “Many adult only children feel overwhelmed and stressed being the only person in their family to handle all the elder care responsibilities for their elderly parents,” Greene said. Well, I had parents and a childfree aunt to deal with. And, to be honest, I couldn't. My parents both developed dementia, and that was way beyond what I was able to handle, so yeah, I hired professionals for that. It wasn't like I could quit my job to care for them full-time. Can I just point out, though, how weird it is that we put all the burden of elder care on the kids? That sort of thing may have made sense in a pre-industrial society, but now, it's just weird. Though having lots of attention from parents can lead to closer relationships with them, some only children may also feel like their every move is being watched. Well, that just prepares them for the reality of a surveillance state. Oh, and again, that's not limited to onlies. “Growing up as an only child can create a large sense of independence, which can be both a strength and a weakness,” said Priya Tahim... While I admit that it can be a weakness, I see it as a strength, at least in myself. I've never been reluctant to ask for help when I truly needed it. Speaking of which, anyone want to come over on Monday and shovel sn*w, so I don't get another heart attack? I'll pay. They may feel misunderstood or judged for being an only child. Okay, sure. But partly, that's because of articles like this one. Further perpetuating these stereotypes, only children are often portrayed negatively in movies and TV shows, such as being spoiled, selfish and having poor social skills, Greene added. I remember doing an article on a similar subject, recently, focused on adopted children. As I was both, my representation in media is fucked. At least until I remember Clark Kent. If you’d like to connect with and seek support from other adults who grew up without siblings, Greene recommends joining support groups on Facebook for only children. No. Therapy can also be an effective place to explore how your childhood is shaping who you are — no matter what your birth order is. I'm not going to rag on therapy in general. I've done it, with mixed results. But here's the problem: one of the things I'd like to talk about in therapy is my lack of motivation to do just about anything. To do that, I'd have to find a therapist. To find a therapist, I'd have to do work. And I don't have the motivation to do work, so I don't go looking for shrinks. Is that a vicious cycle, or a catch-22? I tried reading that book once and got bored very quickly. “Whether you are an only child, [oldest child], middle child or [youngest] child, there are pros and cons to each,” Tahim said. “It’s how we choose to grow, learn and adapt … that truly matters.” While I could quibble about "choose," I'm reminded of one of my favorite quotes of all time, from cartoonist R.K. Milholland: In the end, we decide if we're remembered for what happened to us or for what we did with it. |
| Unlike yesterday's entry, I know exactly why I saved this one from Mental Floss: because words are fun. 15 Words Derived From Mythological Creatures—From “Money” to “Cereal” Characters of ancient Greek and Roman mythologies have worked their way into modern vocabularies. I'm also going to brag that I knew almost every one of these. You can believe me or not; doesn't change anything. As the first month of the year, January somewhat appropriately takes its name from the Roman god Janus, who was associated with entrances, doorways, gates, and beginnings. Knew that one, too, though January wasn't always the first month of the year. 15 more words we owe to the Greeks and Romans are explored here. And I'm not covering all 15. Aurora was the Roman goddess of the dawn... As the early-morning bringer of daily light, Aurora’s name later came to be attached to the famous dawn-like phenomenon of swirling colored arches of light that appear in the night sky at high and low latitudes. Okay, well, auroras aren't very dawn-like, from what little I've seen. And I've been trying to see; there have been reports of auroras being seen all through the continental US due to a recent solar storm but, as usual, I saw nothing. Hyacinth is said to have been a beautiful young man who was struck on the head and killed while the god Apollo taught him how to throw a discus. At least he didn't teach him how to throw a disco. This, by the way, was the one I hadn't been aware of. Both money and the coin-producing mint where it is made take their names from Juno Moneta, an epithet for the Roman goddess Juno specifically associated with an ancient temple erected in her honor on Rome’s Capitoline Hill. What the linked article doesn't say, and is only mentioned in passing at the link given there, is that the actual translation of "Moneta" is "warner," as in "she who warns." I find this, in connection to money, amusing. Derived ultimately from a Greek word meaning a distribution or doling out of something, Nemesis was the name of a Greek (and later Roman) goddess of retribution and divine vengeance, who was tasked with either punishing or rewarding people for their evil or benevolent actions. And yet the "rewarding" part gets neglected. There are, of course, more at the link, for those who enjoy etymology. Just, as usual with MF, don't take anything too seriously unless you've double-checked the facts. |
| This is one of those times when I don't remember the original reason I saved something. But whatever; I'll find something to yap about. From NPR: I guess I might have kept it because it's a word origin thing, and I do like knowing origins. But I've known this word's origin for decades, so I don't know. Since the word was coined in the 18th century, "serendipity" has been used to describe all kinds of scientific and technological breakthroughs, including penicillin, the microwave oven and Velcro. I'll take their word for it. For now. And let's not forget that it was the name of the charming 2001 romantic comedy... I'd already forgotten, thanks. "Serendipity" — as the Merriam-Webster dictionary defines it — is "the ability to find valuable or agreeable things not sought for" or "luck that takes the form of such finding." A dictionary, being descriptive and not prescriptive, is the beginning of understanding, not the end. While the word has often been associated with good fortune or happy accidents, its origin suggests that serendipity goes beyond just happenstance. Some researchers argue that serendipity can be acquired through skill and that opportunities for serendipitous moments occur more frequently than we realize. Okay, but wouldn't that give it a different definition? The term was introduced by English politician and writer Horace Walpole in a letter dated Jan. 28, 1754. Walpole is widely credited with writing the first gothic novel, The Castle of Otranto, but he was also the inventor of dozens of words in the English language, including "souvenir" and "nuance,"... Well, I thought "souvenir" was French, but I suppose someone had to port it to English. "Nuance" is definitely from French. Walpole said he drew inspiration from a Persian fairy tale, "The Three Princes of Serendip." (Serendip is a historical name for Sri Lanka.) No idea why I remembered that word origin over lo these many years, when I've forgotten so much else. Over the years, the definition of "serendipity" has broadened slightly. "I think often now people will use it in a bit more of a generic sense to mean a positive thing that happened by chance," Gorrie said. " It's the same basic meaning, but it's less to do with finding and more just to do with happening." Yeah, words have a tendency to do that. Personally, I don't know if I've ever used the word in other writing (besides today). I don't particularly like it. It's too close to "serenity," for one thing; and, for another, I suppose I was never quite sure of its nuance (see what I did there?) For a third thing, I can't say or even think the word without thinking "Dippity Do." However, to Sanda Erdelez, a professor at the School of Library and Information Science at Simmons University, serendipity involves more than just being at the right place at the right time. " What matters is not just chance, but how people recognize this opportunity and then how they act on that opportunity," she said. "There is actually an element of human agency in it." I could argue that the ability to recognize and act on an opportunity is itself a form of luck: either you start out with that character trait, or you find an article like this one, by chance, and decide to work on that aspect of yourself. (Whether such efforts can be successful, I leave up to the reader.) In her research, Erdelez focused on how people come across information important to them either unexpectedly or when they are not actively looking for it. She called them "super-encounterers." "These are people who have a high level of curiosity," Erdelez said. "[They] have either a number of hobbies or interest areas so they can see connections between various things." Oh. Yeah. That's why I saved this article: I consider myself a curious person with many areas of interest, and for as long as I can remember, I've tried to see connections between disparate things. It is, I think, a good trait for a writer to have. So, for those on the hunt for serendipitous moments, Erdelez suggests carving out time from a busy schedule to give chance a good chance to happen. Yeah, that borders on mysticism, but I'm not going to quibble about that; serendipity or not, I can't help but feel it's important to do that anyway. |
| While LiveScience isn't where I'd go for trustworthy scientific information, this article had enough of interest for me to share. Is the sun really a dwarf star? Our sun is huge, at least compared to Earth and the other planets. So is it really a dwarf? Well, I don't know. Is the Dead Sea really a sea? Are the Blue Ridge really mountains? Is the East River a river? And I won't get us started on Pluto again. The sun is the biggest object in the solar system; at about 865,000 miles (1.4 million kilometers) across, it's more than 100 times wider than Earth. Using linear measurements to compare celestial bodies can be misleading. Sure, you can try to picture 100 Earths edge-to-edge across the sun's apparent disc, or find one of the many illustrations of such that exist. Or you can look up This doesn't mean that 1.3M Earths would fit inside the thing. Think of a crate of oranges, and how there's always space between the spheres. Despite being enormous, our star is often called a "dwarf." So is the sun really a dwarf star? We could call it a "tank" if we wanted to. So is the sun really a tank star? My point here is that, at first glance, this isn't a science question; it's one of categorization or nomenclature. It's like asking "is homo sapiens really sapiens?" Dwarf stars got their name when Danish astronomer Ejnar Hertzsprung noticed that the reddest stars he observed were either much brighter or much fainter than the sun. He called the brighter ones "giants" and the dimmer ones "dwarfs..." I do like knowing the history of science, and of words. Here, just as a wild guess, Hertzsprung was probably drawing on Norse mythology, which is absolutely crawling with giants and dwarfs. Incidentally, there's some debate over the difference between "dwarfs" and "dwarves." Best I can tell, "dwarves" is generally used for the fantasy race popularized by Tolkien and blatantly stolen by D&D (Tolkien himself stole it from Norse mythology). From what I understand, humans of smaller stature prefer "dwarfs," and it's also the nomenclature for astronomical objects. The sun is currently more similar in size and brightness to smaller, dimmer stars called red dwarfs than to giant stars, so the sun and its brethren also became classified as dwarf stars. Like I said, it's a categorization thing. Also, "currently" is misleading. Yes, based on our best available information, the sun won't stay the same forever; it'll eventually blow up and turn red, or vice-versa. But "eventually" means billions of years from now. Calling the sun yellow is a bit of a misnomer, however, as the sun's visible output is greatest in the green wavelengths, Guliano explained. But the sun emits all visible colors, so "the actual color of sunlight is white," Wong said. One reason some non-scientists can't get into science is the nomenclature, though. For instance, the sun is also described, by astrophysicists at least, as a black body. As in black-body radiation. This confuses the fuck out of people, and they start muttering about "common sense," as if that were something that existed. "The sun is yellow, but less-massive main sequence stars are orange or red, and more massive main sequence stars are blue," Carles Badenes, a professor of physics and astronomy at the University of Pittsburgh, told Live Science. One of the science things that confused me as a kid was that, with stars, red is cooler and blue is hotter. Our bathroom faucet was labeled with blue for cold water and red for hot. Thus began my journey of understanding. Color is probably less confusing than "dwarf" vs. "giant," though one can take those descriptions as being "smaller than average" or "larger than average" without getting too far off track. And yet, as we've seen, color descriptions can be misleading, as well. Whatever box you put the sun in, and no matter how much I mutter about "the accursed daystar," it's still the sun, and while I avoid its direct rays like a vampire, it would suck if it weren't there. |
| Today's entry is a brief introspection that you can blame on "26 Paychecks " Write a 300 to 500 word piece about a writing project that you have been working on, but aren't pushing through to completion. Explain the project: genre, plot synopsis, expected length (short story, saga, epic, novel, series). Tell us how long ago you started writing it. Tell us why you stopped working on it, or why the work is not advancing. Tell us what people in this group or on Writing.Com could do to help you see your project through to the end. By “have been working on,” I suppose “only in my head” counts. There was a NaNo project I did lo these many years ago. It’s meant to be a science fiction novel set in the next century, where human travel outside the Earth-Moon system is still not done for various technical and political reasons. Without giving away too much, the story is mostly about one pilot who breaks that barrier in a newly designed ship, built in secret and in contravention of international laws, in order to retrieve an ice asteroid that will make her orbital community more self-reliant and less dependent on Earth or Luna (such self-sufficiency is, of course, what those laws were written to prevent). How long ago? I don’t know. It’s gotta be going on 20 years now. This is how I know I’m just not cut out to be a real writer: not because of lack of writing ability or ideas, but an utter inability to see things through. Why did I stop working on it? Well, for starters, every time I looked at it as an editing project, I found something less like work to do. For finishers, the political milieu of the story is: a conservative, fascist, racist, protectionist hybrid corpo-theocracy has taken over most of the US, and, after Civil War II, the US is no longer the US but fractured into, basically, Good States (California, New York, etc.) and Bad States (Texas, Florida, etc.) That’s not actually what they’re called, but that’s the idea. Other countries are aligned with one or the other, but the biggest global power in the novel is a different, rival theocracy to the one in the former US. Since I started writing the story, the US started heading for Civil War II, thanks to a conservative, fascist, racist, protectionist hybrid corpo-theocracy, so the milieu I envisioned has gone from “yeah, right” science fiction to “it took no genius to predict that” science fiction. So that’s why I’m not working on it now, apart from sheer laziness: my ability to do so without cackling at just how spot on my political, if not technological, projections were, would get in the way. It pays to be a pessimist; you can always find something to cackle about. Tell us what people in this group or on Writing.Com could do to help you see your project through to the end. If anyone could “help” me, I’d have completed it already. No, at some point, I simply gave up all hope of ever finishing that, or the three other novels I have in draft form, all promising, none actually finished. (And that's still less than 500 words except for the italicized bits, which were just the assignments.) |