Not for the faint of art. |
Complex Numbers A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number. The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi. Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary. Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty. |
Well, that was a year. I know I said yesterday that I'd do a recap or something, but I can't be arsed. Besides, there's not much about this year that I want to remember. Best thing about it is I got my eyes fixed. Yay me. I'm not one for making New Year's resolutions. I used to, but I do try to learn things in life, like not to keep doing the same thing over and over and expect different results. My mother liked to chant, "If at first you don't succeed, try, try again." Which, now I think of it, is ironic for someone who tried over and over to have a child only to end up adopting one. One who ended up being a gigantic disappointment. I'm also not a fan of social media in general, but there is one app that I use regularly. It's called Untappd, and it's a place to record what beers you've tried and rate them. I call it "social media for drunks." I kind of wish they also did wine, or at least that I could find a different one for that purpose. But my first love is beer, so whatever. So I was looking in my Untappd records a few days ago, and discovered something distressing. In 2020, I checked in 648 times. In 2021, my number of check-ins was 520. I can't go on like this. I mean, a 20% reduction? That's outrageous. So my goal for 2022 is to drink at least 650 beers. They don't have to be all new to me, but I want my numbers back up, dammit. I know, I know, that sounds like a lot, average almost two beers a day, but remember, a good number of those were from tasting flights. As in, a lot of times, I just took a couple of sips of the brew. The only way, therefore, that I can get back to my 2020 numbers is to visit more breweries and get more tasting flights. What with everything going on right now, whether that's possible or not is up in the air. But it's something to reach for, and everyone needs a hobby. Speaking of hobbies... One-Sentence Movie Review: The King's Man A movie that reimagines the events leading up to and surrounding WWI and the Russian Revolution, the best thing about this movie was the character of Rasputin; other than that, it's pretty standard political/spy stuff that thinks it's more clever than it actually is, but also features several great fight scenes. Rating: 3/5 And so ends 2021, and good riddance. Everyone have a happy New Year, and, at the risk of sounding clichéd, see you next year. As it has become my habit to post one of these just after midnight, remember tomorrow's entry won't follow that pattern, as I will be in no shape to post then. But I'll get one in on New Year's day. Sometime. |
This one has been hanging around in my queue for a while, and I don't remember the point I was going to make about it. If there ever was one. It doesn't help that a friend of mine smuggled me a bottle of delicious mead all the way from Utah, and I just drank it. Our Worst Idea About “Safety” A concept that took hold in the ’70s has haunted everything from seat belts to masks—and it’s going to keep putting us in danger. The concept many medical experts can’t seem to loosen their grip on is known as “risk compensation.” It’s an idea that comes from the study of road safety and posits that people adjust their behavior in response to perceived risk: the safer you feel, the more risks you’ll take. As far as transportation engineering is concerned, yes, that is a thing. If you’re driving on a precarious cliff-side road without guardrails, you’d probably drive more cautiously. But some proponents of the idea make a stronger claim: that guardrails cause so much reckless driving that any potential safety benefits of guardrails are offset or even reversed. Under this reasoning, a road with guardrails would cause more accidents than a road without guardrails. Guardrails aren’t helpful; they’re counterproductive. When I was in England several years ago, I was walking along a trail next to a stream when I noticed that something seemed off. After a while, I figured out what it was: even though the trail was, for the most part, significantly higher than the stream, there was no railing, no fence, no barrier of any kind. Here in the US, there would have been something like that -- or else they would have put the trail somewhere else, or, because this is America and we do what we want, moved the stream. The idea behind the railing is to keep people from accidentally falling into the stream. Without the railing, though, I think we had a greater sense of the risks involved and were therefore more careful. That's the kind of thing this article is talking about. But whenever risk compensation has been subjected to empirical scrutiny, the results are usually ambiguous, or the hypothesis fails spectacularly. Okay. I'm not going to argue with the science, if indeed it's saying what they claim it's saying. But I've thought for a while now that things are just too safe -- at least in the US. There needs to be some risk, some idea of the consequences of, say, falling off a two-meter retaining wall. But in 1975, University of Chicago economist Sam Peltzman elevated what might have remained armchair speculation to a powerful argument against safety regulations. Writing in the Journal of Political Economy, Peltzman hypothesized that 1960s-era federally mandated vehicle regulations such as seat belts were actually making the roads less safe because they encouraged so much reckless and careless driving. In his thinking, any safety advantage of the new regulations was being offset. He analyzed traffic accident data before and after the regulations and found that not only did the regulations fail to decrease fatal accidents, but traffic-related fatalities increased after regulatory action. I vaguely remember some discussion about that at the time, but I was nine or ten years old, so I generally ignored it while riding unsecured on a tractor, swimming in questionable water, or learning to drive a truck without seatbelts. Yes, I learned the basics of driving before I hit puberty -- one dubious advantage of living on a farm. Note how one always "hits" puberty, by the way. One never eases into it. Decades of traffic data now leave little doubt that, overall, safety regulations have indeed reduced traffic-related fatalities. The fun thing about science is that sometimes science gets it wrong. And then it self-corrects, but people are so attached to the first conclusion that they ignore the correction. This is not a problem with science. This is a problem with humans. Anyway, like I said, I don't remember the point I originally had in mind, but it came up today so I'm sharing it. I've been thinking about risk management in general for a while now, and there is, as I think the article points out (after all that mead, I can't be sure), a difference between personal and public risk management. Tomorrow is the last day of the year, so it's very likely I'll do something other than riffing off of an article from somewhere, opting instead to do a review of the year or possibly a crystal-ball prediction for 2022. Maybe both. I haven't decided yet and probably won't until like 11:45 tonight. And as the following day is New Year's, don't expect a blog entry at the usual time, because I may be drunk now, but I plan to be completely danchu at that time. |
Short one today. Bit of a relief after yesterday for all of us, I'm sure. Pretty much just what it says in the headline, but I think it's worth looking at, because "recreate" isn't the precise word I'd use. Echo, maybe. Update. Transform. Chris photographed people back in the 1970s, 1980s, and 1990s. And during 2020 and 2021, he tracked them down and recreated the photos he took decades ago – one of them being nearly 50 years old. Chris kindly shared his photos with us, and if you ask me, they’re the most heartwarming thing you’ll see today. Of course, you'll have to go to the link to see the pictures, and I think they're clever as well as artistic. “I entered the digital age with a Canon EOS 1000D, 60D, 70D and finally a 5D M111 as I often photograph in low light and adverse weather. I do not use flash and I am hopeless at Photoshop so do minimal editing with the camera software. There is no substitute for getting it right first time in the camera.” Me, I lost interest in doing photography around the time of the switch to digital (causation, not correlation), but I can still appreciate a good photography project. It's not that I don't appreciate digital photography - like everyone else, I use my mobile phone to record important things like beers and cats - it's just that, since everyone these days is a photographer, I'd have to work hard to stand out. And I'm allergic to work. Anyway, like I said, short, and more photos than text. It's not so much about the technical quality of either the originals or the "reshoots," but more about the interesting ways the photographer recalled the original scenes with the newer photos. |
You're damn right I have something to say about this. For starters, "sci-fi" is a shitty abbreviation (though not as shitty as SyFy). Mostly because the second "i" is pronounced completely differently. What do we talk about when we talk about science fiction? Is it our hope for the future, or our fear of creating the very thing that will destroy us? If the most influential sci-fi books of all time are any indication, the answer is both. I see science fiction (which I will helpfully abbreviate as SF to avoid the shitty shortening) as more than either of those things, but okay. The most influential sci-fi books of all time have shaped not just science fiction and its myriad sub-genres, but horror, fantasy, and manga, as well. Nowhere in this article do I see who gets to define "most influential." So I'm going to assume it's the opinion of one person, or perhaps a committee. Frankenstein by Mary Shelley (1818) Widely regarded as the grandmother of all sci-fi novels, Mary Shelley’s Frankenstein not only laid the foundation for science fiction as an exploration of what happens when Man plays God, but also asked the timeless question: what is it that makes us human? Well, the list starts off well enough. I consider it the first SF book, but I think a lot of people who talk about it miss the point. We had to read it in engineering school and were supposed to come to the conclusion of "don't 'play god.'" Ever the contrarian, I interpreted it as "If you must play god, get it right." Blake, or the Huts of America by Martin R. Delany (1859-1862) Having never read this one, I have no opinion - except that since I've never even heard of it, and I've been consuming SF for my entire life, I have to doubt the "influential" label. Perhaps it should be. Perhaps not. Maybe I'll read it one day and judge for myself -- taking the historical context into account, of course. Twenty Thousand Leagues Under the Sea by Jules Verne (1869) Jules Verne’s novels predicted many modern technologies, from solar-powered space flight to Zoom, but Captain Nemo’s Nautilus is a particular stand-out. It's been a very long time since I read this one, and I have no doubt of its influence, but science fiction doesn't "predict." It warns. Or, alternatively, it plants the seeds of ideas that later come to fruition. In other words, it can be a self-fulfilling prophecy, or a blueprint, like when they deliberately made flip-phones to look like the old-style Star Trek communicators. The Strange Case of Dr. Jekyll and Mr. Hyde by Robert Louis Stevenson (1886) Victor Frankenstein might be the first hubristic scientist in literary history, but the trials and tribulations of Robert Louis Stevenson’s well-meaning Dr. Jekyll would be rehashed for years to come as the archetypal “mad scientist.” No argument on this one. The Time Machine by H.G. Wells (1895) Time travel. Human evolution. Post-apocalyptic visions of Earth. Cli-fi (climate fiction). H.G. Wells’s The Time Machine has all of these sci-fi staples and more. Or this one, except that hopefully the author of this article is now rubbing their face because I just smacked them for "Cli-Fi." Of One Blood, or The Hidden Self by Pauline Hopkins (1902-1903) Pauline Hopkins’s Of One Blood follows Reuel, a Harvard-educated mixed-race man who passes for white, who uncovers mind-blowing truths about Africa’s history — and its present — when he stumbles upon a technologically advanced underground civilization beneath an archeological dig site in Ethiopia. Another one I never even heard of. I get the impression that the compiler of this list wanted to be "inclusive." That's great, to be inclusive. But in order to be influential, something has to actually have, you know, influence. Perhaps the book is great and was simply overlooked due to *ism, and that should absolutely be rectified in some way, but not by retroactively calling something "influential." A Princess of Mars by Edgar Rice Burroughs (1912) Whether you love ’em or hate ’em, pulpy adventures have had a strong presence in the sci-fi world for more than a century. Tarzan creator Edgar Rice Burroughs also penned this novel, the first of many 11 books about his fictional “Uncle Jack,” AKA John Carter of Mars. This one, I read as a kid. More recently, I read it as an adult upon the 100th anniversary of its publication and my gods, the writing was horrible. But I can't deny he had Ideas. I'd only vaguely classify it as science fiction, though. Still, good or bad, it's undeniable that the book was influential. We wouldn't have Star Wars without it -- hell, Lucas lifted parts of it whole-cloth. Nor would we have Superman, at least not as we know the character. As an aside, it's a real shame the movie (John Carter) was a critical and audience flop. I don't think Disney pushed hard enough on the history of the story when marketing it. People thought it was ripping off Star Wars and Superman and the like, when the reality is precisely the opposite. We by Yevgeny Zamyatin (1924) Yevgeny Zamyatin’s We appeared in print in English 30 years before the original Russian version was published. One of the first dystopian novels, We introduces readers to the One State: a unified world government that demands conformity of its citizens, who live and work in glass buildings, and have numbers instead of names. From the description, this definitely falls into the "warning" category, though I'm pretty sure some governments and/or businesses would put it in the "blueprint" category. Again, never even heard of it, though. Metropolis by Thea von Harbou (1925) Existential dread over automation might have begun with the Luddites in the early 19th century, but Thea von Harbou dragged their fears into the modern age with her 1925 novel. The movie made from this one is undeniably a classic, and I've never read the book. But since the book inspired the movie, I'm going to go ahead and agree with the "influential" label. Brave New World by Aldous Huxley (1932) Aldous Huxley’s best-known novel takes place in a far-future version of London, one marked by free love, mandatory drug consumption, and the total destruction of the nuclear family. No way can I disagree with this one. The Collected Stories of Arthur C. Clarke by Arthur C. Clarke (1937-1999) It’s hard to imagine any writer working to reconcile religion and science with more intent than Arthur C. Clarke. They cannot, and should not, be reconciled... but that doesn't mean I don't respect Clarke as an author. But if I did have to choose one of his works for this list, it wouldn't be the collection. Nor would it be 2001. Or even the distressing Childhood's End. No, it would be an article he wrote in 1945 proposing geosynchronous satellites. Again, not a prediction or a prophecy, but a blueprint, one that absolutely transformed the world as we knew it. It's also not a "book," so it doesn't really belong on this list, but this is my blog so I note it because I can. The Complete Robot by Isaac Asimov (1939-1977) It’s hard to pick just one of Isaac Asimov’s books to include on this list, but the ubiquitousness of his Three Laws gives The Complete Robot a positronic leg up on its competition. All due respect to Asimov -- he's actually ahead of Clarke on my personal list of favorite SF authors -- but his "robot" stories don't really have much of a bearing on the realities of AI. Shadow Over Mars by Leigh Brackett (1944) Also published as The Nemesis from Terra, this debut novel from the “Queen of the Space Opera” follows Rick Urquhart... Shadow over Mars is pulpy, it’s prototypical, and it’s largely overlooked. Again, it may be a freakin' masterpiece, but if it's "overlooked," it's not "influential." Nineteen Eighty-Four by George Orwell (1949) This one is so influential that everyone knows it, even if they haven’t read it. Yeah, we know. Astro Boy by Osamu Tezuka (1952-1968) Astro Boy may not have been the first work of literature to introduce robots with feelings, but it’s one of the most influential by far. I keep meaning to read these, but I never seem to get around to it. Fahrenheit 451 by Ray Bradbury (1953) “It was a pleasure to burn.” It’s one of the best-known first lines in literature, and it comes from a novel that’s on par with Nineteen Eighty-Four, at least as far as being misinterpreted goes. No one is arguing that Bradbury wasn't an excellent writer, or that this wasn't a great book, but influential? Meh. Starship Troopers by Robert A. Heinlein (1959) Space wars and buglike aliens still abound in sci-fi today, thanks in large part to Robert A. Heinlein’s Starship Troopers. And finally we get to my actual favorite SF author -- with one of his worst books. No. Pick Stranger in a Strange Land instead. A Canticle for Leibowitz by Walter M. Miller Jr. (1959) Let’s talk apocalypse and what comes after. You can’t throw a rock without hitting a post-apocalyptic story these days. Okay, no argument here either. A Wrinkle in Time by Madeleine L’Engle (1962) If science fiction has a gateway drug, A Wrinkle in Time is it. Writing for Early Bird Books, Molly Reiniger points out that Madeleine L’Engle’s novel for children “created the space, especially for girls, to be interested in science fiction and fantasy, and to go on to be dedicated readers and writers of the genre.” Or here. The movie sucked ass, though. Dune by Frank Herbert (1965) It’s impossible to talk about the most influential sci-fi books of all time without talking about Dune. No, it really isn't. Well, okay, it is, but that doesn't mean I have to actually like political SF as a subgenre. The universe he created has the lasting power to capture the imagination, though, for sure. Babel-17 by Samuel R. Delany (1966) Although it’s not Delany’s best-known work — that honor belongs to 1975’s Dhalgren — Babel-17 was highly influential upon its release for its use of, and play with, language. Delany (this Delany, not the unrelated one way up there near the top) is one of those intellectual SF writers whose books will probably never be made into movies -- but he had an undeniable influence on other SF writers. Do Androids Dream of Electric Sheep? by Philip K. Dick (1968) A century and a half after Mary Shelley cracked open the question of what makes us human, Philip K. Dick dumped that can of worms out onto the page to create what remains one of the most influential sci-fi novels ever written. I could write an entire thesis comparing and contrasting Blade Runner with Frankenstein... but not today. The Left Hand of Darkness by Ursula K. Le Guin (1969) I would hope that, by this point in this list, I’ve put any arguments about the “sudden” politicization of science fiction to rest. If you’re still unconvinced, check out Ursula K. Le Guin’s The Left Hand of Darkness... You know, one of the great things about science fiction isn't the part about imagining possible future technologies, but about challenging social norms, and I agree that no one did that better than Le Guin. Slaughterhouse-Five by Kurt Vonnegut (1969) Another sci-fi novel with broad crossover appeal, Slaughterhouse-Five stands out for its depiction of post-traumatic stress disorder as time travel. As I recall, Vonnegut objected strenuously to being called an SF writer. He's no longer with us, so he can't argue about it any more... but he was an SF writer. So it goes. Where Late the Sweet Birds Sang by Kate Wilhelm (1976) Kate Wilhelm’s Where Late the Sweet Birds Sang imagines a post-apocalyptic future in which humanity has attempted to circumvent rampant infertility through a dedicated cloning program. No. The Ultimate Hitchhiker’s Guide to the Galaxy by Douglas Adams (1979-1992) Whether you’re cheeky enough to call a five-book series — six, if you count Eoin Colfer’s And Another Thing… — a “trilogy,” base an entire series around the adventures of one bumbling Englishman who stumbles unawares into being the last living Earth-man, or make the Earth’s destroyers into aliens who write poetry so bad it literally hurts, nobody does sci-fi comedy quite like Douglas Adams. Yes. Daughters of a Coral Dawn by Katherine V. Forrest (1984) It’s a story reminiscent of Laura Lam’s Goldilocks: a group of women set out from Earth to colonize a distant planet, away from patriarchal influence and persecution... An early work of lesbian sci-fi that gave way to a Lambda Literary Award–winning sequel, Daughters of a Coral Dawn is an oft-overlooked classic in the genre. Again, if it's "oft-overlooked," I find it hard to reconcile that with "influential." I've never read it and it may be a shining example of the writer's craft, but that doesn't earn it a spot on this list. Okay, I'm going to go ahead and skip over some now. You can read the full list at the link. Watchmen by Alan Moore and Dave Gibbons (1986-1987) If you’re a fan of Invincible, The Boys, or any other series that depicts superheroes as, well, not so super, you have Alan Moore and Dave Gibbons’s Watchmen to thank. I'm agreeing with this one -- conditionally. I've said before that comic book superheroes aren't strictly SF. That's not a value judgement -- I love comics -- just one of categorization. But I'm allowing this one because it does have significant SF elements. And also because it's still my second-favorite graphic novel of all time, after Sandman, which is most definitely not SF. And I'm going to stop there. The list is roughly in chronological order, and anything published after Watchmen simply hasn't had enough time to earn the label "influential." Great? Maybe. Popular? Sometimes. Well-written? Not gonna argue. And some of the later books listed, I've read and greatly enjoyed, and certainly some of them have planted the seeds for other ideas -- but whether those seeds bear fruit, or wither and die like clothing fads, remains to be seen. Meanwhile, I do like lists of this sort, because, yes, I've found some books I now want to read, and it's also nice to see some of my own choices made the cut. |
Earlier this month, in "Too Little, Too Late" , I noted how I was Done With Caring About Climate Change. We had a high here of over 70F on Christmas Day (great weather for walking to the movie theater), and all I could think about was how nice it was to be warm (average high for late December around here is low-40s). Anyway, here's another article that tries to sugarcoat our impending doom. I'm posting this because it's another angle on the topic. The rare spots of good news on climate change It looks increasingly clear that we'll at least sidestep the worst-case scenarios. You know, one of the most frustrating things about being right and actually doing something about it is that nothing happens, so people think the original prognostication is overblown. For example, back in 1999, a lot of people were concerned about a "Y2K problem" that, as I recall, had some people screaming doom and gloom over its potential impact on computer systems everywhere and what happens to them when their odometers turn over. A bunch of people took this seriously, tackled the problem, and solved it before the end of that year. Then, as the clock ticked midnight, nothing that was predicted came to pass -- because people had done something about it. As another example, suppose some terrorists have a plan to, I dunno, blow up the Washington Monument. Their plan comes to the attention of the NSA or whatever, and they stop it from happening. And all the general public knows is that the National Phallic Symbol is still standing; they're blissfully unaware that we were about to lose our permanent erection. Or the people who warned that COVID could kill millions by the end of 2020 if nothing was done. Then, the end of 2020 came along, and it "only" killed half a million (or whatever the hell the actual numbers were; it makes no difference to the point I'm making). So you get people who scoff at the original prognostications, forgetting the qualifier of "if we do nothing" and the fact that we actually did something. No one notices when nothing happens. And that leads to thinking that nothing would have happened, when the fact is that it absolutely would have happened had we not staged an intervention. So if, as I maintain, we'd done something about climate change when it was still possible to avoid extreme outcomes, then the extreme outcomes wouldn't have happened, opening the door for idiots to go, "See? We did all this shit and nothing happened." Yes, idiots; that's because we did all the shit. Or would have. All of this is to comment on the subhead above: "we'll at least sidestep the worst-case scenarios." Yes, that's because while we haven't done anywhere close to enough, we've at least done something, and the worst-case scenarios were based on us doing nothing. To be sure, the limited progress isn’t nearly enough. We’ve taken far too long to begin making real changes. World events and politics could still slow or reverse the trends. "Could?" Probably it'll make things worse. So what are the signs of progress amid the climate gloom? Well, we're still brewing beer. Today, if you layer in all the climate policies already in place around the world, we’re now on track for 2.7 °C of warming this century as a middle estimate, according to Climate Action Tracker. Look. You don't get to be relieved at 2.7°C just because someone warned that it could have been twice that. That amount of temperature increase is Bad. It would be like warning that Grandma might have liver cancer and bladder cancer, and then saying, "Well, turns out she doesn't have bladder cancer after all, so what a relief, right?" If you assume that nations will meet their emissions pledges under the Paris agreement... HA! Given the increasingly strict climate policies and the plummeting costs of solar and wind, we’re about to witness an absolute boom in renewables development. And just to be clear, if this is true, it's undeniably a good thing -- with or without climate change. Meanwhile, there are plenty of signs of technological progress. Researchers and companies are figuring out ways to produce carbon-free steel and cement. Plant-based meat alternatives are getting tastier and more popular faster than anyone expected. This has nothing to do with climate change, but I finally got to try a Beyond Burger at the drafthouse. I ordered one because I was curious about it. It was pretty good. It tasted almost entirely like meat doesn't, so I don't know where people get off thinking it's anything close to tasting like dead cow -- but it was good in its own right. The kicker, though, was that the menu included calorie counts, and the calorie load of the fauxburger was about 10% higher than that of the actual beef burger. So if you're eating it to save the planet? Fine. If you're eating it to be healthier? Jury's out. But like I said, at least it tasted good; most vegan food tastes like penance. And here’s an important and counterintuitive finding: While dangerous, extreme weather events are becoming increasingly common or severe, the world seems to be getting a lot better at keeping people safer from them. The average number of deaths from natural disasters has generally dropped sharply in recent decades. Gee, that's nice, but what about property damage and the resulting costs to individuals and insurance companies who then pass the costs on to individuals? Peoples' loss of houses and businesses, like we just saw with the tornado in the Midwest? Displacement? Poverty? Death isn't the worst possible outcome here. Progressive US politicians now casually repeat the claim that climate change is an “existential threat,” suggesting it will wipe out all of humanity. That's probably hyperbole, but consider all the scaremongering articles about lower birth rates and you'll understand that you don't have to wipe out all of humanity for society to collapse; just decimate it. (I use that word with its proper definition of "remove 1/10th of.") While I'm admittedly a fan of the idea of reducing the population, I'd rather it be by reducing the birth rate gradually and sustainably, not losing a big chunk of living, breathing people to floods, famines, or fires. But insisting that the world is at the edge of collapse, when it’s not, is a terrible message for young people and carries some real risks as well. It clearly undermines credibility. It could lead some people to simply lose hope. We should be losing hope. I know mine's completely gone. Anyway, the article makes good points and ones that are maybe not so good, but there's a lot more to it than I quoted, and it's worth a look. |
Today's link is ancient (2007), but short and about cats. A Brief History of House Cats It may be that “nobody owns a cat,” but scientists now say the popular pet has lived with people for 12,000 years It's possible that new science has come out on the subject since then, but who cares when the illustration is a kitty with a ball of yarn? Speaking of science, that picture reminded me of string theory: The Universe is a big ball of string, and God is a cat. Cats were first domesticated in the Near East, and some of the study authors speculate that the process began up to 12,000 years ago. Headline: 12,000 years. Body text: "speculation." And online journalism has only gotten worse since the noughties. While 12,000 years ago might seem a bold estimate—nearly 3,000 before the date of the Cyprus tomb's cat—it actually is a perfectly logical one, since that is precisely when the first agricultural societies began to flourish in the Middle East's Fertile Crescent. Sure, I get the logic behind it, but it's still speculative. Cats, on the other hand, only became useful to people when we began to settle down, till the earth and—crucially—store surplus crops. With grain stores came mice, and when the first wild cats wandered into town, the stage was set for what the Science study authors call "one of the more successful 'biological experiments' ever undertaken." The cats were delighted by the abundance of prey in the storehouses; people were delighted by the pest control. I still don't know why everyone who writes about cats have to defend their "usefulness." How "useful" are the paintings in the Sistine Chapel? How "useful" is a ballet? Cats are living art, and they don't need to justify their existence. Of course, sometimes they're living art that pukes on the couch, but still. Anyway, worth reading the article, but I wanted to get on with my movie review. One-Sentence Movie Review: Spider-Man: No Way Home It is not easy to write a review that does this movie justice without spoiling the best surprises; all I can really say is that if you're already a fan you'll love it, and if you're not, you'll have no idea what's going on, but if that's the case you're not going to a comics-adapted movie anyway, and you should probably go see Licorice Pizza instead (which I haven't seen yet but looks artsy as hell). Rating: 5/5 |
As I pointed out in "Ain't Talkin' 'bout Love" , sometimes I get a random link that makes sense with whatever day it is. Today is not one of those times. Which came first, the butt or the mouth? New research gives an answer It's a chicken-and-the-egg question, but "which came first?" might not be the right way to think about it Given enough time, I'd probably find a way to relate this article to today being Christmas, probably involving some pun on "ass," as in "And Mary rode Joseph's ass all the way to Bethlehem." But, as I like to say, I can't be arsed. Besides, Christmas is just another day for me, and it's always relevant to talk about assholes. For instance, a couple of months ago, I posted another article about buttholes. It's here: "The End" Today's article is similar, but shorter and covers a few different aspects of the digestive tract. First, though, let me address the "chicken-and-the-egg question" mentioned in the subheading above: I don't know why this is even still in debate; eggs existed long before chickens did. I'm sure you've seen the fossilized dinosaur embryo they found recently. If not, here. Like many people, you may have spent hours pondering the question of which came first, the anus or the mouth? I assure you, it keeps me up at night. Most mammals have one major tract for solids going "in," and one for "out"; One for eating, a mouth, and another for defecation, typically considered the anus. One of these days, maybe I'll try to find out why they start with "most." But I'm really not all that interested in searching for counterexamples. Some scientists believe that, in evolutionary history, the mouth developed first, based on how embryos develop in the womb. Others dispute this theory, arguing that the anus developed first based on embryo development in different animals. Recently, a paper published in Nature reviewed these theories to get to the bottom of it. These are legitimate questions because it's not like soft tissue is often preserved in the fossil record. There's a concept in evolutionary history called Haeckel's Biogenetic Law, developed in 1866 by Ernst Haeckel. Haeckel believed that an embryo's developmental stages provide information about the adult stages of that organism's ancestors. If an embryo looks different at different stages of development, those differences correspond to how the ancestors of the species looked at adult stages. This "law" is often expressed as "ontogeny recapitulates phylogeny," but it's not really a law and has been shown to not be the case. Here's the important part, though: Researchers in the Nature study attempted to review all plausible theories by examining the blastopore, tissues, nerves and bands, and other pieces that make up the development of early human embryos. The three major theories they examined, for the evolution of mouths and anuses in bilaterians (which includes humans), were as follows: 1. The blastopore becomes the mouth, and the anus develops secondarily. 2. The blastopore becomes the anus, and the mouth develops secondarily. 3. The blastopore divides into the mouth and the anus through the fusion and separation of a tubular gut. They found that the evidence supports the third scenario, in which the blastopore elongates and closes laterally at both ends, giving rise to the mouth and the anus simultaneously (or as simultaneous as embryonic development can get). And maybe that explains why, with some people, it's hard to tell whether they're talking out of their mouth or out of their ass. Anal openings first appeared around 550 million years ago, around the time of the first worm-like creatures. That sounds like a long time, but consider: Life on Earth is probably about four billion years old, eight times longer than the time since the first asshole appeared. Granted, until about 600 million years ago, there was no multicellular life, and single-celled organisms have different means of ingestion, digestion, and excretion. That still leaves about 50 million years between the first multicellular organism and the time when the first shit was taken. And 50 million years is certainly a long time in human terms The dinosaurs disappeared about 60 million years ago, for instance, and recognizable humans have been around for less than a million years. So here we are, half a billion years later, surrounded by assholes. |
Unless something unexpected happens, I believe today will be the last time I'll use my eye surgery as an excuse to slack off on blog posts. I had the final procedure yesterday morning, and everything went smoothly enough (though not even Valium could keep me from freaking out about what was going on right in front of, and inside, my eye). But between recovering from that and wanting to do my daily French lessons tonight, both my eyes are really tired now, and the right one is, additionally, sore and unfocused. As is normal for this situation, I'm told. Still, everything seems to be going in the right direction. I won't be able to get examined for new glasses until after the new year, but I can see enough to be able to do stuff online. Or at least I will be after another day of recuperation. Most importantly, I already have a ticket for the Spider-Man movie on Saturday, and I should be able to watch it just fine. Meanwhile, like I said, I'm slacking off tonight to keep my eyes from getting any more tired. I expect I'll be back to my normal snarky self for my Christmas morning blog entry. Until then, for those who observe it, I hope you have a great Christmas Eve! |
Warning: today's link is a slideshow. I tried some usual tricks for unsliding it, but I think they're on to my attempts to deprive them of precious clicks, and I can't be arsed to figure out any new method tonight. As such, I normally wouldn't share something like this, but in this case, the combination of "hope" and "future" made me laugh, so here are some excerpts along with my snark. mRNA Vaccines and the end of disease If there’s a silver lining to the global pandemic, it’s the speed of the vaccine roll-out. Spurred by massive amounts of public cash and backed by decades of research, Pfizer’s COVID-19 vaccine was the first “Messenger RNA” medicine to be fully approved by the FDA. It’s unlikely to be the last. ...and half the population won't trust them. Self-driving cars Widespread adoption of self-driving cars is taking longer than was initially predicted, but once the technology is locked down and the public is comfortable with robot-driving, everything will change for the better. Hahahaha. Oh wait, you're serious. HAHAHAHAHAHA. Look, I'm a big proponent of autonomous vehicles. As far as I'm concerned, they can't roll out fast enough (and this is coming from someone who loves to drive). Mainly so I can get home from bars, but also, incidentally, because they will be safer. Yes. They will. Stop laughing; that's my job. Oh, sure, you've heard the stories of test vehicles crashing or hitting people. Those only stand out because it's new technology. Every year in the US, something like 40,000 people die as a result of meat-driven vehicles. Imagine if every single one of them were reported on the way a single AV mishap is. Would AVs reduce fatalities to zero? No. Nothing can make it perfectly safe. But as far as I'm concerned, if the fatality rate (and, in general, injury rate) could be reduced by a standard deviation or two, it's worth adopting. But no, people focus on the "it kills people!" part without considering that the current system kills people and we just accept it as the cost of doing business. What we need to focus on is "it will kill fewer people." Also remember that there are a lot of individuals and groups that make money from the current system. Not just car companies, who could probably retool, and not just professional drivers, but municipalities who get their revenue from speeding tickets. That revenue would dry right up like the Mojave after a drizzle. They have a vested interest in promoting the "It's not safe!" propaganda. Point, and the reason I laughed above, is that because of these and a myriad other factors (including loss of the illusion of being in control), "the public" won't ever be comfortable around robot drivers. Half the holdouts will be afraid of handing decisions over to an AI, and the other half will be afraid the AI will take over the world and enslave humanity. So no, I'll have to plan on staggering home from bars until the day I die (probably from being hit by a drunk driver). Just so you know, I had a lot more written in this section, but I deleted it. I have a lot to say about it, which is one sign that I need to shut up about it. So this is me shutting up now. De-extinction Living robots Advances in genetic engineering are getting us ever closer to saving endangered animal species from extinction and even bringing extinct species back to life. Scientists at Tufts university have created living robots. Science fiction and horror movies and books ought to be required reading/viewing for anyone going into science, especially bioscience and robotics. Full body haptic suits The idea of a full body suit that allows users to experience the feeling of a virtual or augmented space along with its sights and sounds is a little creepy. It’s like the background material in a dystopian science fiction movie or something. So is everything else on this list. In any case, of all the ones I'm highlighting, this is the one that I could see becoming reality. Because sex. Carbon vodka and garbage sunglasses The Air Company markets a Vodka they say they is made from carbon dioxide, so you can get drunk and eliminate carbon from the atmosphere at the same time. How about some advances in editing technology while we're at it? "they say they is made," indeed. Feh. Regardless, all vodka is made from carbon dioxide. Because it's made from plants. And plants sequester carbon. They turn the carbon into starches and/or sugars, which yeast then turns into magic fluid. I can't be arsed to find out if this company's process is somehow more efficient or environmentally friendlier. Anyway, I don't mean to be negative about scientific advances. Of course I'm excited about real progress. Let's just remember that almost no invention comes without a cost of some sort, sometimes one that only becomes obvious after the fact. The only question is whether the benefits outweigh the costs. So no, none of these make me hopeful about the future. That would require us having done something about climate change 30 years ago, not now that it's too late. |
I don't have a whole lot to add to today's link. It's something that tracks with things I've been saying for a while, though, so here it is. Why it's time to stop pursuing happiness Positive thinking and visualising success can be counterproductive – happily, other strategies for fulfilment are available Essentially, in my view anyway, one doesn't find "happiness" by demanding happiness; it is, rather, the result of doing things that are fulfilling. These things are different for everyone. Over the past 10 years, numerous studies have shown that our obsession with happiness and high personal confidence may be making us less content with our lives, and less effective at reaching our actual goals. Indeed, we may often be happier when we stop focusing on happiness altogether. Ever had trouble finding the right words to express something? Pretty sure we all have. Or remembering the name of that actress from that movie, you know, the one with the... whatever. Then you go think about something else and the answer just eventually bubbles up unbidden. I think of this as kind of like that. Again, though, not much to add for me here. The article does take a dig at Norman Vincent Peale, who I'm pretty sure I've ragged on in here before. And it doesn't come out and say it, but it also negates a lot of new-age type crap like The Secret. So if you're interested in this sort of thing, give it a look. It is, as is often the case here, a sort of ad for a book, but the article has good information anyway. Or not. You know. Whatever makes you happy. |
With over 1800 entries in this blog, I'm pretty sure I've repeated a blog title before, though I haven't gone searching for duplicates, so I don't know when it might have happened. This problem is only exacerbated on important dates, like my birthday or the winter solstice, both of which I tend to bang on about when they happen. And today is the winter solstice, at 10:59 am in my time zone. Hence the twisting of a Metallica lyric in my title today; I'm pretty sure I've never used that one. It has been my personal tradition, for a while now, to stay awake between sunset and sunrise on the longest night of the year. I do this to remind myself that "even the darkest night will end, and the sun will rise." As an aside, I can't be certain that tonight is the longest night. The information I found goes only to the minute, and the number of minutes is the same tonight and tomorrow night (though today will definitely be the shortest daylight of the year). I'm sure I'd know if they broke it down to the second. Either way, though, the difference, if there is one, is measured in seconds so it hardly matters. I'm only pointing this out because I like to get these technical matters straight. This staying up through the longest night thing is something I try to do but don't always have the opportunity. Sometimes, I'm traveling, and that's not conducive to staying up all night. Well, I'm not traveling this year. I vaguely remember that one year, I had a cold, and so I didn't do it, because that would suck. One requires rest when one is ill. But I don't have so much as a sniffle right now (which is weird, because usually winter makes my sinuses act up even if I don't have a cold or you-know-what). This year, my excuse is that I'm between eye surgeries -- as I noted, I had my left eye done on Thursday, and the right eye is scheduled to get poked at this coming Thursday. So no, I'm not going to attempt it tonight. My sleep is disrupted enough this week, as I'm pretty sure I mentioned a couple of days ago. But I have to mark the occasion in my mind, anyway, and yesterday, I realized that I already had. The moment of sunrise on Solstice day is like a rebirth or renewal for me -- as I've said before, quoting Doctor Who, it's the moment I know that we're "halfway out of the dark." Well, I had a moment like that on Sunday, instead. See, before I go see a movie, I sit at the theater bar and drink some beer. These past few months, I've had to take a picture of the menu on the wall and order from the image on my phone, because the writing on the wall was just too blurry for me to read -- but I could see fine up close. Being able to read beer menus on walls is an important life skill for me, and not being able to do it makes me feel like I'm too old and decrepit to be there. ("Don't they have printed menus?" "Yes, but they can't be arsed to keep them up to date.") Well, before I went to see West Side Story, I sat down at the bar and... I could read everything on the menu. Even the small print, like ABV and prices. Though only out of my left eye. So yeah. I'm halfway out of the dark. |
Coincidence that this particular article came up today. You'll see why in a minute. Eddie Van Halen endured a 'horrifying racist environment' before becoming a rock legend In an interview with Marc Maron, former bandmate David Lee Roth revealed just how painful the experience was for the late artist, who was of Indonesian and Dutch descent. It would be very weird if articles appropriate to other topics of the day didn't come up every now and then, but this one's almost eerie -- there are about 40 links in my queue, and this is the only one that I recall with the theme of racism, and it just happens to be right after I saw West Side Story, which also deals with racism and is set in the same general era. You know the story. Movie review at the bottom, as usual. Anyway, the article. Music fans around the world are mourning the loss of iconic Van Halen rock star Eddie Van Halen. Yes, this article is from last year. But considering I'm still mourning Leonard Cohen from 2016, it wouldn't surprise me. And while many today honor his legacy as one of the all-time greatest guitarists, fans are also highlighting past interviews describing his encounters with painful racism and discrimination because of his mixed race in his early years. Bit of background for me: I've never particularly liked Van Halen's music. I've never particularly hated it, either. It was just kind of there, on the radio. But I recognize that the band was talented, and that a lot of people were big fans. However, I didn't much care about things like the David Lee Roth / Sammy Hagar spat. I only say this to point out that at no time did Eddie Van Halen's ethnicity matter to me one bit. I mean, from the name, I figured he had some Dutch in there, but my favorite musician also has Dutch ancestry, so I didn't think anything of it one way or another. And I had absolutely no idea that he was part Indonesian. Van Halen... was the son of Dutch and Indonesian immigrants and spent his childhood in the Netherlands. It doesn't take much knowledge of history to figure out how that might have happened. His former bandmate David Lee Roth, a fellow rock superstar, once revealed on the podcast "WTF with Marc Maron" just how painful the experience was for the young Van Halen and his brother, drummer Alex Van Halen. I would, however, like to point out that from what little I know about Roth, I'm not sure I'd fully trust anything he said about Van Halen. I suppose they might have reconciled at some point, but again, I have no idea and I can't be arsed to look it up. Still, I don't know why he'd make up something like that, and there seems to be some primary-source confirmation later. He added that the brothers, who were often referred to as "half-breed" in the Netherlands, still met difficult circumstances after immigrating to the U.S. I'm not exactly surprised. "My first friends in America were Black," Eddie told the journalist. "It was actually the white people that were the bullies. They would tear up my homework and papers, make me eat playground sand, all those things, and the Black kids stuck up for me." I really wonder why he said "actually" there. I mean, that's what I'd expect from racism in the US. Who did he expect to be the bullies? Other marginalized people? There's not much more to the article, just a few more details which you can see for yourself at the link, but since I didn't know about it I figured other people might not, either. Whether you're a Van Halen fan or not -- and like I said, their music didn't usually do it for me -- you gotta admit that they were talented, and everyone loves a rags-to-riches story. I just hope that someday, we won't have to hear stories about people being picked on for being different. I doubt it, but I can hope. Even if it did provide the impetus for him to become one of the most famous guitarists in rock. After all, nothing could have annoyed those bullies more. Living well can be the best revenge, or so they say. I wonder how many of them ended up being fans? "Oh, him, yeah, that's the weirdo we used to shove into a locker at school. Dude can shred!" One-Sentence Movie Review: West Side Story: Absolutely gorgeous movie of the Oscar-bait variety (not that the Oscars have been worth a damn for many years), but I'm not sure there's anything new to the story, and the whole "horny teens create unnecessary drama and widespread collateral damage" thing has been played out for at least decades, if not longer. Rating: 4/5 |
Kinda worn out right now so no major post today, or link. Just been sleeping irregularly since the surgery. I think it's knowing that I have to go through that horrific five minutes again next week. Tonight is a full moon, but it's cloudy here so I doubt I'll get to see it. The solstice is on Tuesday (which is not something you look at, so cloud cover won't matter), and that usually manages to lift my mood, knowing that days will start getting longer. We'll see. As for personal plans, hopefully tomorrow I'll have a movie to review, and I'm also planning to go see the Spider-Man movie on Christmas Day. My personal Christmas tradition is to drink and watch a movie in the theater. Last year, that ended up being Wonder Woman 1984, and I was nowhere near drunk enough to ignore that movie's terrible plot with its idea that humanity could act as one to achieve a common goal. I mean, read the room (yes, I know it was written before the pandemic). I trust Spider-Man will be better. Hell, it would have to be. Like I said, short entry, but I may not be able to get one in later, so this is what you get. Until tomorrow. |
Things continue to improve, so I thought I'd once again tackle something from my queue. This time, without glasses on. What do near-death experiences mean, and why do they fascinate us? Psychiatrist Bruce Greyson has spent decades talking to people about near-death experiences. His work raises questions about what happens when we die, and how we ought to choose to live. I can take a stab at the answer to the second question in the headline: "Why do they fascinate us?" Because they're the Unknown, and the Unknown fascinates humans. It's how we got to where we are now, for good or ill. The meeting had been organised by Bruce Greyson, now a professor emeritus in psychiatry at the University of Virginia. Part of the reason I saved this link was because of the link to my hometown and alma mater. But to my knowledge, I've never met the guy. If I had, I probably would have made a "Bruce Wayne / Dick Grayson" joke, because I'm an asshole. It wouldn't matter to me that the last name is spelled differently. I have to make a conscious effort to remember that "gray" is more common in the US, and "grey" in the UK. But that's irrelevant to names. Here's where I note that the link above is from the Guardian, which uses British spellings and other conventions, and I make no attempt to Americaniz(s)e them here. A month into his psychiatric training, in the 1960s, he had been “confronted by a patient who claimed to have left her body” while unconscious on a hospital bed, and who later provided an accurate description of events that had taken place “in a different room”. This made no sense to him. “I was raised in a scientific household,” he says, over Zoom. “My father was a chemist. Growing up, the physical world was all there was.” I can understand that, as my father was a chemist too (sailor by profession, chemistry degree). I like to think I got my fascination with science from him. The thing about science, though, is that its whole purpose is to investigate things that we don't understand. To me, that includes subjects that are traditionally in the realm of mysticism, spirituality, and, well, the Unknown. Not all scientists agree, and so you get people scoffing at research into parapsychology. UVA has, or at least had, a dedicated parapsychology department. Not as famous as Duke's, maybe, and they had a rough time of it right after Ghostbusters came out, but personally I think applying scientific methods to study the things we don't understand is overall a good thing -- so long as the researchers can try to be objective. Greyson presents his research in a new book, After, which is bound by a series of case studies. I do seem to find a lot of articles that are basically book ads. As usual, I don't have a problem with this as long as the article is interesting. Greyson says. Some people recall out-of-body experiences, or report travelling through a long tunnel; others meet entities they think of as God or Allah or long-dead family members; some feel time bend and warp, as though it were elastic. Leaving aside for the moment that these sound a lot like dreams -- which could be the mind's way of trying to make sense of misfiring neurons or whatever -- the "long tunnel" thing has been a mainstay of NDE descriptions since at least the time when Moody's book came out (it's mentioned in the article) back in the 70s. At the time, I was an impressionable child, and I read it, like many people, hoping to make sense of the Unknown of death. One thing about the tunnel imagery always bugged me: some people use it as evidence for reincarnation, saying it's reminiscent of the process of being born, from the point of view of the baby. I'll leave the myriad problems with this "hypothesis" up to the reader. Given that near-death experiences happen with limited warning, they are almost impossible to test. I have a vague memory of a movie, maybe back in the 70s or 80s, that dealt with testing this. But it was only a movie, a work of fiction. At the University of Kentucky, the neurologist Kevin Nelson, who, like Greyson, has spent years recording NDEs as a kind of academic side-gig, thinks of the experiences as “a blending of two states of consciousness – wakefulness and REM sleep – during a time of great physical or emotional danger,” and argues that many NDEs are “dream-like”, existing in a neurological “borderland”. Like I said, similar to dreams. In After, Greyson writes: “I take seriously the possibility that NDEs may be brought on by physical changes in the brain,” though he also accepts that the mind might be able to function “independent” of it. There have been reports of people experiencing near-death episodes while their brains are inactive, he says, and “yet that’s when they say they have the most vivid experience of their lives.” This doesn’t make sense to him. Partway though our conversation, he asks: “Are these the final moments of consciousness? Or the beginning moments of the afterlife?” I think we all know I'm a materialist. But if actual evidence is produced, I'm willing to accept it. This isn't evidence. This is speculation. When I ask him what his current logical understanding is, he looks resigned. “It seems most likely to me that the mind is somehow separate to the brain,” he says, “and, if that’s true, maybe it can function when the brain dies.” Then he adds, “But if the mind is not there in the brain, where is it? And what is it?” Despite the subject matter, I feel these are legitimate questions to ask. But they need to be addressed with scientific rigor, not biased to what someone believes to be true on faith alone. To Greyson, the impact near-death experiences have on people’s lives has been his most surprising discovery. Whatever they are, whatever the mechanisms behind them, I have little doubt that NDEs are real, any more than I doubt that some people have seen strange phenomena in the sky. But as with making the immediate cognitive leap to "must be space aliens" in the latter case, assuming that NDEs are proof of some afterlife is a massive leap to conclusions. Having a mind separate from the brain makes no sense with our current understanding of science. Which doesn't mean our current understanding is completely correct, and that's why research is needed. You never know when a paradigm shift can occur. Well, I've banged on long enough. Most people have already drawn their own conclusions, and that's fine. Me? I'm not making a conclusion, but do consider this: if the mind is separate from the brain, then why do physical changes in the brain (injury, trauma, chemical imbalance, Alzheimer's, etc.) get reflected in the individual's personality and thought processes? There's no doubt, though, that there is much we don't yet understand. And I'd hope that there will always be things we don't understand. |
Not going to do my usual thing today. After eye surgery yesterday, it's difficult for me to focus on a screen. My left eye with its new lens protests against any attempt to see up close, and the right one (scheduled for next week) no longer works properly, even with my glasses. Trying to use both at the same time just gives me a headache. I'm sure things will get better (optimism from me -- who'd'a thunk it), but for tonight, I'm taking it easy. It did occur to me today that after the next surgery, my prescription glasses will no longer be of any use, and I'm not sure yet how I'm supposed to see up close to do things like reading, writing, or language lessons. Perhaps a cheap-o pair of reading glasses from the pharmacy can tide me over until my regular eye doc can measure me for new ones. But that will have to wait for at least another day or two so my eye can begin to heal. With any luck, though, I'll be able to see a movie in the theater this weekend. I can already tell that the distance vision, past about six meters or so, has greatly improved, and it's not like I need binocular vision for movie-watching. I spent a few minutes tonight just looking at the moon (with my left eye), which I haven't seen clearly in months. As for the surgery itself, the less said about that, the better, especially since I have to go through it once more. And I'm trying to be more focused (pun intended) on the results rather than the process. Well, I don't see too many red squiggly underscores here, so hopefully there isn't a huge number of typos. I don't expect to be online much today, though. With little else to do, I might empty the refrigerator of beer. At least then I'll have a good excuse for seeing double. |
As I prepare to face one of my phobias later today, it's only fitting that this story about spiders came up this time. Why so many of us are casual spider-murderers It's officially arachnicide season in the Northern Hemisphere. Millions of spiders have appeared in our homes – and they'd better be on their guard. Why do we kill them so casually? Plus, the new Spider-Man movie hits theaters today. I don't know when I'll get a chance to see it, but I do hope it's soon. Now, I kind of like spiders. What I mean by that is: when I encounter one outside, I appreciate its elegance and its insectivorous qualities. When I encounter one inside, I try to turn it into an outside spider -- but sometimes, like when I saw a telltale red hourglass marking on one of them, well... things can get ugly. First I needed to fetch something from the shed – the domain of monstrous spiders the size of baby mice, who lurk in corners with just their furry, gangling legs protruding. On the other hand, I am not a fan of enormous spiders jumping out at me. That doesn't give me time to do a threat assessment. Eventually my journey ended on the patio – and here there was a shock. Lying on the paving, legs splayed out wildly, as though he had fallen from a great height – was the pallid corpse of Stripy. One thing I have never, ever done, though, is give a spider a name. I once had a praying mantis that liked to hang out on my ceiling, and his name was Batman. Mantises (mantes?) share the spiders' penchant for eating annoying bugs without all the extra legs, eyes, and fuzz. Why do many of us kill spiders so casually, swatting out their lives with our god-like power, almost like it's a reflex? Because we're bigger (normally) and prefer to remain unbitten? Spider massacres like these are even more jarring when you consider that spiders and humans are not so different. Though our evolutionary paths diverged at least 530 million years ago, we share many of the same organs and body parts – such as kneecaps – and similar brain chemicals, from dopamine to adrenaline. No one has ever studied spider emotions directly, but it's easy to imagine that they might be more relatable than you would think. Except I don't like to eat bugs. They may also have their own unique kind of intelligence, in which they're able to use their webs to help them think. I've seen articles about this before. The webs act as kind of an external brain. In that, they're not so dissimilar to us, as we use the Web as an external brain. According to Jeffrey Lockwood, there are a number of reasons we struggle to empathise with spiders – in fact, these unlucky creatures possess a constellation of separate features that chance has combined into a package we find uniquely repulsive. All those bloody damn eyes, for starters. Human infants as young as just five months old tend to be more threatened by images of spiders than those of other organisms, suggesting that our aversion to them is partly innate, perhaps having evolved to prevent us from casually picking up ones that are venomous. I do have a vague memory from early childhood of being freaked out by a spider that was webbing down from the ceiling toward my crib. Though whether that's a memory of a real event or of a nightmare, I'm not certain. Many of the most chilling stories about spiders have an element of surprise – such as the time a friend donned an old Halloween costume that had been stashed away in the loft for years, and someone said "wow, I love the spider detail on your neck! It's so realistic…". Much screaming ensued, because this was most certainly not part of the look, but a real spider who had silently slunk down from their hat. This is comedy gold -- if it doesn't happen to me, that is. Apart from their menacing fangs and scampering legs, spiders face another challenge in the looks department, at least from a human perspective: they don't look like human babies. To me, this is a point in favor of spiders. Anyway, a good article to read, and there aren't even that many pictures of spiders to freak people out. The ones that are there are kind of cute. I'm sure most people know intellectually that spiders are our friends, but reason and rationality are poor weapons against the true enemy, which is fear. I'll have to keep that in mind when I freak out as they drill into my eyeball. |
One thing I find interesting is the origin of words. A subset of that is the origin of names. One can look at that map and find some notable patterns -- how, for instance, Smith seems to be the most popular name in so many Anglophone countries. Or that, although there are several different popular surnames in different Spanish-speaking countries, most of them seem to be ancestral. Sorry, you'll have to go to the link to know what I'm talking about. I'm not going through the process to be able to embed the maps here. The most intriguing thing to me isn't the names themselves, though. I mean, I'm sure you already knew the abundance of Smith here in the US, and that Nguyen is predominant in Vietnam. No, the bit I like is the categories: Surnames generally fall into 1 of 5 categories: toponymic (location-based), occupational, personal descriptor, patronymic (from the name of a father or ancestor), and names that signify patronage. What surprised me is that the only place where occupational names reign supreme is Europe and some of its former colonies (US, NZ, etc.) It's entirely missing from the map in Asia, South America, and Africa. Which is not to say that those countries don't have occupational names, but it's not represented in the countries' most numerous names. This might be saying something profound about what different cultures choose to honor by taking names -- occupation for English speakers, ancestry for most others -- or maybe it doesn't. But it does track with how we here in the US tend to define ourselves by our occupations, while other cultures identify more as a part of a family or clan. But I may be reading too much into this; I don't know. Names can get popular through fertility, too -- someone who has lots of descendants who share their name. There may be other things to be gleaned from the maps; feel free to let me know if you notice anything. |
Here's one from The New Yorker on why what you're doing is Bad and You Should Feel Bad. Economics 101: There is no such thing as a free lunch. Listening to music on the Internet feels clean, efficient, environmentally virtuous. Alternatively, it feels like a lot of work and, while it has the advantage of breadth, sometimes finding a particular song can be a pain. And just try doing this while driving across country, through places with no service. And I never thought of it as particularly environmentally virtuous. Instead of accumulating heaps of vinyl or plastic, we unpocket our sleek devices and pluck tunes from the ether. Kind of like radio, only without inane DJ chatter. The ostensibly frictionless nature of online listening has other hidden or overlooked costs. Not many things don't have such costs. Exploitative regimes of labor enable the production of smartphone and computer components. And everything else. Conditions at Foxconn factories in China have long been notorious; recent reports suggest that the brutally abused Uighur minority has been pressed into the production of Apple devices. And yet, no one cares enough to do anything about it. Child laborers are involved in the mining of cobalt, which is used in iPhone batteries. Gotta find some kind of use for kids. Spotify, the dominant streaming service, needs huge quantities of energy to power its servers. Huge compared to what? A house? A city? A Bitcoin mine? No less problematic are the streaming services’ own exploitative practices, including their notoriously stingy royalty payments to working musicians. And yet, again, we put up with it. But Devine isn’t interested in inducing guilt; he simply wants us to become more aware of the materiality of music. He writes, “There is a highly intoxicating form of mystification at work in the ideology of musical culture more generally.” As a result, music is “seen as a special pursuit that somehow transcends the conditions of its production.” Devine’s critical history of recording formats throws a necessary wrench into that mythology of musical purity. Yes, this guy's promoting his book. Fine. Anyway, I'm pretty sure Neal Peart said it better: One likes to believe in the freedom of music But glittering prizes and endless compromises Shatter the illusion of integrity, yeah And that song ("The Spirit of Radio") came out in 1980, when the height of music technology was the cassette player, you either bought LPs or tapes because CDs weren't a thing yet, and the internet was barely a spark in ARPA's eyes. Point being, it's not a new observation. The linked article goes in to some of said history, and for once for this publication, it's succinct and worth reading. “Musically, we may need to question our expectations of infinite access and infinite storage,” he writes. Our demand that all of musical history should be available at the touch of a finger has become gluttonous. It may seem a harmless form of consumer desire, but it leaves real scars on the face of the Earth. Some sacrifices are going to be necessary, sure -- purposeful or otherwise. But no. You can't have the Apocalypse without a soundtrack. It just isn't done. And since I quoted the song above, here it is -- also to demonstrate that yeah, there is indeed "magic at your fingers," even today. For now. Off on your way, hit the open road There is magic at your fingers For the Spirit ever lingers Undemanding contact in your happy solitude |
For the rest of the month, I may be late posting or, if things don't work out the way I'm expecting, I might miss a day or two. Just a reminder, if you've gotten used to my schedule. I'll get back to commenting on offsite links tomorrow. Today, I just want to thank everyone who's been reading, and those who commented yesterday. ⭐Princette♥PengthuluWrites - 837 day streak here. They say I'm in the top 1%. Your position is even more rarefied. Well done! Félicitations, j'espère que tu continueras. 🌕 HuntersMoon , 10 years beats my marriage record by a lot. Awesome! Write_Mikey_Write! , lots of people wait until the last possible minute to enter a contest. Even people who run contests and challenges. As for the bonus achievement, one advantage of not having a car is I don't run out to fast food places on a whim. Yes, there's Uber Eats, but so far I've resisted. Sumojo , the I Write activity is a great motivator, I've found. I know I participated in it once. Can't remember if I finished. I seem to remember getting hung up in September when I just couldn't take another "birthday" prompt. And congrats on the anniversary - you even have 🌕 HuntersMoon beat there! QPdoll is Grateful , that's awesome about the degrees! And obviously I'm a fan of 30DBC, but I don't think I ever participated in all six in one year - that's certainly an achievement. Lilli 🧿 ☕ , you know, that's a major accomplishment these days. Leger~ , thanks, always good to hear from you. And waking up every morning is something I've never managed to accomplish, so yay! Prosperous Snow celebrating , I tried to do Contest Challenge but I got freaked out at all the backlog. Putting it off is only making it worse. Awesome that you're able to do it. Congrats! NaNoNette , I've heard that communication is key to a successful marriage. I wouldn't know firsthand. Kåre เลียม Enga , I appreciate all the encouragement you provide, not just to me, but to all bloggers. That's an achievement too. No, I don't feel obligated to write every day. If I did, I'd probably stop because it will have become "work." As long as I want to do it, I'll continue. And since, like I said, I'm feeling generous, I'll send everyone who commented yesterday a Merit Badge (to be sent out later today). Also, I know I said CR not guaranteed, but if I've sent you one in the past two weeks (I know that applies to a few people here), I will hold off on sending yours until closer to the end of the month. Again, thanks for reading, and congratulations on all your achievements! |
Little bit different format today; I'm not up for dealing with a link right now. Yesterday, an old friend stopped by, with her husband, and we all took advantage of the 60s (Freedom Units) outdoor temperatures in December by hanging out on the deck and getting completely plastered. Good to do that sometimes. Instead, I wanted to note that today is December 12, which means that if I make a blog entry tomorrow, the 13th, then I will have written an entry in here every day for two calendar years. I didn't really set out to make this accomplishment, but once I realized I was doing it, I figured, why not keep it up? With eye surgery coming up fast (eek), I'm not sure if I'll be able to extend the streak. But I'm going to try. Just don't expect the regular schedule for the next couple of weeks. I may post late, if at all. Still, the real satisfaction, for me, comes not from slavishly sticking to the daily writing exercise, but from the interaction I have with readers. I appreciate all the comments. So to mark two full years of blogging every day, let's have a... Merit Badge Mini-Contest! As this is, after all, my personal soapbox, I get to use it to celebrate some of my few accomplishments in life, such as the aforementioned two years of blogging every day. But I also want to hear about yours. So in the comments on this entry, tell us about one of your recent accomplishments -- or one that you hope to achieve soon; either way. Notes: I will award at least one Merit Badge if there are any comments today. I may award more than one at my discretion because I'm feeling generous. Since I just sent a metric asston of Merit Badges to people who completed October's NaNo Prep, CR is not guaranteed. Merit Badge(s) will go out tomorrow, December 13, so you'll need to comment here by midnight tonight, Sunday 12/12/21, to be considered. But if you happen to see this entry later, your comment is still welcome. |