Not for the faint of art. |
Complex Numbers A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number. The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi. Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary. Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty. |
I'm not entirely sure why I saved this particular Live Science article, and it's only been like three weeks. Yeah... it's not really a paradox. Flanked with fjords and inlets, Alaska is the state with the most coastline in the United States. Easy to accomplish when you're a giant peninsula with many craggy islands offshore. But what is the length of its oceanic coast? It depends on whom you ask. According to the Congressional Research Service, the number is 6,640 miles (10,690 kilometers). But if you consult the National Oceanic and Atmospheric Administration (NOAA), the coastal edges of the state total 33,904 miles (54,563 km). Yep, that's a big difference, all right. Not to brag, but I knew the answer. Still, I want to say that, obviously, the former number is incorrect, because Congress always lies. The coastline paradox occurs because coasts are not straight lines, and this makes them difficult, or impossible, to measure definitively. From an aircraft, you can see that the coast has many features, including bays, inlets, rocks and islands. And the closer you look, the more nooks and crannies you'll find. Oh, now I remember why I saved it. It's related to fractals like the Julia set or Mandelbrot set, which involve complex numbers, and, well... you know. As a result, the length of a coastline depends on the size of the ruler you use. This isn't just a coastline issue. Lots of survey boundaries follow the thread of a river (or, in the case of VA/MD, the low-tide shoreline of the Potomac), which has similar characteristics. But if you used a smaller ruler, you'll capture more complexity, resulting in a longer measurement. Hence, a paradox. Okay, I suppose, for some definitions of paradox. Regardless, that's what it's called, so I'll run with it. According to work published in 1961, English mathematician Lewis Fry Richardson noted how different countries had different lengths for the same shared border because of differences in scales of measurement. In 1967, mathematician Benoit Mandelbrot expanded on Richardson's work, writing a classic Science paper on the length of Britain's coastline. This later led him to discover and conceptualize the shape of fractals, a curve that has increased complexity the more you zoom in. This is also related to why no one can agree whether China or the US is the third-largest country by area: it depends how you measure some of the boundaries, including the coasts. Also, vertical differences get thrown in; this is the same fractal problem, only in two dimensions (surface), not one (boundary line). The concept of "dimensions" also gets modified when you're dealing with fractals; you can get fractional dimensions. Which, it should come as no surprise, gave fractals their name. I object, however, to the idea of "increased complexity the more you zoom in." I'd argue that you get the same complexity, just at different scales. I guess the sentence can be read like zooming in reveals greater complexity. The article also features some nice Mandelbrot set zoom animations, which I always find fascinating. This can hold true for coastlines. You could technically measure a coastline down to the grain of sand or atomic level or smaller, which would mean a coastline's length could be close to infinity, Sammler said. Another nitpick: no such thing as "close to infinity." In the real world, as opposed to a purely mathematical construct, there's a minimum length (it's very, very small). That minimum length implies an upper bound to how long a fractal boundary can be. It can be very, very long... but that's still not infinity. Coastlines are also shifting entities. Tides, coastal erosion and sea level rise all contribute to the fluctuating state of coastlines. So maps from the 1900s, or even satellite imagery from a few years ago, may not resemble what coastlines really are today. And if you want to get really technical, it changes from moment to moment, as portions are eroded and others built up. Not to mention general changes in sea level. As I've noted before, you can't step into the same river once. So how much coastline does Alaska, the United States, or our entire planet, have? We may never know the accurate number. It's a paradox, and like many things in nature, escapes our ability to define it. Which shouldn't mean we throw up our hands and give up. I like to think of it as a metaphor for life itself: always approaching an answer, never quite getting there. But learning more and more along the way. |
Today, we have an article from Fast Company about something of great worldwide import. How Comic Sans became the Crocs of fonts After 30 years of abuse, Comic Sans is ready for its redemption. Objection! Comic Sans never deserved the opprobrium heaped upon it by self-proclaimed font snobs, whereas Crocs deserve every criticism and then more on top of that. Comic Sans has turned 30, and it’s done being your punch line. I have long said that it should be the Official Sarcasm Font of the Internet. For three whole decades, Comic Sans cowered at your reproaches and winced at your jokes. That's obviously poetic license, but I have occasionally wondered how the anticomicsans vitriol might have affected the poor, innocent font creator. It barely flinched when Google’s practical joke made sure that searching for “Helvetica” would render all results in Comic Sans. Okay, I hadn't heard of that, but that's legitimately hilarious. But Comic Sans has just hit the big 3-0—and it’s ready for its second act. Great, make the rest of us feel even older than we are. Don’t take it from us. Take it from various studies that have been done on the subject of “turning 30.” And from the three experts who contributed to this story and said that turning 30 marks a period of introspection and change. And this is where it goes from whimsical poetic license to stretching a metaphor beyond its elastic limit. But, whatever, I'm entertained. For starters, Comic Sans wants you to know it wasn’t ever meant to be taken seriously. Vincent Connare, who was then a typographic engineer working at Microsoft, created the typeface in 1994. Well, that answers part of my musing above, musing that never reached the level of "why don't I just google it?" Most of us were blissfully offline back then, so Microsoft had devised a program called Microsoft Bob to teach people how to use computers. I'm pretty sure 1994 was the year I first obtained an internet provider. But I'd been using computers for at least 15 years before that, both for work and recreation. Well, "recreation" included learning how to code, which, let me tell you, was a lot harder to do before the internet. We had to buy books. It also included playing early video games, which is why I'm not good at coding to this day. Comic Sans was inspired by the comic books Connare had lying around in his office. Hence the name. I always figured it was from comic books, not funny-ha-ha comics. Comic Sans appeared on restaurant menus, funeral announcements, official government letters, bumper stickers, business signs—so many places, in fact, that there’s a popular subreddit on the topic. Even the Vatican used it in 2013 to commemorate Pope Benedict XVI. Like I said, I don't hate the font, but if I saw it everywhere, I'd learn to. Like with Crocs. The boom continued well into the early aughts... Noughties, dammit! ...at which point Microsoft released a licensed version of Comic Sans in 2010. And the hate might also have spilled over from a generalized dislike of Microsoft, who have definitely made some... questionable... design decisions. Like Clippy. Over the past few years, the font has become a favorite among people with dyslexia because “the letters can appear less crowded” and “it uses few repeated shapes, creating distinct letters.” I'd heard that the font was originally designed to be dyslexic-friendly. Perhaps that was fake news. Designed that way or not, it seems to be so. Perhaps some people got less vocal about their distaste for CS for fear of being labeled ableist. Like the Eiffel Tower (which drew a slew of protests while it was still in construction), or that Mariah Carey Christmas album, the typeface has become nothing short of iconic. (Though we can agree to disagree on the Mariah front.) Oh, yeah, damn right we disagree. I have no hate for Mariah Carey and acknowledge that she's talented, but that album is about 70% of the reason I don't go out in public in December. Today, Comic Sans is the Crocs of fonts. First we hated it, then we loved to hate it, now we kinda, maybe love it because we’re experiencing it through a different lens. "We" my ass. My issues with the writing, and the author's questionable taste in music and footwear, aside, I'm glad to see Comic Sans finally getting some... well, not love, exactly, but less hate. I still say it should become the Sarcasm Font, but no one listens to me. |
Sundays are when I usually dip into the past, and today is a Sunday, so here's a blog entry I did way, way back in 2007: "Useless" The internet has changed somewhat since 2007, so the links aren't what they used to be. They're still active URLs, surprisingly enough, but I wouldn't go in there without a condom. First was the virtual bubble wrap. As I noted, the bubble wrap makes the popping noise as you mouse over it, but then regenerates. Infinite bubble wrap, right? Wrong! It was Flash-powered, and Flash is dead. The Web lost a lot of awesome stuff that day, including the utterly useless bubble wrap site. I don't, in fact, know why the URL is even still there. Just to taunt us, maybe. Or perhaps for some more nefarious purpose, hence the condom. The second URL at the site was http://www.papertoilet.com/ That one is, to my vast surprise, still operating. I guess it's not Flash. So many great things on the internet lost to time, and the one that remains is a virtual roll of toilet paper that you can unroll to reveal... absolutely nothing? Fitting. And I'd recommend staying away from the third link, even if you're behind seven proxies. It was, as far as I can tell, just a stapler that you could make go kachunk. Well, not a stapler. A picture of one. Or whatever. And it seems to be gone, anyway. Useless to us, maybe not to malware producers. And so the unweirdening of the internet proceeds. We have lost so much that we'll never, ever get back, and what do we have to show for it? Influenzas and trolls. Useless. |
Something interesting I found at BBC Future: It's also a book ad: In our new book, we explore the many internal and external factors that influence and manipulate the way we think – from genetics to digital technology and advertising. And it appears that language can have a fascinating effect on the way we think about time and space. And I'll do my usual pointing out that this isn't settled science. But a growing number of experts believe language can influence how we think just as our thoughts and culture can shape how language develops. Honestly, I'd kind of figured that was the case. I think in words, myself. Usually English ones. But I've asked other people, and some say they don't. I imagine kids these days think in emoji. For example, we know that people remember things they pay more attention to. And different languages force us to pay attention to an array of different things, be it gender, movement or colour. I'm still just focused on getting French pronunciation close to correct. Linguists, neuroscientists, psychologists and others have spent decades trying to uncover the ways in which language influences our thoughts, often focusing on abstract concepts such as space and time which are open to interpretation. I hope they took into account my anecdotal evidence about not everyone thinking in words. There follows a good bit of examples of the science people did to investigate this. I could probably nitpick some of the methods, but I don't feel like it today. This is the part that most interested me, as someone trying to be bilingual: Things start to get really strange, however, when looking at what happens in the minds of people who speak more than one language fluently. "With bilinguals, you are literally looking at two different languages in the same mind," explains Panos Athanasopoulos, a linguist at Lancaster University in the UK. "This means that you can establish a causal role of language on cognition, if you find that the same individual changes their behaviour when the language context changes." While it is true that I complained about how bad this year's Beaujolais Nouveau was (it was really bad), and joked about how that means I'm turning French, that doesn't mean I'm turning French. When you learn a new language at my advanced age, things go a lot slower. We aren't necessarily prisoners to thinking a certain way, though. Intriguingly, Casasanto has shown that you can quickly reverse people's mental time representation by training them to read mirror-reversed text, which goes in the opposite direction to what they're used to. This refers to a part I didn't quote, which stated that people generally conceive of time moving in the same direction as their language writing. In contrast to language learning, I taught myself how to read mirror-reversed text (and upside-down, and upside-down mirror-reversed) when I was a kid, and I still conceive of time's arrow as moving left to right. Maybe that's a case of just because I can do it doesn't mean I've internalized the connection to conception of time. There's a whole lot more to the article, but I'll skip to near the end: As this body of research grows, it is becoming increasingly clear that language is influencing how we think about the world around us and our passage through it. Again, though, I wouldn't take it as settled science but more of a working hypothesis. And while being multilingual won't necessarily make you a genius, we all can gain a fresh perspective and a more flexible understanding of the world by learning a new language. A reasonable assertion, I think. One reason to actually learn a language rather than relying entirely on smartphone apps for translation. |
Several years ago, I did an entry on the Stoned Ape Hypothesis, and expressed great skepticism: "Expanded Consciousness" I promptly forgot all about it, until this Big Think article pinged my radar. A new spin on the “Stoned Ape Hypothesis” The controversial theory about magic mushrooms and human evolution gets a much-needed update. One might wonder (fairly) why I even give this attention if I dismiss it so readily. After all, I'm not here repeatedly sharing flat-Earth links, right? Well, it's different because, for one thing, it's not completely falsified the way flat-Earth doctrine is; for another, talking about it might help normalize the use of psychedelics. In the realm of human evolution, few theories have captured the public imagination quite like the “Stoned Ape Hypothesis.” It also might increase understanding of evolution in general, even if this particular hypothesis turns out to be a truckload of manure. Originally proposed by ethnobotanist Terence McKenna in his 1992 book Food of the Gods, this provocative idea has recently resurged in popular discourse, thanks in large part to its discussion on Joe Rogan’s widely followed podcast. Well, now I'm even less inclined to believe it. However, matching the enthusiasm for the theory is the skepticism that opposes it, and critics have branded it “pseudoscience,” successfully demoting it from a legitimate scientific hypothesis to fringe status. I'm not going that far. But I still haven't seen any real evidence. Since most academics approve of this characterization, I’ve long felt motivated to “steelman” McKenna’s theory, which I think will prove to be more right than wrong. Okay, fair enough. Your opinion, man. The article goes on to do just that, and it's easy enough to follow. McKenna’s highly amusing and admittedly speculative answer to the puzzle was that psychedelic substances helped spark the rapid evolution in human cognition, consciousness, and culture. According to his story, our early hominid ancestors would have inevitably encountered psychedelic fungi while foraging for food in locations like the African savanna. The psilocybin in these mushrooms would have provided adaptive advantages to those who consumed them, including enhanced cognition, creativity, and elevated states of consciousness. Okay, so, what happened to the other species who consumed them? Because I would find it even harder to believe that it was only our ancestors who ate magic mushrooms. Did it have an effect on the antelope? The zebra? The... whatever the hell other foraging species roamed Africa at the same time? Or maybe it only works on primates? Well, plenty of primates lived in places with shrooms, and we don't see them doing rocket science or writing novels. For evolutionary theorists, this sounded too close to Lamarckism, the idea that acquired traits could be passed down to offspring, a theory that fell out of fashion with the emergence of Darwin’s theory of natural selection. Which is exactly what I said in my earlier entry, but of course, I'm not a biologist. Still, I understand there's some leeway for heritability of certain acquired traits. This is called epigenetics (or so I'm told). McKenna, though, had more than a few answers to these criticisms, which makes the theory difficult to judge as flat-out right or wrong, since some of his explanations could be more or less correct. You don't get to just push a theory out there and expect us to judge it as "right" or "wrong." Like, I hereby theorize that there's life on Pluto. You can't prove me wrong, so it's a legitimate theory, right? No. No, it is not. One promising alternative explanation, which you could say represents the “status quo alternative” to McKenna’s theory, is that social and cultural factors played a unique role, such that increasing social complexity created a natural selection pressure that strongly favored intelligence over physical attributes. Looking around, I find the idea that, in humans, intelligence can be favored over physical attributes, almost as unlikely as the magic mushroom hypothesis. ...why would psychedelics then mostly disappear from our diet, rather than being a regular part of our contemporary lives, the way a drug like caffeine is? Now, that right there is cultural bias. Other cultures incorporate, or used to incorporate before missionaries came along, psychedelics into their sacred rituals. (In the author's defense, he does acknowledge some of these instances later in the article.) I realize that my statement there works in favor of Stoned Ape. That's okay. Skepticism doesn't mean outright rejection. According to the New Stoned Ape Theory, psychedelics likely served as a “chemical catalyst” for a special kind of “cognitive-cultural phase transition,” characterized by a shift in perspective at the individual level that propagates through culture (“goes viral”) and restructures the worldview of society, bringing about a transition at the societal level. Which, looking back at my earlier entry, I acknowledged as a possibility that I could accept (given evidence). I quote Younger Me: "And maybe - just maybe - I can see psychedelics being an engine for social evolution." The article is fairly long, as BT articles tend to be. I'm not going to critique each claim, though there's plenty to critique. I'll just point out one other quote, one that claims to sum things up: To summarize the theory in a sentence: Psychedelics, as “worldview shifters,” can create a cognitive phase transition whose spread creates a social phase transition — a shift in culture. It’s that simple! I'm a big fan of Occam's Razor, but when it comes to evolution and human cognition, I reflexively distrust anything that's "that simple." Which, again, doesn't mean it's wrong. The article kind of undercuts itself at the end by proposing something even weirder and more speculative, but I'm not going to weigh in on that except to say that it sounds like the ramblings of someone who just ate mushrooms. Which is fine. There's plenty of actual evidence that hallucinogens can, under certain circumstances, be beneficial. I just think the whole thing needs more science. |
I've written about the Trolley Problem before. At length. I even wrote a very short story featuring it: "The Trolley Problem" [18+]. This is very likely to be the last time I feature an article about it; this blog is steadily approaching its end. The article itself is a few years old, but I'm not aware of any progress in Trolleyproblemology since it came out in 2018. It's also from Slate, so no surprise they got it wrong. Does the Trolley Problem Have a Problem? What if your answer to an absurd hypothetical question had no bearing on how you behaved in real life? It was never meant to have bearing on how you behaved in real life. Consider this article from Philosophy Now (limited free articles), which concludes: The answer, in my view, is that there is no definitive solution. Like most philosophical problems, the Trolley Problem is not designed to have a solution. It is, rather, intended to provoke thought, and create an intellectual discourse in which the difficulty of resolving moral dilemmas is appreciated, and our limitations as moral agents are recognized... I do not believe there will ever be a perfect solution to the Trolley Problem, nor a consensus as to the best possible solution. All we can hope for – and should hope for, as I have argued – is to utilize the tools of philosophy as well as the scientific method to continue this discourse. The Trolley Problem does not have to be resolved; it merely needs to be contemplated, and to be the topic of our conversations from time to time. That is, of course, the opinion of one philosopher, but it rings true to me. Philosophers, however, aren't known for having a sense of humor. They're all Very Serious Thinkers. We have a different name for philosophers with senses of humor: we call them "comedians." And comedians have been having a field day with various permutations of the Trolley Problem, many of which are legitimately hilarious. Which is, as the Very Serious Philosopher notes, the point—even if he'd be appalled at the humor elements. So, back to the Slate article: I ask because the trolley-problem thought experiment described above—and its standard culminating question, Would it be morally permissible for you to hit the switch?—has in recent years become a mainstay of research in a subfield of psychology. And there's the "problem," right there: It's not psychology. It's philosophy. In November 2016, though, Dries Bostyn, a graduate student in social psychology at the University of Ghent, ran what may have been the first-ever real-life version of a trolley-problem study in the lab. In place of railroad tracks and human victims, he used an electroschock machine and a colony of mice—and the question was no longer hypothetical: Would students press a button to zap a living, breathing mouse, so as to spare five other living, breathing mice from feeling pain? Right, because our moral calculus involving mice is obviously exactly the same as it would be with fellow humans. I'm not saying people don't feel sorry for mice. I always feel sorry for the ones that Edgar Allan Purr leaves on the doorstep. But I'd be horrified if he brought us a dead human for a present, instead. Not that he could, but, you know, as long as we're talking hypothetically. It’s a discomfiting result, and one that seems—at least at first—to throw a boulder into the path of this research. Scientists have been using a set of cheap-and-easy mental probes (Would you hit the railroad switch?) to capture moral judgment. But if the answers to those questions don’t connect to real behavior, then where, exactly, have these trolley problems taken us? I suppose the answer to that depends on whether you ask a philosopher, a psychologist, a lawyer, or a comedian. It also seemed a little off that trolley problems were often posed in funny, entertaining ways, while real-life moral dilemmas are unfunny as a rule. Except that it's the comedian's job to make things funny when they're not. There's a lot more at the link, of course, but I've banged on long enough. In short, I disagree with the basic premise that it's a psychology issue instead of a philosophy one. Still, as I've noted in here before, it's not completely hypothetical: there are real-life situations where exercising agency can make a difference, one way or the other. So it's worth thinking about. And it's worth making jokes about, because comedians can often do philosophy better than philosophers. |
I'll admit it: I'm only linking this NPR article because the headline is glorious. I mean, I come up with good puns sometimes, sure. But that one nailed it. (Nail? Scratch?... No? Okay.) We've all had bug bites, or dry scalp, or a sunburn that causes itch. But what if you felt itchy all the time — and there was no relief? Well, then you know you've truly sinned in the eyes of the Lord. Seriously, though, that happened to me. Not, like, permanent, the way these unforgivable souls suffer, but a side-effect of some medication I had to take for a few weeks. Journalist Annie Lowrey suffers from primary biliary cholangitis (PBC), a degenerative liver disease in which the body mistakenly attacks cells lining the bile ducts, causing them to inflame. The result is a severe itch that doesn't respond to antihistamines or steroids. Yeah, that's gotta suck. PBC is impacts approximately 80,000 people in the U.S., the majority of whom are women. At its worst, Lowrey says, the itch caused her to dig holes in her skin and scalp. She's even fantasized about having limbs amputated to escape the itch. "PBC is impacts?" NPR, you used to be better than this. Also, while I totally get the limb-amputation fantasy, seems to me that would just make the itching worse, what with phantom limb syndrome and all. And finally, 80 thousand? I know that's not much on a percentage basis, but my whole city doesn't have 80,000 people in it. There's a bit more at the link, mostly going into some detail about the phenomenon of itch. Some of it is accepted science, and some of it isn't (the evo-psych bits, I mean). I managed to read the whole thing without experiencing the sympathetic itch the article talks about, which is more than I can say for when there's an article about yawning. But mostly, I linked it for the absolutely stellar headline. |
This one from Mental Floss reminds us of our mortality. The article is a couple of years old, but I'm sure the subjects remain deceased. I did a similar entry a few years ago: "Inevitable" . And I Revisited that one just a few weeks ago: "Revisied: "Inevitable"" Authors are used to killing their darlings and that sometimes means offing characters in creatively unconventional ways. Non-authors, too, die on a predictably regular basis. But when they die in a weird way, it's usually not irony. 1. Sherwood Anderson As far as I know, I've never read anything by this author, so now the manner of his death is the only thing I know about him. An autopsy revealed the culprit to be a 3-inch-long wooden toothpick in an olive that the author had swallowed while enjoying a martini. Having imbibed more than my share of martinis (which, incidentally, have to contain gin and vermouth and an olive garnish, otherwise it's not a martini), I can say with some certainty that one would need to imbibe a whole hell of a lot of them to not notice a 3-inch toothpick entering one's digestive system. So many, in fact, that it's not the toothpick that I'd be worried about. 2. Aeschylus Oh, we're going old-school now. This one was in the earlier entry, though at the time I apparently accepted the story at face value. According to writer Valerius Maximus, Aeschylus was hit by a falling tortoise while sitting outside Sicily’s city walls: “An eagle carrying a tortoise was above him. Deceived by the gleam of his hairless skull, it dashed the tortoise against it, as though it were a stone, in order to feed on the flesh of the broken animal.” The truth may be stranger than fiction, but that particular story has all the hallmarks of being, well... fiction. 3. Gustav Kobbé Music critic and author Gustav Kobbé loved to sail, but the hobby led to his death. Whereas with most music critics, pissing off musicians is what does them in. 4. Margaret Wise Brown In 1952, Goodnight Moon (1947) author Margaret Wise Brown was in France on a publicity tour when she developed appendicitis and was taken to the hospital for emergency surgery. As the article notes, it wasn't the appendicitis that killed her. Not directly, anyway. 5. Tennessee Williams Dr. Annette J. Saddik, Distinguished Professor of Theatre and Literature at the City University of New York, explained in 2010 that the false cause of death was due to John Uecker, Williams’s assistant, telling “the Medical Examiner, ‘Look, people are going to think it’s suicide or AIDS or something bizarre and we don't know what happened.’ So the Medical Examiner, said, ‘OK, he choked on a bottle cap.’” A rare case of fiction being stranger than truth. In any case, I think the last article of this sort that I featured also brought up Williams, but apparently, the "bottle cap" story was taken as true, there. 6. Pietro Aretino Pietro Aretino was an Italian satirist, playwright, and poet, who is credited with inventing written pornography. Say what, now? It's a near-certainty that as soon as writing was invented, someone used it to make porn. In any case, as with the ancient Greek above (and some of the others here), the story of his death is questionable. 7. Sir Thomas Urquhart The Scottish writer and translator died in 1660, supposedly because the news that Charles II... had retaken the throne caused him to burst into a fit of joyful, but deadly, giggles. Look, if you're going to claim these are true death stories, at least make them true death stories. 8. Edgar Allan Poe I want to say "we all know this one," but there's always someone learning something for the first time. In this case, you'll just have to check out the link. Or read the earlier entry. Or look up his Wiki page. The circumstances surrounding his death were mysterious, but well-reported. 9. Sir Fulke Greville Dying on the toilet isn’t the most dignified way to go, and though Elizabethan poet and dramatist Sir Fulke Greville managed to avoid that fate, the toilet certainly played a part in his death. Eh, that's a stretch. 10. Mark Twain Another really famous one, and as far as I know, well-documented. I remember as a kid being told about this, and thinking, "What a shame that he never got to see the comet," though he might have seen it before he died; I don't know. In any case, I couldn't see it when it swung through in 1986, so I know I'll never see it. 11. Molière Legend often has it that Molière died onstage, but that’s not actually true. Yeah, and I wonder about some of these others. Some of Molière’s lines as Argan were eerily prophetic of his imminent demise. Look, if you talk about death, and then die, it's not all that prophetic. Everyone dies, and lots of people talk about it beforehand. 12. Dan Andersson On September 16, 1920, Swedish author and poet Dan Andersson checked in to the Hotel Hellman in Stockholm, settling in Room 11. And this is why we don't have "Hell" in the name of hotels today. 13. Francis Bacon The only ironic death I would have accepted here would be "fried on a stove." The actual story, if true, does involve food, though. And science. There's a deep human curiosity to know how someone died. I can understand that; most of us want to know if it's something we should avoid. Failing to avoid a particular method of demise, however, should definitely warrant internet fame. |
Today's link is a few years old and from Popular Science, so one shouldn't take it as the final word on the subject. Still an interesting read. I always figured it was more fright than stress. All the melanocytes just up and run away. Not really, but I find it amusing. In 1902, the British Medical Journal reported an unusual case of rapid hair whitening. A 22-year-old woman “witnessed a tragedy of a woman’s throat being cut and the victim falling dead at her feet,” according to a physician at the London Temperance Hospital. The next day, the right side of her pubic hair turned white, while the left half remained black. One wonders if that was the inspiration for Cruella de Vil. And it’s not just random violence that sends people’s pigment running—college exams, children, and work pressure appear to change our coloring, too. Ah. No wonder I haven't gone gray yet. But for millennia, scholars have been relying mostly on anecdotal proof and intuition to rationalize this phenomenon. In the absence of clear evidence, many scientists did not believe stress could turn hair snow white, instead arguing the change must be triggered by chemicals or strange immune system behavior. Right, because stress couldn't possibly change body chemicals or the immune system. A recent paper, published Wednesday in the journal Nature, may put some of these arguments to rest. Again, it's not so "recent" anymore. And it's one paper. It's not definitive. That doesn't mean it's wrong. In the study, stem cell and regenerative biologists from the United States and Brazil reported that stress can indeed cause hair to lose its pigment—and they identified a cellular pathway by which it can occur. That last bit is, in my view, the important part. To study this vexing relationship, the researchers created an elaborate animal model, which basically involved trying to turn black-haired rats white with lab-made stressors. I get why they need to do animal testing for this sort of thing, and stressing out rats isn't going to raise a lot of public ire. Still... poor rats. Perhaps unsurprisingly, nociception-induced stress, which the scientists stimulated by injecting the rats with resiniferatoxin, an analogue of the chili pepper compound capsaicin, worked best and fastest. As a sample of one, I can definitively say that overdosing on capsaicin doesn't necessarily turn one's hair white. Having identified the optimal way to make a rat panic... Ah, there it is: the phrase at which point I hit "save bookmark." ...the team began searching for corresponding changes in the physiological pathways that give rise to coat color. There follows more description of the experiment. Hair still holds many secrets. We don’t know why hair loss plays out differently on someone’s scalp than on their face or, for that matter, their back. See, I don't mind getting gray hair. Baldness concerns me way more, though I have to admit it would make a few things easier. In recent years, there’s been a surge in research and development for anti-balding solutions—and many of them show promise. Terskikh, for his part, is working on regenerating hair from scratch using things like pluripotent stem cells. If it works, we’ll have an unlimited supply of hair—presumably in every shade. Great, just what we need: anime hair, no hair dye necessary. Won't happen, though. The Manic Panic industry lobby is just too powerful. |
My dip into the past this morning takes me all the way back to January of 2022, when I was cold and disease raged across the land. So, not much different from today: "Plus Ça Change..." In it, I responded to a then-current prompt from "JAFBG" [XGC], which is still active and other bloggers should totally check out the active prompts. The prompt was: "Imagine the pandemic never happened. How do you think your life would be different now, if at all?" Most of us have forgotten all about it, apparently. ...okay, look, I'd like to say that the only thing that would be different is that I wouldn't have canceled my gym membership. But let's be real: I still would have canceled my gym membership. Eventually. Because I'm lazy. I still haven't re-enrolled. Laziness is my only excuse there. There was a long stretch, before everything went to hell, when I was going there every day. I can't say I miss it, only that I feel like I could benefit from going back. Before the pandemic, I was going out only rarely, ordering delivery groceries, and buying most of my shit from Amazon. Now, I go out only rarely, order delivery groceries, and buy most of my shit from Amazon. I just placed a grocery delivery order today, and I'm expecting a package from Amazon later. I thought about weaning myself off that site, but... nah. I will continue to wear this mask in public, regardless of what diseases may or may not be circulating. Yeah, I need to stop making promises. I probably would have traveled more, but even that would have led to me being, right now, sitting here at home wishing winter would end already because fuck, I've had it with winter. You know how stores keep stocking holiday items earlier and earlier every year? Well, "I've had it with winter" is now my default state, year-round. Hm. One thing that might be different. I might not be so completely, utterly done with approximately half of the population of the US. Even if that had been the case, I'd be completely and utterly done now. I might still have some tiny thread of hope for the future. Let's see... nope. All gone. |
Here's something sure to brighten everyone's day, if for no other reason than this Time article is shorter than some of the others that have popped up lately. Well, I'm not sure that's true for everyone. You never know what'll make someone happy, sometimes. Like, I've known people who were only happy after they made everyone around them miserable. So it's not a stretch to believe that obsessing over... whatever... might be pleasing to someone. Given all the obsessing people tend to do, I'd almost guarantee it. Happiness is a worthy pursuit. Is it, though? Is it really? Or are other things worthy of pursuit, and happiness is a byproduct of that? But fixating too much on achieving it often leads to bad feelings when you fall short—which ultimately makes you less happy. Especially if you expect not to fall short. That’s the finding of a new study published in the journal Emotion. For once, I actually clicked on the link provided to the study. I didn't read it too closely, but I'd take it with the same level of skepticism as any other psych study, even if it does speak to my biases. In the study, people who said they were worried about achieving and maintaining happiness tended to have more depressive symptoms, worse well-being, and less life satisfaction than those who simply held happiness as a goal—and didn’t fret about whether they were meeting it. Which is, honestly, exactly the result I'd expect. Which doesn't mean it's right. Or wrong. But to me it's like a study that investigated whether cumulus clouds look soft and fluffy. What’s the secret, then? Take the pressure off and stop taking your own happiness temperature so often, Zerwas says. Okay. Embrace all of your feelings—both happy and sad ones—since all emotions can be informative, providing us insights into our psychic makeup. Okay. And practice cognitive-behavioral strategies such as mindfulness—being present in one’s emotions and aware of what those feelings are—to truly tune in. Aaaaand you lost me. I feel like this "pursuit of happiness" thing is mostly an American phenomenon, thanks to the famous words in the Declaration of Independence. There's some discussion over what exactly was meant by "happiness" therein, but I'm pretty sure the connotation of the word has changed in the last 250 years. It probably didn't mean a feeling of unbridled joy, but something closer to security. And in my experience, living without fear is happiness enough. |
Mostly, I'm just pleased that there exists a magazine called Far Out. Really, I didn't think this was much of a mystery. Led Zeppelin’s fourth studio album might be most famous for the all-out hard rock of opening tracks ‘Black Dog’ and ‘Rock and Roll’, as well as the slow-burning firepower of ‘When the Levee Breaks’, and, of course, the ethereal majesty of ‘Stairway to Heaven’. They apparently used up all their creative energy on the songs, because they never bothered to give the album a name. I guess, technically, the album name is Led Zeppelin, but most of us called it Zeppelin 4. But ‘The Battle of Evermore’ is an underrated favourite for many fans, as well as the band’s singer Robert Plant. It's absolutely near the top of my list. It sounds as though ‘Evermore’ could suit a fictionalised or dramatised narrative of Agincourt or Bosworth Field, but that’s not what’s going on here. Zeppelin’s lyricist has taken his inspiration from the realm of pure fantasy, as compelling as that realm appears in this context. That fantasy drew from reality, so it's not unreasonable to think that a real-life historical battle may have been the ultimate inspiration. Plant didn’t just suck this legendary battle out of his thumb, however. I'm just including that line here because I wish I'd come up with that particular turn of cliché. He draws heavily on Tolkien’s high-fantasy novel The Lord of the Rings, including specific details in certain lines of the lyrics that indicate exactly which battle he’s referring to. And somehow, I realized that even before I read Tolkien. Perhaps someone told me of the connection. Lord of the Rings fans pinpoint Plant’s descriptions as a lyrical account of the Battle of the Pelennor Fields, which serves as the climactic battle in the War of the Ring during the third volume of the novel. And that's a level of detail I'd never even considered exploring. Plant's lyrics on Zeppelin songs were always good examples of "This feels like it could be a metaphor, but for what?" It's probably more true for Stairway, which, incidentally, was the only song on the album with printed lyrics included, so I could only assume that there was something important there that he was trying to convey. I finally figured out what it was, decades later. But Evermore was always pretty obvious: he was channeling Tolkien. The legendary island of Avalon, the place where the sword Excalibur was forged in Arthurian legend, which also appears in the works of Geoffrey Chaucer, gets a mention, too. The frontman seems to have played around with mythical imagery without much care for its relation to any pre-existing narrative structure. Okay, maybe not that obvious. Still, I thought the reference to Avalon in the song was mostly because it scanned well. Normally, when I do an entry about a song, I'll include the video. But there's little need, as there's one at the linked article. |
My random number generator likes to have a laugh at my expense sometimes; today, it came up with another Quanta piece. This one, however, doesn't involve scary numbers. ‘Metaphysical Experiments’ Probe Our Hidden Assumptions About Reality Experiments that test physics and philosophy as “a single whole” may be our only route to surefire knowledge about the universe. It does, however, discuss science and philosophy, as the headline warns. Metaphysics is the branch of philosophy that deals in the deep scaffolding of the world: the nature of space, time, causation and existence, the foundations of reality itself. Once something can be experimentally verified, though, it ceases to be metaphysics and becomes... physics. Or fact. Plenty of things we (for varying definitions of "we") know (for varying definitions of "know") now (for varying definitions of "now") were once in the realm of metaphysics. It’s generally considered untestable, since metaphysical assumptions underlie all our efforts to conduct tests and interpret results. And if it's untestable, it's not science. At least, not yet; the untestable can become testable. I've said before that philosophy guides science, while science informs philosophy. I stand by that assertion. What the article (and it's a fairly long one) focuses on is what to do when we don't even know that we're using philosophy to guide science. Intuitions we have about the way the world works rarely conflict with our everyday experience. Well, yeah, because intuition is largely based on everyday experience. But at the uncharted edges of experience — at high speeds and tiny scales — those intuitions cease to serve us, making it impossible for us to do science without confronting our philosophical assumptions head-on. Suddenly we find ourselves in a place where science and philosophy can no longer be neatly distinguished. A place, according to the physicist Eric Cavalcanti, called “experimental metaphysics.” My first reaction is to say that the phrase is misleading. However, sufficient explanation can suffice to un-mislead it. If this article isn't sufficient explanation, I don't know what would be. A book, maybe? For once, I don't see a book ad hidden in the article. You should definitely go to the link just to see the pictures of Cavalcanti, though. I think he and Orlando Bloom were separated at birth. In experimental metaphysics, the tools of science can be used to test our philosophical worldviews, which in turn can be used to better understand science. Okay, I don't have a problem with that. THE DIVIDING LINE between science and philosophy has never been clear. Often, it’s drawn along testability. Yeah, that's where I, an absolute amateur, generally draw it, at least in my head. I will note, though, that the system we call "science" developed out of what used to be called "natural philosophy." However, astronomy developed out of astrology, and chemistry out of alchemy, so I'm not sure that's a win for natural philosophy. As it turns out, though, the testability distinction doesn’t hold. Philosophers have long known that it’s impossible to prove a hypothesis. Which is why scientists test for whether something can be falsified, not proven. In 1906, though, the French physicist Pierre Duhem showed that falsifying a single hypothesis is impossible. Every piece of science is bound up in a tangled mesh of assumptions, he argued. Dang ol' French, messing things up for everyone. Take, for instance, the geometry of space-time. Immanuel Kant, the 18th-century philosopher, declared that the properties of space and time are not empirical questions. He thought not only that the geometry of space was necessarily Euclidean, meaning that a triangle’s interior angles add up to 180 degrees, but that this fact had to be “the basis of any future metaphysics.” And this is why we don't take the words of philosophers as absolute truth. Especially those of Kant. (I will resist the obvious pun.) The unit of empirical significance is a combination of science and philosophy. The thinker who saw this most clearly was the 20th-century Swiss mathematician Ferdinand Gonseth. You will note that he wasn't French. He probably spoke the language, though. For Gonseth, science and metaphysics are always in conversation with one another, with metaphysics providing the foundations on which science operates, science providing evidence that forces metaphysics to revise those foundations, and the two together adapting and changing like a living, breathing organism. As he said in a symposium he attended in Einstein’s honor, “Science and philosophy form a single whole.” I don't think I'd ever heard of this dude before, but that sounds awfully familiar. Like it's very close to my own ideas, reached independently, as noted above. Which means I have to be careful, lest my own confirmation bias kick in. The article goes on, like I said, for a while, with examples of the intersection of philosophy and science. Then, what to me is a pretty important example: Michele Besso, Einstein’s best friend and sounding board, was the only person Einstein credited with helping him come up with the theory of relativity. But Besso helped less with the physics than with the philosophy. Einstein had always been a realist, believing in a reality behind the scenes, independent of our observations, but Besso introduced him to the philosophical writings of Ernst Mach, who argued that a theory should only refer to measurable quantities. Mach, by way of Besso, encouraged Einstein to give up his metaphysical notions of absolute space, time and motion. The result was the special theory of relativity. Also not French. But the real point here is: in order to come up with his famous, and now well-supported, theory of relativity, Einstein had to shed some basic assumptions that he, maybe, didn't even realize he had. Once he did that, he made history and changed both science and philosophy completely. I won't quote more from the article, but I read the whole thing. I'll just say this: Einstein was famous for, among other things, popularizing the concept of the thought experiment, and the article leans heavily on that particular technique for figuring things out. And thought experiments, when based on known science, basically philosophy. I rest my case. |
From Quanta, and as usual for that source, may not be suitable for numerophobes: How Base 3 Computing Beats Binary Long explored but infrequently embraced, base 3 computing may yet find a home in cybersecurity. There's been talk of quantum computers being used for that, as well. Difference is, from what I understand, quantum computing is still very much in its infancy. Three, as Schoolhouse Rock! told children of the 1970s, is a magic number. Thanks. Now I have an earworm. The number 3 also suggests a different way of counting. Our familiar base 10 decimal system uses the 10 digits from zero to 9. Binary, our digital lingua franca, represents numbers using only the two digits zero and 1. One of my favorite nerdy jokes: "There are 10 kinds of people in this world: Those who understand binary, and those who don't." The hallmark feature of ternary notation is that it’s ruthlessly efficient. With two binary bits, you can represent four numbers. Two “trits” — each with three different states — allow you to represent nine different numbers. "Trits?" No. "Bits" is short for binary digits. If you have trinary digits, using the same convention would yield something way funnier. It turns out that ternary is the most economical of all possible integer bases for representing big numbers. The article explains why, or we can just take their word for it. But: For large numbers, base 3 has a lower radix economy than any other integer base. (Surprisingly, if you allow a base to be any real number, and not just an integer, then the most efficient computational base is the irrational number e.) That doesn't surprise me at all, except the surprise of being allowed to have a non-integer base number. The irrational number e is 2.71828..., which is closer to 3 than to any other integer. Despite its natural advantages, base 3 computing never took off, even though many mathematicians marveled at its efficiency. In 1840, an English printer, inventor, banker and self-taught mathematician named Thomas Fowler invented a ternary computing machine to calculate weighted values of taxes and interest. What pleases mathematicians doesn't usually please the rest of us. Why didn’t ternary computing catch on? ... Binary was easier to implement. Thus showing once again that "easier to implement" doesn't always translate to "most efficient to run." How does this affect us? Well, it doesn't, much. Numbers have to be shifted to decimal notation either way, so we can do things like taxes and budgets. Mostly, I'm just disappointed with "trits." |
I'll admit it. The only reason I'm sharing this Atlas Obscura article is that I'm actually 12 years old. Penistone Paramount Cinema Penistone, England This century-old, single-screen cinema still puts a 1937 Compton organ to good use. Always good to read about an organ being put to good use. Along Shrewsbury Road in Penistone, England, an unassuming single-screen theater keeps cinematic nostalgia alive. There's something to be said about having a single screen rather than dividing your attentions between a dozen. Inside, the star of the show is the Compton organ. The instrument was originally built by the John Compton Organ Co. in 1937. It was first installed in Birmingham’s Paramount Theatre, where it entertained audiences for over 30 years. In 1988, it was bought by a private cinema owner and installed in the Regal Cinema at Oswestry in Shropshire. I suppose I'm disappointed that it didn't come from Scunthorpe. After four years at Oswestry, it was brought to the Penistone Paramount Cinema by organist Kevin Grunill. The instrument was restored in 2000 and again in 2013. Only 13 years between restorations, for such an old organ? Yeah, that's all I have today. I'm spent. |
Big Think offers up another way for people to annoy pedants, and vice-versa. It’s important that weight and mass are not the same Here on Earth, we commonly use terms like weight (in pounds) and mass (in kilograms) as though they’re interchangeable. They’re not. By "we," I suppose they mostly mean "Americans." But even people in countries that use SI units will use kg as if it's a unit of weight. This is fine, as far as I'm concerned, because damn near 100% of us live on the Earth's surface. And while there are gravity variations due to latitude, elevation, or anomaly (such as the one from "Anomalies" a couple of weeks ago), they mostly don't make much difference unless you're a scientist who requires greater precision. Still, as a pedant, I think it's important to know the difference. Also, as a pedant, I think it's important to keep it to oneself if one does not wish to be uninvited from social gatherings for being a pedant. Since this is my blog, I'm making an exception here. Conventionally, here on the surface of the Earth, we can convert between the two using only a minimal amount of effort: 1 kilogram is 2.205 pounds, and vice versa, 2.205 pounds converts to 1 kilogram. Going back-and-forth requires only multiplication or division, which seems easy enough. "Easy?" Have you met people? Some of them break out in hives if you ask them to add 10% to something. Incidentally, I don't usually bother with the 0.005. For most practical purposes, 2.2 is close enough and much easier to do head-math with. A kilogram is an example of mass, not of weight, while a pound is an example of a weight, not of a mass. It’s only here on the surface of the Earth, where we’re at rest relative to the rotating Earth, that these two concepts can rightfully be used interchangeably. I mean, as long as we're being pedantic, it's a really stupendously big universe and it wouldn't surprise me if there are other planets with the same acceleration due to gravity as that of the Earth. But for nearly 100% of the volume of the universe, weight is irrelevant and only mass matters (pun absolutely intended). If you release an object from rest and allow it to fall, it falls straight down, accelerating at a constant rate. It gains speed directly proportional to the amount of time that it’s been falling, and the distance it covers is proportional to the amount of time squared that the object has been falling. Again, if you're going to get technical about it, you'd add "without air resistance." This phenomenon, however, appears to not depend on mass or weight. A light object will fall just as quickly as a heavy object, especially if air resistance isn’t a factor. There it is. Someone did that on the Moon, incidentally. This proved two things: 1) Physics is right; 2) at least that particular Moon expedition wasn't faked on an Earthbound sound stage. The article proceeds to go into a lengthy (and weighty) discussion of the differences between scales and balances, and how the former measures weight while the latter measures mass. It's remarkably math-light. Then: We can sum up the difference succinctly: your mass is an inherent quality of the atoms that make up your body, but your weight is dependent on how those atoms accelerate under the influence of all the factors and forces acting on it. Which I suppose is the main point of the article; everything else exists to support it. There are plenty of physics textbooks (and physics teachers) that ignore this difference, and simply state that your weight, W, always obeys the equation W = mg. This is incorrect; it is only true when you are at rest on the surface of the Earth. There's a popular trick question: "Which weighs more, a pound of feathers or a pound of lead?" It's a trick for several reasons. We often talk about “watching our weight” or “trying to lose weight,” but if that was truly your goal, you could simply go to a higher elevation, move to a different planet, or even get into an elevator and wait for the door to close after you hit the “down” button. Yes, "simply... move to a different planet." Sounds like a good idea for other reasons, these days. |
Diving into the past again today, I came up with a relatively recent entry, from May of last year: "A Frank Discussion" The entry revolved around a piece on the bon appétit site; as of right now, the article is still there. And it's about figuring out which big-brand hot dogs are best. I doubt the world of wieners has changed much in a year and a half. The article has a "summer's coming so here's something about grilling" slant, which, of course, it's the precise wrong time of year for, here in the One True Hemisphere. Apparently, at the time, I'd overlooked one of the article's biggest flaws: while aimed at summer grill cooks, the testing featured boiled hot dogs. As anyone with taste buds knows, boiled dogs taste way different from grilled dogs. Me: It's been many years since I've actually eaten a hot dog, frankfurter, or weiner; anything requiring a hot dog bun. And now it's been many years plus a year and a half. Not to mention I know what they're made of, but that doesn't stop me from eating breakfast sausages. I've also eaten way worse than breakfast sausages, since then. Here's the thing: it's hard to be objective about food (or drinks) during a taste test. Taste is, well, a matter of taste. Beer, for example, is highly personal; some love *shudder* IPAs, while I prefer darker, less hoppy brews. Additionally, taste changes over time, and can be affected by numerous factors, such as your overall health and the last thing you ate or drank. I concluded that entry with what may have been my first assertion in here that a hot dog is actually a taco. I'm not sure I actually believe it, myself, but it does tend to get one to think about categorization problems and their edge cases. |
It's okay, folks. No need to worry about global warming; we've got the important bit covered: Century-old experiment secures beer and whiskey’s future Genetic insights could help grains endure climate change Thanks to an experiment started before the Great Depression, researchers have pinpointed the genes behind the remarkable adaptability of barley, a key ingredient in beer and whiskey. These insights could ensure the crop’s continued survival amidst rapid climate change. Whew! Coffee and chocolate future still uncertain, but those are less important. Grown everywhere from Asia and Egypt to Norway and the Andes mountains of South America, barley is one of the world’s most important cereal crops and has been for at least 12,000 years. As it has spread across the globe, random changes to its DNA allowed it to survive in each new location. I know it's necessary to summarize for an article, but "random changes" are only one component of adaptability. The article talks about the experiment promised in the headline, and I won't quote it; I have little to say about the details except that it seems legitimate to me. Then, towards the end: Using modern technology like genome engineering and CRISPR, researchers could try to engineer other crops that flower at specific, more advantageous times. And approximately 15 seconds later, someone's going to screech about genetically engineered crops. But the really important quote, they saved for the end: “Barley’s ability to adapt has served as a cornerstone to the development of civilization. Understanding it is important not just to keep making alcoholic beverages, but also for our ability to develop the crops of the future and enhance their ability to adapt as the world changes,” Koenig said. Or, to put it in layman's terms: Beer. Is there anything it can't do? |
We all know I'm not religious. I'm also not spiritual, unless by "spirit" one means "distilled beverage." The closest I get is when I listen to music. But, of course, it has to be good music—which has nothing to do with whether it was created with religion in mind or not. So here's a Cracked article about songs you might not have realized were religious (and, somehow, Leonard Cohen isn't on the list). 4 Famous Songs That Are Secretly Religious This was supposed to be a pop song. We didn’t know it was about Mormons For example, you might realize the band you’re listening to is named “Creed,” and the song is talking about going “higher, to a place where blind men see.” Clearly, this about heaven. And yet the band actually insists they are not religious, as they’re not trying to preach anything to anyone. Songs (and other works of art) can be religious without being preachy. However: 1) A band naming itself "Creed" and claiming to not be religious sets off my bullshit detectors; 2) regardless of their inspiration, they suck. I like to change the lyrics of their most famous song to "With Legs Wide Open." Or, you might hear that 1960s classic “Spirit in the Sky” and note the lyric, “Gotta have a friend in Jesus.” You then might conclude it’s a Christian song. Singer-songwriter Norman Greenbaum is actually Jewish, but penning that song was a smart move. Insert Jewish joke about "business is business" here. 4 Imagine Dragons’ ‘Radioactive’ Is About Leaving Mormonism If you were told that one of Imagine Dragons’ songs is religious, maybe you’d go with “Believer,” because it talks about being a believer. Maybe you’d go with “Demons” because it talks about demons. Maybe you’d go with “Thunder” because that one’s so terrible that only divine intervention can explain its success. Okay, that's funny. But... who? The band formed at Brigham Young University, and when you go on to leave the Church of Latter-day Saints, the memory of your time there stays with you. We know for sure that one song of theirs is about that subject: “Radioactive.” Yeah, so, kind of the opposite of religious, I guess? Hell if I know. I didn't watch the video they embedded in the article, so I still don't think I've ever heard any of their stuff. 3 Simon and Garfunkel’s ‘Cecilia’ Is St. Cecilia The idea that “Cecilia” is a song directed at a saint sounds absurd. Yeah, it kind of does, especially since Paul Simon and Art Garfunkel are in the same tribe as Norman Greenbaum. But hey, Simon said it, and while he may have been trolling, it kind of works on a metaphorical level. It doesn't help that the song is probably the duo's least awesome. 2 ‘Dem Bones’ Is a Spiritual About the Promised Land Unlike the others, this isn't a relatively recent pop song, but, as the article notes, an older gospel tune. Which means it's not "secretly" religious at all, but I guess there's a secularized version for broader appeal. Today, the song is most important during Halloween, because it gets children talking about spooky skeletons. That’s fine. When you get down to it, Halloween is the holiest night of the year. And technically, anything involving an afterlife has religious connotations. 1 ‘The Star-Spangled Banner’ Is About the Greek Pantheon This is the one I saved this article for, even though the header is misleading. "The Star-Spangled Banner" is the name of Key's poem, which, as we all know, was set to an existing tune. If you’re a fan of trivia, you might already know that this melody came from a British drinking song. What's weird to me is, I've known this for decades, but I never could be arsed to find out exactly what the drinking song was. Well, now I know... The song is about a bunch of Greek gods arguing with each other. When the poet Anacreon urges humanity to drink and screw, the god Jupiter gets angry and considers intervening. Apollo disagrees, claiming Jupiter's mighty thunderbolts are nothing against the power of music. And whaddaya know; Apollo was right. The United States may not have been founded as a religious country, but if you ever feel the need to pledge allegiance to one nation under God, feel free to specify which god that it is. Why, Dionysus, of course. |
Back in the early days of this blog, I had a running joke about ragging on Hello Kitty. Very recently, I found this article from Fast Company: Hello Kitty turns 50: Here’s how she became a global moneymaker As Hello Kitty’s commercial success expanded beyond Asia, so did her personal profile. At least, I thought of it as a running joke. Looking back, it might have seemed like a serious hate-on. Either way, though, the whole reason for the gag was the Nefarious Neko's worldwide appeal, how the character seemed to be everywhere, on all things, in all contexts. This, of course, is successful marketing. As I've said before, I sometimes put marketing articles in here because many people have things they want to promote, like, maybe, their books. And also, the psychology of it can be interesting. Hello Kitty turns 50 on Friday. For context, if you don't want to click on the link, "Friday" was November 1 of this year. As a tabula rasa open to interpretation, the non-threatening creation was the perfect vehicle for making money, she said. This strikes me as opposite to most marketing advice, which is to know your audience and pander to it. In the past, I'd have taken issue with the "non-threatening" description, but again, I think those jokes fell flat. There have been anniversary editions of merchandise ranging from pet collars, cosmetics and McDonald’s Happy Meals to Crocs and a Baccarat crystal figurine. On the other hand, I'd have had a field day with Hello Kitty Crocs. Hell, I might still be able to work up a rant about that particular combination. By the late 1970s, Sanrio revealed the character’s name as Kitty White, her height as five apples tall and her birthplace as suburban London, where the company said she lived with her parents and twin sister Mimmy. That much, I knew. ("Know your enemy," I might have said in the past.) Her TV appearances required co-stars, including a pet cat named Charmmy Kitty that made its debut 20 years ago. This was also always suspect to me. A cat owning a cat? Isn't that, like, slavery? But Hello Kitty’s 40th birthday brought an update that astonished fans. Sanrio clarified to a Los Angeles museum curator that Kitty, despite her feline features, was a little girl. Thus sidestepping the "slavery" issue and pushing the conversation back to animal rights. Just kidding; I doubt anyone else thought deeply about the ethics involved. “She is supposed to be Kitty White and English. But this is part of the enigma: Who is Hello Kitty? We can’t figure it out. We don’t even know if she is a cat,” art historian Joyce S. Cheng, a University of Oregon associate professor, said. “There is an unresolved indeterminacy about her that is so amazing.” Like if she were in a box with a radioactive atom. Schrõdinger's Hello Kitty. Part of the confusion stems from a misunderstanding of “kawaii,” which is Japanese for “cute” but also connotes a lovable or adorable essence. It may be cliché, but some things really do get lost in translation. During a presentation earlier this year in Seoul, Hello Kitty designer Yamaguchi said one of her unfulfilled goals was finding a way “to develop a Hello Kitty for men to fall in love with as well.” But she’s still working on it. I have some ideas, but I try to keep this blog 18+. Even if I'd been serious about my ragging on Hello Kitty, that came to an end some years ago, when I read an article (which I can't find now) featuring some village in Northern Siberia, considered the most remote human settlement in the world (pretty sure they didn't include Antarctic research bases, just places where humans naturally settled). Of course, the village isn't entirely unreachable, or they wouldn't have done an article on it, but apparently, it's only accessible by some rickety train line in the week when there's summer there. The article included pictures, and in one of them, a little girl in this remote village unconnected to the world at large was wearing a Hello Kitty shirt. That is when I knew that her dominance was complete, and no amount of rage, joking or otherwise, would end her hegemony. Honestly, these days, I think we could do a lot worse. |