Not for the faint of art. |
Complex Numbers A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number. The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi. Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary. Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty. |
No, the title isn't political commentary. My random selection of past entries yielded this bit from the end of 2018: "Worst Year Ever" The (very short) entry is basically a few comments on a link to an article from Science.org, which is still available and claims that the year 536 C.E. was "the worst [year] to be alive." My commentary back then was pretty much focused on praising the interdisciplinary science that produced this result. But now, six years on, I might have a few more things to say about the article itself. Ask medieval historian Michael McCormick what year was the worst to be alive, and he's got an answer: "536." "Worst" is obviously a matter of opinion. It's also dependent on the opinion-holder's experience and knowledge. Ask people what the worst movie ever made is, and you'll get a bunch of different answers, but none of them are definitive because I guarantee you none of them have seen every single movie ever made, and one person's "worst" may be another's "yeah, it was pretty bad." In this case, we've got a guy whose area of focus seems to be medieval Europe, so his answer is understandably Eurocentric. A mysterious fog plunged Europe, the Middle East, and parts of Asia into darkness, day and night—for 18 months. See? Nothing about North America, which had a decent population at the time, or anything in the Southern Hemisphere. It might have been a shitty time in those places, but maybe not the worst year to be alive. Especially for the native Americans or Australians. Temperatures in the summer of 536 fell 1.5°C to 2.5°C, initiating the coldest decade in the past 2300 years. Hey, guys, I think I've figured out how to stop global warming. If we can't make a volcano go boom, there's always the similar effects of nuclear winter. Historians have long known that the middle of the sixth century was a dark hour in what used to be called the Dark Ages, but the source of the mysterious clouds has long been a puzzle. Spoiler: volcanic eruptions in Iceland. Probably. I mean, it's science, so results can have varying confidence levels, and I haven't seen any updated articles since then. Still, my main point remains: when different disciplines cross-reference each other, you get better science. |
When science meets art, are they both elevated? Or do they make each other suck worse? From CNN: Turbulent skies of Vincent Van Gogh’s ‘The Starry Night’ align with a scientific theory, study finds I usually groan when I see "scientific theory" in a headline, because the word "theory" means something different in science than it does in everyday speech. In this case, though, my fears proved unfounded. Which doesn't mean I don't have some issues with the rest of the piece. Now, a new analysis by physicists based in China and France suggests the artist had a deep, intuitive understanding of the mathematical structure of turbulent flow. Or, and hear me out here, he was an artist and thus observed turbulent flow in, perhaps, a river or whatever, and incorporated that observation without being able to do math beyond "this paint costs 3 francs and this other one costs 4; which one is cheaper?" As a common natural phenomenon observed in fluids — moving water, ocean currents, blood flow, billowing storm clouds and plumes of smoke — turbulent flow is chaotic, as larger swirls or eddies, form and break down into smaller ones. "Chaotic" is another word that means something different in science than it does in everyday speech. Again, though, credit where it's due; the article uses it in the scientific sense: It may appear random to the casual observer, but turbulence nonetheless follows a cascading pattern that can be studied and, at least partially, explained using mathematical equations. "Partially" is doing a lot of the work in that sentence. “The Starry Night” is an oil-on-canvas painting that, the study noted, depicts a view just before sunrise from the east-facing window of the artist’s asylum room at Saint-Rémy-de-Provence in southern France. There are, I think, a few paintings that even the art-blind (like me) can identify at first glance. Mona Lisa. That Michelangelo thing with God and Human. The Scream. Maybe that one with the farmers. And The Starry Night. But, on the off-chance you have no idea what we're talking about, the article provides helpful illustrations. Using a digital image of the painting, Huang and his colleagues examined the scale of its 14 main whirling shapes to understand whether they aligned with physical theories that describe the transfer of energy from large- to small-scale eddies as they collide and interact with one another. I would love to have seen their grant proposal. "Yeah, we're going to study... art." The atmospheric motion of the painted sky cannot be directly measured, so Huang and his colleagues precisely measured the brushstrokes and compared the size of the brushstrokes to the mathematical scales expected from turbulence theories. To gauge physical movement, they used the relative brightness or luminance of the varying paint colors. One wonders if they already had a conclusion in mind when they picked those criteria, which would make it questionable science. Huang and the team also found that the paint, at the smallest scale, mixes around with some background swirls and whirls in a fashion predicted by turbulence theory, following a statistical pattern known as Batchelor’s scaling. Batchelor’s scaling mathematically represents how small particles, such as drifting algae in the ocean or pieces of dust in the wind, are passively mixed around by turbulent flow. And here's where most of the red flags appear, to me. Paint is, and you might want to sit down for this one, a fluid. Granted, that's just about the limit of my knowledge of artists' paint, but I have a high degree of confidence in my assertion, having seen paint in its fluid form. I've even seen paint mixed, and noted the swirls and eddies of turbulence. This is kind of like seeing milk added to coffee, and I don't drink coffee either. Where I become more speculative is in thinking: well, he painted the thing with wet paint, so of course there's turbulence at the small-scale boundaries of brushstrokes. Beattie agreed: “It’s an amazing coincidence that Van Gogh’s beautiful painting shares many of the same statistics as turbulence,” he said. While there is, indeed, such a thing as coincidence, I don't agree that this is an example of it. The study team performed the same analysis and detected the same phenomenon in two other images, one a painting, “Chain Pier, Brighton,” created by British artist John Constable in 1826-7, and the other a photograph of Jupiter’s Great Red Spot, taken by NASA’s Voyager 1 spacecraft on March 5, 1979. Now, that seems a little more like what I'd expect from science. First, another painting; perhaps as a control of sorts. Hell if I know; I don't know that painting, and CNN neglected to illustrate it for us (there is, however, a link). As for Jupiter, we're pretty sure it exhibits all the hallmarks of turbulence on its visible surface, so it's a check on their modeling assumptions. And yet, it shouldn't boggle anyone's mind that an artist noticed turbulence and tried to recreate it, or that one as brilliant as van Gogh was able to do it. |
Today, we have an exercise in metaphor-stretching: This starts out looking like a piece about housing: When I was a kid, I lived in a bungalow for a while. You know, one of those houses that’s compact and square and has a half-upper floor that’s basically just a loft with sloped ceilings. Yeah, well, the house I lived in as a kid was originally a shotgun shack. Now if someone told me to live in a loft with sloped ceilings I’d ask if there was any regular house available, please. Nice to be able to be picky, isn't it? Something else I’ve noticed that’s shaped like a bungalow is the way we talk to each other. Human speech. And that's the metaphor part, which, as it turns out, is what the post is really about. We waste an unbelievable amount of time — in our daily lives, on podcasts, in interviews, in blogs and articles, even in Tweets and Notes for God’s sakes — qualifying everything we say with caveats. We say “now I know not all x are y, and I know that historically abc, and I’m not trying to say that lmnop so please don’t take this wrong…” Granted, I avoid X/Twatter like the plague it is, and I don't listen to podcasts. But while I've seen what this dude's talking about, and even engaged in it on occasion (for instance, when I note that I might be talking about something with a US bias), I don't think disclaimers like that rise to the level of wasting "an unbelievable amount of time." Saying anything even remotely controversial on the internet is terrifying. Is it though? Is it more terrifying than saying it in a crowd of Americans with guns? No matter what argument you make on the internet, you will get people who reject it wholesale because you forgot an asterisk. Because you forgot to mention their particular edge case. Yes, and? We all know that not all heterosexual dating advice or sex advice applies to transgender people. It doesn’t even apply to all straight people. We are all intelligent enough to know that. [citation needed] on the "intelligent" bit. People have managed to make it insensitive to speak about things that the majority of people deal with. People have managed to make it insensitive to be normal. And what, exactly, is "normal?" And then the tyrannical minority ostracizes you for it, and in turn makes it okay for everyone else to ostracize you. "Tyrannical minority?" Loud, sometimes, maybe, but that seems like an oxymoron. Unless the "minority" is an actual, political tyrant, a minority of one. If you drove down a street consisting entirely of overly-decorated bungalows, with nice upper windows and big, furnished porches, you’d call bullshit on the entire street. No, I don't think I would. Everywhere I look, I see people decorating their speech with nuance when all they really want to say is some simple, normal thing. It feels like bullshit. Well, it's not everywhere I look. Perhaps examine your own biases, first? People don't change their minds on the internet. They usually do that in books, battlefields, or not at all. Yeah, that's a little bleak. Some people are just actively searchlighting for reasons to get outraged. They aren’t worth listening to. On that, we can agree. People who need that much nuance weren’t going to learn from your argument anyway. Aaaannnd you've lost me again. As I've said before — we as a culture have become profoundly unserious. To me, the opposite of "serious" is "funny." So the opposite of "unserious" would be "unfunny." Me? I'd rather be funny. Maybe if we just start saying what we mean and placing the impetus on the reader to read nuance into the topic, we’ll all grow up a little. Well, sure, calling me childish certainly helps your argument. People are tired of having to decorate their speech to make it marketable. I think by "people," he means "I." As in him, not me. I think we’re past peak Woke, and I think part of what that means is that we’re past peak not-being-able-to-speak-like-adults. Ah, there it is. I'm not dismissing his argument, mind you. I read the whole thing, top to bottom (it's really not that long). I simply don't agree with most of it, though I accept that my opinion could change. If you're going to do a metaphor, make sure it's one that we can actually relate to. Bungalows may not be exotic or ritzy, but they're generally better than no home at all. So that's me, saying what I mean. |
I've lined Aeon articles in here before. This seems to be an affiliated site, Psyche. Apparently, someone loves Ancient Greek. How to do mental time travel Feeling overwhelmed by the present moment? Find a connection to the longer view and a wiser perspective on what matters Bullshit sense... tingling! But let's give this a chance. You have a remarkable talent – the ability to step outside the present, and imagine the past and future in your mind’s eye. Hey, you know what else you can imagine? Things that never happened, places that don't exist, and impossible scenarios. It's a rather important ability for fiction writers. Or city planners. Some people apparently don't have the capability for visualization, while others can experience it more vividly. But I think it's fair to say most people have some ability to imagine. Known as ‘mental time-travel’, some psychologists propose it’s a trait that allowed our species to thrive. Well, at least this is the polar opposite of the "staying in the present moment" crap that's been circulating. I'll give it that. If I ask you to imagine what you did yesterday, or what you’re planning for tomorrow, you can conjure up rich scenes in the theatre of your mind. Well. Some of us can. I doubt the author intended to be ableist, but this is a bit like saying, "If I ask you to walk a mile, you can get up and walk a mile," ignoring or forgetting that paraplegics exist. In the accelerating, information-rich, target-driven culture of the early 21st century, the present often dominates thoughts and priorities instead. And some people seem to want to make "the present" the only priority. We need to be present-minded sometimes. However, too much focus on the ‘now’ can also lead to the kind of harmful short-termism that infuses business, politics and media – a near-term perspective that worsens many of the long-term challenges we face this century, such as the climate crisis. I can't really disagree with that. I've said similar things. Whether we can visualize things or not, we can learn from the past and make plans for the future, and both of those things are important. But it’s also compounded by a host of unhelpful human habits and biases too, such as our ‘present bias’, whereby we tend to prioritise short-term rewards over long-term benefits (the classic example is the marshmallow test, in which some children can’t resist eating a single treat now, rejecting the chance to chomp two later on). Yeah... that might not be the best example to quote. A longer view provides a deeper, richer awareness of how we fit into the human story – and the planet’s – and reveals just how fortunate you are to be here, right now. The geologist Marcia Bjornerud calls this perspective ‘timefulness’. I'm not sure I like that name any more than I like its apparent inspiration, mindfulness. Why a geologist gets to weigh in at all should be obvious: they generally cultivate a real sense of deep time, working as they do with rocks that sometimes predate eukaryotic life. In this Guide, I’ll share practical tips and exercises that can help you escape the unwanted, short-termist distractions of the present, and discover the upsides of a longer time perspective. The author proceeds to do just that, and at length. I don't think I need to copy anything else; if you're interested, go to the link (hopefully it won't rot anytime soon). I will note, however, that he does turn back to "mindfulness" at one point in there. As for my bullshit sense, well, jury's still out for me. I tend to distrust pop psychology (for example, the marshmallow study, above), though that doesn't mean it's all bullshit. But right now, I have plans for the rest of the day because, no, I don't live in the present. |
Well, that's finally over. Now I just get to grump at holiday season chatter. Today's article, from BBC, has nothing to do with politics or seasons, and everything to do with the invention you're using right now. There have been differences of opinion concerning the actual beginning of the internet. It's not like a human birth, or Armstrong's boot on the moon: a clear and obvious transition point. In my view, this article is more about a precursor technology, but a vital one for what the internet became. On 29 October 1969, two scientists established a connection between computers some 350 miles away and started typing a message. Halfway through, it crashed. 1969 would have been long before "try rebooting and reinstalling all your drivers" would become tech support's second suggestion, after "make sure it's plugged in." At the height of the Cold War, Charley Kline and Bill Duvall were two bright-eyed engineers on the front lines of one of technology's most ambitious experiments. I should note, for context, that this was the same year as the aforementioned moon landing. Unlike NASA's stated mission, though, this early attempt at remote networking was in service of more military pursuits. Funded by the US Department of Defense, the project aimed to create a network that could directly share data without relying on telephone lines. Instead, this system used a method of data delivery called "packet switching" that would later form the basis for the modern internet. Like I said, not the actual invention of the internet, and military. It was the first test of a technology that would change almost every facet of human life. But before it could work, you had to log in. Some things don't change. But Kline didn't even make it all the way through the word "L-O-G-I-N" before Duvall told him over the phone that his system crashed. Thanks to that error, the first "message" that Kline sent Duvall on that autumn day in 1969 was simply the letters "L-O". And that's what I find amusing about the story: it's very Biblical. "Lo!" As in "Lo and behold." On the other talon, I want to think that once they got it working (which, as the article notes, they did, after about an hour), the second message sent over this proto-internet was "Send nudes." The BBC spoke to Kline and Duvall for the 55th anniversary of the occasion. The rest is a transcript of that interview. It goes into more depth over what happened (or didn't happen) at the time, but there's no reason to repeat it here. As compelling as this origin story is, I had this vague memory of a different origin story for the internet, one which took place some years later. In a rare case of me actually looking something up, I found this entry from 2019: "Birthed in Beer" So if I had to choose which one was the actual invention of the internet, I'd pick the 1970s one, because it involved beer. |
Ah, yes, November 5, and Election Day in the US. The UK will be celebrating an attempted terrorist attack, while over here, we're trying to avoid a terrorist attack. The Random Number Gods have chosen to bless us with another Cracked link today. Okay, but, no, there are no spots on Earth where the laws of physics actually fall apart. Well, maybe, sometimes, at CERN, but they do it on purpose. Still, these are rather interesting. There are some places on the planet where things get weird. For instance, you ever heard of the Bermuda Triangle? Well, it turns out there’s nothing weird about that bit of the ocean at all — it sees a lot of traffic, but vessels that travel there are no more at risk than those anywhere else. Like Bigfoot, that's not going to stop humans from making shit up about it. 5 Gravity Drops Near Sri Lanka Once you learn that the force of gravity is slightly variable across the planet, it stands to reason that there are some spots where it's less and others where it's more. Finding out where it's less, though, that's what science does. Gravity varies from place to place — and in some places, it varies a lot. In the ocean near Sri Lanka, gravity is so much weaker than in the rest of the world that the sea level is more than 300 feet lower than it would otherwise be. Hey, I just came up with a fix for rising sea levels! Just increase the gravity of the Indian Ocean, and presto! To know why gravity’s so low there, we’d have to burrow deep into the planet, and possibly cut it in two, which is inadvisable. Awww. Now, look, what that article's not telling you is that it's a minuscule effect. The variation from average, above or below, is about 0.5%. We'd never feel it. Sure, it has a profound effect on sea level, but look at what the comparatively really very tiny effect of the Moon's gravity does to that on a daily basis. 4 The Toasty Bit of Norway In the Norwegian Sea, we have one disturbing bit called the Lofoten Vortex, where the water stores an unusual level of heat. Lofoten Vortex can be the name of my Bjork cover band. Yes, yes, I know Bjork is from Iceland, but come on, look at a globe. (A flat map inevitably distorts the distance between Iceland and Norway.) 3 On Top of Paraguay, Magnetism Disappears I recently saw an argument for why the Earth's magnetic field isn't as important as we thought for maintaining our atmosphere. But this isn't about that. Without looking at the map, if you had to guess one spot where the magnetic field gets weird, maybe you’d point your finger at one of the poles. But the planet’s rotational axis, which defines where we put the north and south poles, isn’t the same as its magnetic axis (which creates the magnetic field). As a result, we have this belt of radiation around the globe that dips down and comes close to us at this unlikely spot above Paraguay. So, it doesn't "disappear." It just gets weaker there. How much weaker, I can't be arsed to look up. I can forgive a comedy site for hyperbolic headers, but they made it sound like, I don't know, compasses won't work in Paraguay or something. 2 The Tulsa Center of the Universe You might not have known that the center of the universe is in Tulsa, but that’s what this spot is named, and Oklahoma wouldn’t lie to us. Can't be the center of the universe; I don't live in Tulsa. We’d go investigate ourselves, but that would mean having to spend time in Tulsa. Good reason. 1 The Cave Where Energy Comes from Rock I mean, technically, coal is rock. Uranium ore is rock. But then you have Movile Cave in Romania. The interior is totally cut off from the outside world, and the creatures in there get no energy through photosynthesis, either directly or indirectly. Instead, the producers of this food web are bacteria that get their energy through chemosynthesis. Here, though, they're talking about an entire cave ecosystem that doesn't rely, at its base, on solar energy input (apart from, you know, it not being frozen solid and all). Which is outside our normal experience, and definitely doesn't break the laws of physics any more than the other examples do, but is really interesting. Because now we know for sure that life can exist without photosynthesis. Exist, yes, but imagine crawling out of that cave to go vote. Who are the candidates, now? |
This Cracked article might have been better to post before Halloween, but remember, tomorrow is Election Day here. Scary stuff is still in our future. I have often looked at articles about what some scientists are doing and think: "You fools! Have you never seen a horror movie?" Animal testing is an unfortunate but necessary part of certain scientific fields — that is, until the scientists decide to go all Dr. Moreau on some rats. Then it’s just weird. It’s like they’ve never seen a single monster movie. See? I'm not the only one. (Sometimes, "horror" gets replaced with "science fiction," but usually, horror is still a subgenre.) 5 The Spider Goat Spider silk is super useful for a lot of different things, but there’s a reason you don’t see a lot of spider farms. I saw a video recently on how silkworms do their thing. Silkworms are larval moths, and generally unpleasant to look at, but they lack the visceral horror of arachnids. Still, spiders are cool... from a distance. In 2012, Utah State University geneticists rectified that problem by splicing spider DNA into goat embryos, who eventually grew up to lactate spider silk. Friendly neighborhood spider-goat. 4 The Man Mice In 2013, scientists at the University of Rochester implanted human glial brain cells into the brains of newborn mice, who became much smarter and learned faster than other mice as a result. "What are we going to do tomorrow night, Brain?" Also, anyone who has ever seen a horror movie should have implored them to stop. 3 Acid Elephant In the ‘60s, experimenting with LSD was all the rage, in both the scientific and “hanging out in your friend’s cousin’s basement” senses of the word. By 1962, scientists at the University of Oklahoma had run out of ideas until one of them asked, “What if we gave an elephant a thousand doses?” Which makes me wonder: if we see pink elephants, what do elephants see? Gray humans? 2 Magnetized Cockroaches “Can you turn a cockroach into a magnet?” is a question you only ask after you’ve been seriously scientifically jaded... ...or on a serious acid trip. 1 Zombie Dogs Believe it or not, we do know how to bring once-living creatures back from the dead. Thus shifting the definition of "dead." Used to be: heart stopped = death, but then CPR came along, and now doctors have to pretty much guess when the point of no return occurs. Cornish hoped to try his method on humans, specifically a recently executed prisoner, but the government forbade it not out of any fear of a zombie apocalypse but because they weren’t sure how double jeopardy laws applied to a revived corpse. Seems like an important legal loophole to fix. Add to this mix the penchant of certain scientists to revive millennia-old bacteria found in ice cores, and you definitely have the makings of a horror movie. But still not as scary as tomorrow's election. |
I have an article in the queue about author deaths. Back in June of 2021, I shared a different article about author deaths: "Inevitable" As you know, one reason I do these entries is to see how things have changed. I can assure you that every person listed in that original article is still dead. The original article, from LitHub, is still up. Though I clicked on it to check, I no longer follow that site and want nothing to do with them, which is why we haven't had any LitHub links in over a year. So here's my 2024 take on my 2021 words: Most of us don't choose the time and place of our demise, with notable exceptions such as Hunter S. Thompson. On manner of death, Thompson, of course, plagiarized Hemingway. Quote from the article: "Camus died in a car crash. Simple enough, right? ...Apparently, Camus once said that the most absurd way to die was in a car accident." My response: I can think of far more absurd ways to die, but most of them involve alcohol and maybe prostitutes. I think, with that, I was trying to invoke the same ironic twist ending that Camus experienced. Interesting as some of these are, I think it's better to be remembered for how you lived than for how you died. But, failing the former, I'll take the latter. |
I guess sometimes Lifehacker is good for a laugh. Laughacker? I don't know. Spatchcocking Your Chicken Is Worth the Effort Those thighs aren't going to crisp tucked way under there. Settle down, Beavis. Roast chicken is an everyday pleasure—a good fit for both special occasions and midnight snacks. Which is why my grocery store sells rotisserie chicks. While you might be familiar with the classic roasting style, with trussed legs and tucked wings, this method can lead to overcooked breasts and soggy thighs, two phrases I want nowhere near my chicken. Heh heh heh huh huh There’s a better way to roast your chicken for more even cooking: spatchcocking. Bwaaaaahahahaha You can spatchcock, or butterfly, any bird. That bit might only be funny if you're familiar with British slang. Traditional roasting puts the driest cut of meat (the breast of the chicken) at the top, often closest to the heating element—before you've even turned up the heat, it’s a recipe for overcooking. The parts that are juiciest (i.e. the thighs) are lower, if not completely under the rest of the body, and shielded from direct heat. Somehow I'm hungry for wings, now, and I hate wings. Illustrated instructions follow. I'd suggest not going to the link if you're vegan or vegetarian, or, like me, are allergic to hard work. |
Did my early voting today, so I can spend Election Day doing more important things, like drinking. Unrelated to that or, well, anything else, really, is this Atlas Obscura article featuring an obscure city in an obscure state: The Strange Heat Island Lurking Beneath Minneapolis An urban explorer ventured deep below downtown in search of Schieks Cave. What he found changed science. Greg Brick knew it was there, lurking beneath his city, hidden within the Minneapolis water and sewer system: an enticing geologic anomaly called Schieks Cave. That sort of thing fascinates me, but I can't be arsed to do all the work or break the laws necessary to explore for myself. When Brick arrived at the cave, he shined a flashlight into the thick darkness of the city’s underbelly. He found not just the natural void but also concrete walls that previous generations of civil engineers had built to support the natural structure. If those civil engineers were anything like me, they wrote something like "put a concrete wall here" on a map, and some minimum-wage temps did the actual work. The water was about 20 degrees hotter than it was supposed to be. It was more like the groundwater of Mississippi than that of Minneapolis. Something was warming the water beneath his city. Well, clearly, that's because Minneapolis is actually a gateway to Hell. [The cave's] discovery in 1904 by a city sewer engineer was initially kept secret lest the public fear Minneapolis had been built on unstable ground. Apparently, it was, else generations of civil engineers wouldn't have specced out concrete support walls in the cave. He finally had a way to access the cave—but there was another problem. Along the route, the raw sewage poured from shafts overhead, shooting bacteria into the air as it splashed down and creating what’s politely known as “coliform aerosols.” Unsurprisingly, Brick got sick. Another reason for us civil engineers to stay at our desks. In 2008, a separate team from the University of Minnesota had predicted that heat from Minneapolis’s urban surface was conducting itself deep underground, heating the groundwater there like a metropolitan microwave. As the article notes, this turned out to be the correct explanation, not the "gateway to Hell" one. Much to my disappointment. But it’s not all bad news: Canadian and European researchers recently suggested recycling underground heat and using it as a low-carbon way to heat homes, while also cooling the groundwater back down to normal temps. I hope their "suggestion" included exactly how to do that. Now, from the headline, I was expecting some discovery of dark matter or a violation of the Second Law of Thermodynamics or whatever. But no. Still, at least it's not boring. |
Samhain, and the temperature is supposed to get up into the 80s (Freedom Units) today. Not that I'm unhappy about that, but it is somewhat unnerving for the end of October. This Halloween night coincides closely with a new moon (8:47 Eastern tomorrow), so it's not nearly as cool as when it coincides with a full moon. So I'm doing... absolutely nothing for the occasion. Except, of course, trying to keep both of the black cats indoors, because some people are idiots. Speaking of the moon, though, here's a lunar mythology article from Discover: Myths of the Moon Shaped Ancient Cultures and Modern Cultures of Today From ancient myths to modern superstitions, all cultures — even modern ones — tell stories about the moon. The moon is something all people on Earth, no matter where or when they’ve lived, have in common. Unless, I suppose, you spend your entire life underground. That we all share the moon does not mean that we imagine it the same way, however. In our myths and stories, the moon plays many different roles. We know more about the moon than ever before, and we continue to learn more. That dead hunk of rock in the sky still has secrets to give up. Myths and legends don't tell us much about the moon, but they do reveal a lot about us. The relationship between the moon and the tides, for example, was clear to early peoples who lived near the sea, explains Tok Thompson, an anthropologist at the University of Southern California who specializes in folklore and mythology. That's probably an example of noticing a correlation without understanding the deeper effects (in this case, gravity) behind the correlation. In other words, the "observation" part of the scientific method. The moon, with its regular phases, also functioned as a calendar. You could track the days by the sun, but for longer periods, the moon was useful. The English word moon comes from mensis, the Latin word for month, which is also the origin of the word ‘menstruation.’ Much as I enjoy a good etymology, I'm not sure this is right. Wiktionary traces the word back through Proto-Germanic, all the way to PIE. This suggests to me that, rather than having its origin in Latin, the Latin and the Germanic words share a root. I'm not an expert, by any means, but this is enough to make me question the "comes from Latin" assertion. Pretty sure Luna is from Latin, though; that was the name of a moon goddess. All that aside, let's stop pretending that Gregorian calendar months are related to lunar cycles. So yes, ancient cultures knew a lot about the moon, and they wanted to share that knowledge to keep a record of it. I also question the qualifier "a lot" in that sentence. Thompson’s favorite myth involving the moon is a creation story from the Tlingit people of the northwest coast of North America. In this story, an old man keeps all the light in the world stashed away in a box. Through wiles that vary from telling to telling, the trickster Raven steals the box and releases the Sun, the moon, and all the stars, bringing light to the world. One could be forgiven for wondering how Raven was able to see the box, considering there was no light at the time, but myths follow their own, dreamlike, rules. In Chinese mythology, a woman named Chang E drank a magic elixir, whereupon she floated all the way to the moon, and there she lives still — with, in some versions, a rabbit. Radical feminists: "Hey, gimme some of that elixir!" We still have some fascinating myths about our favorite satellite. Probably the most persistent is that the moon can drive us mad. I find that it's necessary to separate the two meanings of "myth." A foundational story, like the ones about Raven or Chang E, is not the same thing as a persistent falsehood. I don't think this article draws the distinction. Many people — including some healthcare workers and police officers — believe that crime, traffic accidents, psychiatric hospital admissions, and even suicides spike during a full moon. Or that could be cognitive bias. You notice things more when they fit your pre-existing worldview. Like when you're frustrated by machines and bureaucracy, and think "Mercury must be in retrograde." These days, we don’t blame the effect on moon goddesses or magic elixirs but on something at least quasi-scientific: the moon’s gravitational effect on the water in our bodies. On its surface, that makes a lot of sense. No. No, it does not. After all, the moon creates tides, and roughly half our body weight is water. Okay, but have they ever observed tides in a mudpuddle? And indeed, though a few small studies have found some possible effects of the moon on mental health, research over the past decades has not borne this out. For example, a Swiss study published in 2019 looked at almost 18,000 cases of in-patient psychiatric admissions over ten years and found no correlation between the phase of the moon and psychiatric admissions or length of stay. There's some nuance there, but basically, the "moon causes madness" bit is the second definition of myth. Pausing to gaze at the moon can create an intense feeling of both awe and peace. Rather than making us mad, the moon might just make us a little bit more sane. Nice to think, but there's little evidence for that, too. Still... go look at the moon. Might want to wait a few days, though; it's too close to the sun right now. |
From Cracked, an extraordinarily important article: Puns are the highest form of humor, though the joy of them comes not from an audience's response, but from the punster watching the audience's response. And we have to watch carefully, lest one of them yeet something harder than a rotten tomato at us. But there are two things about puns that I don't like: 1) When someone steals one of mine, even inadvertently; and 2) when someone comes up with one that I really should have, but didn't. Such is the case with the second one at the link. Because it's Cracked, it's a countdown list, so it's the one numbered 13. 13. My girlfriend is the square root of negative 100. She’s a perfect 10, but also imaginary Our relationship is complex. See? See? May the creator of that one be darned to the grayest middle parts of Heck. Obviously, I'm not going to reproduce all of them here. But this one made me laugh: 6. I told my husband he was awful at directions He got so mad that he packed his bags and right. Now, I should come up with a really witty pun of my own to wrap up this entry. I should, but I'm coming up empty. My muse is on vacation. So none of us get to be a-mused. |
From Big Think, an article about how the numbers sometimes lie: The “McNamara fallacy”: When data leads to the worst decision Don’t make the mistake of blindly following quantitative metrics — whether you’re helping clients or looking for lunch. Or do they? In 2014, I took a weekend break to York. You know what sometimes lies? Anecdotes. York is a lovely city in the north of the UK, with an ancient cathedral, quaint cobbled roads... Oh, hell no. ...and an interactive Viking experience. Is that where you get hung by the ankles and forced to watch your clan murdered and village pillaged while you're upside-down? The reason for the anecdote is quickly revealed: the author made the mistake of trusting online reviews whilst choosing a restaurant, Skewers, which turned out to have high ratings due to drunks lovingly appreciating their late-night-early-morning fare. Let's be clear, here: This is not a failure of TripAdvisor. It's not a failure of the food establishment. It's certainly not a failure of the well-meaning drunks. The blame lies entirely on the author. I travel quite a bit, and I prefer to form my own opinions of places I visit, rather than relying on others' (sometimes real, sometimes not) experiences. Sometimes, it sucks, but usually, I'm pleasantly surprised. Either way, I get to write reviews of my own, which may or may not factor into someone else choosing to visit. The story of Skewers is an example of the McNamara fallacy, and learning about it can help us all (especially the underprepared tourists among us). In spite of my misgivings about the opening anecdote, I'm willing to read on. Fortunately, the author again gets quickly to the point, which is how I know I'm reading Big Think and not The New Yorker: The McNamara fallacy is what occurs when decision-makers rely solely on quantitative metrics while ignoring qualitative factors. So, like, when you rely only on the studies that claim booze is bad for you, and ignore how good it makes you feel. The fallacy is named after Robert McNamara, the U.S. Secretary of Defense during the Vietnam War, where his over-reliance on measurable data led to several misguided strategies where considering certain human and contextual elements would have been successful. I'm no expert on war history, but claiming that different decisions "would have been successful" strikes me as arrogant. I'd weasel out with "might have been successful," instead. The McNamara fallacy is not saying that using data is bad or that collecting as much information as you can is wasted time. I'm just leaving this here lest someone be thinking, "What's the point in data, then?" For example, it’s not uncommon for someone to deeply love a book that has few or no reviews on Goodreads. It’s possible to enjoy a restaurant or a movie in spite of what others say. Data is a great starting point, and a great many idiotic and dangerous things are done when we ignore data, but it doesn’t always make for the best decisions. I'm very familiar with that assessment, anyway. People are different, and you might love something others hate, or vice-versa. Reviews and the like are aggregations, and blindly following them is roughly the same thing as blindly following the crowd. You can do that if you want, but I don't, because that way, I miss out on the joy of making my own discoveries. If everyone liked the same things, there'd only be one beer, and that would be a boring world indeed. I have a much bigger problem with the explicit comparison of "finding a place to eat" and "running a war where people die." Still, I suspect the basic ideas of the fallacy are sound, and the article goes into other examples and applications thereof, which I don't feel the need to reproduce here. Fear not, though; it's short, and there are very few actual numbers. So, a solid four stars out of five from me. Your experience may vary. |
I don't know if I've used anything from Inverse before. This article caught my eye a while back. Just as the answer to any yes/no headline question is "no" by default, the answer to "why do [businesses] do [thing]" is almost always "money." It's been a while since I saved this article, so let's see if I'm right. We live on a planet where people still die of starvation, and yet we still waste so much food — it’s a problem, and not just for sustenance but the environment. We can solve the problem, or we can learn to live with it. I did. Part of it is just a category problem: Americans are used to seeing a wide and alluring variety of foods on shelves, and a lot of it, especially for produce and meats. One of the first times I set foot in a Whole Fools, long before they merged with Amazon, I noticed the marketing gimmick of their produce section: an abundance of everything, all bins topped off and on the verge of overflowing. This was obviously a long time ago, but I remember thinking: "Some marketing expert decided that the store has to look like it'll never run out." I imagined stock clerks (or "associates" or whatever) being paid minimum wage to watch the produce section, and run in and replace every cantaloupe or courgette some customer put in their cart. This is in contrast with other local grocery stores, who seemed to have no problem letting bins get nearly empty before replacing the contents. I imagine that's bad in a couple of ways. Some people, seeing that there are only three oranges left, will move on to buy something else, thinking, perhaps, "Those three must have been the ones no one else wanted," or, if they're more generous in their estimation of humans, "Let someone who really needs them have them." Others will grab all three, thinking something like "I'd better get that now," and then they don't eat them, thus simply moving the food waste problem from the producer to the consumer. To be clear, though, I have no idea if there's any difference in food waste between WF and, say, Kroger. So what do the stores do? They overpurchase, “knowing that some food waste must be built into their bottom line,” says Jennifer Molidor, senior food campaigner at the Center for Biological Diversity. And what do they do with the unbought food? Do they donate it to homeless shelters? Not from what I've heard; they just toss it into a dumpster under a surveillance camera to make sure no one's getting free food. Profit margins on perishable foods are so high that stores would rather overstock so as not to miss even one sale. Yeah, I'm going to need a citation for that. But it does tie in to my "money" answer above. On the high-tech side, retailers are starting to use artificial intelligence to better determine how much and when to order food items. Using AI, huh? Well, that should calm consumer fears. Another high-tech solution is “dynamic pricing,” or flexible price points that can shift depending on real-world market factors, in this case allowing stores to discount items that are getting close to the end of their shelf life. That's actually pretty cool, in my opinion. Though stores have done a version of that for as long as I can remember, selling yesterday's bakery products, for example, at a discount. (The article does point that out.) Still, when food does go unsold, it can go somewhere more productive than a landfill, and there, too, ReFED has seen progress. In the Pacific Coast Food Waste Commitment, signatories increased the percentage of unsold food that was composted by 28 percent and donated by 20 percent. Okay, so maybe some of it does get donated. My inner cynic (which is like 90% of me) wants to know if the companies get tax breaks for doing so. One challenge for dynamic pricing is how to reduce prices without a ton of extra labor. Some startups are experimenting with digital labels that could be easily changed. If a price can be easily changed, it can be easily increased. When gas stations moved from physical signs to digital ones, their prices became more dynamic, too, probably because it takes less labor to change it. There are also plenty of hurdles for finding useful places for food waste to go. For one thing, “landfilling in the United States is dirt cheap,” Sanders says, so doing just about anything else with unsold food costs more. "Landfilling." "Dirt cheap." I see what they did there. Also, again: money. Donating food, meanwhile, not only comes with liability risk to keep the food well preserved, but it also requires new processes and labor for store employees to collect the food and new partnerships for how it gets picked up and where it goes. Yeah, you have to wonder about liability in those situations. That's gotta affect their bottom line, too. At some point, change will require mandatory measures, not just voluntary ones, to disrupt that dynamic. “We need laws and regulations from the government to hold industry accountable and make food waste prevention a requirement,” Molidor says. Oh, more regulations? You just lost half your support. Anyway, I don't have solutions. Like I said, I forced myself to get comfortable with food waste, because it's like when you were a kid and you wouldn't eat your vegetables and your mom was like "Think of all the starving children in Ethiopia!" (or whatever place was well-known for having famine when you were a kid). And you pushed the plate toward her and said "Send it to them!" |
Almost four years ago, I wrote this entry in response to a 30DBC prompt: "Stylish" The prompt was fairly long for a prompt, so the entry was fairly long for an entry. Prompt: "What is your blogging style? In your response, consider the following questions: What is your process of writing a blog entry - do you plan it out in advance, or just start writing? Who is your ideal reader? How did your unique blogging style emerge? Has your blog changed over time?" Upon re-reading the entry, I was a bit disappointed that few of the answers would change between then and now. There's something to be said for consistency, but there's also something to be said for growth. Yesterday, I made my habitual Monday foray to the local taphouse. That's one thing that's changed. I still like that taphouse, but I don't go regularly anymore. I sit outside, on their patio, because it's a better bet than dining indoors. There are no guarantees, of course, but the science points to a lower risk of Trump Mumps transmission if you're not inside. This has changed, of course, because it's not 2020 anymore. As far as I know, I still haven't gotten Trump Mumps... but I've had a few bouts of cold-type illness, and never did bother getting tested. None of which really has anything to do with the prompt, except to illustrate that I don't really plan out these entries, other than maybe giving them an hour or so of thought, usually while doing something else (in this case watching YouTube videos about science, philosophy, and the philosophy of science), and then, in the entry, I could write about almost anything. My YouTube consumption has diminished since then, also. I despise ads, and they want to charge too much for no-ads. And even without ads, I got really weary of everyone telling me to "like and subscribe." Urging me to like, subscribe, comment, etc. is an absolutely certain way to ensure I never will. I don't want to get tied down to any one subject, because so many things are interesting to me, and in the end, I'm not writing for any particular type of reader, but just to write. That... wasn't exactly true then, and it's not exactly true now. It occurred to me several years ago that some of the best works of fiction were written, not with a particular demographic in mind, targeted to what market research suggests would be the most lucrative audience, but for an individual or small group of people. So, I imagine writing for one particular person. It's not always the same person. Still, it's not a lie; I don't write for a "type." Thus, I really haven't tried to push myself into a "style." Sometimes I'm funny (or try to be; jury's still out), and sometimes I'm completely serious. Sometimes both at the same time. The problem there is I'm not sure anyone can tell the difference. This bit hasn't changed. When I started blogging, lo these many years ago, it was mostly about personal stuff, like the crap I started with today. Clearly, I still do that when the situation warrants, like when I was traveling and had experiences to relate. Obviously, I do still talk about personal shit sometimes, but I've arranged my life specifically to avoid drama, so very little happens to me that anyone else would consider interesting. The downside of this is that when it does, I barely have the practice to handle it. But writing isn't work. I suspect that if I ever made actual money from it, I'd probably start to consider it work and break out in hives. Still true. As I noted recently, I've managed to add an entry every day of this calendar year thus far, and I'm hoping to make it to December 31 (not that I'll stop then, but I do expect to take a couple of breaks next year). Those breaks didn't happen, and I'm closing in on a five-year uninterrupted streak. So, I guess, some things did change, but mostly things outside my control. |
From SciAm, and an author I hung out with for a week a few years ago: Why Does the Moon Look Bigger Near the Horizon? The rising moon looks huge on the horizon, but it’s all in your head I can smugly assert that this is something I'd already figured out. I remember watching the full moon rise one early evening a while back. You'd think astronomers would spend more time watching moonrises. As it cleared the horizon, the moon looked huge! Eastern Colorado, where this occurred, is basically a continuation of Kansas: Flat as the proverbial pancake. There's an actual horizon there. Anyone who is capable of seeing the moon (or the sun) near the horizon has experienced this effect. I don't have an unobstructed view of the horizon here, but I've spent time on the coasts, so yes. But it’s not real. Simple measurements of the moon show it’s essentially the same size on the horizon as when it’s overhead. This really is an illusion. As opposed to time, which really is not (yes, another time article is in the queue). Attempts to explain it are as old as the illusion itself, and most come up short. Aristotle wrote about it, for example, attributing it to the effects of mist. As smart as Aristotle was, he got lots of stuff wrong. This was one of those things. A related idea, still common today, is that Earth’s air acts like a lens, refracting (bending) the light from the moon and magnifying it. The "simple measurements" noted in an earlier quote refute that bit instantly. Refraction does make it appear redder than usual, though, just like it does with the rising or setting sun. Another common but mistaken explanation is that when the moon’s on the horizon, you’re subconsciously comparing it with nearby objects such as trees and buildings, making it look bigger. But that can’t be right; the illusion still occurs when the horizon is empty, such as at sea or on the plains. But, see, coming up with explanations like that, even when they can be disproven... that's part of science. So what’s the cause? Like so many things in science, there are two effects at play here. Phil indeed goes on to explain the cause, but no need for me to quote it word-for-word. The first part of the answer is related to this well-known optical illusion. The second part is devoted to showing that we don't see the sky as a hemisphere, but as a nearly-flat surface. This is the harder part to accept, I think, but next time you're out on the plains or the ocean on a partly cloudy day, note how the bottoms of the clouds make the sky seem flat. This is for the same reason that the Earth itself appears (but only appears, dammit) flat in such locations. So you've got two (nearly) planes, earth and sky (the temptation exists to make a plains/planes pun, but I'm not in the mood), but only one moon. Still, presumably, you've seen the moon high in the sky quite often, so then when you see it near the horizon, your brain goes back to remembering seeing that. Moon illusion misconceptions still abound, and like so many myths, they likely won’t go away no matter how much someone like me writes about them. Yes, this is, indeed, the curse of fact-checkers. |
Today in You're Doing It Wrong... No, Salted Water Doesn't Boil Faster and 7 Other Pasta Myths We asked an Italian chef about the eight biggest myths and misnomers linked to making pasta at home. Most likely they asked "an Italian chef" because if they'd asked 10 Italian chefs, they'd have gotten a hundred different opinions. If you think salting your pasta water is going to get dinner on the table faster, think again. I have to wonder how that bit got started. Adding salt to water increases its boiling point, making it take longer to come to a boil. The higher-temperature salt water would then, presumably, cook the noodles faster. It would be pretty simple to do an experiment to see which effect takes precedence... and then you probably find that the amount of salt used in cooking makes no meaningful difference to boiling temperature. To separate pasta protocol from the false and fabricated, we asked an expert about the biggest pasta myths, mistakes and misnomers that could be ruining your rotini and putting your pappardelle in peril. Really, I'm just quoting this bit to illustrate how proud the author is of his alliteration. Filippo de Marchi, is chef de cuisine at De Majo Restaurant & Terrace. Stealth ad! We grilled Marchi on nine of the top-circulated pasta cooking myths. Tough to grill pasta. "Cooking pasta isn't difficult at all. It's all about timing and the right water-to-pasta ratio," he says. So, let me tell you what I remember of my mom's pasta-cooking technique. First, fill a small pot halfway with water. Then throw in the pasta (breaking the hell out of it if it's spaghetti). Then put it on the stove over high heat. Wait 45 minutes. Drain and serve. My mom had no Italian heritage. 1. Throwing pasta against a wall to see if it sticks proves it's done Chef's take: FALSE Of course this is false. I've always known it to be false. And yet, the false information persists, as it usually does. 2. Adding olive oil to pasta water keeps noodles from sticking Chef's take: FALSE "The oil just floats on top of the water and doesn't coat the pasta effectively," says de Marchi. Pretty sure there's more to it than that. Once the water boils, the oil gets stirred in more, though it's still going to be gloppy and not stick to the noodles. 3. Fresh pasta is always better than dry pasta Chef's take: FALSE Pretty sure that's a question of individual taste and opinion, not fact or fiction. It's all about personal preference. Fresh, dry or frozen; chefs aren't here to dictate what your taste buds like and don't like. Like I said. 4. Leave the pot covered while the pasta is cooking Chef's take: FALSE Of all the "myths" on the list, this is the only one I'd never even heard of. All pasta-making recipes I've seen demand open-top pots. On the flip side, everyone's other favorite cooked starch, rice, requires a covered pot. Unless you cheat and get instant rice or whatever. 5. Adding salt helps water boil faster Chef's take: FALSE As I noted above, middle-school chemistry disproves this one. "If you're cooking without enough salt, the pasta can end up tasting a bit bland," warns de Marchi, whose signature dish at NHC Murano Villa is a spaghetti alle vongole. The stealth ad continues! 6. Drain pasta until it's completely dry Chef's take: FALSE Besides, "completely" is misleading, here. Presumably, the pasta started out dry; that's why you stick it in boiling water. 7. You should run cooked pasta under water before serving Answer: FALSE Yeah, I'm pretty sure there are situations where you do want to do that (creating pasta salad, e.g.), but for your hot spaghetti dishes? Never. 8. You should precook sheets of lasagna Answer: FALSE I have to admit, this one confused me for a long time. Most store-bought lasagna has pre-cooking on the box instructions, as I recall (I haven't made it in a very long time). It wasn't until I met my second wife, who also had zero Italian ancestry, who introduced me to the lazy wonders of cook-it-in-the-baking-pan lasagna. Left out of the article: the other two cooking tips mentioned above (they said nine total), which presumably aren't myths. I guess we'll all have to go to this guy's restaurant and ask him ourselves. |
I've written before about the bad-news bias toward lottery winners, where the media latches on to a tragedy connected to someone who won the lottery as if to say: "See? You have to work for your money, or bad things will happen." Whereas in reality, the majority of big-ticket lottery winners go on to live decent lives. Sure, they eventually get sick and/or die, but that's going to happen whether you have lots of money or not. Well, here's a rare article, from Business Insider, that takes an honest look at one guy's lottery win. The temptation there is to react with envy or sour-grapes. I'm just looking at it as one person's experience, which of course may not be representative of everyone's. This as-told-to essay is based on a transcribed conversation with Timothy Schultz, who won the Powerball Lottery in 1999. So, a quarter-century ago. Plenty of time for someone to figure out what to do with all that money. I started playing the lottery once or twice weekly, buying a single ticket. I visualized winning and told people about it. They said, "Well if anyone's going to win, you're going to win." Then I did. Plenty of people "visualize" winning and continue to lose. I keep visualizing a date with Halle Berry, but it hasn't happened yet. A press conference announced I had won the $28 million Powerball lottery. After that, our phone was inundated with messages. People I knew congratulated me, but there were stacks of letters from strangers, some of whom asked for money. I'd like to see a study about whether this happens more often to major lottery winners than to other rich people. I'd always imagined what I'd do if I ever won: pay off debt and put myself through college, but I'd never thought about how it would change my life. I immediately question how a 21-year-old without student loans can get themselves into debt so deep the lottery has to get involved. I've been in debt, but it took me longer than 21 years to get there. Suddenly, I'd gone from a gas station attendant to retired at 21. I felt like I was holding a magic wand. Everything was possible, but I also wanted to be financially responsible. Some would say that one important aspect of financial responsibility is to not play the lottery. Those who are irresponsible before winning a jackpot will generally continue to be irresponsible. There are, of course, exceptions. Before turning in the ticket, I consulted with wealth professionals to understand how much I could afford to spend and give to others. Not a bad idea, depending on how much the "wealth professionals" gouge you for the advice. Before I received the money, I set up a plan with advisors to invest it. We invested conservatively so the returns could last me over a lifetime. Also a good idea. Basic, napkin calculation: you win $20M. Assuming that's the actual lump sum payout and not the amortized version, you then want to do immediate things with it like, maybe, set your family up with houses or give some to the local animal shelter or whatever. Maybe take a round-the-world cruise. Whatever floats your boat. Say you set aside half of it just for funsies. That leaves $10M to invest. Conservative investments tend to return an average of 8% a year. This is the stock market we're talking about, so some years it might be negative, while other years it'll be higher. In 1999, the dot-com crash was still a few months away, but you'd have no way of knowing that. Because inflation is also a thing, there's a rule of thumb about living on 4% of the investment account balance on an annual basis. Some question that rule, but again, just being quick here. 4% of $10M is $400K. That's more than most people make in the US, even now, let alone in 1999. Keep in mind that these assumptions maintain, or possibly slightly grow, the balance. It can stay there, working for you, while you sit on the beach very much not-working, until a) someone fucks up the portfolio, b) someone gets irresponsible with it, c) society collapses, or d) hyperinflation takes over. But as a 21-year-old, the first thing I bought was the latest video game system. Honestly, I can't blame him one bit. And what's that cost in 1999? $500? That's a rounding error. I went back to college to study film and broadcast journalism, a dream come true. Perhaps the best thing about having money at a young age would be the ability to pursue whatever college degree you wanted, without concerning yourself about ROI on tuition. Or maybe that's not everyone. People were supportive, but some treated me differently. Some tried to get closer to me, which made me feel like a walking, talking ATM. It may be true that, once you have money and people know it, you always wonder if someone's becoming your friend because you're cool, or because they see a way to get money out of you. When you win the lottery, people don't view the money as something you've earned. A family member explicitly told me I got something for nothing by winning the lottery and should keep giving them and others money. Bullshit. Also, "I want unearned income too!" is massively hypocritical. It was a steep learning curve navigating the social aspect of winning the lottery. Technically, it would have been a shallow learning curve, as per an entry I did here a while back, but there's no accounting for English idioms. People ask all the time, "Does money buy happiness?" Money doesn't necessarily change who you are. It can affect happiness by buying time, providing opportunities, and alleviating stress about debt. I know I've harped on this before, but "Does money buy happiness?" is the wrong question, with wrong assumptions. But whoever says it doesn't has never enjoyed a really good single-malt scotch. I wish I had invested in bitcoin a few years ago, but that's my only regret about how I've spent the winnings. An investor's worst enemy is hindsight. Personally, I'm glad I never got into Dunning-Krugerrands. Don't trust 'em. These days, I don't buy anything too crazy. Like many people, I live within a budget. Budgeting is important, rich or poor. But if I were 21 now and had the option, I would consider claiming the prize anonymously, especially if it was a large prize. I'd recommend that for lottery winners at any age. So. Nothing exceptionally bad happened to him, some good things did, but it did change his life and how he looked at things. Basically, life happened. This is just one person, though, as I noted. But it's nice to see an honest look at the results for that one person, without sensationalism. Even if the article is an ad for his media conglomerate. |
Comedy meets science in another Cracked article. Questions It Looks Like Science Is Never Gonna Be Able to Answer Those eggheads aren’t so smart after all, are they? Right, because not knowing everything is the same as knowing nothing. Science isn’t a discipline known for moving fast, breaking things or shooting first and asking questions later. Well... kind of. For various definitions of "fast." 4. The Riemann Hypothesis It’s ridiculously complicated, but let’s just say that according to the hypothesis, proposed by Bernard Riemann, there’s a pattern to the distribution of prime numbers along the number line. Oh, come on, that's not science; that's math, a separate discipline that will never know everything. 3. Goldbach’s Conjecture If 160 years seems like a long time, try 282. That’s how long ago Russian mathematician Christian Goldbach theorized that “every even positive integer greater than 3 is the sum of two (not necessarily distinct) primes.” I invented the principle "never let facts get in the way of a joke," but again... math, not science. 2. The Clarendon Dry Piles It sounds like a sex act that only exists on Urban Dictionary, but it may be even weirder. It’s two brass bells connected by a pair of batteries called dry piles powered by electrostatic forces. The amount of charge carried between the bells is so small that it appears to be a perpetual motion machine, but the truth is, the batteries are just draining extremely slowly. So science has, in fact, answered a question: extremely slow drain. At least this one is actually science, not math. It’s anyone’s guess when the batteries might die, but we’ll probably never know because the hardware will probably break first. Seems to me that question will eventually be answered, so this section still doesn't meet the promise of the headline. 1. The 500-Year Microbiology Experiment Also science. It’s a stunning act of futility and optimism to bet on society surviving another 500 years, but that’s what scientists at the University of Edinburgh did in 2014 when they endeavored to find out if certain strains of dried bacteria can survive five centuries. This headline sucked. That's another question that will eventually be answered. That is, if, as the article notes, science is still around in 490 years. |
And now, one of my rare posts about actual writing... It is a truth universally acknowledged that of all the bad opening lines of literature, with the possible exception of the one that begins, "It is a truth universally acknowledged," the one that begins, "It was a dark and stormy night" is the nadir of all possible opening lines. This was from Bulwer-Lytton's novel Paul Clifford, about which literally no one outside of academia knows anything apart from the opening line. Like all truths universally acknowledged, it's not necessarily true. But it still provides the impetus for writers to try to do worse, generally to hilarious effect. I've posted about this before, I know, but today's entry is about this year's "winners." Now, you'll have to go to the site to see the Grand Prize Winner. Suffice it to say that I don't agree. It's bad, but not dark-and-stormy-night bad. I'll do what that site doesn't, which is to first paste the inspiration for the contest, the actual opening line of Paul Clifford (they put it way down the page): It was a dark and stormy night; the rain fell in torrents—except at occasional intervals, when it was checked by a violent gust of wind which swept up the streets (for it is in London that our scene lies), rattling along the housetops, and fiercely agitating the scanty flame of the lamps that struggled against the darkness. Now, some of my personal favorites: It was a dark and stormy night, which makes perfect sense when you realize we’re on Neptune, with a mean distance from the Sun of 4.5 billion kilometers (or 30 astronomical units), and winds that howl at 100 meters per second, composed of mostly hydrogen and helium (and only trace amounts of methane), which is way better than Uranus, which stinks to high heaven. —Jon A. Bell, Porto, Portugal Yes, I'd probably have selected that one as the Grand Prize Loser, mostly because I'm sick and tired of seventh-planet puns, but also because of the completely unnecessary science data. And, of course, I'm quite fond of the Vile Puns section: "I do enjoy turning a prophet," said Torquemada, as he roasted the heretic seer on a spit. —A. R. Templeton, Stratford, Canada And: "My laddies may not be the fastest sugar cane harvesters," Fergus confessed, "but they're not as slow as my lasses..." —Mark Meiches, Dallas, TX Just one more; like I said, the page is there to view the other stinkers at your leisure: Ralf Smalborgson kept a small shop in Direperil, Minnesota, and his goods consisted only of medieval stringed instruments, lanyards and backstays, and some limited apothecary supplies, giving the store its uninviting signage: Lute, Rope, and Pillage. —Ciarán McGonagle, Derry, Northern Ireland Which reminds me of my D&D-adjacent bard, one of whose catchphrases is, "Come on, baby, fight my lyre!" |